With the increasing power and reach of search engines there is an ongoing debate about the benefits of search engines and human edited directories. The big search engines are locked in battles with web developers who are in search of the all-powerful first place in search results. If one does a Google search on "horse AIM icons" for example the first several results that are returned are all advertising for various drugs.
So here's an idea. Why doesn't google implement an API interface that could allow web users to indicate that the page results are deceptive. By requiring people to sign up for an API Id and limiting the number of sites that can be reported in a day they could help reduce malicious use of the service. Perhaps there is ultimately a human that reviews the top offenders and does a blacklist of sorts to remove them from the top of the results.