In the last few weeks there have been a higher than usual amount of articles on search quality where mainstream media sources have shined a bright light on companies who are achieving high levels of search rank using methods which could be questionable. A small eyeglass manufacturer for example actually provided poor customer service seemingly on purpose so the company could benefit from a higher perceived search rank as a result of negative comments from customers.
In another case, J.C. Penny was using an outside SEO company to place links on low quality pages all over the web and when the New York Times wrote an article about it, Google scrambled to penalize the company’s results. Then Overstock made headlines when it was learned it offered discounts to colleges while encouraging them to post links back to the company.
The point here is fairly obvious… There are hundreds of thousands if not more people dedicated to analyzing how search engines work in order to raise their rankings online. This is nothing new but what is new is Google aggressively taking action in some cases in as short as a few hours after a problem comes up in the mainstream media or the blogosphere.
Google is doing its best to defend its public reputation.
But with so much social sharing taking place on the web it is only natural that search engines will look to humans to help determine what information is of the most value. This isn’t to say search engines don’t already use humans to check results when they change algorithms – its just that algorithms are somewhat consistent meaning over time it becomes possible to reverse engineer how they work.
Moreover, the last few years has shown us a more open Google and they have shared some bullet points regarding how their search technology works.
So now we are entering a new phase of search where humans become more important. Google took its experimental Toolbar function which lets users block websites for example and made it available in response to negative feelings online about Demand Media sites appearing frequently and highly ranked in search results.
You might recall a few years back Wikia Search was launched and killed by Wikia co-founder Jimmy Wales. The idea here was to use humans to power the search.
So humans alone aren’t the answer. In a way, Google has been using the power of preference to fuel its algorithms as its toolbar acts as a mechanism to let the company know who is clicking what based on search results.
But when you factor in the massive data held in Twitter and Facebook, you now have the ability to provide search results which match user preferences to some degree. So the real threat to Google is the data contained on social networks as it will allow any company serious about competing to tap into a database of solid results.
And this is why Bing is integrating Facebook content into its results like it has done with Twitter. Likewise with Wolfram Alpha – another human-based source for information which could be relevant to search.
The question remains of course is whether Google can stay ahead of Bing and if any other company will have the guts to launch a new search engine which leverages social more than algorithms.
Facebook itself may be best positioned to pull it off and this is likely one of the reasons the company is furiously hiring programmers.