Google is seriously having tough time to check the tide of spams that is threatening the very quality of its search results which is in fact an attack on its supremacy. As expected a furious Google is not willing to lose its numero uno position and so, we have the great Google farmer Update. Now it is highly likely that the websites with poor content will no longer be available in its search results, as far as the company claims. But does this mean that just having great content in your website will automatically give a boost to your website’s visibility in some competitive keywords? Definitely not. Google is smart than that. No longer does it treat links as the only factor for offering better results in search results. A sea change can be observed as Google is turning its attention from links to something else that we here in this article will try to cover.
Change is the Only Constant Thing
Google is in the habit of introducing small changes occasionally to keep its results fresh and relevant and each time it introduces something new, it throws webmaster into a tizzy as they have to decipher the real significance of the changes at any cost.
In 1999, Google introduced major change in its algo and it started focusing on unique meta title and meta description for each page.
In 2002, Google started favoring page rank and onpage for securing better ranking in search results.
In 2005, Google stared giving more attention to domain name, anchor text and on page.
But it was 2011, Google introduced farmer update. So it is evident that updates are quite common with Google and that means, as a SEO company we need to learn the art of living with it.
Brand Value: It was not long ago that JCPenney incident flashed out and thus causing much discomfort to Google. But strangely enough, Google took almost months to take punitive action against this website. The reason was simple. Definitely this website was manipulating Google’s algo to get top ranking, most of the users found this website good enough to spend their valuable time. This is what made it really hard for Google to scrap the site from SERP. But Google finally fished out a mechanism to solve the problem. It focused on the values of brand. By this the search engine factorized the brand values which meant that the accepted and renowned brands will get greater visibility over the generic brands. The motivating assumption behind the decision was that users are more likely to be happy with if their query brings information from sites such as Amazon and Puma.
Entity Associations: Another major change in the metrics used by the search engine for conferring ranks was the inclusion of ‘entity association’. By this change, certain sites were sure to hit the bull’s, provided their content carry the entity in the search query. Fir the example, if a searcher puts the ‘The King’s Speech’ in the search bar, the SERP might include sites such as RottenTomatoes, Flixster, Metacritic and IMDB.
However, it is not that the search engine is biased. Google’s search result is merely influenced by the identification of an entity (might be specific person, place or thing) in the search query.
Google’s acquisition of Metaweb has helped it to catalog different names for a single entity. Therefore, the result of search query can reflect unpredictable but relevant content. For example, if search query shows the words like, Terminator, Governor, Kindergarten Cop or Conan the Barbarian, Google can easily associate the name of Arnold Schwarzenegger and produce results that are relevant.
Human Quality Raters & (Trusted) User Behavior: Across the past decade Google rating mechanism has been ruled by two factors user feedback and algorithm. While the collective users could click and rate sites, Google’s mathematical equation sought to sieve the sites according on the basis of signals. Such scaling procedure influenced the visibility of a site over the other. But the release of Google Chrome indicates that it is gradually decreasing its reliance on the algo because, Chrome web extension allow user to block sites they don’t like (SideWiki, SearchWiki, Starred Results). Even Bing is factoring wgat is called ‘clickstream data’, a user-based information. It is a clear signal that user information and usage data will broadly influence result rank. Other subsidiary data, such as which results give good use experience, might also bring changes in the ranking procedure.