Contact Us

Go Digital With Our Blog

Search engine torn between algorithm and manual solutions to fight spam

February 28, 2011

Google has monopolized the search engine environment in a way that other competitors have really little confidence to pull it down form that height. But the tooth-and-nail competition, however, is apparent at present. It is probably this competitive impulse that persuaded Blekko to take the decision of dumping some of the sites outright from its index. The unannounced move of the UK search engine appeared to be an obnoxious strategy to climb up the ladder, since Google has been tossed up with allegation of giving thin results.

As the term ‘Content Firm’ carries a fishy smell implying the use of spam and fabricated content, search engines must come out with ways that address two things at least. First, the new infrastructure settled on must eliminate copycat sites that steal or derive content from secondary sources (other original sites) and develop signals with which they can judge the authenticity of link graphs.

While there is no doubt that the search community disdain replicated material, the task of purging the SERPs of them becomes all the more difficult for the search engine. But search engines are, however, struggling figure our infrastructural and algorithmic changes. While some of these changes are already in pipeline, others are waiting to be invested. The following is a brief discussion on some of these.

Personalization: Deeper personalization of the search result is one of the key areas that Google has been working for quite a long. Since the object of such exacting personalization is to eliminate unwanted sites and offer the frequently used ones, the chances of spam gets minimized or at best, on your monitor throws your chosen sites before you. Moreover, with the democratization of smartphones, the search communities are expecting more personalized search in near future. And the search engines are taking notice of these demands.

Explicit user Feedback: In order to make your personalized search environment spam-free, Google has been using Explicit Feedback for some time. When the user takes some steps to tell the search engine about the importance of a site, this works as a recommendation. It is known as ‘explicit feedback’. Other ways of expressing preferences may include mailing a page, adding them to favorites and others. Traditionally, these data were very hard for Google to come by. But there is no doubt that such data will be very useful for powering up personalized search.

Temporal Data: The algorithmic infrastructure of a search engine is sometimes inadequate for judging the relevancy of a page. It’s well known that Google tend to value pages that it finds fresh in terms of query spaces. But if a site inflates its relevancy by upping the link graph, Google’s algorithm fails to judge the relevancy. In that case manual scrutiny becomes necessary. This manual solution acts as a whistleblower because higher link graph may sometimes weaken a site’s visibility (if found spamming).

Social Graph: Social graph and real time search are tow other domains that Google is interested to make your search result more personalized by eliminating weak contents. When you leave your different social network account details with Google profile, it fetches data from them in your personalized search environment, thus narrowing the range of sites your keyword informs.

Limitation of manual solutions: Non-algorithmic mechanisms appear to be more efficient in rooting out weak content sight, but a closer analysis of a subjective approach will make the attending pitfalls more apparent. One of them major ones is that Google will be still unable to judge the degree of honesty behind explicit feedbacks.

Moreover, there is scarcely any doubt that the user community won’t like Google to perform as jury and judge the relevancy of sites arbitrarily. Since subjective strategy will be more complex and resource-intensive, the SERP will show poorer result than what it is at present. The geeks might not hesitate to accuse Google of showing bias. Under such circumstances, manual solutions do not feel friendly and the objectivity of algorithm appears more acceptable.

Share With:
Share

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent Post

We Love Our Clients