April 30, 2021
The creation of Google Panda
On February 24, 2011, the Internet as we knew it changed its face. Google deploys for the very first time its new filter, Google Panda. Unlike most of the algorithm updates made until then, this is not a relevance filter, but a cleaning filter. The quest to improve search results is therefore for the first time going through the demotion or even de-indexation of sites that have not respected Google’s guidelines.
Indeed, the American search engine has been aware for a long time that clever people have been taking advantage of flaws in its algorithm. The famous Grey Hats and Black Hats (which we talked about in this article and this one) hijack the way it works to their advantage and thus rise in the search results illegitimately. It is therefore 10 days after Valentine’s Day 2011 that Google decides to start a long and ruthless hunt for fraudsters, an event that will forever mark the history of SEO and content optimization.
How Google Panda works
If its little brother will attack in an intractable way all those who try to optimize their performance without respecting the rules established by the search engine, Google Panda’s first objective is to check the quality of the content of the referenced sites. Thus, it will focus on 4 main families of practices that it will try to eliminate as much as possible from search results:
Whether within the same site or on different sites, the same content spotted several times will be automatically removed from the results. Except for the original page, the one on which it was first published.
Although artificial intelligence is increasingly powerful nowadays, Google Panda’s goal at the time was to identify automatically generated content. The latter usually contained a certain number of syntax, grammar and other errors, indicating most of the time a text produced by a robot.
Often associated with cloaking (see the article on Google Penguin), hidden content is usually text dissimulated on the page of a site. Invisible for the user, robots are in theory able to spot it and therefore to sanction the page concerned.
The abuse of optimization of your content, whether by the multiplication of keywords, internal or external links, or in any way whatsoever, very often makes the texts indigestible and poor in quality. Here again, Panda punishes.
What to remember about Google Panda?
If Panda has not been officially updated since 2015, it is because this filter is now an integrated part of the algorithm, just like Penguin. And in general, Google will eternally continue to hunt down and eliminate all those who do not respect its Guidelines. Therefore, your content must be unique and useful in order to meet the American quality standards and avoid being de-indexed from search results. You will understand that SEO is a strategy that can be particularly interesting if it is well mastered, but that can also turn against you if it is not properly applied. To sum up, it is always preferable to call on an expert for SEO.