March 24, 2021
The creation of Google Penguin
Penguin is the little brother of Panda. These are two filters used by the famous search engine, or rather by its algorithm, PageRank. To be even more accurate, it is actually about two major updates. Both of them aimed at cleaning the Google index by identifying and penalizing sites that would not respect its famous Guidelines.
Indeed, Google quickly realized that many sites were exploiting the flaws of its algorithm to improve their natural referencing. This is why, after having hunted down sites with poor content and plagiarism in 2011, the search engine then attacked in 2012 all those using fraudulent techniques to climb in the results.
How Google Penguin works
Our oviparous algorithm hunts cheaters : all those who would use in their SEO strategy methods pointed out in the Guidelines. Thus, since 2012, crawlers, also called Google Bots, have been tasked with paying special attention to certain details that were once forgotten. Here are some of the techniques tracked by the American giant to fight against abusive referencing :
Keyword stuffing : A method that is now almost obsolete, it consists in integrating a large number of keywords on a page, even without direct relation to its content. Worse, a long time ago, some webmasters did not hesitate to write entire lines of keywords with a font color identical to that of the background…
Cloaking : More recent, but especially more elaborate, this approach requires a little more technique. It consists in identifying among the visitors of a website all the robots sent by the search engines. Thanks to their IP address, they are automatically sent back to a different version than the one presented to human visitors.
Content duplication : As the name suggests, this is the partial or complete reproduction of existing content. It is not about improving SEO, but it deteriorates the quality of the visitor experience. This is why Google also sanctions this type of practice.
Netlinking : Here again, abuses existed. The netlinking was blacklisted by Google, not by its nature because it’s necessary to ensure the proper finctioning of their crawlers, but it’’s way of doing things. Indeed, there are what are called “links farms”, sites exclusively dedicated to the creation of backlinks for a third site, we speak about “Money site”. Today, Penguin is used to detect poor quality, artificially created sites. As a result, a site with more and more links, exponentially from this type of site, could lead Google to blacklist it.
The penalties of Google Penguin
Of course, once the algorithm has identified you as a fraudster, that’s the penalty. First of all, there are two types of sanctions. The automatic penalties, managed by the robots, and the manual penalties. The latter are usually the result of an anomaly reported by one of the crawlers. This is followed by a thorough check by one of the control team members to determine whether or not a penalty should be applied.
Only in the second case is a complaint possible. In the case of so-called “manual” sanctions, the owner receives a message from Google explaining why he was punished. The latter can then correct what was wrong and then ask to have his site examined again.
The penalties can vary from a few lost places on a query, to the de-indexation of one or more pages, or even sometimes an entire website. As you can see, Google was not joking at all when it presented its ambition to lead a policy of “Fear Uncertainty and Doubt”.