We often get asked by clients what the different Google Updates are and what they target, so we have decided to give an overview here:
Google’s primary product is their ultra-famous search engine. It relies on an algorithm, which is designed to go out into the internet and seek out whichever websites most closely correspond to the query submitted by the user.
Since Google’s search engine is automated, it follows that it can be tricked, using a number of techniques. At one time, an unscrupulous webmaster might have created a page full of links and keywords, all designed to fulfil the criteria of Google’s algorithm without providing any content of value to Google’s users.
Google devotes a significant chunk of their resources into discouraging such techniques and have a dedicated Webspam team, who is headed by Matt Cutts (although he is currently on leave). They frequently release updates for their algorithm and most of these occur behind the scenes. Occasionally, however, they will release an update significant enough to be given a codename. Each of these named updates serve a specific role in improving the quality of Google results and weeding out the poor quality and spammy websites that are taking advantage.
Google have long maintained a list of technical guidelines for webmasters, in which they outline a number of principles that they expect to be adhered to. The guidelines broadly state that websites should be created with a human audience in mind, rather than for the benefit of a ranking algorithm. Each update to Google’s algorithm is a means of punishing those that violate this principle.
This punishment is not permanent, however. Whenever a new version of a Google update goes live, pages whose Google ranking had been curtailed by a previous update will have a chance to have that reversed – but only if they had made the appropriate changes in the interim. So, if a website suffers after an update, but then it takes steps to comply with Google’s guidelines, they may recover when the next one rolls out.
What is Panda?
The ‘Panda’ update was released in February 2011 and was designed to penalise poor-quality websites. There are many quantifiable factors which suggest the quality of a website. While Google are reticent to detail exactly what these factors are, their technical guidelines do advise against the use of a number of specific techniques such as ‘doorway’ pages and ‘scraped’ content. It is relatively straightforward for Google to find sites which employ such techniques. Panda will also examine a page’s ‘Bounce rate’, or the proportion of users who spend a very short amount of time before returning to the search results.
Panda seeks out and flags pages which it deems to be of poor quality, in much the same way that the spelling and grammar check on a word processor might seek out and flag spelling and grammar errors. As well as the bounce rate, Panda will also look for things like duplicate text, duplicate page design, pages with hardly any text content and pages with hardly any media content at all. This is why this update was originally known as the farmer update by Search Engine Land as it targeted content farms or sites with thousands of pages of low quality purely created to rank and bring traffic.
Panda flags offending pages as ‘spam’. As one might expect, this causes the page’s ranking to decrease exponentially, depending on the severity of the violation.
What is Penguin?
In April 2012, Google released an update called ‘Penguin’, which was designed as a complement to Panda. Though the goals of the two updates are similar, they perform two quite distinct functions. While Panda is concerned with the content of a site, Penguin’s focus is narrower: it deals with the link profile of a site.
Webmasters may be tempted to artificially boost their rankings, either by purchasing links, or through link networks designed for that specific purpose. Some of the links provided may be of dubious quality and relevance. For example, if a page about fridges is linking to a page about zoology, then Penguin will interpret this as an attempt to game the system and punish not only the offending site but all of the sites which link to it. Another common technique that has caused people to be hurt by Penguin is the use of hundreds or even thousands of link directories pointing at their site. These directories are low quality and there only purpose has been to give sites more links in the eyes of the search engines.
Penguin also looks at the anchor text of the links pointing at a site in order to see if this appears to be natural. If your link profile is natural it wouldn’t be uncommon to see brand term links in the region of 60% of your overall link profile. In the past many webmasters have bought links that have anchor text which matches the keywords they are targeting as previously this would have boosted ranking for the terms.
For further information on link schemes that Penguin targeted have a look at Google list here.
This is bad news for the webmasters who employ such tactics. Happily for them, Google provide a convenient ‘disavow links’ tool, which can be used to indicate that the search algorithm should disregard links on a given site. To be safe, though, it is recommended that offending links are removed altogether. However the disavow tool is for advanced users and if it isn’t used correctly it can cause a site further harm. Prior to submitting a disavow it is best to attempt to contact webmasters of sites that you no longer want to link to you and ask them to remove the link.
What is Pigeon?
Pigeon is the latest Google filter to be released. Unlike the Panda and Penguin updates, Pigeon is not aimed at penalising sites. It is, instead, a fundamental change to Google’s algorithm, specifically concerning the way in which local search results are dealt with.
Since Pigeon was released only very recently, details on this update are sparse. It currently only affects searches in US English, though other languages are expected to follow. Google claim that Pigeon is designed to improve location-based services. What this means in practice is that businesses local to the user are prioritised over those which are further away. Statistics compiled by Brightedge indicate that the change impacted centralised businesses like job and real-estate websites most harshly; by contrast, more localised businesses, such as those in the hospitality, food and education sectors, enjoyed a boost.
There is currently talk going on in the industry suggesting that Google may have reversed or turned down the level of Pigeon due to the severe impact and concern it has caused with local search marketers.
What’s new in the latest updates?
Before we proceed, a clarification is in order. While Google will occasionally give a name and number to versions which they deem substantial, they do not always extend this practice to the intervening versions.
Panda (4.1)
The latest incarnation of the Panda update is 4.1 which has significantly impacted affiliate sites. There are many affiliate sites that have very thin content on pages that have purely been designed to drive users to partner sites. Another important finding from sites impacted by this latest update are keyword stuffed pages and in particular those pages that are keyword stuffing with their location in order to rank locally. An example of this would be page titles with the following or similar page title format:
Location Keyword | Keyword Location | Location Keyword
And then regular use of these keywords throughout the page. Panda doesn’t change it just gets more aggressive in terms of finding low quality, spammy content that’s main or sole purpose is to rank for key terms and drive organic traffic.
Penguin (2.1 and 3.0)
Version 2.1 is the latest iteration of Penguin and it essentially carried out a deeper analysis of a site, looking at pages deeper in a sites hierarchy. Therefore anyone that had been employing spammy deep link tactics such as directory and forum links were hurt this time around. For a more in-depth view of Penguin 2.1 read out previous post here.
Another major overhaul of Penguin algorithm is imminent. As the numbering may suggest, this is not simply an update but a large scale rewrite of the algorithm. Google’s Gary Illyes announced at a recent convention that it could occur as early as mid-October.
Conclusion
Google is constantly evolving and looking for ways that it can improve the search results, listing the best quality results to suit a query. Optimising your site to take advantage of this is not a quick or easy process, but it is one that is worth taking time over and carrying out a complete digital marketing strategy. You need to focus on creating great content that actually serves a purpose for the visitor. You also need to make sure that you site is easy to navigate, loads quickly and has the information clearly laid out. Finally you need to focus on earning your authority, which mean creating content and campaigns that naturally earn links and social shares. Forget about search engines and focus on the visitor!
Free Website Audit
Let's get started
Find out how your website is performing and what needs fixing!
Find out more →Free Website Audit
Let's get started
What our client say...
“Richard and his team took a lot of time out of his day to come and visit us, see our products, see what we’re about and understand our industry. The results, they speak for themselves really.”
Chris Brady
CEO & Founder
1 Stop Spas