As digital marketers, we all like to say that our way is the right way: that if you want your company to become a household name and to have your website ranking alongside the frontrunners in your industry, then we hold the key. The truth is, despite our varying campaigns and SEO techniques, we are all working towards cracking the same code. The ever changing Google algorithms are at the centre of our digital marketing strategies, and it is our ever-increasing knowledge of them that is one of our biggest assets in the SEO world.
There is a huge amount of speculation when it comes to what factors Google takes into account when ranking sites. The elusive ‘200 Ranking Factors’ have long been debated by leading voices in the field, and have never been confirmed entirely by Google themselves. As of March 2016, Google has confirmed that backlinks and content are, in no particular order, the top two ranking factors, with the Google RankBrain being the already confirmed third factor. We are very much left to our own devices to discern what it is that Google is looking for from a website, meaning that the better we understand their algorithms, the more intelligently we can create strategies for our clients.
There are a few ways to approach understanding the Google algorithms, but to understand effectively how they work together, it is important to take them out of their chronological context and address them as individual parts of a whole mechanism.
Despite the Panda and Penguin algorithms having been released before Hummingbird, I believe it is important to start here due to the enormous overhaul that ensued as a result of Hummingbird. Before Hummingbird, the main Google algorithm didn’t have a name. It ran alongside Panda and Penguin, focusing on gathering information, but not sorting it in the most user friendly way.
Released in September 2013, Hummingbird completely altered the way that Google manages search results. Whereas previous algorithms had merely been add-ons, Hummingbird was an entirely new engine – an engine that still utilises the older parts that are Panda and Penguin.
Hummingbird works to understand the semantics of a user’s search, understanding what they actually mean based on their search query. An example of this would be: if I searched “Places to eat pizza in London”, Hummingbird would interpret ‘places’ as ‘restaurants’ and show me pizza restaurants in London.
This upgrade was met with mixed feelings; although it is most definitely an improvement on the previous algorithm, Hummingbird was blamed by many for a drop in rankings. However, this was more likely due to a refresh of the Penguin algorithm at the end of September, as Hummingbird had already been running for a month prior to when it was announced, with no noticeable consequences.
The Panda algorithm was first launched in February 2011, and marked the beginning of change when it comes to vetting the quality of websites. If Hummingbird is the engine, the Panda is an additional part that runs in conjunction with the main Hummingbird algorithm. Google’s intentions have always been to provide users with the most relevant search results possible, and not to bog them down with spammy and irrelevant sites practicing poor SEO. The purpose of Panda is to show sites with high-quality content higher in search results, and to relegate those that are of lower quality.
An article written by Google Employee Amit Singhal at the launch of Panda explains the ins and outs of what the algorithm is basing its judgement on. In layman’s terms, in order to be seen favourably by Panda, your content must be original, of good authority, trustworthy, and well written. There are far more areas to bare in mind, which can be found in the above article. Also, be wary of thin content, which is a page that adds little or no value to someone who is reading it. If a large quantity of your pages have only a sentence or two of content, then they are likely to be judged as low quality by Panda.
Penguin followed Panda in April 2012, with the aim being to reduce the trust that Google has in unnatural backlinks that are used to gain an advantage in the Google results. As previously mentioned, links are considered one of the top three Google search signals when it comes to determining ranking, meaning that any websites practicing black hat tactics were heavily penalised upon its release.
Whilst genuine links to and from other sites are looked on highly by Google, if a site has a large number of links from suspicious, low-quality sites, then Google isn’t going to trust them and your website will be penalised.
Penguin can have a huge impact on your rankings if you are considered to have untrustworthy links on your site. If Google doesn’t trust the links you have, it will reduce the ranking of your entire website, not just the relevant page. The backlash of Penguin was huge in some cases, but there is a lot we can learn from it, too. Remember, Google doesn’t update their algorithms to benefit online business, but rather the people who use Google to search. If you always keep your visitors at the front of your mind when engaging SEO strategies, then your website should continue to grow.
Google’s Pigeon algorithm is a fairly new addition to the family, having been released in July 2014. It focuses on finding more useful results for users based on their location. This is a focus in Hummingbird, however Pigeon offers improved distance and location ranking parameters to the user. By creating closer ties between the local algorithm and the core algorithm, the same SEO factors are being used to rank local and non-local Google results. This means that if you are optimising your site to include your client’s location, then you shouldn’t have any repercussions from this update.
We have known about Google RankBrain since 2011, when its development originally began. Google then confirmed that they were using it for a “substantial percentage” of all Google queries in October 2015.
Through deep-learning techniques, Google are working on cracking the code of artificial intelligence. The Google RankBrain is a machine-learning artificial intelligence. What this means is that it uses mathematical processes and an advanced understanding of language semantics to gradually understand what it is that makes people search, then applying those conclusions to future search results. This means that it isn’t using a set of preprogrammed methods, so it isn’t held back by any preset ideas and doesn’t handle scenarios in a predetermined way.
There isn’t much that we know about the RankBrain at this moment in time, only that it is utilised for the majority of online searches. However, we can determine what it isn’t. RankBrain isn’t a new algorithm in the same way as Panda and Penguin are; it is of a modification that works in conjunction with Hummingbird to produce the most meaningful search results for the user. It is also important to remember that RankBrain isn’t a robot, nor is it conscious. It is just a series of mathematical equations that will continue to improve over time, providing users with more accurate search results on an exponential level.
Keep It in Perspective
It is easy to get bogged down with algorithms, and to start scrutinising your every move so as not to be hit by updates. The truth is, Google makes over 600 changes to its algorithm a year, and the vast majority are unannounced. Therefore, questioning your every step is likely to hinder your progress rather than boost it.
The key motivation behind all of Google’s algorithms is to ensure that the user experience is as accurate and useful as possible. So, by keeping online visitors at the forefront of all of your SEO strategies, you are unlikely to come up against any backlash from Google.