Understanding Google Updates

Posted on Apr 24, 2017 in search algorithms

The SEO community generally perceives Google updates as targeting metrics of low quality.  Anytime a change occurs in Google’s algorithm, the change is understood through the lens of what kinds of sites lost ranking. This is a less than ideal way to understand what Google is doing. That’s why so many theories contradict each other, because they’re looking at the wrong metrics.

These are the typical touchstones the SEO community focuses on to understand what an update was:

  • low quality content
  • low quality user experience
  • low quality links

What Google Updates Really Target
The SEO industry defines nearly every Google update around the targeting of low quality. But Google also targets high quality. Targeting low quality is only a small part of what Google’s algorithm does.

If you think about it, understanding what a user wants when they make a search query plus understanding how a web page can solve a users problem is what the algorithm really does. Excluding low quality is just a small part of that process.

If Google changed their algorithm in order to better target high quality web pages, the SEO industry would continue to see the change as targeting low quality.  Does that sound like an accurate way to understand Google?

This is what Google might be targeting:

  • High quality content
  • High quality user experience
  • High quality links
  • Better user satisfaction (with Google)

Google Core Algorithm Updates
The fact is that Google updates its algorithm on a continual basis. In 2012 Google updated its algorithm 665 times. That’s called a Core Algorithm Update.

That’s different from what is generally known as An Update, where a profound change in Google’s algorithm happens. Examples of major updates are the introduction of the Penguin Algorithm, the Panda Algorithm, and the Hummingbird Update.  In the case of the Hummingbird update, Google introduced the update and announced it a month later. The SEO community never noticed the change.

Updates Go Unnoticed
A similar thing happened with the introduction of an artificial intelligence component that Google calls RankBrain. RankBrain was introduced sometime around the spring of 2015 and the search industry didn’t notice. Some people (including myself) noticed changes. On May 26, 2015 I  posted observations on WebmasterWorld about recent trends on search results that showed how user intent was a prime ranking consideration and that keywords were less important. I described a process (one of many) where AI is used to understand language:

 For example, there is a paper related to artificial intelligence that attempts to understand words and their different contexts. Turns out the accuracy of the system improves when the source documents that form the basis of the semantic analysis are considered together with the links that interlink those core pages.

My point was that AI was making traditional “keyword focused SEO” obsolete and in 2015, before RankBrain was announced, I showed examples of sites that were ranked without keywords or synonyms in their pages. These were web pages that were understood to meet the user intent of the search query.  I didn’t know it was RankBrain because it hadn’t been announced. However I understood that AI was playing a role in the SERPs and I stated so. The response was predictably one of  disbelief!

The Wrong Way to Understand a Google Update
Unless Google announces what the major update is, the SEO community has a difficult time understanding what it is.  The reason the industry has difficulty identifying and understanding updates is that it continually views updates through the lens of who lost their rankings. 99% of the time that you read about an update it will be reviewed in the context of who the losers were, of what kinds of sites were targeted. And this may not be the best way to examine an algorithm change.

A Better Way to Understand a Google Update
If you spend any amount of time reading Google Patents and information retrieval research, one thing will become clear: The state of the art of search engine science is focused on understanding concepts in the same way humans understand it. Another (large) area is understanding how to satisfy users.  What you won’t see much of is about spam fighting. The truth seems to be that if you focus on identifying what satisfies people (which means understanding the queries and the documents that answer those queries) the low quality web pages take care of themselves.

In other words, there may be little need to target low quality web pages for demotion if you spend your effort on identifying high quality web pages.

If Google updates it’s algorithm to better understand what a web page is about, you will never figure out what is happening if you are focusing solely on what web pages no longer rank. How can someone understand an algorithm by examining the wrong metrics?

The Phantom Update Misunderstanding
Misunderstanding what Google was doing is precisely what may  have happened with the so called Phantom Updates.  The reason so many theories about what happened conflicted is because they were examining the wrong metrics.

  • Did it affect sites with bad links?
  • Did it affect sites with too many ads?
  • Why did it affect ecommerce sites?

The confusion was based on focusing on looking at the sites that lost rankings, instead of looking at the sites that gained rankings. This focus on the wrong metrics gave rise to concepts such as  Phantanda (Phantom + Panda) and Phantenguin (a link focused Phantom, i.e. Phantom + Penguin). I ignored the Phantom update dialogue because there was little to zero substance to it.  Which is why I posted the following on Twitter:

 

Google Phantom Update

A Perspective on the Google Algorithm
The most recent series of updates from February 2017 were at first called Phantom and subsequently (and reluctantly) named Fred, after a Google employee’s pet fish. At the time of the update some people claimed the update targeted sites with low quality links. Others reported that ad heavy and thin content sites were being targeted. Google insisted there was no update, that there was only a tweak to the core algorithm.

A reasonable person can assume that it could very well have been an improvement to Google’s ability to understand user queries and/or web page documents (probably both!). A change on this order would affect sites with poor links as well as ad heavy sites and pages with thin content. Mystery solved? Probably!

The SEO community would not understand such an update because it is hung up on identifying what kinds of sites are being “targeted” for penalties. SEOs would be confused because they seem to think that every update is about Google identifying low quality sites. Enough is enough. It’s a strange habit the search industry has gotten into and it’s time it stopped trying to understand Google through identifying the losers and broaden the inquiry to include positive ranking factors.

Here’s another way to think about a site that dropped ranking position:
Sometimes there is nothing wrong with a site that dropped ranking. It’s simply not the right page for answering a specific query for most users making that search query. User intent

What is Google’s Core Algorithm?
The core algorithm is not just about fighting spam. Google’s core algorithm is primarily about Retrieving Information. What I have been saying for the past month (and so have Googler’s between the lines) is that some of these Phantom Updates are not about targeting spam. They’re the opposite. I think they’re about improving the algorithm to be able to find content. Google and Bing have been on a long quest to be able to find and rank content on sites that do little to no SEO. This is a priority.

Maybe the SEO Community needs to change how it thinks about Google Updates?

Enjoyed this article?

Share this post with your friends!