In the world of search engine optimization (SEO). Few topics have generated as much collective attention as the development and emergence of Google algorithm updates. These coding additions to Google’s search ranking algorithm, varying in size and scope, have the capacity to fundamentally change the way the algorithm works—and they have, periodically, over the years.
Whenever a new update is released, SEO professionals go crazy, excited to dig deep and learn what changed. And between updates, we tend to wait and speculate about what changes could be coming down the pipeline in the future.
If you’re new to the SEO world, this can all seem very intimidating. Google has been releasing updates for the better part of two decades, which is a lot to catch up on—and there are no signs of its momentum slowing down in the near future. Google is a company committed to ongoing change and development, and its algorithm is a reflection of that; as search marketers, we need to be willing to change and develop alongside it.
To help educate newcomers to the SEO world, provide a refresher for those of us in the thick of it, and lay a foundation with which we can predict the future of Google developments, I’ve put together this comprehensive history of Google updates for your reading or perusing pleasure.
First, I want to cover some basics about Google updates. Because of the increased attention they get from SEO professionals as well as webmasters, marketers, and entrepreneurs in general, there have been a few misconceptions that have developed over time.
You probably understand the idea behind a Google update well from a conceptual perspective. You’re definitely familiar with Google search in general; its primary function is to give users a list of the best, most relevant results for their search queries—and, unless you’re a hardcore Bing user, you’d probably agree it’s doing a pretty good job.
But it wasn’t always as impressive of a system as it is today. Search results are generally calculated based on two broad categories of factors: relevance and authority.
The relevance of a result is determined by evaluating how well a site’s content matches a user’s intention; back in the day, this relied on a strict one-to-one keyword matching system that hunted for web pages that used a specific keyword term more than its contemporaries. The authority of a site is determined by PageRank, a system that looks to sites’ backlink profiles to determine how they relate to other websites and authorities.
These two systems are updated frequently, as Google discovers new, more sophisticated, and less exploitable ways to evaluate relevance and authority. It also finds new ways of presenting information in its search engine results pages (SERPs), and adds new features to make searchers’ lives easier.
When Google decides to create an update, depending on the size, it may be pushed directly to the main algorithm or be published as a kind of test branch, to be evaluated for effectiveness and functionality.
Either way, the update may be improved upon with subsequent iterations over time. Sometimes, Google announces these changes in advance, letting webmasters know what they can expect from the update, but most of the time, they roll out silently, and we only know of their existence because of the changes we see and measure in SERPs.
Updates matter because they affect how search works in general, including how your site is displayed and how users interact with it.
The common perception is that updates are bad news for SEO. They’re seen to be bad for the SEO business, throwing a wrench in optimizers’ best-laid plans to earn higher ranks for their respective sites and causing mass panic when they come out and crush everyone’s hard-earned rankings.
(Image Source: SearchEngineLand)
There’s some truth to this; people do tend to panic when a new Google update causes a significant changeup in how rankings are displayed. However, updates aren’t simply a way for Google to step in and harsh everyone’s SEO buzz; they’re complicated pieces of algorithmic machinery designed to do a better job of providing quality information to Google’s users.
Accordingly, they do have a massive effect on SEO, but it isn’t all bad.
One of the more frustrating elements of Google updates is their stunning level of ambiguity, and it manifests in a number of different forms:
If Google releases these updates to improve the web, why does the company intentionally withhold details like these from webmasters? The answer is pretty simple. The biggest reason for Google releasing updates in the first place is to improve overall user experience.
Imagine Google releases the exact details for how it ranks sites; webmasters would strive to exploit this information for the sole purpose of gaining rank, compromising that user experience. Cloaking this information in ambiguity is a defensive measure to prevent this level of manipulation from taking place.
Because Google updates are so ambiguous, one of the best tools we have as search optimizers is our ability to detect patterns and commonalities in these updates. Not only does this give us a better understanding of Google’s motivation and a clearer knowledge of the full scope of an update’s effects, but it also allows us to make more meaningful predictions about what updates may be coming in the future.
I’ve already made reference to a handful of different named Google updates, so I wanted to take a minute to talk about these update naming conventions. Some updates are formally named by Google; for example, the major algorithms Panda and Penguin were given formal, official names so they could be easily understood and discussed by members of the community. However, since Google is quiet about most of its rollouts, only a handful of updates get official names.
Instead, it’s the collective power of a community that usually lands on a name. In the early days of Google updates, the community of Web Master World took charge of naming updates that were otherwise going informally and silently released. Sometimes, these names were based on basic descriptors, such as the name of the city where the update was announced, and other times, they took on human names much like hurricanes.
Today, most update names emerge from the SEO community as leaders attempt to either describe what’s going on (such as with the suggestively titled Mobilegeddon update) or adhere to Google’s habit of arbitrarily naming updates after animals that start with “P” (such as with Pigeon, Panda, and Penguin). Sequential updates are typically kept in numerical order, as you might imagine.
As we enter the main section of this article, where I detail each of Google’s significant updates, I want to warn you that these updates are grouped contextually. For the most part, they’ll be kept in chronological order, but there may be deviations based on each update’s respective category.
Now it’s time to dig into the actual updates that have shaped Google into the powerhouse universal search algorithm it is today.
Even though Google officially launched in 1998, it wasn’t until 2003 that it started making significant updates to its search process (or at least, held enough attention in the online marketing community for people to care about them). Before 2003, there were a handful of changes, both to the ranking process and to the visual layout of the search engine, but things really kicked off with the Boston update in February of 2003.
The 2003 Google dance era was the first major block of updates, but starting with the Florida update later that year, updates began to take on new roles and new significances for the online marketing community.
Aside from Google’s content evaluation and general improvement updates, this era was also privy to the development of several new features that search optimizers could utilize for greater content visibility.
(Image Source: Sitemaps.org)
It was around this time when Google started stepping up its game with even more features and functions for its users, going above and beyond the call of duty for search engines to give users a wider possible experience. This series of updates was focused on bringing new concepts to users, rather than improving existing infrastructure:
Throughout this era, Google was also coming up with even more “tweaking” updates, all of which committed some minor changes to its search algorithm or improved user experience in some way. These are the most important ones to note:
So far, we’ve seen some significant updates, revisions, changes, and additions to Google Search, but now we’re getting to the heavy hitters. Two of the most impactful, talked-about, and still-relevant updates to the Google ranking algorithm happened practically back-to-back in 2011 and 2012, and both of them forever changed how Google evaluates authority.
The update rolled out on February 23, 2011, and fundamentally changed the way search optimizers operated online. This update was announced formally and explained by Google; the company stated that the update impacted roughly 12 percent of all search queries, which is a huge number, compared to previous algorithm changes.
Digging into more specific details, the update targeted content farms, which previously existed to churn out meaningless “fluff” content to increase ranks. It also penalized sites with content that seemed to be unnaturally written, or stuffed with keywords, and instead rewarded sites that it evaluated to have detailed, well-written, valuable content.
Over the course of its months-long rollout, thousands of sites saw their rankings—and their traffic—plummet, and the backlash was intense.
(Image Source: Neil Patel)
Yet another follow-up, briefly known as Panda 3.0 until users realized it paled in significance to 1.0 and 2.0, became known as 2.1—these changes were hardly noticeable. A round of more Panda updates, minor in scale but significant in nature followed, including Panda 2.2 in June 2011, 2.3 in July, and so on each month until October.
For others, 3.0 happened in November 2011, when an update on November 18th shook up rankings further, seemingly based on the quality of their sites’ content.
Others would merely be data refreshes, “rebooting” Panda’s evaluation criteria. In any case, the number of Panda updates grew to be about 25 when Matt Cutts announced that in the future, monthly Panda updates would roll out over the course of 10 days or so, leading to a more seamless, rotating update experience for users and webmasters. This setup became known informally as the “Panda dance,” loosely referencing the Google dance of 2003.
It rolled out gradually, much like a Panda dance update, but was massive in scale, affecting about 7.5 percent of all queries. Evidence suggests it was both a fundamental algorithmic change and a data refresh, which lent power to the overall release.
Panda 4.1, a follow-up in September 2014, affected a hypothetical 3 to 5 percent of all search queries, but the typical slow rollout made it hard to tell. Panda 4.2 came in July 2015, but didn’t seem to have much of an impact.
Presumably, it’s still being updated on a rolling, gradual basis, but it’s hard to take a pulse of this because of the constant, slow updates and the fact that each successive update seems to be less significant.
Between and surrounding the Panda and Penguin updates were a number of other small updates, including the introduction of Schema.org microformatting, now a major institution in the search world.
(Image Source: Schema.org)
Unfortunately, it wasn’t able to sustain this momentum of growth and today, Google+ doesn’t bear any more impact on search ranks than any other social media platform.
Subsequently, Penguin was formally released on April 24, 2012. The update targeted a number of different activities that could be qualified as black-hat practices. For example, it cracked down on keyword stuffing, the practice of artificially including unnatural keywords into links and content for the sole purpose of increasing rankings for those terms.
It also seriously improved Google’s ability to detect “good” and “natural” backlinks versus unnatural or spammy ones. As a result, more than 3 percent of all search queries were affected, and webmasters went into the same outrage they did when Panda disrupted their ranking efforts.
Like Panda, it set a new tone for search optimization—one focused on quality over manipulation—and helped solidify Google as the leader in search it remains to be.
A third Penguin update came in October 2012, though it didn’t seem to affect many queries (less than 1 percent).
Like Panda, Penguin still holds an esteemed reputation as one of the most significant branches of the Google search algorithm, and is responsible for the foundation of many SEO strategies.
The Knowledge Graph is a distinct section of indexed information within Google. Rather than indexing various sites and evaluating them for relevance to a given query, the Knowledge Graph exists to give users direct, succinct answers to their common questions. You’ve likely seen a box like this pop up when you Google something basic:
(Image Source: Google)
The Knowledge Graph was rolled out as part of an update back in May 2012. When it was released, it only affected a small percentage of queries, as the Knowledge Graph’s scope and breadth were both limited. However, Google has since greatly prioritized the value and prominence of the Knowledge Graph, and thanks to more websites utilizing structured markup, it has access to more information than ever before.
The next official Knowledge Graph expansion came in December 2012, when Google expanded the types of queries that could be answered with the KG and ported it to different major languages around the world. After that, an update in July 2013 radically increased the reach and prominence of the Knowledge Graph, raising the number of queries it showed up for by more than 50 percent. After this update, more than one-fourth of all searches featured some kind of Knowledge Graph display.
Since then, the prominence of rich answers and KG entries has been slowly and gradually increasing, likely due to the gradual nature of incoming knowledge and increasing capabilities of the algorithm. The Hummingbird update and its subsequent partner RankBrain (both of which I’ll explain in coming sections) have also amplified the reach and power of the Knowledge Graph with their semantic analysis capabilities.
The exact-match domains update was a seemingly small update that affected a ton of queries. Exact-match domains are ones that match the wording of a user’s query exactly. In some cases, this is highly valuable; a user is likely searching for that exact brand, but in some cases, this can be used as a deceptive way to poach organic traffic.
Google accordingly reevaluated the way it handled cases of exact-match domains, affecting nearly 1 percent of queries and reducing the presence of exact-match domains by more than 10 percent in search engine results pages.
After discussing the powerhouses of Panda and Penguin at length, all other search engine updates seem smaller by comparison. However, there have been some significant additions and modifications in recent years, some of which have opened doors to entirely new search visibility opportunities.
The Payday Loan update came around June 2013, and its purpose was to penalize sites with dubious intentions or spammy natures. As the name suggests, the primary target for these were “payday loan” sites and other similar sites that deliberately attempt to take advantage of consumers.
Porn sites were also targeted. The main scouting mechanism for this was evaluating certain types of link schemes in an effort to reduce spam overall. For most legitimate business owners, this update didn’t make much of a difference.
Payday loans also saw future iterations—2.0 and 3.0—in 2014, which targeted the same types of sites.
Hummingbird was, and continues to be, a beautiful and particularly interesting update. Even though many users never noticed it, it fundamentally changed the way Google search works—and continues to influence it to this day. Released sometime around August 2013 and formally acknowledged later in September, the Hummingbird update was a core algorithm change that improved Google’s ability to evaluate queries, working on the “relevance” side of the equation rather than the “authority” side.
Specifically, the Hummingbird update changed the way Google looks at keywords in user queries. Rather than dissecting queries based on what individual keywords and phrases it contains, the Hummingbird update allows Google to semantically decipher what a user is intending to search for, then find results that serve that intention throughout the web. This may seem like a small difference—and for most queries, these types of results are similar—but now that it exists, heavily keyword-focused strategies have been weakened, and the power of naturally-written content with contextual relevance signals has increased even further.
Hummingbird has also been important in setting the stage for the future of Google developments, as we will soon see. For starters, it has fueled the rise in voice searches performed by users; voice searches tend to be more semantically complex than typed queries, and demand a greater semantic understanding for better results. Google has also modified Hummingbird with an important and game-changing update more recently with RankBrain.
One of the biggest motivations to getting a Google+ account and using it to develop content was to take advantage of the concept of Authorship that developed. Authorship was a way for you to include your name and face (in other words, your personal brand) in Google’s search results, alongside all the material you’d authored.
For a time, it seemed like a huge SEO advantage; articles written with Authorship through Google+, no matter what site they were intended for, would display more prominently in search results and see higher click-through rates, and clearly Google was investing heavily in this feature.
(Image Source: Dan Stasiewski)
But starting in December 2013, Google started downplaying the feature. Authorship displays went down by more than 15 percent, with no explained motivation for the pull-back.
In June 2014, this update was followed up with an even more massive change—headshots and photos were removed entirely from search results based on Authorship. Then, on August 28, 2014, Authorship was completely removed as a concept from Google search.
In many ways, this was the death knell for Google+; despite an early boost, the platform was seeing declining numbers and stagnant engagement. Though search optimizers loved it, it failed to catch on commercially in a significant way.
Today, the platform still exists, but in a much different form, and it will never offer the same level of SEO benefits that it once promised.
Up until Pigeon rolled out in July 2014, Google hadn’t played much with local search. Local search operated on an algorithm separate from the national algorithm, and it had toyed with various integrations and features for local businesses, such as Maps and Google My Business, but the fundamentals remained more or less the same for years.
Pigeon introduced a number of different changes to the local algorithm. For starters, it brought local and national search closer together; after Pigeon, national authority signals, like those coming from high-authority backlinks, became more important to rank in local results.
The layout of local results changed (though they previously and have since gone through many different layout changes), and the way Google handled location cues (mostly from mobile devices and other GPS-enabled signals) improved drastically. Perhaps most importantly, Pigeon increased the power of third party review sites like Yelp, increasing the authority of local businesses with a great number of positive reviews and even increasing the rank of third party review site pages.
It was a rare move for Google, as it was seen as a partial response to complaints from Yelp and other providers that their local review pages weren’t getting enough visibility in search engines. In any case, Pigeon was a massive win for local business owners.
I previously discussed Panda & Penguin already in their respective section, but they’re worth calling attention to again. Panda and Penguin weren’t one-time updates that can be forgotten about; they’re ongoing features of Google’s core algorithm, so it’s important to keep them in the back of your mind.
Google is constantly refining how it evaluates and handles both content and links, and these elements remain two of the most important features of any SEO campaign. Though the updates are happening so gradually they’re hard to measure, the search landscape is changing.
Google has also introduced some significant layout and functionality changes; these haven’t necessarily changed the way Google’s search algorithm functions, but they have changed the average user’s experience with the platform:
Mobilegeddon is perhaps the most entertainingly named update on this list. In February 2015, Google announced that it would be making a change to its ranking algorithm, favoring sites that were considered to be “mobile friendly” and penalizing those that weren’t mobile friendly on April 21 of that same year. Although Google had slowly been favoring sites with better mobile user experience, this was taken to be the formal, structured update to cement Google’s desire for better mobile sites in search results.
The SEO community went crazy over this, exaggerating the reach and effects of the update as apocalyptic with their unofficial nickname (Mobilegeddon). In reality, most sites by this point were already mobile-friendly, and those that weren’t had a great deal of tools at their disposal to make their site more mobile-friendly, such as Google’s helpful and still-functional tool to literally test and analyze the mobile friendliness of your site.
Overall, “mobile-friendly” here mostly means that your site content is readable without zooming, all your content loads appropriately on all manner of devices, and all your buttons, links, and features are easily accessible with fingertips instead of a mouse. When the update rolled out, it didn’t seem very impressive; only a small percentage of queries were affected, but subsequent refreshes have given it a bigger impact.
Google also released a Mobile 2.0 update in May 2016, but since this basically reinforced concepts already introduced with Mobile 1.0, the effects of the update were barely noticeable to most business owners and search optimizers.
Now, we enter the modern era of Google updates. I’ve already mentioned a handful of updates that have come out in the past couple years or so, but now I want to take a look at the game-changers that will hold the biggest impact on how the search engine is likely to develop in the future.
RankBrain made headlines when it first emerged—or rather, when it was formally announced by Google. Google announced the update in October 2015, but by that point, the process had already been rolling for months. What makes RankBrain special is the fact that it doesn’t necessarily deal with authority or relevance evaluations; instead, it’s an enhancement to the Hummingbird update, so it deals with better understanding the semantics of user queries.
But RankBrain is even more interesting than that; rather than being a change to Google’s algorithm or even an addition, it’s a machine learning system that will learn to update itself over time. Hummingbird works by trying to analyze user intent of complex phrases and queries, but not all of these are straightforward for automatic algorithms to analyze; for example, the phrase “who is the current president of the United States?” and the phrase “who’s the guy who’s running the country right now?” are basically asking the same thing, but the latter is much more vague.
RankBrain is designed to learn the complexities of language phrasing, eventually doing a better job at digging into what users are actually searching for. It’s highly significant because it’s the first time Google has used a machine learning update, and it could be an indication of where the company wants to take its search algorithm in the future.
Google has also released a number of what it calls “quality updates,” which make alterations to what factors and signals indicate what’s determined to be “quality content.” The quality update of May 2015 was one of these, but other updates have presumably rolled out, making less of an impact and going largely unnoticed by search optimizers.
However, Google has recently opened up more about what actually qualifies as “quality content,” publishing and regularly revising a document called the search quality evaluator guidelines. If you haven’t read it yet, I highly recommend you check it out. A lot of it is common-sense stuff, or is familiar if you know the basics of SEO, but there are some hidden gems that are worth scoping out for yourself; I highlighted 10 of them in this infographic.
With regard to spam, Google has released two new updates in the past two years to combat black-hat tactics and practices that negatively affect users’ experiences. In January 2017, Google released an update they announced five months previously, called the “intrusive interstitial” penalty. Basically, the update penalizes any site that uses aggressive interstitials or pop-ups to harm a user’s experience.
In addition, Google launched a soft update in March 2017 called “Fred,” though it’s still officially unconfirmed. The specifics are cloudy, but Fred was designed to hit sites with low-value content, or those practicing black-hat link building tactics, penalizing them.
In September 2016, just before Penguin 4.0, the search community noticed significant ranking volatility for local searches. Though unconfirmed by Google, the informally named “Possum” update attracted significant attention. It appears the update increased the importance of location for the actual searcher, and updated local search entries to include establishments that were just outside city limits. Google also seems to have experimented with location filtering, separating individual locations that point to the same site.
Google has also been experimenting with its presentation of “featured snippets,” the selected, standalone search entries that pop up to answer your questions concisely, above organic search results, but below ads. Though the past few years have seen a steady increase in the number of featured snippets, there was a significant drop in October 2017; conversely, Knowledge Graph panels increased, suggesting Google may be rebalancing its use of featured snippets and the Knowledge Graph as a means of presenting information to searchers.
Later, in November 2017, Google increased the length of search snippets for most results, upping the character limit to nearly twice its previous cap of 155.
Google also seems to be favoring micro-updates, as opposed to the behemoths that made search optimizers fear for their jobs and domains for years. Instead of pushing a massive algorithm change over the course of a weekend, Google is making much smaller tweaks, and feeding them out over the course of weeks, or even months. For example, in April 2017, about half of all page-one organic search results were HTTPS as of mid-April, yet by the end of 2017, they represented 75 percent of all results, heavily implying a slow-rollout update favoring secure sites.
In fact, it’s hard to tell when Google is updating its algorithm at all anymore, unless it announces the change directly (which is still rare, thanks to Google wanting to cut back on spam). These updates are going unnamed, unnoticed, and for the most part, un-worried about. November 2016, December 2016, February 2017, May 2017, September 2017, and November 2017 all saw significant ranking volatility associated with small updates.
Part of the reason why Google is cutting back on the massive updates and favoring much smaller ones is because its core algorithm is already doing such a good job on its own. It’s a high-quality foundation for the search engine, and it’s clearly doing a great job of giving users exactly what they’re looking for—just think of all the times you use Google search on a daily basis.
Especially now that Panda and Penguin are rolled into the core algorithm, Google’s not in a place to be making major infrastructural changes anymore, unless it someday decides to fundamentally change the way we think about search—and honestly, I wouldn’t put it past them.
It is interesting to see how Google has increased its attention on app indexing and displays. Thanks to the greatly increased popularity and use of mobile devices, apps have become far more relevant and Google has responded.
Its first step was allowing for the indexation of apps, much like the indexation of websites, to allow apps to show up for relevant user queries. After that, it rolled out a process called app deep linking, which allows developers to link to specific pages of content within apps for users who already have the apps installed.
For example, if you have a travel app, a Google search on your mobile device could theoretically link you to a specific destination’s page within that app.
But Google’s pushing the envelope even further with a process now called app streaming. Now, certain apps are being stored on Google’s servers, so you can access app-based content in a Google search without even having the app installed on your mobile device. This could be a portent of the future of Google’s algorithm development.
With the knowledge of the modern era, and the pattern of behavior we’ve seen from Google in the past, we can look to the future and try to predict how Google will develop itself.
These predictions are speculative and ambiguous, but unfortunately that’s the nature of the beast. Historically, it’s been difficult to know what to prepare for, because not even Google engineers know exactly what technologies users will take to and which ones they won’t.
Everything in the search world—from algorithm updates to SEO and user experience design—demands an ongoing process of feedback and adaptation. You have to pay attention, remain flexible, and work actively if you want to stay ahead of the curve.
Google has come a long way in the past 19 years of its existence, and it likely has a long road of development ahead of it. If you attune yourself to its changes, you can take advantage of them, and ride those waves of visibility to benefit your own brand.