Google’s search algorithm has historically performed well against its initial challenge, finding, understanding and organizing the world’s information. However Google has always made constant improvements central to its mission and a constant driver of progress within their immense line of products.
Over the years Google’s algorithm has also been susceptible to weaknesses and loopholes. As online marketers have reverse engineered the SERPs and learned how to take advantage of those algorithmic weaknesses and loopholes, Google has made an effort to shore up those weaknesses and combat those SEO strategies in a variety of ways.
For both of the aforementioned reasons, algorithm updates have consistently rolled out over the years with the focus of improved search results. In this lesson we will examine the history of Google’s algorithm updates to help provide clarity and perspective as to the evolution and current state of the world’s most popular search engine.
The content of this lesson will be kept in reverse chronological order, meaning we will discuss the most recent updates first, so that it is easier for you to access information on the latest updates as they continue to roll out. With each new update we will post the latest information here and to keep you well informed.
History of Google Algorithm Updates
To be clear, there are minor algorithm updates rolling out all of the time. Most of the time Google is vague about the purpose of these updates and often times it is nearly impossible to pinpoint the differences in the SERPs. The goal of this lesson is to track and explain the major updates.
Google uses different algorithms to process and score different types of information. Different algorithms are then bundled together for use with different types of search results. For instance there are specific algorithms that are triggered for queries based on user intent relevant to news, shopping, events, food, sports and local results along with many more. Each type of search uses a unique bundle of different algorithms. Some algorithms are shared commonly across all of the search bundles, and other algorithms are only used for specific types of searches.
Penguin 3.0 Update October 17, 2014
After just over a year since the Penguin 2.0 update, Google released the Penguin 3.0 update in late October. The update, which rolled out over nearly a month, signified a refresh of the previous link data as well as some additional factors relating to tiered linking.
Panda 4.1 Update September 26, 2014
It was announced by a Google UK Webmaster Trends Analyst that a new Panda update has been rolled out. After collecting feedback from users and webmasters that identified additional signals which indicate low quality content, Google has included these new data points into their organic search algorithm. The result seems to be that a greater diversity of high-quality small and medium-sized sites are ranking higher. It is clear that thin content is again the target, and that content that thoroughly explores every aspect of a topic, sends Google clear data messages (read Schema.org structured data markup) and techniques which improve crawl efficiency will continue to pay dividends.
Pigeon Local Update July 25, 2014
Pigeon is an update of Google’s local search algorithm that gives it a more robust ability to recognize inconsistencies in NAP information, knowledge graph, accuracy of location & distance, and more. This update has resulted in a major shift in Local Search rankings. It seems that the 7-local listings packs have been dropped, and SEO and web design local packs are back after being removed in 2009. This update also fixed a problem with Yelp profiles not showing up in the rankings.
Google Drops Authorship Images June 2014
Google has decided to stop showing the Google+ profile images associated with search results that include rel="author" mark-up. Also, previous to this change the Author's Name rich snipped used to trigger a search results of articles exclusively associated with that author. After this update the Author name rich snippet is simply a link to the author's Google+ profile page.
Google Spam Algorithms / Payday Loan Update 2.0 & 3.0 May 21 - June 12, 2014
It seems that this version of the Payday Loan Update targeted spammy websites in competitive niches. It has been speculated that this update strengthened reliance on brand trust signals to offset the impact of negative SEO. It was also announce that this update intended to take action against potential negative SEO factors. This algorithm is supposed to only be triggered for a certain group of queries that have been classified by Google as "spammy" such as: payday loans, casinos, viagra, prescription drugs, etc...
Panda Update 4.0 May 23, 2014
This latest Panda update coincided nicely with the Payday Loan update which makes sense. Panda identifies low quality, thin content and the Payday Loan update identifies link schemes. It seems that Google has gotten very good at finding tiered link strategies that include link layers with massive quantities of optimized anchor text links from low quality pages that are used to build up the link equity to a “topically relevant” page that links back to the page that is desired to rank. It seems that the combination of optimized anchor text, low quality thin content and layered link funneling now can trigger an algorithmic penalty. Therefore, ensuring that all pages involved feature quality content and longer tail “phrase match” text links should stop the penalty and actually make this basic strategy even more powerful than ever.
Payday Loan Update 2.0 May 19, 2014
This update to the original Payday Loan update from June 11, 2013 improved Google’s ability to identify strategies often used by niches that are notorious for spam. This update targets unique link schemes with tiered link strategies utilizing heavily optimized anchor text and relying on increasing levels of links from thin content pages at levels two and three.
No Name Update March 24, 2014
This update was reported on various SEO forums and seemed to be a Panda refresh. Sites who had addressed Panda related issues and had implemented the appropriate Panda related strategies seemed to rise in the rankings and sites with lower quality, thin content continued to drop.
Page Layout #3 February 6, 2014
Google refreshed their page layout algorithm again. The page layout algorithm was introduced in January 2012 and penalized sites with too many ads above the fold. Essentially Google believes that users are looking for content and they want to reward sites that place their content and the user experience as the top priority.
Reduction of Authorship Markup December 2013
In October 2013 at PubCon, Matt Cutts announced there could be up to a 15% reduction in the appearance of Authorship Rich Snippets in SERPs. By December a wave of complaints swept through SEO forums about losses of Authoriship markup. Authorship markup still shows up, just not quite as often.
Penguin 2.1 October 2013
This was the 5th Penguin update and while the results seemed fairly moderate, some sites were hit hard and other sites beneitted. The original Penguin update was announced April 24, 2012. The main focus of Penguin was a recalibration of ranking factors associated with linking, including the addition of new metrics and penalties.
Hummingbird August / September 2013
On September 26, 2013 Google announced that for the first time since 2000 they had completely rewritten their entire ranking engine from the ground-up. It is called Hummingbird because a Hummingbird has the largest brain size to body mass ratio of any animal.
While the Hummingbird algorithm still utilizes most, if not all, of the same concepts that Google’s algorithm was already using just prior to the switch, it now does all of that stuff in a much faster and more elegant way, while adding some new bells and whistles that make Googles algorithm both immediately smarter and far more extensible to future changes.
Hummingbird incorporates “conversational search” a feature that Google previously utilized only with knowledge graph answers. Conversational search is able to pull meaning and intent from natural language phrases instead of strings of keywords.
Knowledge Graph results are being incorporated into a wider variety of searches as Google is getting better at understanding entities and synonyms and how they related to user intent.
In Depth Articles August 6, 2013
Google released a new type of search result called “in-depth articles”. This was the result of research that demonstrated up to 10% of users daily information needs involve learning about a broad topic. This new type of result will favor pages that feature an in-depth exploration of the subject matter, adhere to existing best practices and schema.org “article” markup, and utilize rel=next and rel=prev for pagination of the in-depth article.
Knowledge Graph Expansion July 19, 2013
All of the sudden knowledge graph results began showing up in some form for more than a quarter of all searches. This is a clear sign of how integral knowledge graph is to Google’s perspective moving forward.
Panda Recovery July 18, 2013
This update seems to have followed through with a statement made by Matt Cutts from May when he said that Panda would be lessening its impact on some grey area sites. This update is supposed to make Panda for “finely targeted” allowing some recovery for some sites that were negatively impacted by google.
Payday Loan Update June 11, 2013
This update was announced by Matt Cutts prior to it being pushed out. The update was very targeted to focus on “spammy queries” such as “Payday Loans”, porn terms and other spammy queries. Cutts explained that this algorithm was specifically targeting “unique link schemes” and that the update immediately rolled out worldwide.
Penguin 2.0 May 22, 2013
This update to Penguin seemed to further reward big brands and hurt spammy sites particularly in the gaming and porn industries. Of note was that among the top brands Yelp was hurt the most while Twitter was the biggest winner. What does this tell you about the impact of different business models and linking strategies? Penguin is the “Spam Fighting” algorithm and 2.0 is a slicker more powerful version.
Domain Crowding May 21, 2013
Domain crowding is where a page of the SERPs features multiple listings from the same domain. Google had previously addressed this on their page one results and so this update barely had any effect at all on page one results. However, this update created significant changes to SERPs 2-10 with the instances of a domain being listed multiple times on the same SERP page dropping by 18.74% according to Ben Milleare of HighPosition.com
Panda #25 March 14, 2013
Matt Cutts addressed this update before rolling it out saying that it would likely be the last Panda refresh before Panda was rolled into the core algorithm. (See Panda Update for more information)
Panda #24 January 22, 2013
Google announced this update and claimed it affected 1.2% of searches. (See Panda Update for more information)
Panda #23 December 21,2012
Google announced this Panda refresh affected 1.3% of English queries.
Knowledge Graph Expansion December 4, 2012
In this update Google added features of knowledge graph into SERPs for Spanish, French, German, Portuguese, Russian and Italian.
Panda Updates 2012
Google updated its Panda algorithm 13 times in 2012. Here are the updates in order, with the associated update number, dates and the associated percentage change in SERPs that Google claimed were affected by the update.
- November 21. .08% of English queries affected, 0.4% worldwide queries affected.
- November 5. 1.1% of English queries affected, 0.4% worldwide queries affected
- September 27. 2012 2.4% of English queries affected
- September 18. 0.7% of English queries affected
- August 20. 1% of English queries affected
- July 24. 1% of English queries affected
- June 25. 1% of English queries affected
- June 9. 1% of English queries affected
- April 27. No change given.
- April 19. No change given.
- March 23. 1.6% of English queries affected
- February 27. No change given
- January 18. 1% of English queries affected
As you can see Google was constantly tweaking its content quality algorithm throughout 2012.
Page Layout #2 October 9, 2012
This update to the algorithm Google uses to penalize sites that feature “too many” ads above the fold on pages was announced and noted by Matt Cutts to only affect 0.7% of English queries. This is a reminder that Google wants to see content and usefulness as the centerpiece of pages it ranks… Question, would this algorithm penalize Google’s own SERPs? Probably… but let’s not poke the bear.
65 Changes August and September
This is a fascinating post from Google about the many updates they actually introduced over a 2 month period. Remember, not every update gets a cool name. Smaller updates just get numbers. Still, if you’re geeking out on this stuff, this is a really interesting read and glimpse into what they are looking for and how they want to display it.To read all of these changes it would be best to go directly to Google’s blog: Inside Search
Exact Match Domain (EMD) Update September 27, 2012
Google introduced a big change to the way it valued exact match domains. The target of the update was supposed to be low quality sites that were really only being successful at ranking for exact match keywords, however many sites reported drops in rankings at this time which indicates that the rankings drops could have affected this as a factor in ranking for closely related terms as well.
7 Result SERPs August 14, 2012
This update significantly changed organic rankings in Google, causing many search queries to only display 7 organic results instead on page 1 instead of 10. SEO Moz has done an incredible job of tracking these results and have found that the 7 Result SERPs are always associated with a potion 1 result showing additional Site Links or an Image Result block. This change seems to be algorithmic and constantly fluctuating… meaning that a given keyword phrase will show 7 results one day and the next day it may show 10 results again. What this definitely does is place increased importance on top rankings and achieving enough trust and authority for your site that Google allows you to start displaying Site Links for queries that are strongly related to your brand.
DMCA Penalty August 10, 2012
Google introduced a new penalty that would apply to sites that receive a significant number of valid copyright removal notices, stating “sites with high numbers of removal notices may appear lower in our results”.
86 Changes June and July
This is another big list of all the algorithm changes that Google introduced during the months of June and July 2012. It’s a very interesting list that definitely set Google up for future changes to be introduced, here is the link: Inside Search
39 Changes May
Here is the big list of changes Google made in May 2012. Inside Search
Knowledge Graph May 16, 2012
In what is probably one of the biggest actual shifts in how Google “understands” the world’s information, Google has introduced what they call “Knowledge Graph”. Basically, this represents Google creating data associations much more like how the human mind works vs. how a directory works. From this point forward, Google is building a data set where everything is an entity. Entities have attributes and associations and a number of logical modifiers that can help Google better understand the semantics of natural language. While keywords will continue to be important as the basis for content, knowledge graph will allow Google to better determine the quality of content based on what relevant associations are brought up throughout the content. This will help Google become very good at understanding how a subject matter is being explored and what types of queries it is really relevant to.
52 Changes April
A list straight from Google of 52 changes that were introduced in April. Inside Search
The Penguin Update April 24, 2012
With the Penguin Update, Google introduced and new level of scrutiny to spammy backlinking practices along with penalties to drive offenders down the SERPs and in some cases remove those sites from the SERPs almost completely.
It has been noted that Penguin is imperfect in that some sites get penalized while other sites that may be even worse offenders get by, however Google has stood by Penguin as having improved search results and thereby, has been a success.
When Penguin was first introduced, recovery from Penguin penalties was very difficult and unclear. Because Penguin penalties stem from backlinks that are found to be in violation of Googles guidelines, the only way to “recover” is to have those backlinks taken down, and in many cases you may be getting links from site that are not actively maintained. If you have no control over the links, you are at the mercy of the webmaster who is.
When so many people who had been using “black hat” or “grey hat” link building techniques became penalized, it was only a matter of time before Google found a better way to address those situations, which happened when Google introduced their backlink disavow tool and reconsideration request protocol.
At first Penguin was solely an algorithmic penalty, which means that it was applied automatically to any sites who are found to be in violation. However, in iterations since Google has drawn a line between an algorithmic penalty and the application of manual penalties. Some instances that may be more borderline are now being assigned to a Quality Rater for review and their report and recommendation is given to another Googler that has be ability to apply a manual penalty.
If your site gets a manual penalty, then you have the ability to attempt to address the issue and file a reconsideration request. In the course “When Pandas, Penguins and Hummingbirds Attack – A guide to recovering from Google penalties”, we will take an in depth look at the steps to Google recovery.
50 Changes March
Here is a link to the Google webmasters blog that lists all of the changes that were introduced in March 2012. Inside Search
40 Changes February
Venice Update February 27, 2012
The Venice update was all about Localized SEO. This changed the way Google understands user intent in relation to local business services. For instance, previously if you searched for “Chiropractor” you would get a cluster of local map results, but the organic results would be more general and national in nature. After the Venice update, Google would assume that your current location is important to queries that involve the types of services where being local is important. Therefore, now if you live in San Diego, CA and you search for “Chiropractor” you will still see the cluster of local map results, but you will also get more local Chiropractors ranking highly in the organic results.
17 Changes January
Google “Inside Search” blog listing 17 algorithmic changes that were introduced in January 2012. Inside Search
Page Layout (“Top Heavy”) Update January 19, 2012
Google introduced a new algorithm that devalues sites that seem to prioritize advertising over the content of the page. Specifically Google said that sites that dedicate too much ad space “above the fold” may not rank as highly moving forward. This is because Google organic results seek to deliver results based on relevance and importance of the content, not the ads that appear on the page. Therefore, users are expecting to see content, and if too many ads are obscuring the content, that is not a good user experience.
Search + Your World January 10, 2012
In a dramatic shift for organic search results, and in a move that quite obviously seeks to lure more people to Google’s own social network, Google plus, Google began delivering organic search results based on your social connections to that content. For instance, if you have your brother in your Google plus circles, and you search for “motorcycle parts” and your brother is following or has shared a “motorcycle store” on Google plus, that motorcycle store may now show up in your organic search results because Google feels you are likely to like something that someone you know has already liked. While many people feel that this could place you in a social bubble that limits your ability to truly discover new things online, Google has also included an easy way to turn off personalized search.
30 Changes December/January
Here is the Google blog listing 30 algorithmic changes that were introduced in January 2012. Inside Search
10 Changes December
The Google blog highlighting the changes that were introduced in December. Inside Search
Freshness Update November 3, 2011
Building upon the Caffeine web indexing system which allowed Google to crawl and index the web for fresh content on an enormous scale, the Freshness Update was introduced to help Google understand what search queries are more likely to be looking for fresh content, how fresh the content should be and deliver search results that meet the user intent.
For instance, if someone types in “March Madness” they are likely looking for information on the upcoming, current or most recent March Madness NCAA basketball tournament, and that person is not likely to be interested in search results that are several years old. Under the old algorithm, the older pages would be more likely to rank for a variety of factors… however in the case of this type of search, because freshness of the content is likely key to the user intent, the Freshness Algorithm will be triggered and help the most recent content rise to the top of the results.
The Freshness Update was estimated to impact 35% of queries which is nearly triple the impact of Panda).
Secure Search – Analytics Keyword Encryption October 18, 2011
Google rolled out this update, claiming that it would protect user privacy and make search more secure by introducing SSL encryption for users that are logged in to Google. The net effect of this is that when users are logged in to Google, and they click on an organic search result, the keyword that they used to trigger that SERP is no longer reported to the webmaster of the site in Google Analytics.
So, if you are logged into Google and you type in “san diego tax attorney” and click on “thetaxlawyer.com” the webmaster of that site will know that Google sent someone to their site, but they will no longer see the keyword data. Instead that click will be counted as [not provided].
Over time Google has significantly expanded the amount of keywords that are falling ito the [not provided] category which demonstrates that this change had less to do with whether or not a user is logged into Google, but was a keyword controlled algorithm that is meant to gently wean webmaster off of their keyword obsessed approach to optimization.
Pagination Elements Update September 15, 2011
Google introduced rel=”next” and rel=”prev” link attributes to help more clearly identify the relationship between pages in a series. This would later become important for the “In Depth Articles” update. This update also helped improve canonicalization and consolidation of “View All” pages.
Expanded Sitelinks August 16, 2011
After experimenting with this for some years, Google significantly expanded the delivery of sitelinks for many more search queries. Sitelinks are triggered when a search query strongly appears to be looking for a specific site, but the search is quite broad. Sitelinks will feature additional pages or categories within the site to help the user more quickly navigate to their desired destination.
Google also has introduced the ability for webmasters to remove unwanted sitelinks from within Webmaster tools.
Google+ June 28, 2011
With the introduction of Google+, Google finally introduced a viable social network that could be capable of competing with Facebook. Within 2 weeks Google+ reached 10M users. However, while people signed up quickly, not many of them were actually talking or sharing and thus user engagement metrics were embarrassingly low. Google+ deeply integrated with many other apps within the Google ecosystem, however possibly the most important move was when Google turned their category leading “Google Places” local business directory, into “Google+ Local Pages” thereby making every user interaction with the map listings from Google SERPs a Google+ interaction.
Also, while Google has denied that Google+ metrics such as +1’s and sharing have any impact on a pages ability to rank well in SERPs, numerous studies have indicated a very strong correlation between Google+ activity and good rankings.
Schema.org June 2, 2011
Google, Yahoo and Microsoft announced support for structured data under a consolidated format, which is provided under the Schema.org project. Structured data gives search engines a much greater ability to identify and categorize content in a very granular way. This content can then be more deeply understood and delivered in a wide variety of new search results. Structured data would page the way for Google’s knowledge graph.
Panda February 23, 2011
The Panda update affected up to 12% of search queries and would be one of the most frequently updated algorithms within Google over the next two years. Panda addressed a growing problem in Google wherein sites that were simply aggregating content and not adding any value, were often outranking the sites that were actually originating the content. Because of this, the originators of quality content were not receiving the high rankings and traffic that rewarded their efforts, and so content on the web was taking a noticeable decline in quality. Also, because aggregators (often called “content farms”) were proving to be a successful business model, they were growing and becoming a larger problem.
With the Panda update, Google addressed two major areas of concern. The first was to do a better job of promoting the originators of content higher in the rankings and demoting the content farms lower in the rankings. However, the second issue Panda addressed was even more important to the future of the web. Google needed to figure out how to determine if quality was high quality or low quality and in order to do this they first needed a way to quantify high quality content algorithmically. Google’s engineers figured out that there are a number of different ways to explore content. Journalists are familiar with the need for an article to address: who, what, when, where and why… however if you look at an encyclopedia, you will find other patters of content exploration such as definitions, synonyms, antonyms, history, organizations, events, problems, solutions, repercussions, etc… Google’s engineers were able to come up with a long list of the terms, phrases and semantic structures that identify how content is being explored. The more of these that a piece of content addresses on a particular topic, the more in-depth and higher quality that piece of content is likely to be.
Attribution Update January 28, 2011
This update was a precursor to the Panda Update and allowed Google to more clearly identify content as attributed to an author or publisher and thus more easily identify scrapers and aggregators.
Social Signals December 2010
Google confirmed that social metrics are being used as ranking signals. This makes sense because social networks offer a human curation of the web. Google is now very good at understanding both context and sentiment and therefore can understand if users are saying that a website is good or bad. Also, if people decide to share a website or if they publicly like a website, those are potentially more reliable trust signals than links (which is a system that has been gamed for a long time).
Negative Reviews December 2010
Google introduced an update that would negatively impact websites that were associated with a high percentage of negative reviews. Utilizing analysis of sentiment, along with 5 star ratings, Google is able to determine whether users are saying a site is good or bad. If the majority of your customers say you are bad, Google is going to make you feel the pain with a drop in rankings.
Instant Previews November 2010
Google introduced a new feature in the SERPs where if you hover over a result a thumbnail type preview will appear. This brings new focus and attention to landing page design.
Google Instant September 2010
This update introduced SERPs that actually change while a query is being typed. Therefore if your query starts with the most broad term and then adds the more specific long tail modifiers, you would see search results that start as a broad match to your term and then are instantly refined as you continue to type. In retrospect, this feature was more useful for competitive analysis that it probably was for most users.
Caffeine June 2010
The Caffeine update began being tested in late 2009, however after months of testing and adjustements, the full rollout was launched in June. Caffeine represented a significant boost to Google’s ability to quickly crawl and index the web in search of fresh content. According to Google, this resulted in a 50% fresher index. Therefore, sites that were active participants in the web by providing a consistent stream of fresh content and content updates seemed to be rewarded with higher rankings.
May Day May 2010
May Day was an algorithm change that significantly impacted rankings for long-tail search terms. Prior to the May Day update Google was only able to index 3 word phrases and rankings for longer tail terms were the result of combining these three word strings. Therefore, the translations were not nearly as accurate. However, after May Day Google was able to increase this to 5 word phrases. This had a significant impact on how Google was able to identify and understand phrase meaning and relevance. Therefore, it caused a significant impact and essentially reshuffled the deck on previous long-tail rankings.
Google Places April 2010
While “Places Pages” were rolled out nearly 8 months earlier, up to this point they had been a feature of Google Maps. The launch of Google Places introduced a Local Business Center with additional features and made the Places pages and map results much more integrated into SERPs from queries that are more likely to be looking for local businesses or local services.
Real-Time Search December 2009
With “Real Time Search” Google SERPs introduced significant integration to fresher content including Google News and Social Networks (Twitter feeds in particular), expanding into additional sources over time.
rel=”canonical” Tag Introduced February 2009
In order to help Google reduce problems associated with indexing duplicate content and to help webmasters tell Google which version of a page should be displayed in SERPs along with factors such as properly attributing link equity, Googel introduced the rel=”canonical” tag.
Vince February 2009
This update seems to have favored big brands through the introduction of a new set of trust signals that tend to favor larger businesses. While the factors introduced may certainly be valid indicators of brand trust, they also seemed to have widened the naturally occurring gap between the online footprint of large and small businesses, thereby making it harder for smaller businesses to compete, or at least giving a new set of things that businesses must do in order to send all of the available trust signals.
Google Suggest August 2008
With Google suggest, as you type your queries into Google’s search box, a drop down of suggested matches would appear and instantly update to try to match your intended query as you type. Once your intended query appears in the suggestions below the box, you could simply click on it to initiate the search. This also became a good keyword research tool, by automatically displaying longer tail versions of your base keyword.
Universal Search May 2007
With “Universal Search” Google began integrating a much wider set of result types including News, Video, Images, Local Maps and Business Listings and more.
Nothing Much 2006
In 2006 there were numerous small tweaks and updates but none caused much of a shake-up in the SERPs and none seemed to be very well understood outside of Google.
Big Daddy December 2005
The Big Daddy update was a major infrastructure update that changes the way Google handled URL canonicalization, redirects and other issues.
- October 2005
The Jagger Update introduced penalties for links that violate Google guidelines such as link farms & paid links and depreciated reciprocal links and other low-quality links.
- Search Engine Watch
- http://www.webpronews.com/googles-jagger-update-dust-begins-to-settle-2005-11">Web Pro News
Personalized Search June 2005
With the “Personalized Search” update Google began to use the history of a users search queries as indicators toward the relevance of future queries. The thought behind this was that most users tend to begin with a broad term and then refine that query 3-5 times by adding and changing words until they find what they are looking for. With this update Google sought to help users find the relevant results faster. Thus changing the results that individual users would see. While this was a small change at first, it set the stage for much deeper and more significant personalization to come.
XML Sitemaps June 2005
Google began allowing webmasters to submit SML sitemaps, suggesting to Google a specific list of URLs to crawl and index. This helped ensure that Google could find and crawl every page on your site.
No Follow January 2005
In an effort to combat spam, Google along with Yahoo and Microsoft came together to introduce the “no follow” attribute and gave webmasters a way to add mark-up to advertising links and links that were introduced through UGC that could be used for link spam.
Brandy February 2004
The “Brandy Update” introduced a tremendous increase in Google’s index and introduced Latent Semantic Indexing which increased the importance of link anchor text as a relevance signal. The Brandy update also introduced the concept of link neighborhoods as indicators of relevance and quality.
Austin January 2004
The Austin update could have been called Florida 2.0. This update focused on penalizing on page spam tactics such as invisible text, keyword and meta tag stuffing, etc.
Florida November 2003
This was the first major update that significantly impacted the practices that websites were using to manipulate Google’s SERPs. While SEO was already picking up steam as an important and thriving industry, this update caused an industry boom as businesses who had enjoyed high rankings and lots of traffic suddenly were penalized and in desperate need of recovery.
Fritz July 2003
Prior to the Fritz update, the Google search results were refreshed on a monthly basis. Fritz introduced an incremental approach where the index changed daily.
Cassandra April 2003
This was the first update that cracked down on link quality issues such as creating link farms from lots of domains that you also own.
That covers all of the important Google updates
The above list may skip some of the smaller updates that were either less understood or made no significant verifiable impact to the search rankings or strategies used in SEO. However, the above list does document all of the major updates that an SEO professional or serious online business owner should understand.