[Updated] A Full Overview of Google Algorithm Updates
Google Search Algorithm Updates always were, are, and, we suppose, will be something that strikes some fear into the hearts of webmasters who care about their site rankings. Google plays hardball, webmasters play whack-a-mole.
Why is this so when these updates presumably aim to provide the best user experience and results to searchers and to reward websites for high quality content?
Everybody who is engaged in the SEO field knows that Google operates and applies more than 200 ranking factors before letting your website get the highest positions on the SERP. However, before improving each of those 200 ranking factors you are better off to first learn which of them impact a website negatively or positively and then start your work.
It is difficult to consider even half of Google’s ranking factors when you conduct a website’s optimization, and some webmasters don’t even want to do this. Instead they go for black-hat SEO in order to get quick results. Their quick high positions eventually get lost and end with a Google Penalty which in a simple word means trouble. Google is too smart, so don’t try to cheat.
[12.19.2019] Google never stops. This time we will talk with you about the:
- Google September 2019 Core Update
- Google BERT Update
- Google November 2019 Local Search Update
- Minor Google Updates
Table of Contents
- Google May Day Update
- Google Panda Update
- Google Exact Match Update
- Google Penguin Update
- Google Hummingbird Update
- Google Pigeon Update
- Google Mobile-Friendly Update
- Google Possum Update
- Google Fred Update
- Google Medic Update
What Is a Google Penalty and Why Is It Dangerous for Your Website?
A Google Penalty is represented by a negative impact on your rankings which comes to you after Google conducts a manual review or algorithm update. This means that you did something to your website that was against Google’s webmaster guidelines and your website was mentioned in a spam report. If you see that your rankings and traffic suddenly became lower in a matter of several days, especially after a new algorithm update, this usually means that a Google Penalty was applied according to new rules.
You can check whether your website was punished in Google Search Console. However, this shows you only manual review results. A penalty as a result of the latest algorithm update can be confirmed only via website ranking and traffic analysis.
There are two types of Google Penalty: partial matches and site wide matches. The first type of penalty is applied when there are several artificial links or those of low quality which point to specific pages of your website and you are losing valuable organic traffic. The second type takes place when the backlink profile of your entire website needs significant and instant auditing and cleaning.
A rank decrease may refer to all the pages of your website, a specific keyword, or a specific page.
What Stands Behind Google Algorithm Updates
A Google Algorithm Update covers a bunch of changes for Google’s ranking algorithm, including improvements in the existing algorithm or a set of new rules on how to analyze the quality of websites and then rank them in the SERPs.
Since Google launched in 1998, a lot of algorithm updates have happened. Of course, more often people pay attention only to the major ones, e.g. Panda, Penguin, Hummingbird, and Pigeon. However, there are actually more of them which have had a significant impact on the current SERPs view and which have been a total pain in the neck for many webmasters:
Release date: April 28 – May 3, 2010
This update aimed at rewarding websites with high quality long tail content. The peculiarity of the update is that Google now paid more attention to the quality of the presented content. It got you to higher positions on the SERPs regarding specific long tail keywords despite a poor backlink profile. Its main goal was relevancy of the content presented.
The greatest losses came to websites in a niche where they were selling goods generically. Focusing mostly on short tail keywords they went through significant traffic and rankings drops regarding long tail keywords.
Remedy: fill your website with enough long tail keywords. This may come in the form of articles, extended reviews, long descriptions, a characteristics overview, guidance, and so on.
WebCEO’s Keywords Research Tool will help you to work on keywords and create unique and winning combinations for your website.
Release date: February 23, 2011
This update aims at rewarding websites with high quality content and punishing low quality websites. Panda looks for everything you did in order to get a higher position on a SERP without quality in your pocket. This could have been:
- Low quality content, whether it is employee or machine generated, will lead you nowhere. Google said multiple times that quality is everything and it is better to listen to these words if you want to be the first and keep your positions for a long time. Be sure to express the language of the text fluently.
- Unnatural language, in other words, this could be a keyword “overdose”. If you put too many keywords in your text, this will be noticed almost immediately, simply because neither we nor Google bots are used to reading content with a great repetition of specific words. Google also finds this suspicious and doesn’t delay penalties.
- Thin content or lack of content. This is when you have a really low amount of material on one of your website’s pages. Google likes it when you spend more time and present in-depth content to searchers. Writing a short paragraph with little sense, but an overdose of keywords, is not a good decision.
- Content farming, i.e. a method of creating a significant amount of low quality content, for example a bunch of very short articles written for popular search queries, with the aim of getting greater traffic and revenue. Google doesn’t like it when content is created with the aim of ad monetization. User experience and quality should always be first.
- Lack of authority/trustworthiness plays negatively with your rankings, you can see this by analyzing your website performance: how often your content is updated, domain age, type, and authority, poor or bad backlink profile, visitor behavior on your website, etc.
- Inappropriate ads. If there are a lot of advertisements on your website and those are not relevant to your content or disturbing for a website visitor, you can eventually expect a penalty from Panda. The situation may become especially risky if the amount of advertisements overtakes the amount of content (ad-to-content ratio).
Remedy: content improvement.
1. Work on the content of your website. Rewrite your material in order to make it of high quality, put keywords only in places where their presence is necessary and use only those keywords which are relevant to your niche and specifically to the page you are trying to improve. It will not be a bad decision if you remove all the pages with low quality content.
2. Forget about advertisements for a second. Go to websites that are trustworthy and popular and learn their advertisement profile: how many ads they have, whether they are disturbing, and their relevancy to a website’s niche. Then come back to your place and think about the same points regarding your website. Optimize things properly and create an ideal place for visitors.
3. No black-hat SEO. Honest and “clean”, well done content will bring you success and traffic, therefore heightening your authority. Searchers will come to your place and stay there for a long time. Google needs nothing more.
Release date: September, 2012
The Google Exact Match Domain (EMD) Update focused on websites with domain names that exactly repeated a searchers’ query and then got their websites to the top pretty much based on that alone.
Remedy: unfortunately, no advice will help here, because you either have such domain name or you don’t. It’s no longer a guaranty at all that you will score for a keyword just because your domain name is an exact match.
Release date: April 24, 2012
Penguin was released to punish websites that try to improve their positions by getting links from low quality websites and by stuffing pages and anchor links with an enormous amount of keywords. Such schemes are easily recognizable even for a user. Google only needs to check websites from which those links came and then present you a penalty.
Nobody likes bad quality websites, because they, by default, mean that: a user will not find any good material there and can’t trust them, so it is better not to visit them at all. Accordingly, being linked to from such a website will reflect on you.
Remedy: forget about black-hat SEO.
1. Don’t try to build fast, easy, or paid spammy backlinks, because those more often will bring you harm instead of success. Take your time and try to get links from websites which have already reached some popularity and high domain authority. WebCEO’s Backlink Quality Check Tool will help you to learn your backlink profile from A to Z, including link texts, linking domains and the tool will show you toxic pages that link to your website.
Use white hat link building techniques like high quality guest blogging, link round-ups, the skyscraper technique, etc, which will be a win for both sides. By taking these steps, webmasters will gain a decent backlink profile which will be appreciated by Google, and build up domain authority for their website.
WebCEO’s Content Submission Tool will help you to find the best places where your content can be your best advertisement.
2. Avoid keyword stuffing in any form and place. Keywords are not flowers which you can put anywhere and enjoy them. Their main mission is to help you to find a reader, but not to attract Google’s attention. Find the best variants, build some relevant long tail or short tail keywords which are successful for your niche, find some synonymic alternatives and put them into your text so users and Google will not be annoyed. Google recognizes synonyms and can reward you even more for using them.
Release date: August 20, 2013
Because semantic search is so complicated, the Google Hummingbird update went farther than any other updates. It tried to understand a searcher’s way of thinking at the moment they wanted to find something on Google. This approach extended the borders of information which should be presented. For instance, Google would not just show the definition of “pizza” on its local SERPs, but information which concerns that word: recipes, history, the nearest pizza places, the most popular among them, recommendations in the form of “people also ask”, etc. Hummingbird tries to get you the most accurate results concerning your query, by analyzing what you probably need this information for.
Hummingbird works with a “knowledge graph” which was first presented in 2012. There one can usually find relevant answers to a query, and you won’t even need to enter a website! With this update, local search was vastly improved. Since Hummingbird gives you less primitive information and goes deeper into your wishes, local businesses were given a large incentive to do better with their Internet performance by improving title tags, keywords in descriptions, and conducting website updates.
Advice: structure your content properly.
1. Because Hummingbird uses knowledge graphs, it has become better to write your content in a way that answers the following questions: Who? What? Where? When? Why? How? By doing this you heighten the chances of your website being chosen as an answer for a searcher’s query and you can also be selected for a Featured Snippet (technically speaking, we are suggesting that you optimize your Open Graph code and your Schema code – webmasters will know what we mean). This helps to bring more traffic.
WebCEO’s Rank Tracking Tool will show you your results in organic search and whether your website was shown in a Featured Snippet or Knowledge Panel (the box where Knowledge Graph data is presented).
2. Diversify your content. Long articles with comprehensive analysis are really great and Google likes them, but for a visitor’s convenience you can write paragraphs that are easy and quick to read. Moreover, these short articles can also be used by Google in a knowledge panel.
3. Your language should follow your niche. You can write in a simple, interesting, and engaging way, but don’t forget that you must create content related to a definite niche. Don’t make your article too easy; use up-to-date terms, statistics, diagrams, and so on. Remember that all those terms are your keywords, and Hummingbird can consider your information more relevant to somebody’s query than anything else.
4. Set a Schema markup. This markup determines whether your page will be featured in a Snippet. The data presented in your Schema markup can help visitors judge what your site represents: ratings, quantity of reviews and skillful descriptions. If you are an owner of a local business you can present more data concerning your working hours, menu, and phone numbers.
Release date: July 24, 2014
Pigeon brought a lot of changes to local SEO after its release:
- With Pigeon, the 7-pack changed into the 3-pack: since the update, searchers can now see the three best results for local businesses instead of seven. You can also see a map on the SERP above the 3-pack which shows the distance to those three places;
- Pigeon shows you results not only depending on the closeness of venues to you, but also takes into account a website’s position for that keyword in organic search results. In simple words, Pigeon sees the nearest places to you, analyzes their organic positions on the SERP considering all SEO ranking factors, rates them, and then finally presents a list of the best variants for you in local search. This is a great feature because you receive not just the ordinary results of where you can go, but the best results;
Advice: become visible.
1. Make your business visible on all local business directories: Facebook, LinkedIn, Bing, Yelp, and many others. Local ranking factors have become more and more important: reviews, citations, links, social media engagement, and so on.
2. Use local search terms as your keywords. Write them down in a snippet of your own content, in title tags, and in the descriptions of your place in the local directories. Of course, don’t forget to mention them in your text.
3. Time to think about your website optimization. As Pigeon takes into account a website’s organic SERP results, you should take care of your website performance: high quality content, backlink profile, domain authority, mobile-friendliness, etc.
Release date: April 21, 2015
Mobile devices are everywhere nowadays. Users have traded their desktops for smartphones and prefer to chill out with them 24/7. Google sees trends and follows them. Trying to provide users with the best performance even on mobile devices, Google released its Mobile Friendly Update which had an impact only on those websites which aren’t optimized for smartphones. It has been a great motivator for webmasters to make their sites convenient for any type of device. Going into detail: with this update your search rankings on desktops are not lowered at all. As this update concerns only mobile devices, accordingly only your mobile rankings suffer from it if you haven’t optimized your website yet. However, it won’t necessarily affect the whole website. If some of your website’s pages are mobile-friendly, then they will not be “touched” by Google. Google has even provided a test for website owners which can help to check whether a website is mobile-friendly.
Advice: make your website mobile-friendly. Use AMP for this purpose, a framework that provides for the fast and smooth loading of your website on mobile devices.
WebCEO’s SEO Analysis Tool will give you detailed information concerning your website’s mobile optimization, so you can discover any issues and see instructions on how to solve them.
With the Google Mobile-First Indexing which was enabled on July 1, 2019, a website’s mobile-friendliness obtains even more value. Google has begun to crawl and index websites primarily from the mobile version point of view. If you run a website that Google hasn’t seen yet and it is not mobile-friendly yet, be ready to experience problems with your indexing and rankings.
Release date: September 1, 2016
Possum was a greater version of Pigeon in which developers addressed the disadvantages of the latter:
- This update let companies situated beyond a city’s borders be shown in a local search when searchers specified that city. Earlier this was impossible, because Pigeon focused on places which were strictly on the territory of a chosen city. Even if a website had good positions in organic search results, it could not be seen in local results because of this.
- Google improved its results sorting. Now it doesn’t show several results which belong to one address. For example, if there are two coffee houses in one place near you, Google will show you only one of them in order to avoid duplicate content. The second result will also be presented on a list, but pushed down.
- Now you will see different results for keyword variations. Even a slight difference between them will give a list of new places, e.g.:
- Possum is more sensitive to a searcher’s physical location than it was before. Now the 3-pack shows you not just the best results for you, but also which of them are the closest.
- Local search has become more independent from the organic results. Even despite low rankings in organic search, some businesses do really well in local search results.
1. As Google still considers organic results while giving searchers the best local matches, it is important to constantly keep track of the general website’s performance: backlink profile, domain authority, etc.
Release date: March 7-8, 2017
The codename “Fred” is not official and was proposed by Gary Illyes on Twitter as being something that is unknown. The target of this update was to punish websites that use black-hat SEO and too many advertisements for aggressive monetization. “Fred” fights against websites:
- that contain an excessive amount of advertisements;
- the content of which is thin and of low quality;
- which contains text written about different topics with the aim of fast rank position gains;
- have little benefit for users, a lot of page issues and have a bad impact on the user experience;
- that are not mobile-friendly.
Remedy: website’s quality reevaluating.
1. Your website should belong to a specific niche and fulfill a user’s needs with relevant content, which is rich and well written, without keyword stuffing and without any signs of thin or duplicate content.
2. There should be no game playing with title tags, metadata, keywords, and schema code. Any attempts to use black-hat SEO must be stopped immediately. Google doesn’t like them and you should not either.
3. Be shy when time for advertisements on your website comes. An excessive amount of it will always disturb users and they will leave your site instantly, heightening your bounce rate, which Google automatically doesn’t like. Remember that a lot of people nowadays use Ad Blockers, so the chances to get something from those advertisements may become minimal.
Release date: August 1, 2018
The “Medic” update presumably punishes websites which can negatively influence people’s well-being. This includes: health, financial security, safety of a user, and so on. To be specific, this update affects websites which:
- require personal information, e.g. name, date of birth, Personal Identification Number, Social Security Number, bank account number, driver’s license, – on the whole, the type of information which may be used for Identity theft;
- offer goods for buying and use insecure monetary transactions, e.g. online shops, where the information about your credit card and bank account number is used and may be potentially stolen for the sake of lucre;
- offer a user some advice or general information regarding the medical sphere and health, which in Google management’s estimation may be harmful.
- present information in the form of advice regarding important life problems and future decisions, e.g. serious purchases like cars, houses, stocks, some financial advice and so on.
Remedy: raise trust among users.
1. Work on your landing pages and content – make it of high quality and erase all features which Google doesn’t like: low quality, thin, duplicate content, keyword stuffing, and everything else that Panda hunts for. Take the freshest information from trustworthy and official sources, attaching statistics, tables, diagrams, etc. With this you will show users and Google that you haven’t pulled your data out of thin air.
2. E.A.T. concept – expertise, authoritativeness, trust. Write in detail on your About page who you are, why you can be useful, and why people should trust you – prove that you are a specialist who has, for example, the necessary background and education to be an expert in a chosen niche. If you recommend some information that goes against the words of most scientists, then give your users more proof that what you say may be right (Copernicus was right after all). Trust rises with good reviews about your website, so ask people/your customers/visitors/subscribers to leave a comment concerning their attitude to your website.
3. Create an author bio which will present you as a specialist. This variant should be used if your About page gives information concerning the services you provide on your website. In your bio you can write about yourself as an expert in a specific sphere and why people can trust you. Maybe, you have a Bachelor’s, Master’s, or Doctoral Degree, completed some courses or internship, and so on. Google presumably also wants to trust you, so give it such an opportunity.
Google never stops developing and updating. And each time website owners encounter more and more new rules and limits, which they should obey in order to stay visible on the SERPs. 2019 follows this trend as well. The June Google Core Update made a lot of changes. Many YMYL websites were impacted and fell in the rankings. These are “Your Money or Your Life” sites that presume to sell you things that effect your health and financial situation. Meanwhile educational and informational resources gained more authority. Learn more about the June 2019 Google Core Update in order to protect your website from decreasing rankings and pick up new rules.
Release date: September 24, 2019
As the June 2019 Core Update’s successor, the September Core Update focused on websites that in any way might damage people’s well-being. Websites containing information regarding health, money, travel and medical stuff were the target of this update. Some publishing websites like The Daily Mail did better after this update after having fallen with the June update. Google says that any kind of information that might have any influence on people’s lives should be harmless. It’s still too early to describe everything this update might have brought, however, some points already cry for your attention.
Remedy: revise your content.
1. E.A.T. concept is still necessary to follow: show your visitors that you don’t get information out of thin air. Disclose your qualifications and prove your credibility so that neither Google nor users would hesitate to come to your website and later apply knowledge you provide them with.
2. Monitor your backlinks, both coming to and from you: the sources you use while creating your content might influence people’s trust in you as a professional and as a writer. Unproven and suspicious websites you are linking to will make people trust you less, because they will not be sure whether they can handle the data or not unless you provide them with recognized specialists’ opinions.
3. Freshness is an always winning feature: update your content with fresh and relevant information. If you use any statistics in your texts, make sure the figures are recent and a hundred percent true.
Release date: October 21, 2019
The BERT algorithm (Bidirectional Encoder Representations from Transformers) is not a simple update to the existing algorithm. It is the introduction of a new system that will help understand users’ natural language and provide them with more accurate results for their queries. We can call BERT Hummingbird’s successor at some point because both these updates are focused on the understanding of search intent.
BERT goes deeper in a searcher’s query analysis, catching the context of it. It considers the whole word groups, prepositions that surround the sense-leading word of a query and other language units. BERT will analyze the linguistics to understand what a person really wants and will deliver the most accurate results. This update will also influence featured snippets.
Remedy: there are no particular instructions on how to write content or optimize your website for this update. BERT was created to understand people’s way of thinking while creating a query from a linguistic point of view. The advice is to write for people in a natural way: no machine-generated content.
Release date: November 2019
As Google announced on their official Twitter account, neural matching will be used in delivering local search results. Neural matching is used “to better understand how words are related to concepts”. In simple words, neural matching was implemented to help the search engine build connections between the information about a local business and a searcher’s query and deliver more accurate results regarding particular businesses someone might be looking for even without specific names in a query. This might also concern similar location and business names and how to distinguish between these.
There is no remedy for this type of update. This concerns only Google’s ability to better understand what people might be looking for.
MINOR GOOGLE UPDATES
Release date: September 10, 2019
Google introduced additional link attributes. Besides rel=”nofollow” there will be two more attributes to use:
- rel=”sponsored”: an attribute to a link to specify that it is a part of a sponsored promotion;
- rel=”ugc”: an attribute to a link to specify that it is a part of user generated content: comments or forum posts.
There is no remedy for this update, just a helpful feature for webmasters. Starting from March 2020, these attributes will be considered by Googlebots when crawling and indexing to understand a website’s content better.
Release date: September 16, 2019
Rich snippets for the “LocalBusiness” and “Organization” schema types (including their sub-types) have become the center of attention. Google decided to “eliminate” self-serving reviews for these categories. These are the reviews that are placed on a website’s pages with the help of widgets. You don’t have to switch off such widgets if you are using them currently, it’s just that Google will no longer show these on the SERPs. This update concerns organic search alone. If there are reviews about your website on other directories – that are not managed by you – such reviews will still be shown in SERPs.
Remedy: there is no exact remedy for such updates. Our advice is to work and wait for 5-star reviews about your business on other websites. The name property has also become important: when reviewing a product, it’s necessary to mention its name for the feedback to be more useful for users.
Release date: October 2019
In June, France adopted a copyright reform according to which Google and other huge services have to pay media resources even for a tiny piece of content used on their platforms, for instance, an article’s abstract shown on the SERP. Due to these changes in EU legislation, Google introduced new robots meta tags that give webmasters an option to choose how their snippets will look like on the SERP. Currently, Google search results for some queries in France look like an ordinary list of websites without previews.
Remedy: this is not a problem to be solved or an update that can influence your rankings. This is a feature for webmasters to basically pre-approve or not whether Google can show a content snippet from your site in the SERPs. It is up to you whether to use these meta tags or not.
“max-snippet:[number]” for the quantity of characters;
“max-video-preview:[number]” for the length of a video-preview;
“max-image-preview:[setting]” for the size of an image-preview.
You are also free to mix these.
TO SUM UP, year by year, Google tries to improve the experiences of its users, releasing updates that have the potential to make each search for information more easy and satisfying (you may disagree if your site dropped in the rankings but you can work with us to bring them back up). The updates we’ve mentioned in this article had a huge impact on the SERPs. There have actually been many more updates than the ones we mentioned, plenty of which were never mentioned to the public. Maintain your rankings with WebCEO’s Rank Tracking Tool and protect your website from further Google algorithm updates.