7 Latest Google Updates: Why It Matters for Your Business

Latest Google Updates Why It Matters for Your Business

‘How’s your site traffic and site ranking doing?’ is one question that perhaps will never become too recent to ask again. The number of updates that come our way, some scheduled and others as surprises, never fail to give us the shock of our lives in terms of our site ranking and the web traffic associated with it.

For example, the recent Google June Core Update turned out to be some kind of a savior for some businesses which experienced as high as 70% increase in traffic, while there were many whose traffic simply went dud.

This blog discusses four Google updates that have been recently rolled out. It briefly explains why they should matter to you as a business owner and what steps (if any) can be taken to either make them work for you or at least immunize your business from its impact.

1. Remove Noindex Directives from Robots.txt

You Only Have a Month!

Still using the Robots.txt directive for indexing files? Find an alternative. You only have time until 1 September as Google has announced that it will no longer support the same.

Why Was Robots.txt Used?

Most site owners have been using this tool to tell Googlebot whether to crawl certain pages or not.

This is important as giving Googlebot the green light to crawl all your web pages may affect its search ranking, especially if it has a large number of pages. Besides, there are always certain pages which you would not want to appear in the search results.

It is interesting to note that Google never officially stated that it would support the noindex directive in the first place. But since it also did not make any claim contrariwise, site owners began using it as a primary tool to manage the indexing of their pages.

What’s the Solution?

Before coming to the solution, let’s start off with a caveat. It is best not to rely on unofficial SEO rules since they can be reversed at any time.

Moving on to the alternatives to Robots.txt. Google mentioned some solutions in a recently-shared blog.
[See the full blog here.]

There are five ways in which you can prevent Googlebot from crawling certain pages. The best option is to use password protection for those pages which you don’t want indexed. Hide them behind a login and they will not be crawled.

Apart from this, using 404 and 410 HTTP status codes can be another simple way to avoid the same.

To conclude, site owners have a month’s time to make the required changes. From now on, try and ensure that you do not make yourself dependent on any unofficial rule.

2. Google My Business Adds “Request a Quote” Button to Listings – What it Means for Your Business?

Google My Business has recently come up somewhat “secretly & silently” with a “Request a Quote” Button. This is automatically showing up for business listings that have enabled the messaging feature.

The huge button appears in both desktop and mobile searches.

How It Will Impact Your Business

With the impossible-to-miss button on very listing, the website traffic is most likely to take a hit. As more and more users will begin to use the “Request a Quote” button without bothering to take a cursory look at the website, the competition is likely to become primarily of the cost of the service, while the details of those services may come later.

Whether this particular button will be beneficial for all businesses or not is hard to tell at this moment, but for those users who make a final decision based on price, this feature is going to be significant in selecting a particular service provider or business.

For businesses where the type of work offered by everyone is more or less the same, this newly added feature may help in pulling through a bigger pool of customers by simply lowering the quote.

So, for example if one is offering plumbing services, a reduced quotation may fetch them a client. As for counterbalancing this reduced cost, one may decide not to spend too much money on website design as the user may not go that far to decide which service to take.

Leads vs Traffic

Obviously, the quote button actually makes you choose between leads vs traffic. A message from a user requesting a quote is a lead that is likelier to get converted, compared to the simple traffic.

However, the efficacy of this button will depend from business to business, as explained in the above section.

How to Get this Feature Enabled

As mentioned above, this feature is automatically enabled for business listings that have turned on their messaging service. So, you can get it activated by simply activating your messaging feature.

Below are the steps to get it activated:

  • Step 1. After opening the app, go to the location you would like to manage
  • Step 2. Click/Tap on the Customers
  • Step 3. Click/Tap MESSAGES and turn on the notification

3. Google Cancels Support for Robots.txt Noindex

Google Withdraws Support for Robots.txt Noindex
The Robots.txt file, which proved to be an amazing tool to give your website more visibility, is no longer supported by Google. The Googlebot will no longer use Robots.txt directive for indexing of files and businesses have time until September 1 to switch to an alternative.

What is a Robots.txt File?

To explain it in simple terms, Robots.txt is a file system which tells Googlebot which pages to crawl for any given website. By using Robots.txt, you can make certain pages of your website inaccessible to Googlebot. Since a website may contain a large number of pages, giving access to Googlebot to crawl each one of them might downgrade your SEO ranking. Besides, there may be certain pages which you do not want to show up in the public domain. Robots.txt file is used for all such purposes.

What You Can Do

In a recent blog, Google shared some of the alternatives for those who heavily rely on Robots.txt for indexing.

You can read the full blog here.
Of the five ways explained in the blog, one of the simplest is to use 404 & 410 HTTP Status codes for the pages you do not want Googlebot to crawl. Another way is to use password protection. This means that if you hide a page behind the login, your pages will not be crawled.

Search Console Data to Third Party Platforms

In a recent video on Google Webmaster’s YouTube channel, Google’s Martin Splitt mentioned that the company is working to bring the search console data to third-party platforms. Though he did not go into the nitty-gritty of how it is going to happen and what steps have already been taken, he did say that the company would roll it out soon.
One of the major reasons for such a change is to reduce the efforts of site owners to go into the search console and retrieve data from there. With the new third-party platforms in place, site owners will be saved from going all the way into the search console and gather the data from third-party platforms.

What is Google Search Console?

Google Search Console (GSC) is a free tool offered by Google that helps you keep a tab on your website’s performance and functionality. It generates reports, analysis and gives you different metrics to calculate how your site is doing and also indicates the steps you can take to improve your site’s functionality.

What You Can Do

So far there is not much information about this change, hence saying anything would be premature. One of the dangers of the data going to third-party platforms is that competitors might be able to access it, which can give them an advantage. However, it is still not clear whether the third-party platforms will be a kind of an open-source or something on the lines of Google Search Console.

4. Mobile-first Indexing

Mobile-first indexing is simply what the words mean—it uses the mobile version of the website as a primary version to index, rank, and file. This means that if your website’s mobile version is not as per the prescribed guidelines for ranking, then it may experience a drop. It does not mean that your desktop version will not be crawled by Google bots but simply means that the mobile version will be considered as the main one.

From July 1 onwards, all new websites will be indexed using mobile-first indexing. This clearly indicates that one may not be able to survive in the digital arena for long unless one invests their time and effort in building and running a successful mobile-friendly website. As more than 58% of the site visits in 2018 came from mobile and at the end of the day it is all about user experience, this shift towards mobile-friendly websites is nothing but natural.

Though this is a definite change for new websites, what about those that are already existing? It cannot be said with certainty as to when Google will enable mobile-first indexing for existing websites, but what is certain is that it will surely be done. As Google continues to bring updates and changes in its algorithm by analyzing user trends, it should not come as a surprise if Google announces someday that it will now follow mobile-first indexing to rank websites.

Interestingly, the Google Search Console has now come up with three new sets of Googlebot crawling data. With the help of this new data, you can find out which crawler version is crawling your website. This means that if your primary crawler is shown to be Google bot Smartphone, then it is an indication that your website has perhaps switched to mobile-first indexing.

Apart from the primary crawler, you can also find out from which date your website has switched from desktop-first indexing to mobile-first indexing. This will remove all the guesswork and allow you to track with certainty as to how the change in indexing has impacted you.

Besides, you can also know which crawler has been used to generate a specific report after collecting your data.

What You Can Do

Well, in simple terms, invest more in your mobile version and make it at par with your desktop version.
Pay special attention to the page loading speed and content. If the content is the same for both the versions, then there’s no need to worry.
Lastly, begin to think about micro formats. It’s always better to use schema and hreflang on all your pages.

5. Google June 2019 Core Update & Wikipedia

This is perhaps one of the most shocking updates in 2019. An obvious reason is because it turned the rankings of many websites upside-down. But the unexpected results this update could bring may also be understood from the fact that it was the first time Google made an announcement before rolling out a core update. Up until now, Google never made such announcements.

Though Google squashed any speculation by saying in another tweet that it “just wanted to be more proactive” so that people knew beforehand and did not “scratch their heads” trying to figure out what happened, the pre-emptive announcement may indicate an unexpected impact.

Adding on to this is the Wikipedia angle. A relatively big, alternate health site run by Dr. Joseph Mercola has lost almost 99% of its traffic post this update. Dr. Mercola believes that Google is assessing the relevance and trustworthiness of websites based on information available on Wikipedia. He says, “Google is now manually lowering the ranking of undesirable content, largely based on Wikipedia’s assessment of the author or site.”

This is news because if Wikipedia is being used by Google as an assessment tool, then anything new or any alternative to the already-established wisdom may not find its way up the ranking ladder.

If one goes by the Quality Rater Guidelines, it does mention that the reputation and trustworthiness of a website plays a vital role in its ranking. However, whether Google does indeed rely on Wikipedia or any other source to assess the reputation of the site cannot be said with certainty.

What You Can Do

Since there is no evidence of Google using Wikipedia as an assessment tool, nothing much can be done in this case.

What is possible is to improve the quality of your website in terms of content and information. This can be done by using the E.A.T. metric where ‘E’ stands for ‘expertise,’‘ A’for ‘authoritativeness’ and ‘T’ means ‘trustworthiness.’ Try to publish content that is written by an expert, is well-researched and in-depth, and backed by facts and studies that make it reliable.

As there are no fixes for the Google Core Update, you can only try making those changes that are known to increase ranking and traffic.

6. Update Related to Multiple Featured Snippets

This too came up in June and was answered by John Mueller on Google Webmaster Central hangout as a response to a query related to using ‘recipe’ and ‘how-to’ structured data type.

He mentioned that one can use any type of data structure, but only one type will be chosen by Google (based on what is closest to the user’s query) and displayed as a snippet. This means that your chances of getting more traffic from multiple snippets get reduced.

Though it’s not prohibited to use both types of data structure, only one will be shown.
Being inside a box, featured snippets stand out from the rest of the organic search and have a better chance of getting clicked. They also give a better user experience.
Now that there is only one featured snippet for a single query, it makes the competition fiercer. Normally, you need not have to be #1 in search rankings to qualify for being shown as a snippet—anything in the top 10 rankings may do the job.

What You Can Do

As John Mueller has stated, use both types of structured data so that you never miss out on either of the types of query.

7. Google Ads Now Report on Shopping Campaign Landing Pages

Those who are following it closely will notice that Google is very aggressively making changes in its ads. They have not been doing well in the recent past and Google is now making regular changes to make them better for business owners. There are some who speculate that the June 2019 Core Update is primarily to give a boost to Google ads as with this update the ads do not stand out as ads and appear to be a part of organic search, which simply means more clicks.

Ads will now be clicked more often because of their similar shape, size, and form to the organic search results.
With this new update, Google ads will prove to be more valuable and better to mold and strategize your ad campaigns in a direction that would help you reap greater benefits. Some of the metrics that you would now be able to record include:

– Impressions
– Clicks
– Average CPC
– Conversion Rate
– Conversions
– Cost

In addition to the above, you can also see the page that is linked with each of your landing pages. Furthermore, you can also check the mobile-friendliness of the page, while at the same time identify which of your landing pages are providing a better experience to users on mobile devices.

What You Can Do

Increasing your investment in Google ads seems to be the most plausible of the options. As Google is now making available to you so many metrics, you can literally test different communication combinations with a slight change in the product to see which one is resonating better with the users.
Also, now that Google ads appear similar to that of organic search results at least for mobile phones, you stand a better chance of getting more clicks.


As change is the only constant in the digital world, every update disrupts the status quo. However, the bright side is that you can always take advantage of the situation and try to make it work in your favor.

There are some updates for which there is no immediate solution, whereas some automatically begin to work for you. The key is simply to stay alert.

As Google responds to reality, it continues to advance and evolve with time. As the news of millions of fake business listings on Google Maps began to float, Google was quick to respond that it will do the needful to provide users with better results for their query.
What is expected in this case is that it might become difficult to list a business on Google Maps, or Google might even remove the existing listings considering them fake, which may also include genuine listings. So while it will surely be a cause of concern initially, but in the long run, it will help businesses.