How the New Google Search Console Makes Life Better for SEOs

According to Search Engine Land, most people should already have access to the new beta version of Google Search Console. If you didn’t receive a notification yet, visit the search console and type your website address in the search property field, and – Voila!

As an SEO you should definitely try it out! It gives you a lot of interesting insight into how Google crawls and indexes your domain.

In this article, I’m going to comb through Google Search Console’s beta version and point out some of its best new features. It should be noted, however, that at the time of writing this, the beta version has only been available for a few months and Google is more than likely to add more reports and features. While I will do my best to update this article, I can’t guarantee that Google will not remove or merge some of the reports.

Let’s get started…

GSC now offers 16 months of data!

This is a game-changer! The previous GSC only offered data from the past 90 days. For many SEOs it was literally not enough. For example, it was not possible to compare the data quarter to quarter or see if you received more clicks than a year ago. Now, thankfully, it’s more than possible and should open up a whole new perspective for SEOs.

An Updated Performance section

While you can see that the Performance section looks very similar to the old version of Google Search Console (you are able to access it by clicking Search traffic -> Search Analytics), now it has an updated UI and provides information from the last 16 months.

In the event that you aren’t familiar with this section, it provides information about what queries brought Google users to your website and highlights the most popular landing pages; as well as information on impressions, CTRs, and positions.

You can easily filter the data ranges. So, for example, you can view only the data from the previous half-year.

If you scroll down the Performance section, you can see the most popular queries, landing pages, countries, and devices:

Index coverage – the new section to rule them all!

For me, the most important section that was added to the new GSC is Index Coverage. It can give you invaluable insight into why Google decided not to index some of the URLs on your website. It’s also a vault of information about the canonical versions of your content.

Below, I am presenting the list of reports provided by the Index Coverage section:

Google decided to divide the Index Coverage section into four parts:

  1. Error pages
  2. Valid with warnings
  3. Valid
  4. Excluded (URLs not indexed in Google)

You can easily toggle the categories. If you only want to see the error and excluded pages, just toggle the Error and Excluded button to see only the data related to them.

What is very useful is that you can filter the results and see only the issues related to the websites listed in a particular sitemap.

The list of indexed pages

Now, Google Search Console gives you a list of indexed pages.

For example, let’s suppose a shop manager added 4 products in a new category called “garden” to an online shop. How do you check if these products are indexed?

In the new Google Search Console, you can go to Index Coverage -> Submitted and indexed and filter your results to URLs containing “/garden”.

Pages with redirects

If you have a big website, the crawl budget is very important.

However, on the screenshot above you can see Googlebot visited 5 mln URLs with a 301 status.

If Googlebot has to visit over 5 million 301s, it definitely will slow down the crawling process.

Advice: If you see a lot of URLs with redirects in the Google Search Console reports, perform a full crawl of your website and analyze the redirect chains. Try to reduce them.

For example, in Screaming Frog you can click on Reports -> Redirect Chains.

Please note that by default, Screaming Frog doesn’t follow redirects. If you want to see the redirect chain report, before you start a crawl, you should go to Spider -> Advanced -> and tick “Always follow redirects”.

URLs excluded by “noindex” tag

It’s a known fact that Googlebot crawls pages that have”>.

As an SEO you should encourage Googlebot to crawl valuable, indexable pages. if you see that Googlebot is strongly focused on crawling no-indexed pages, it might be a sign that you need to review your internal linking strategy.

There is another reason why should you check this report regularly: to make sure you didn’t exclude any valuable pages from indexing.

URLs Blocked by robots.txt

Unfortunately, it’s common that developers mistakenly block some parts of a website by adding some rules in robots.txt.

The most disastrous rule that may happen to you is:

User-agent:*

Disallow:*

But it can be much more subtle as well.

By auditing the “Blocked by robots.txt” report you can easily spot such URLs.

Also, you can review the “Indexed, though blocked by robots.txt“ report. Go through the list of URLs and try to spot the URLs that were accidentally blocked by robots.txt.

Indexed, not submitted in sitemap

The “Indexed, not submitted in sitemap” is another interesting report that never existed before. 

Why is this report really important to you?

  • You can easily spot low-quality landing pages that should never be indexed in Google.  If your sitemap contains links to all the valuable resources, some of the URLs listed in the “Indexed, not submitted in sitemap” section are PROBABLY irrelevant.
  • You can spot the URLs that should be added to a sitemap. Google recommends adding links to all the valuable resources in a sitemap.  It makes the crawling process easier.

Discovered – currently not indexed

In the “Discovered – currently not indexed” section you can see which links were discovered but not yet crawled by Googlebot.

Let’s suppose a few weeks ago your client added a bunch of products, but these are still not indexed. In such a case you can analyze the following sections in GSC:

  • “Discovered – currently not indexed”
  • “Crawled – currently not indexed”

If these products are not listed here, it may indicate that Googlebot is struggling with your site structure.  To make things clearer, I recommend reading Don’t Let Crawlers Get Lost on Your Website.

Google Search Console documentation mentions “Queued for crawling” which is a similar report; however, I didn’t see this report in Google Search Console.

Duplicate page without canonical tag

“Duplicate page without canonical tag” is a really nice report that shows the URLs categorized as duplicates.

Take some sample URLs and perform an “info:” query to check if Google correctly deduplicates your content. I saw some examples where the duplication was not working properly.

There were 3 URLs with the same content:

  1. http://example.com
  2. http://example.com?parameter1=x
  3. http://example.com?parameter1=x;parameter2=y

None of them had the canonical tag implemented. For some reason, Googlebot picked up the second URL (http://example.com?parameter1=x) as the main version (but not the first one).

You can fix such an issue by introducing proper canonicals or setting these parameters in the old Google Search Console (unfortunately, the URL parameters section has not been migrated to the new GSC yet).

Google chose a different canonical than user

As you may know, a canonical tag is just a hint. It may or may not be respected by Google.

And this report is huge.  Now you have a list of URLs in the event of Google deciding not to respect the canonical tags pointing to them.

Update: Mobile Usability report

Recently, Google introduced the Mobile Usability report in the new Google Search Console.  Now, you can easily spot which URLs have content wider than the screen, and present text that is too small to read.

Fixing issues reported by Mobile Usability Report may help you provide a good user experience for your audience.  

Does the Mobile Usability Report remind you something?

If so, you’re right.  The list of issues is the same for Mobile Usability Tool and Mobile  See the table below for reference:

Update: URL Inspection Tool – One to Rule them All

There is another great feature of Google Search Console. I’m speaking about the URL Inspection Tool.

It’s a vault of information that can help you discover why a particular URL is indexed, while another isn’t.

See the example below:

As you probably noticed, in this case, Google didn’t respect the canonical tag and indexed the inspected URL.

URL inspection tool will also inform you if a:

  • URL was blocked by robots.txt
  • URL was not selected as a canonical
  • URL was discovered but not indexed
  • URL was indexed, but not submitted in a sitemap.

URL inspection shows the latest info

The biggest advantage of Inspect URL is that it shows you the most recent information.

“The Inspect URL tool shows the latest info (doesn’t need to wait for the aggregate report to be updated)” – said  John Mueller, Webmaster Trends Analyst at Google.

Other changes:

Examples of missing tools/report include:

  • Disavow tool
  • Internal links section. Update: it was added recently
  • Messages section
  • Remove URLs
  • Crawl statistics
  • Sitemaps indexing report. Update: it was introduced recently in the new Google Search Console.
  • URL Parameters
  • Security Issues
  • Fetch and Render

It’s not all roses…

GSC OFFERS 1000 ROWS

While it’s easy to praise the console, there are some things that still need to be addressed.

For instance, Google Search Console still only shows you a sample of the URLs that have issues – up to 1000 events. So if your website has a really big structure, 100k+/1mln+ URLs, the reports may not reflect your entire website.

For now, you can partially omit it with proper filtering:

Google is planning to release the new Google Search Console API in the near future. It may allow for exporting more than 1000 rows.

UpdateFinally, Google announced that their Search Console API can let you access to 16 months of Search Analytics data!

THERE’S STILL A…DELAY

Another thing is the delay Google Search Console takes when informing you of a problem. This is why it’s always good to be proactive and regularly perform a crawl of your website to check if it’s still in good condition. There are so many crawlers you can use, such as Screaming Frog, Deepcrawl, Ryte, OnCrawl, and SiteBulb.

If you accidentally added a noindex tag, the crawler will inform you about it.

If you deal with a specific situation like website migration, you definitely shouldn’t wait for Google Search Console to tell you about any issues that Googlebot encountered. Instead, you should perform crawls and analyze server logs to see if some 404 errors occurred.

GOOGLE SEARCH CONSOLE IS STILL IN BETA

You should also keep in mind that the new Google Search Console is in beta, so it may contain some bugs.

If you see some anomalies in your data, temporarily switch to the old version and see if the issue is related to GSC or to your website.

SOME REPORTS AND TOOLS ARE STILL MISSING

As of this writing, the new Google Search Console doesn’t contain all the reports offered in the old Google Search Console. Examples of missing tools/report include:

  • Disavow tool
  • Internal links section
  • Messages section
  • Remove URLs
  • Crawl statistics
  • Sitemaps indexation report
  • URL Parameters
  • Security Issues

Google claims that the most likely reason for missing reports is that they haven’t migrated them yet. Google intends to introduce them in the future.

Currently, Google is collecting feedback from webmasters and SEOs. You can vote in Which report do you most need from old Search Console?

Bonus:

Although the new Google Search Console has made many improvements, there are still better tools to visualize data.

Recently, Aleyda Solis shared her Google Search Console template for Google Data Studio with the SEO community.

All you need to do is simply copy the template and connect with your website’s Google Search Console. That’s it!

By using Aleyda’s template for Google Data Studio, you can:

  • See the list of queries and landing pages that brought users to your website.
  • See many more rows of data than in Google Search Console, which is limited to 1000 URLs. Additionally, you can easily download the data, without any limitations.
  • Easily filter reports by date, query, landing page, and device category.
  • See the most important KPIs and stats.

Wrapping Up

If none of the things I mentioned in this article sound interesting to you (or you simply hate life!), don’t worry, you can still switch to the old version. All you need to do is click on “Go to the old version” at the top-left corner:

As Google stated, this option will be available until the final version of the new GSC will be made available: “Until the new Search Console is complete, both versions will live side-by-side and will be easily interconnected via links in the navigation bar, so you can use both.

But if you are one of those people who love life, then you should definitely take advantage of the data offered in the new Google Search Console as an SEO professional. This can provide invaluable insight into why Google doesn’t index some of your pages.

Now it’s time for you to play around with all the reports and familiarize yourself with them. Have fun 🙂