SEO Office Hours – June 4th, 2021

seo-office-hours-june-4th-2021

This is a summary of the most interesting questions and answers from the Google SEO Office Hours with John Mueller on June 4, 2021. 

Googlebot and cookie consent

00:56 – “Is it really important for Googlebot to be able to see a cookie consent message that we serve to users? Because we kind of decided to serve it upon user interaction, so Googlebot won’t be able to see it. I wonder if that can cause us any problems in terms of Google-friendliness?”

John said, “In general, that should be fine because Googlebot doesn’t really need to see a cookie banner. Googlebot also doesn’t keep cookies, so it wouldn’t accept cookies anyway, if you gave it to Googlebot […] Lots of sites serve the cookie consent banner to Googlebot, as well, just because they serve it to everyone […] The important part is, essentially, that Googlebot is not (…) blocked from crawling the website, so that you don’t have a kind of an interstitial that blocks the access to the rest of the content”.

Keywords in SEO vs. natural language processing

02:15 – “I have a directive to use keywords, specifically target keywords, in meta tags, use it here in the h1, use it this many times in a piece of content. And that really just seems outdated to me, especially with all the advances in semantic search and all the cool MUM and all that other stuff that’s coming down the pipe […] Do you think that that’s still a legitimate SEO tactic, or should we not be focused on using this particular keyword this many times on a page?”

According to John, “(…) The number of times that you use a keyword on a page, I don’t think that really matters or makes sense. When you’re writing naturally, usually that resolves itself automatically. And also, with regards to the individual keywords, I think that’s something where I wouldn’t disregard it completely, but at the same time, I wouldn’t over-focus on exact keywords. So, in particular, things like singular and plural, or kind of like the different ways of writing individual words. That’s something that you probably don’t need to worry about”.

“But mentioning what your site is about and kind of what you want to be found for, that’s something I would still do. So, in particular, what we sometimes see when we look at things like news articles, if a news site doesn’t really understand SEO, they might write in a way that is more, I don’t know, almost like literature. You read it, and you kind of understand what it means, but the exact words that are used on a page don’t really map to exactly that topic.

So that’s something where, from an SEO point of view, if there’s something you want to rank for, I would still mention that on a page. I wouldn’t go overboard with the number of mentions. I wouldn’t go overboard with all of the synonyms and different ways of writing it, but mentioning it at least once definitely makes sense.”

URL removal tool and canonicalization

04:47 – “I have a question regarding the URL removal tool. So my question is, if you use that tool, does it only affect the canonical version of the URL, since I guess this is affecting the index you’re using for publishing your search results. So, does it only affect the canonical version, or does it affect the entire duplicate content cluster, which this canonical is a part of? So, for instance, if I write in a URL, which, in my opinion, is the one to be excluded, but is basically a non-canonical variant, what happens in that situation?”

John’s answer was, “We don’t consider the canonical at all when it comes to URL removals, but rather we match it one-to-one exactly as you submitted it. And we include the HTTP, HTTPS, and WWW, non-WWW versions of that URL.

And essentially, what happens is we don’t remove it from our index. So the whole indexing side stays the same. The crawling side stays the same. We just don’t show it in the search results”.

Here are some articles for your further reading

If your pages respond with the “Blocked by page removal tool” status in Google Search Console, read how the URL removal tool works in the wild and how to fix this issue.

Also, if you struggle with optimizing duplicate content and your canonical signals get ignored by Google, read our article on how to fix “Duplicate, Google chose different canonical than user.”

Duplicate above-the-fold content

06:30 – “My question relates to templated content above the fold from a mobile-first perspective […] Each of [our] topic landing pages, they’re unique URLs, unique titles, but they have that same header and just above the fold content […] Is it detrimental to non-brand rankings for those topic landing pages?”

John said, “So the important part for us is really that there is some amount of unique content in the above-the-fold area. So if you have a banner on top, and you have a generic URL image on top, that’s totally fine. But some of the above-the-fold content should be unique for that page. And that could be something like a heading that’s visible in a minimum case, but at least some of the above-the-fold content should be there.

[…] I mean, it’s probably also something where you want to look at how users interact with those pages afterwards. But that’s kind of more from a non-SEO perspective. But it’s always, I think, important for cases like that that you take a look and see what actually happens with the users afterward”.

A/B testing and cloaking

22:29 – “I know Googlebot doesn’t like cloaking on a site, but that’s essentially what happens when you use Google Analytics A/B testing, for example, because it reaches the content underneath (…) when it does the A/B testing. So anything that is optimized for SEO cannot use Google Analytics A/B testing. Is that right?”

John replied, “Kind of, kind of. So the important part for us with A/B testing is that the A/B testing is not a permanent situation and that the A/B testing is something where Googlebot essentially also falls into that A/B test. And essentially, we can kind of see what users are seeing. 

And when it comes to A/B tests, the important part for us is also that the purpose of the page remains equivalent. So if you’re A/B testing a landing page and you’re selling one product, then it shouldn’t be that, instead of selling a car, suddenly you’re selling a vacation, or a flight, or something like that. […] The cloaking side that is more problematic is if– I don’t know– you’re selling a car, and then when Googlebot looks at it, it’s showing a car. When a user looks at it, it’s going to a pharmacy. That’s usually kind of the more spammy cloaking that we worry about, where the webspam team would get involved.

And all of these subtle changes– also if you have device-specific changes on a page, that’s perfectly fine for us”.

Length of content in ranking

29:23 – “How important is the length of the blog? We are following the guideline of 300 plus words, and recently, I’ve read in many places that Google favors long-form content. So maybe we are missing on that front? We are writing shorter blogs?”

We don’t use the word count at all. So the number of words in your articles, that’s totally up to you. I think some people like to have a guideline with regards to the number of words, but that’s really an internal guideline for you, for your authors. It’s not something that we would use for SEO purposes”.

Disavowing spammy links

30:06 – “I have a question about Google Search Console. My website has around 120 external links, and about 40% of them are non-working Japanese domains. I have no idea where they came from, and what do I have to do with them?”

“Probably, you don’t need to do anything with them. If these are just random links from the internet, I would just ignore them. It’s not specific to Japanese links, but sometimes spammers include normal URLs within the URLs that they use to promote spam, and that means on random forums and blogs, they will drop these URLs, as well. And sometimes that ends up with a lot of links that are posted in non-English or in foreign-language content. And I’ve seen that a lot with Japanese, Chinese, Arabic, all kinds of languages”.

“If these are not links that you placed, that you bought […] then I would just ignore them.”

Need help with disavowing spammy backlinks?

Take advantage of link risk management to reevaluate your backlink strategy.

Separate landing pages for separate physical locations

33:17 – “Let’s say I have 500 physical shops that are selling my products, and I want to create, for each of them, a specific landing page. Would this be considered doorway pages?”

John said, “No, that would be essentially fine. It’d be kind of like having different products, because these are unique locations. These are physical locations. Having individual pages for them is perfectly fine. Sometimes it might make sense to combine these and put them on a shared page, such as if you have a lot of shops in specific countries. Maybe just list the shops there instead of individual pages per shop. But that’s totally up to you”.

Site structure and indexing issues

34:31 – “How might a poor site structure affect indexing? Some of my older articles are getting de-indexed, but these were articles on pages eight to nine of my blog. I’ve since created static category pages that make it easier for Google to keep track and find these pages. But what could be the indexing issues here?”

“It’s certainly the case that we don’t index all content of all websites on the web, so at some point, it can happen that we say, oh, these pages are not critical for the web […] So maybe we will drop them from our index so that we can focus more strongly on the important pages within your website. And that’s something where we use the internal site structure quite a bit to try to figure that out. If we understand that your site is important and we see, within your website, that you’re telling us this part of my website is important for you, then we will try to focus on that. Whereas if you’re saying these pages are also on my website and they’re linked, like you mentioned here, like eight to nine pages away from essentially your home page, then we might say, well, you don’t think these are important, so maybe we won’t focus so much on them. We’ll focus on the other ones.

So that’s something where your internal site structure can help us a little bit to understand that, but it’s definitely also the case that we just don’t index everything on all websites.”

Is disavowing links still necessary?

36:14 – “How much of an impact is disavowing links in bulk going to make? We recently disavowed many backlinks on our site. They were all HTTP sites with low domain authority. Many of them were comments with a link back to our site. However, we haven’t seen any positive improvement. There are currently no manual actions against us. Is disavowing not necessary these days?

“Probably, you could save yourself the effort of disavowing those links. So probably, they wouldn’t have any effect at all. I would mostly use a disavow links tool really for cases where you either have a manual action, or where you look at what you’ve done in the past, and you realize probably I will get a manual action if anyone from Google sees this.

And in those cases, I would go off, and try to disavow that and try to clean that up. But just generally, links from low domain authority sites or links that essentially are part of the web since years and years, I would not disavow those. I don’t think that makes any difference at all.”

Pagination and breadcrumbs in ranking

37:55 – “Does Googlebot still heed pagination and breadcrumbs, or does it affect the ranking? What’s the best practice?”

We use pagination and breadcrumbs as a way of understanding the site’s internal structure a little bit better. So that’s something that definitely still plays a role with regards to crawling and indexing a site. So if you have pagination and breadcrumbs on your site and you remove that, then that does mean that the internal structure of the site is now different.

A good way to kind of double-check how those factors play a role within your website is to use an external crawler, some kind of a tool that you can run across your website to kind of crawl the website the way that it is now, and then you could analyze from there, are these breadcrumb links the reason why these pages are being crawled or not? And then based on that, you can make decisions on whether or not to remove them, or to change them, or however you want to kind of modify things within your website.”

Setting up 301 redirects and timing

39:05 – “In a scenario where a group of URLs have been changed, but for some reason, the 301 redirects have not been set up right away, roughly how long is the time frame that you have to implement redirects to transfer the ranking authority from the old to the new pages and prevent ranking drops?”

Here’s what John said: “So it’s tricky, because there is no specific time for this, especially because there are different variations of this kind of problem situation that you have here. In particular, if the old content still exists and you’ve created a copy of that on a new URL, then in a case like that, we will treat those two URLs as being a part of the same cluster, and we’ll try to pick a canonical URL between those two URLs.

And it can happen that we switch over to your new URL for that. And if that’s the case, then, essentially, we will forward all of the signals from the old URL to the new URL automatically, even without a redirect in place. So in that scenario, you will probably not see a big difference if, at some point, later on, you add a redirect.

The main difference you would see is that it would be a lot clearer for us that you want the new URLs to be indexed and not the old URLs. So in that setup, you probably wouldn’t see a ranking change, but probably you would see that we would switch over to the new URLs a little bit more consistently.

In a situation where you delete the old URLs and just add the same content somewhere else on your website, then that’s something where we would essentially, in a first step, lose all of the information we would have about this page because suddenly it’s a 404. And we would treat the new page as being something new. And we would essentially say, well, there’s a new page here, and we would not have any connection between the old page and the new page.

Having 404 pages on your website?

Read our article on how to fix the “Not found (404)” in Google Search Console.

And that’s something where, at some point, we will drop the old page from our index and lose all of those signals. And if you wait too long and add a redirect much later, then those signals are already gone and that redirect is not going to forward anything anymore.

So that’s kind of– in that situation, where if you delete things and just move it somewhere else, then probably after a certain period of time– I don’t know how long that would be. Depends on the website– you would not see any improvement from adding redirects.

And in a case like that, it would, from my point of view, still make sense to start adding redirects there, just so that you’re sure that, if there is any small value still associated with those old URLs, then at least that is still forwarded over. So those are kind of the main scenarios there”.

Free listings on Google Shopping

42:36 – “Some insight into the optimization of free listings on Google Shopping? If we manually edit a description and title to match keywords that we’re interested in positioning our listings for, will we be penalized if these keywords do not appear on our site? For example, edit the free listing to include the word “low cost” or “cheap” in the description of a product like a gold-plated ring, where the words are not referenced on the reference domain for that listing.”

John admitted, he wasn’t sure. “My feeling– or my understanding is we do try to map the landing page with the products that you have in your Merchant Center feed. And if they don’t align, then we will have to make a call with regards to which of these versions we actually use.

So that’s something where I would generally recommend trying to make sure that these two versions align as much as possible so that you don’t give Google kind of this situation where you’re saying, this data doesn’t really match.”

Size of anchor text

44:12 – “Question about two different anchor texts to a unique URL. Which anchor link does Google value? Does Google value the size of the anchor text– or size that the anchor text takes up on the screen?”

John’s answer was: “I don’t think we have that defined at all. So those are the kind of things where, from our systems, it can happen that we pick one of these, and we try to understand which one is the most relevant one. But I wouldn’t make the assumption that we just naively take the first one on a page and just only use that, or only take the one that has the longest anchor text and ignore the other ones.

We essentially try to understand the site structure the way that a user might understand it and take that into account. And the way that this ambiguous situation is handled, that can also vary over time”.

Getting into the Top Stories carousel

45:12 – “We’re a news site, and we’re thinking about implementing AMP, but Google announced that AMP is not necessary to rank in the Top Stories carousel, and the AMP badge will be removed. So my understanding is that we need to focus on Core Web Vitals and fine-tune our website to make it fast and create high-quality content. Can you give us some more information on getting into Top Stories carousel?”

John said, “So yes, we did announce that AMP is no longer required for the Top Stories carousel. And instead, we will be focusing on things like the Core Web Vitals and the Page Experience factors to try to understand which pages we should be showing there. I think the important part with AMP is that it’s a really easy way to make pages extremely fast and to kind of make sure that you’re in an easy way, almost by default, achieving the metrics for the Core Web Vitals.

So that’s something where, if you’re trying to make your pages fast and you don’t know which framework to use, and maybe AMP has a good approach there, maybe that’s also something where you can take individual elements out of AMP and just reuse those on your pages, and then, over time, migrate more of your pages to the AMP framework. But I would see it more as a framework than a feature that you have to turn on or off.

So with that in mind, there are other ways you can make your pages really fast. You don’t have to use AMP, but sometimes using AMP is an easy way to do that, especially if you have something like WordPress. If your site is built on WordPress and you can just enable the AMP plugin, then sometimes that can kind of automatically shift your site over into the good side of the Core Web Vitals”.

Need some help with your Core Web Vitals? Explore our Core Web Vitals optimization services, and contact us.