Popular Websites that May Fail in Mobile First Indexing

Google is enabling mobile-first indexing for more and more websites. Many webmasters have already received notifications in their Google Search Console.

All of this means that Google will look at the mobile version of a website and that version will now be indexed and used for ranking.

“Mobile-first indexing is a matter of when not if. It will happen to your site sooner or later, and you need to be ready for it [emphasis mine].” wrote Barry Adams in his Mobile SERP survival: Technical SEO checklist guide.

Below, I am presenting some examples of popular websites (ToysRus, IKEA, Gearbest, AliExpress, Live.com) that may lose their rankings if Google enables mobile-first indexing for them.

When Mobile Googlebot Can’t Enter the Second Page of Pagination

I’ve found some websites (IKEA, Aliexpress, Gearbest, ToysRus.ca) that let mobile Googlebot visit only the first page of pagination. Sometimes mobile Googlebot can access as little as 1% of products!

All of the websites above seem to have a proper link structure on desktop, and Google has no trouble following it. On the contrary, their mobile versions implement pagination in a way that makes it impossible for Google to discover. If Google enables mobile-first indexing for websites like IKEA, AliExpress, or Gearbest, they will suffer.

The risk: some product pages will be deindexed or ranked low.

In order to let Google index your products and rank them high, you should ensure there are internal links pointing to them. Simple enough?

But it’s not always the case…

When you open the following websites on your cell phone:

and want to see more products, you can just click on the “Load more” button.

However, mobile Googlebot CAN’T do it for the above websites. It simply can’t access the products from the second page of pagination and beyond. Here’s why: these websites require user interaction (user click, scroll) on their mobile versions to get to the next portion of products from the server. When you click on “Inspect” to see the rendered DOM, you will notice there are no links to the next page of the pagination that Google may follow.

It works like this:

  1. Users click on the “Load more” button
  2. Then the next set of products is obtained from the server
  3. Finally, users see these products.

But Googlebot is a “lazy” user. It doesn’t scroll or click. So if you require user interaction (user click, scroll) and don’t provide any links, Googlebot simply won’t be able to access products from the second page of pagination and beyond.

Is it Really Dangerous?

Yes. Let’s assume that an online retailer has 200 categories, with 100 products per category on average, and Google is able to see just 25 of them. So, there are 20k products of which Google can access only 5k (25%). AliExpress and Gearbest cases (as you will see later in the article) are even more extreme!

These Kind of Issues are Difficult to Spot

If you’re looking at SEO organic visibility charts, you may not notice that something wrong has happened. It’s a little tricky. The most popular pages (as Homepage and product listings) will still rank high. No magic here.

However, 75% of product pages may either rank low or be no-indexed in such a case. This will cause a drop in sales, but you may not spot it looking solely at visibility charts.

An important note: it should be stressed that Google may still access orphan pages via sitemaps (but usually Google will not rank these pages high).


AliExpress is one of the most popular online marketplaces worldwide.

Risk: If Google enables mobile-first indexing for them, Google will see only 25 products per category.

Navigate to the mobile version of Aliexpress.com and choose one of the categories. For instance: Mobile Phones.

There are 1100 products in this category. But mobile Googlebot can see only 20 of them. That’s as low as 2%!

Why does mobile Googlebot see only 2% of the products on AliExpress? There are simply no links that Googlebot can follow to access beyond the first page of pagination.

The next page of products can be downloaded from AliExpress’s server if, and only if, somebody clicks on the “View more” button. As I mentioned earlier, Googlebot is a “lazy” user, it doesn’t click any buttons. Definitely, AliExpress needs to catch up here.

You may say, “Ok, but there are some subcategories/tags (like Android, 1920×1080, Dual SIM Cards), and Google can visit these to discover more products.”

Unfortunately not, as these links are also implemented as JavaScript events, and Google is unable to discover them.

So, if mobile-first indexing is enabled for AliExpress.com, Google will be able to see just a tiny fraction of their products.

Of course, Aliexpress is not an exception here. There are many websites (like Ikea.com or Gearbest – one of the main competitors of AliExpress) that allow mobile Googlebot to see just the first page of pagination.

How to Implement Pagination and Lazy Loading Properly

It may sound obvious, but if you implement pagination or some sort of lazy loading, you have to make sure that Google can deal with it. There are plenty of ways to accomplish this.

How to Deal with a “Tap to See More” Button

  1. You can toggle the CSS visibility (all the products – from the first page of pagination to the last are already placed in the DOM, you just make them visible on the screen). I don’t recommend this solution, though.
  2. You can use the traditional solution of clicking on the “tap to see more” button and navigating users to another page. This is fine as far as SEO is concerned, however, developers try to avoid it because it provides a bad user experience. The browser has to reload the entire page, which takes much more time and consumes a lot of data which is crucial for mobile users.
  3. You can implement a solution similar to the New York Post (see below).

How is it implemented in the New York Post?

Open https://nypost.com/business/page/4 and scroll down to the bottom of the page. Then click on the “See more stories” button.

You can see that:

    1. Users don’t need to reload the page to see more news (which is good for the user experience).  
    2. Googlebot gets proper signals: + pointing to the next page of the pagination.

It’s a creative and interesting solution. Beneficial for both users and Google.

I strongly recommend you take a look at the approach demonstrated by John Mueller a couple of years ago. Infinite scroll was implemented, but there are additional links to the subsequent pages of pagination that are apparent both for users and search bots.


How to Deal with Infinite Scrolling

If you’re implementing infinite scrolling, you can use the pushState function from the HTML5 History API and add link rel=” nex” to point to the next URLs of the pagination.

However, the latest observation from Kyle Blanchette from Botify seems to show that using solely is a weak signal for Google and should be strengthened by traditional tags.



The ideal solution is to combine with tags.

Edit: Now it’s confirmed by John Mueller that in such situations, the Make Sure Google Can See Lazy-loaded Content from Google’s documentation is helpful.

Less Content on Mobile than on Desktop

Let’s move to another category of issues: less content on mobile. There are plenty of websites that have virtually no content on mobile.

Live.com is a great example.

Mobile users (and Googlebot Mobile) can see onlyOutlook’ss logo, a Sign in button, links to Google Play, and a Sign Up page. No real content here.

Of course, their desktop website is more robust:

Is this a real issue?

Yes. If Live.com is switched to mobile-first indexing, Google will be indexing their mobile version. As a result, Google will index a website with no real content, which in the end may lose its organic visibility.

There are Many Websites that Make Similar Mistakes.

Don’t get me wrong; this is not a list of shame. I just wanted to show you that even big brands are making mistakes. There are plenty of other websites with similar issues.

You can find many websites in which Googlecan’tt access menu links or content that is hidden under the tabs.

I wouldn’t blame their internal SEOs or their developers. Search Engine Optimization is becoming more and more complex every single day. New challenges are to come. JavaScript SEO is a very hot topic these days. Interesting, yet complicated.

Internal SEOs may struggle with topics like JavaScript SEO or crawl budget optimization. While they are great specialists, usually, they don’t specialize in this particular niche.

I wrote an article illustrating why popular websites like Giphy, Pinterest, and Instagram lost their organic traffic. The issues facing those sites, and the problems discussed here, I believe, illustrate why it is worth investing in technical SEO services.