On November 30th, 2021, Bartosz Góralewicz and Tomasz Rudzki sat down with Fabrice Canel, the Principal Program Manager at Bing, to discuss the challenges search engines face right now – how to keep crawling and indexing the web that’s constantly getting heavier and bigger?
If you want to know how the whole conversation went, you can watch the webinar here.
In this article, I want to present the most exciting insights and announcements from the webinar.
IndexNow helps optimize the crawl budget
The conversation started with a popular topic in the SEO industry right now – what problem is IndexNow trying to solve?
Search engines have very little clue what’s going on on the internet. So they need to keep crawling the pages to discover if there’s any updated or new content. As a result, they might waste the crawl budget on low-quality pages or pages that haven’t been updated.
Search engines don’t know when you’re going to post your next blog article – but you do! Notifying them about new content on your site brings mutual benefits – your content can appear quicker in SERP, and search engines can reduce the amount of resources they need to crawl and index a website.
The solution for pinging the search engines is not a new idea. A few years ago, webmasters were able to submit URLs to search engines. The problem was that it was easy to just spam search engines with unnecessary submissions, so it needed improvement.
Compared to the old solution with a simple ping sent to the search engines, IndexNow adds a layer of trust. You need to generate an API key and host it on your website. This way, search engines can verify that the submitted URL belongs to you, minimizing the noise in the resulting data.
Sitemaps are not enough
If you’re wondering about why well-optimized sitemaps aren’t enough to support crawling and why we still need IndexNow, then the answer is: sitemaps are not enough!
So IndexNow helps you push your content to search engines when it’s time-sensitive. This is especially important for news websites, but it’s particularly important for any website to get new content indexed as quickly as possible.
Additionally, sitemaps are great for listing the URLs but not so much for updating the content. The lastmod tag is often being abused and set up incorrectly.
IndexNow is less noisy than sitemaps. Fabrice stated that the data coming from IndexNow is very promising, and it proved to guide the bots to high-quality content and minimize crawling.
One of the things to remember is that if you’re going to submit your URL even though it hasn’t changed, Bing might lose the trust and stop coming to crawl the page after you use IndexNow.
All this doesn’t mean that you should stop using XML sitemaps. Fabrice pointed out that sitemaps work particularly well when paired with using IndexNow – it’s not like you should use one or the other.
Cloudflare is now supporting IndexNow
During the webinar, Fabrice announced that Cloudflare is now supporting IndexNow.
Cloudflare has a feature called Crawler Hints, which provides data for search engines about changes in your content. And now, Crawler Hints support IndexNow and can directly notify search engines about any new, updated, or deleted content via this new protocol.
All you need to do is enable Crawler Hints on your website with one simple click, and Cloudflare will take care of the rest! You can find more information on how to turn on Crawler Hints here.
IndexNow doesn’t give a ranking boost
Bartosz asked Fabrice about the fact that many domains that started using IndexNow seem to get boosted traffic from Bing now. But no, IndexNow doesn’t give you any ranking boost.
However, with the amount of Internet content that’s being continuously created, the sooner you get indexed, the more chances you have to appear higher in SERPs.
It’s especially important for sites with news sections. It’s crucial to be indexed as soon as possible. If search engines are late with crawling and indexing your content, you might fall behind your competition.
That’s why websites that use IndexNow might do better in SERPs. Webmasters know best when their content has changed and can notify search engines to crawl and index the page quickly.
IndexNow frees up resources needed for rendering
As Fabrice mentioned in the webinar, rendering is vital for search engines to understand the content. That’s why Bing strives to render as much content as possible.
Future of search engines collaboration
In the past, search engines collaborated on essential initiatives like schema.org or the robots.txt protocol.
During the webinar, Fabrice was asked about potential areas where such cooperation may occur in the near future. He pointed to the issue of hreflang tags and how they are responsible for generating trillions of redundant URLs that generate waste for search engines and the internet overall.
Fabrice said it’s his desire to work with other search engines on finding new ways to target international markets that would also allow search engines to consolidate signals to a single page for one language. This would have to involve a solution for website owners to direct users of the same language to content aimed at specific markets (e.g., Spanish speakers in Argentina and Uruguay would land on the same page and then be directed to content specific to Argentina and Uruguay).
Fabrice summed up that there should be no need to optimize your website for specific search engines in the future. Instead, search engines should follow common protocols and create solutions to improve the current limitations and benefit the industry.