If you build or manage a business website, your normal day at work probably involves monitoring the website’s performance, looking at various KPIs and comparing how it’s doing compared to the competition.
1. You update your website with new products
Every day, you probably add many new pages to your website. Whether these are product pages, articles or news, they are probably fairly similar:
- There’s the main body containing text, photos and maybe a video,
- There are links to related pages,
- There is a comment or review section.
After your new pages are published, all you have to do is eagerly wait for Google to discover them and start showing them to potential customers.
2. Google crawls your website without seeing all the links
This is where things may go downhill.
Google uses complex software known as Googlebot. It’s a system of algorithms that discover new links on the internet and follow them to crawl the pages that they point to. Once Googlebot knows what the page is about, the page is sent to Google’s index.
This sounds fairly simple, right? But you need to not only factor in the millions of new links that need to be followed every day. The pages which have already been indexed also need to be recrawled in the event that any content is updated.
Due to limited resources, the search engine sets priorities for its crawling algorithms and must determine the number of pages to visit for every domain.
Google defines these boundaries as the crawl budget – the number of URLs Googlebot can and wants to crawl.
Limited resources mean that Googlebot prefers to first discover content that is easily accessible.
Because of the restrictions of the crawl budget, you shouldn’t expect your newly added pages to be fully rendered by Googlebot.
3. Googlebot only crawls part of the domain without finding products (valuable content)
If it’s a product page, Googlebot doesn’t discover the product description or the photos; and if it’s an article, it can’t see the entire article (despite what the image says below!)!
To summarize, when Googlebot can’t discover the most important parts of your website, it will spend the entire crawl budget crawling through low-quality content instead. If that happens, you shouldn’t be surprised that Google won’t recognize the value you provide for the users.
4. Google is confused and the crawler budget falls
5. The crawl budget is too low to render JS
This is where we come full circle.
As a consequence, your crawl budget is further lowered.
How can you break the cycle?
Here’s another cycle to consider:
- With a low crawl budget, your website may become more difficult to crawl.
- Entire pages from your website may not get discovered and indexed for weeks or even longer.
- Your customers may never discover your new content in the search results.
- You may lose money.
From the technical SEO perspective, it’s painful to watch businesses continue to spend money on online advertising while their websites aren’t even fully indexed in Google.
If you think your website is trapped in a vicious cycle, Onely can provide your team with the solutions your website needs to break free.