In a recent statement on LinkedIn, Google Analyst Gary Illyes shared his mission for the year: to figure out how to crawl the web even less.
This comes on the heels of a Reddit post discussing the perception that Google is crawling less than in previous years.
While Illyes clarifies that Google is crawling roughly the same amount, he emphasizes the need for more intelligent scheduling and a focus on URLs that are more likely to deserve crawling.
Illyes’ statement aligns with the ongoing discussion among SEO professionals about the concept of a “crawl budget,” which assumes that sites must stay within a limited number of pages that search engines can crawl daily to get their pages indexed.
However, Google’s Search Relations team recently debunked this misconception in a podcast, explaining how Google prioritizes crawling based on various factors.
Crawling Prioritization & Search Demand
In a podcast published two weeks ago, Illyes explained how Google decides how much to crawl:
“If search demand goes down, then that also correlates to the crawl limit going down.”
While he didn’t provide a clear definition of “search demand,” it likely refers to search query demand from Google’s perspective. In other words, if there is a decrease in searches for a particular topic, Google may have less reason to crawl websites related to that topic.
Illyes also emphasized the importance of convincing search engines that a website’s content is worth fetching.
“If you want to increase how much we crawl, then you somehow have to convince search that your stuff is worth fetching, which is basically what the scheduler is listening to.”
Although Illyes didn’t elaborate on how to achieve this, one interpretation could be to ensure that content remains relevant to user trends and stays up to date.
Focus On Quality
Google previously clarified that a fixed “crawl budget” is largely a myth.
Instead, the search engine’s crawling decisions are dynamic and driven by content quality.
As Illyes put it:
“Scheduling is very dynamic. As soon as we get the signals back from search indexing that the quality of the content has increased across this many URLs, we would just start turning up demand.”
The Way Forward
Illyes’ mission to improve crawling efficiency by reducing the amount of crawling and bytes on wire is a step towards a more sustainable and practical web.
As he seeks input from the community, Illyes invites suggestions for interesting internet drafts or standards from IETF or other standards bodies that could contribute to this effort.
“Decreasing crawling without sacrificing crawl-quality would benefit everyone,” he concludes.
Why SEJ Cares
Illyes’ statement on reducing crawling reinforces the need to focus on quality and relevance. SEO isn’t just about technical optimizations but also about creating valuable, user-centric content that satisfies search demand.
By understanding the dynamic nature of Google’s crawling decisions, we can all make more informed choices when optimizing our websites and allocating resources.
How This Can Help You
With the knowledge shared by Illyes, there are several actionable steps you can take:
- Prioritize quality: Focus on creating high-quality, relevant, and engaging content that satisfies user intent and aligns with current search demand.
- Keep content current: Regularly update and refresh your content to ensure it remains valuable to your target audience.
- Monitor search demand trends: Adapt your content strategy to address emerging trends and topics, ensuring your website remains relevant and worthy of crawling.
- Implement technical best practices: Ensure your website has a clean, well-structured architecture and a robust internal linking strategy to facilitate efficient crawling and indexing.
As you refine your SEO strategies, remember the key takeaways from Illyes’ statements and the insights Google’s Search Relations team provided.
With these insights, you’ll be equipped to succeed if and when Google reduces crawling frequency.
Featured Image: Skorzewiak/Shutterstock