Google Crawls Sitemap File Based On Update Frequency

Apr 21, 2010 • 8:06 am | comments (3) by | Filed Under Google Search Engine Optimization

A Google Webmaster Help thread asks how often does Google crawl your XML sitemap file. Google's JohnMu replied saying it depends on how often the file is updated.

John said:

Google's Sitemaps crawler usually reacts to the update frequency of your Sitemap files. If we find new content there every time we crawl, we may choose to crawl more frequently. If you can limit updates of the Sitemap files to daily (or whatever suits your sites best), that may help. Similarly, if you create a shared Sitemap file for these subdomains, that could help by limiting the number of requests we have to make for each subdomain -- you could let us know about the Sitemap file by mentioning it in your robots.txt file using a "Sitemap:" directive (the Sitemap file does not have to be on the same host or domain as the site itself). If we're generally crawling your sites too frequently, you can also set the crawl rate in Webmaster Tools for those sites.

In short, John said "Google's Sitemaps crawler usually reacts to the update frequency of your Sitemap files."

This is pretty much how Google decides to crawl your pages. If you update your pages frequently and you have a normal PageRank, Google will likely crawl your web site more often.

I should add that more web site crawling down not necessarily mean higher ranking in Google. It may be fresher content from your site being included in Google, but doesn't mean you will rank higher.

Forum discussion at Google Webmaster Help.

Previous story: Daily Search Forum Recap: April 20, 2010
Ninja Banner
blog comments powered by Disqus