Google Crawl Rate Errors With Sitemap Indexing

Apr 27, 2011 • 8:16 am | comments (1) by twitter Google+ | Filed Under Google Search Engine Optimization
 

googlebotOver the past few days there have been reports from Webmasters that changing the crawl rate in Google Webmaster Tools has prevented Google from indexing their Sitemap files.

A Google Webmaster Help thread has reports of this issue dating back five days with several webmasters complaining thereafter.

The error webmasters are seeing read:

We were not able to download your Sitemap file due to the crawl rate we are using for your server. For information on increasing crawl rate, please see our Help Center.

Why is this happening? It seems to only happen when you change the crawl rate to a manual setting but I cannot confirm that for sure.

One webmaster effected by this said:

I am having the same problem too. I got the message yesterday, and changed my crawl rate to manual and moved the bar all the way to the right to allow the fastest crawl rate, but I still the see the same "crawl rate problem" message as I saw yesterday.

So far Google has not addressed these concerns in the forum.

Forum discussion at Google Webmaster Help.

Update: Googler, JohnMu replied to the thread with more details. He said:

This message generally means that we'd like to crawl more (in this case, your Sitemap file) from your site if your server (or crawl-rate-setting) would allow it.

If you have a manually set crawl-rate in Webmaster Tools, you may wish to reset that back to "let Google determine my crawl rate," so that our systems can try to automatically raise it to match your server's accessibility (the manual setting is mostly to limit it even lower). Somewhat simplified: should we notice that crawling too much causes your server to slow down, we will generally reduce our crawl rate to avoid causing problems.

Should you notice that Googlebot is regularly crawling less than you would want, then you may want to consider these points:

* Work to reduce the number of crawlable URLs on your website. For example, if you have session-IDs in your URLs, or use complex, dynamically generated URLs, that will generate a large number of crawlable URLs. Normal canonicalization techniquest can generally help in a case like that: http://www.google.com/support/webmasters/bin/answer.py?answer=139066

* Check your Webmaster Tools crawl-stats to see if crawling of your site (and also -- if you have access -- other sites on the same server) is particularly slow, and then work with your hoster and/or web-developer to see if that can be improved.

* Use the "Report a problem with Googlebot" form behind the "learn more" link next to the crawl-rate settings. Keep in mind that if Googlebot is not crawling as much as you'd want due to technical issues (too many URLs being crawled and/or server issues), then we'd really recommend fixing those first.

Hope this helps! Feel free to post back should you have any questions.

Previous story: Google Webmaster Tools Sitemaps Report Bug
 

Comments:

Googlebot In Leet

04/29/2011 09:07 am

Why do so many SEO's have problems with Google while I can but only embrace them? I honestly feel that there are more sour grapes out there than legit webmasters. While Google's webmaster tools does have its inaccurate issues I've never ever had a problem with submitting a sitemap to them. This is but only one of the millions of issues of read about where I've never been affected. PS. I've never been affected by the Panda update either...

blog comments powered by Disqus