Google Fetch As Googlebot Won't Crawl When Site Is Too Slow

Dec 23, 2013 • 8:30 am | comments (2) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Google Page SpeedA Google Webmaster Help thread has one webmaster complaining his site isn't being crawled by Google and isn't showing up in the search results. The reason, his site can't handle Googlebot crawling it.

The site is pretty static and basic but the server is a cheap or free host that can't handle much activity. So Googlebot can't crawl it without taking down the site and thus stays away until it can get through to crawl it without negatively impacting the site.

The interesting thing is that if you use the Fetch As Googlebot feature when this is the case, it will fail as well. So you can actually somewhat diagnose a major site speed issue with Fetch as Googlebot.

John Mueller from Google said:

Looking at your site, I do see that we'd like to crawl more from the server, but we're holding back because we think the server might not be able to handle the load. This is the reason why the Fetch as Google requests aren't making it through. In particular, we're seeing a fairly high response-time for URLs from the server, which often signals that the server is pretty busy even without us crawling.

Forum discussion at Google Webmaster Help.

Previous story: Google's Happy Holidays Logos Start Going Live
 

Comments:

Guest

12/23/2013 04:21 pm

Had this issue during some real high peaks, where web traffic was

Stuart David

12/23/2013 04:22 pm

Had this issue during some real high peaks over the years, where web traffic was OTT, Googlebot is really responsive to the high loads and back off to not pile even more pressure on., and then gently test the waters and build it back up. I am talking responsive in terms of minutes and not days as well, kudos to them for that sensitivity.

blog comments powered by Disqus