Google: The Increase In Crawl Errors Are Nothing To Worry About; Just Hungry Crawlers

Sep 9, 2016 • 7:32 am | comments (4) by twitter Google+ | Filed Under Google Search Engine Optimization

googlebot google algorithm

Over the past week or so, some webmasters have been reporting an increase in crawl errors. It is mostly documented in this WebmasterWorld thread.

Some suspect it has to do with an algorithm update but we've covered not just once but twice that these crawl changes in Google Search Console have no relation to upcoming algorithm updates - at least that is what Google told us.

John Mueller was asked about this by Simon in the Google Hangout, the last one using Google+, at the 24:07 mark into the video. John essentially said that he looked into some of these reports and yes, it is unrelated to any algorithm thing. He said in the cases he looked up, it was related to the algorithm saying they wanted to double check a lot of old URLs. So maybe GoogleBot felt a burst of energy and was eager to try out a bunch of old URLs.

Here was the question:

On the 3rd August we saw our crawl errors jump from 1,000 to 10,000 and then on the 1st September, we saw a jump from 10,000 to 50,000. Now, nothing is wrong with the site - these are all crawl errors from two, three, four years ago. And you mentioned to look where they're linked from, is it a sitemap file and so forth. But even the sitemap files referenced in the 'link from’ are two years old. And I'm just wondering - and I appreciate that you've already said that an increase in googlebot activity doesn't mean anything's happening or there's anything to worry about and I appreciate that this is information only and these crawl errors don't mean there's anything wrong although you may want to check if there is.

But is there any reason why people are seeing this at the moment? And this sudden increase - is it penguin related at all?

John said:


So I I've seen a few reports like that and I wanted to double check with the team on that as well.

We looked into a couple of these reports and they were essentially just our algorithm saying, oh I I have all these old URLs I just want to double check.

I’s weird that a lot of these are kind of jumbled together now and like like you mentioned that they're kind of all all on the same time. So I wonder if we will change anything specific there. But it in practice for from a webmasters point of view you don't need to take action on this. For us it's it's also a good good push to try to figure out how we can make these reports a little bit more actionable so that we don't scare people out.

Here is the video embed:

Forum discussion at Google+.

Previous story: Daily Search Forum Recap: September 8, 2016
blog comments powered by Disqus