Google: Persistent 5xx Result In Slower Crawling But Won't Say How Many URLs Required

Jan 2, 2020 - 7:50 am 3 by

Google Error

Google's John Mueller said that persistent 5xx error responses to Google's requests by your server would result in Google slowing its crawl of your web site. John wouldn't say if there is a specific percentage of your site's URLs that are required or a specific number. But he just said if Google sees persistent 5xx errors, Google will slow how it crawls your site.

He said this in a Twitter thread. "Persistent 5xx's would cause us to crawl slower than usual," he said. He added that "Persistent errors can mask real errors though, so I'd clean that up."

There is no limit to the number, he said "There is no limit, we'd just crawl slower." But crawling slower might not be a bad thing he said "For some sites, crawling slowly is also fine. For others (eg, lots of updates), you might not be so happy with that. In short, .... it depends."

Here are those tweets:

I say, if you see 5xx errors you probably want to fix them.

Forum discussion at Twitter.


Popular Categories

The Pulse of the search community


Search Video Recaps

Google Weekend Volatility, Google On Search Leak, Elizabeth Tucker Interview & Apple Intelligence - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles


Google Search Bug: Not Indexing or Serving New Content?

Jun 20, 2024 - 6:56 pm
Search Forum Recap

Daily Search Forum Recap: June 20, 2024

Jun 20, 2024 - 10:00 am
Google Search Engine Optimization

Google Warns On Using JavaScript For Structured Data

Jun 20, 2024 - 7:51 am

Most Google People Also Ask Responses Are Wikipedia

Jun 20, 2024 - 7:41 am
Google Maps

Google Map Pin Exploit Leads To Local Rankings Drop & Possible Suspension

Jun 20, 2024 - 7:31 am

Google Search Tests Latest Deals With Deals In Past 24 Hours

Jun 20, 2024 - 7:21 am
Previous Story: Google Video Spam Through Text-To-Speech Technology