Google: Persistent 5xx Result In Slower Crawling But Won't Say How Many URLs Required

Jan 2, 2020 - 7:50 am 3 by

Google Error

Google's John Mueller said that persistent 5xx error responses to Google's requests by your server would result in Google slowing its crawl of your web site. John wouldn't say if there is a specific percentage of your site's URLs that are required or a specific number. But he just said if Google sees persistent 5xx errors, Google will slow how it crawls your site.

He said this in a Twitter thread. "Persistent 5xx's would cause us to crawl slower than usual," he said. He added that "Persistent errors can mask real errors though, so I'd clean that up."

There is no limit to the number, he said "There is no limit, we'd just crawl slower." But crawling slower might not be a bad thing he said "For some sites, crawling slowly is also fine. For others (eg, lots of updates), you might not be so happy with that. In short, .... it depends."

Here are those tweets:

I say, if you see 5xx errors you probably want to fix them.

Forum discussion at Twitter.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Gvolatility, Bing Generative Search, Reddit Blocks Bing, Sticky Cookies, AI Overview Ads & SearchGPT - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: July 26, 2024

Jul 26, 2024 - 10:00 am
Search Video Recaps

Google Volatility, Bing Generative Search, Reddit Blocks Bing, Sticky Cookies, AI Overview Ads & SearchGPT

Jul 26, 2024 - 8:01 am
Google

Google Gemini Adds Related Content & Verification Links

Jul 26, 2024 - 7:51 am
Other Search Engines

SearchGPT - OpenAI's AI Search Tool

Jul 26, 2024 - 7:41 am
Search Engine Optimization

Google's John Mueller: Don't Use LLMs For SEO Advice

Jul 26, 2024 - 7:31 am
Google

Google Search With Related Images Carousel Below Image Box

Jul 26, 2024 - 7:21 am
Previous Story: Google Video Spam Through Text-To-Speech Technology