Google: Do Not Block GoogleBot From Crawling 404s

Jul 15, 2020 - 7:57 am 0 by

Google Blocked

John Mueller of Google said it would be "a really bad idea which will cause all sorts of problems" if you block Google or other search engines from crawling pages that return a 404 server status code. He said "billions of 404 pages are crawled every day" by Google and it is normal.

One webmaster wrote that his "website automatically blocks user agents that get more than 10 404 errors, including Googlebot, so that's a problem." John responded to that that this is a really bad idea, he said "That sounds like a really bad idea which will cause all sorts of problems.. You can't avoid that Googlebot & all other search engines will run into 404s. Crawling always includes URLs that were previously seen to be 404."

He said in a different tweet, the same day, "Billions of 404 pages are crawled every day - it's a normal part of the web, it's the proper way to signal that a URL doesn't exist. That's not something you need to, or can, suppress."

So while you can fix your 404 pages through other means, automatically blocking Google from accessing 404 pages without knowing how Google is accessing those pages can be a really bad idea.

Forum discussion at Twitter.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: April 29, 2025

Apr 29, 2025 - 10:00 am
Other Search Engines

ChatGPT Search Gains Shopping Search Features (Not Ads) & More

Apr 29, 2025 - 7:51 am
Google Search Engine Optimization

Google: Changing Lastmod Date In Sitemap Isn't An SEO Hack

Apr 29, 2025 - 7:41 am
Bing Search

Bing Tests New AI Answer Summary

Apr 29, 2025 - 7:31 am
Google Ads

Google Tests New Shopping Ads Design

Apr 29, 2025 - 7:21 am
Bing Search

Bing Search Without Microsoft Name By Logo

Apr 29, 2025 - 7:11 am
Previous Story: Google Discover AMP Articles Going To Main Canonical URL?