Google Will Ignore Robots.txt Rules If It Serves A 4xx Status Code

Jan 17, 2023 - 7:41 am 1 by

Lizzi Sassman Googlebot

Here is another PSA from Gary Illyes of Google. In short, if you serve a 4xx status code with your robots.txt file, then Google will ignore the rules you have specified in that file.

Why? Well, 4xx status codes means the document is not available, so Google won't check it because the server says it is not available. Gary said this because he received a complaint or two about Google not respecting the robots.txt rules.

Gary wrote on LinkedIn, "PSA from my inbox: if you serve your robotstxt with a 403 HTTP status code, all rules in the file will be ignored by Googlebot. Client errors (4xx, except 429) mean unavailable robotstxt, as in, a 404 and a 403 are equivalent in this case."

In short, make sure your robots.txt file serves a 200 status code and Google can access it.

Forum discussion at LinkedIn.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Gvolatility, Bing Generative Search, Reddit Blocks Bing, Sticky Cookies, AI Overview Ads & SearchGPT - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: July 26, 2024

Jul 26, 2024 - 10:00 am
Search Video Recaps

Google Volatility, Bing Generative Search, Reddit Blocks Bing, Sticky Cookies, AI Overview Ads & SearchGPT

Jul 26, 2024 - 8:01 am
Google

Google Gemini Adds Related Content & Verification Links

Jul 26, 2024 - 7:51 am
Other Search Engines

SearchGPT - OpenAI's AI Search Tool

Jul 26, 2024 - 7:41 am
Search Engine Optimization

Google's John Mueller: Don't Use LLMs For SEO Advice

Jul 26, 2024 - 7:31 am
Google

Google Search With Related Images Carousel Below Image Box

Jul 26, 2024 - 7:21 am
Previous Story: Google Search Perspectives & Opinions