Google: It Is A Bad Sign If Your Robots.txt Or Sitemap File Is Ranking For Normal Queries

Nov 8, 2019 • 7:47 am | comments (0) by | Filed Under Google Search Engine Optimization
 

Bad Apple

We had a slew of tweets triggered by Gary Illyes of Google and then followed up by John Mueller of Google around robots.txt and XML sitemap files ranking in Google. In short, if they rank for normal queries, John Mueller said "that's usually a sign that your site is really bad off and should be improved instead."

Let's start with Gary's tweet:

A robots.txt file can be indexed and ranked in Google is what he is saying.

John then adds that you can block these from being indexed using the x-robots-tag HTTP header.

But if you do see your robots.txt file ranking or your sitemap file ranking, he said " if your robots.txt or sitemap file is ranking for normal queries (not site:), that's usually a sign that your site is really bad off and should be improved instead."

You can also use the disallow John added:

Maybe I am misunderstanding, but John said disallow doesn't work here in 2018? Here is his tweet from back then?

I guess when you disallow in robots.txt it is too late anyway.

John said there is really no reason to let Google index your sitemap file, Google processes that differently:

Anyway, I thought you'd find these tweeted, compiled in one post, useful.

Forum discussion at Twitter.

Previous story: Google: Do Not Use Robots.txt To Block Indexing Of URLs With Parameters
Ninja Banner
 
blog comments powered by Disqus