
Google's John Mueller said that if Google is not convinced that there are new and important content to index on your site, then it won't use the sitemap file on your site.
Just because you have a sitemap file, it does not mean Google will index all the pages in that file. This isn't really new, we discussed it before.
John wrote on Reddit a few days ago:
One part of sitemaps is that Google has to be keen on indexing more content from the site. If Google's not convinced that there's new & important content to index, it won't use the sitemap.
We know Google does not index everything, in fact, very few sites have all of its pages indexed by Google (maybe unless it is a 5 page website).
So adding a sitemap file, while useful for many reasons, doesn't mean those pages will be indexed.
Also, here is a somewhat related post on Bluesky from over the weekend:
In the extreme case where Google can't crawl at all, then of course at some point pages start to drop out of the index. For everything else, our systems tend to find a good balance. I don't think it's possible to define an absolute cut-off point, & sites that care tend to watch out for speed too.
— John Mueller (@johnmu.com) February 21, 2026 at 4:03 AM
You can calculate how long it would take to crawl the whole site assuming no duplicates, but imo don't think of this as being the problem - it's more like the symptom of a number of things.
— John Mueller (@johnmu.com) February 23, 2026 at 4:59 AM
Forum discussion at Reddit.

