Google: If You Can't Block No Result Found Internal Search Results Then Block All Search Results

Apr 18, 2023 - 7:41 am 2 by

Google Robot Vacuum Pages

Google's John Mueller went on a bit of a rant on Reddit on the topic of having your internal search result pages crawlable and indexable. As you know, in general, Google does not want to index search results but John said, this is especially true for search result pages that have no results found.

In short, John said, "if you can't select which search results pages should be indexable, you should block all of them from being indexable - use robots.txt disallow or noindex robots meta tag."

The rant is kind of fun to read, so here it is:

Unfortunately, lots of CMSs, hosting platforms, ecommerce platforms, etc still don't robot or noindex search results pages by default. We've given that guidance for probably over a decade. Especially if a search results page returns no results, there's no reason for it to be indexable. And even for other search results pages, it's a good practice to either block them all, or to only allow a hand-selected set to be indexed (eg, known product-type queries, where the results are more like category pages). If you can't limit the indexable search results pages, I would strongly recommend noindexing or robotting *all* of the search pages. It's still a regular occurrance that we see sites spam the search results with open search results pages -- it doesn't take much work to prevent, and cleaning it up afterwards is such a hassle.

In 2007, Google told webmasters to block internal search results from being indexed. The original guideline reads "Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines." Now it reads "Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages."

Then ten years later, Google's John Mueller explained why Google doesn't want your search result pages in its index. He said, "they make infinite spaces (crawling), they're often low-quality pages, often lead to empty search results/soft-404s." Later he explained it is a watering down issue and these types of pages can result in soft 404 responses in Search Console.

Forum discussion at Reddit.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google March Core Update Done, HCU Recoveries, Site Reputation Abuse & AI Topics - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: May 6, 2024

May 6, 2024 - 4:00 pm
Google Search Engine Optimization

Google: Sites Hit By The Old Helpful Content Update Can Recover & Grow

May 6, 2024 - 8:01 am
Google

Statcounter Fixes Search Engine Market Share Data After Incorrectly Showing Google Lost Share

May 6, 2024 - 7:51 am
Google Search Engine Optimization

Google Will Remove Its Disavow Link Tool At Some Point

May 6, 2024 - 7:41 am
Google Maps

Google Search Tests Showing Only Local Listing For Near Me Queries

May 6, 2024 - 7:31 am
Google

Google Gemini Stops Linking To Most Sources?

May 6, 2024 - 7:21 am
Previous Story: Google Expands Return / Shipping Details In Search Results & New Search Console Reporting