Google: When Unindexing Pages From Search Use Noindex, Not Robots.txt

Mar 9, 2021 - 7:41 am 2 by

Google Unindex

Google's John Mueller said on Twitter that when you want to "unindex," remove indexed pages from Google Search, you should not block Google with robots.txt, but rather use noindex. He said only unindex pages when if you don't care if the page doesn't rank but if you want it to rank, then improve the page.

Here are those tweets:

So we see a few things from this tweet:

(1) Only unindex pages when they are not ranking for queries you care about.

(2) If you care about those queries, then improve the pages and do not unindex them.

(3) If you do decide to unindex them, then use noindex instead of robots.txt.

Yes, noindex trumps robots.txt directives.

Also, I don't know if "unindex" is a word but hey.

Forum discussion at Twitter.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Volatility, Indexing Bugs, Google Ads Broad Match Default & YouTube SEO Tips - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: July 15, 2024

Jul 15, 2024 - 10:00 am
Google Search Engine Optimization

Google To Host More Creator Meetings

Jul 15, 2024 - 7:51 am
Google

Google Tests AI-Organized Local Search Results Page

Jul 15, 2024 - 7:41 am
Google

Report: Google AI Overviews Barely Showing Reddit Or Quora Citations Anymore

Jul 15, 2024 - 7:31 am
Google

Google May Be Rolling Out 3D/AR Images More Widely

Jul 15, 2024 - 7:21 am
Google Search Engine Optimization

Google: Removing Massive Disavow Lists Should Not Worry You

Jul 15, 2024 - 7:11 am
Previous Story: Google Search Does Not Use Blockchain (For Timestamps)