Double Check Your Robots.txt: Google Testing New Crawler Directives

Nov 21, 2007 - 7:54 am 2 by

Validate your robots.txt - Googlebot becomes smarter from Sebastian reports official confirmation from Google that they are testing out new crawler directives.

He explains that adding "Noindex: /" to your robots.txt file will now deindex your complete site. Specifically, Google has told us about the new REP META tags protocol and the X-Robots support a while back, so just be careful with your old tags.

Google commented at Sebastian's post saying:

Good catch, Sebastian. How is your experiment going? At the moment we will usually accept the “noindex” directive in the robots.txt, but we are not yet at a point where we are willing to set it into stone and announce full support.

Forum discussion at Sphinn.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: October 31, 2025

Oct 31, 2025 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google & Microsoft Earnings, Query Group Report, Disney Sitelink Hack, Reviews Disappearing & Ranking Volatility

Oct 31, 2025 - 8:01 am
Google Ads

Google: Google Ads Not Going Away With AI Mode As The Future

Oct 31, 2025 - 7:51 am
Google Ads

Cross Campaign Metrics In Google Ads Overview Tab

Oct 31, 2025 - 7:41 am
Google Search Engine Optimization

Google Merchant Center Promotions Adds Download Option

Oct 31, 2025 - 7:31 am
Google Search Engine Optimization

Google's Robby Stein On SEO For AI Mode & AI Overviews

Oct 31, 2025 - 7:21 am
 
Previous Story: Google AdSense Clickable Area Change 'Not So Bad'