Double Check Your Robots.txt: Google Testing New Crawler Directives

Nov 21, 2007 • 7:54 am | comments (2) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Validate your robots.txt - Googlebot becomes smarter from Sebastian reports official confirmation from Google that they are testing out new crawler directives.

He explains that adding "Noindex: /" to your robots.txt file will now deindex your complete site. Specifically, Google has told us about the new REP META tags protocol and the X-Robots support a while back, so just be careful with your old tags.

Google commented at Sebastian's post saying:

Good catch, Sebastian. How is your experiment going? At the moment we will usually accept the “noindex” directive in the robots.txt, but we are not yet at a point where we are willing to set it into stone and announce full support.

Forum discussion at Sphinn.

Previous story: Google AdSense Clickable Area Change 'Not So Bad'
 

Comments:

manas

12/18/2008 11:18 am

we just want to check my website does it ok for google search engine for the top 10 ranking with the keyword "Yahoo Store Design"Yahoo Store" Yahoo Store Development" Seo India plz check and let me know.

seo marketing

04/07/2012 09:36 am

Well i really liked your given tips which are very worthy and excellent. Business SEO web design services are better, because competition is very high today is better if you want to display in the search engine of your company.Keep sharing always.

blog comments powered by Disqus