GoogleGuys Explains Robots.txt Handling

Aug 14, 2006 • 7:36 am | comments (3) by twitter Google+ | Filed Under Google Search Engine Optimization

A featured WebmasterWorld thread has examples of issues with Google possibly disobeying the robots.txt file. GoogleGuy and VanessaFox both come in, to offer some guidance on the perceived issue.

GoogleGuy first explains that "a more specific directive takes precedence over a weaker one." GoogleGuy comes back to explain in greater detail;

The rule of thumb I always use is "the most specific directive applies." So if you say "Everyone in the room, leave. g1smd, please stay; we need to chat" then everyone but g1msd would mosey.

Vanessa Fox from Google offers some useful links to help documentation and tools offered by Google to help you predict Google's behavior towards a specific robots.txt file prior to going live with it.

Forum discussion at WebmasterWorld.

Previous story: Microsoft adCenter to Launch in the UK this Week


Robert Charlton

08/15/2006 06:23 am

Barry - The link with the anchor text "offers" (Vanessa Fox from Google *offers*...) isn't working properly. It goes to the WebmasterWorld thread, but doesn't show a post.

Barry Schwartz

08/15/2006 09:52 am

Works for me...

SEO Egghead

08/15/2006 06:49 pm

Barry - I think this is a common misconception that arises from an ambiguous statement in the robots.txt standard. But according to a close reading of the specification, the rules for a specific user agent _entirely override_ the "User-agent: *" rules. Therefore, any rule under "User-agent: *" that should also be applied to googlebot must be repeated under "User-agent: googlebot." I mention this here as well:

blog comments powered by Disqus