Yahoo! Slurping Wildcards Via Robots.txt File

Nov 6, 2006 • 7:14 am | comments (2) by | Filed Under Yahoo Search Engine Optimization

Thursday night, last week, the Yahoo! Search Blog wrote Yahoo! Search Crawler (Yahoo! Slurp) - Supporting wildcards in robots.txt. I am honestly a bit shocked by the SEO community's response to this, or lack thereof.

I have spotted two threads on the topic, both just threads that say, that Yahoo! announced it and nothing more. I know this is something Webmasters have requested in the past and that they have looked down upon Yahoo! for not supporting until now.

The two additional parameters include:

You can now use '*' in robots directives for Yahoo! Slurp to wildcard match a sequence of characters in your URL.
You can now also use '$' in robots directives for Slurp to anchor the match to the end of the URL string.

More details at

Forum discussion at WebmasterWorld & Search Engine Watch Forums.

Previous story: Running Out of AdWords Campaigns? Got Some Tips For You
Ninja Banner
blog comments powered by Disqus