Yahoo! Slurping Wildcards Via Robots.txt File

Nov 6, 2006 • 7:14 am | comments (2) by twitter Google+ | Filed Under Yahoo Search Engine Optimization
 

Thursday night, last week, the Yahoo! Search Blog wrote Yahoo! Search Crawler (Yahoo! Slurp) - Supporting wildcards in robots.txt. I am honestly a bit shocked by the SEO community's response to this, or lack thereof.

I have spotted two threads on the topic, both just threads that say, that Yahoo! announced it and nothing more. I know this is something Webmasters have requested in the past and that they have looked down upon Yahoo! for not supporting until now.

The two additional parameters include:

You can now use '*' in robots directives for Yahoo! Slurp to wildcard match a sequence of characters in your URL.
and
You can now also use '$' in robots directives for Slurp to anchor the match to the end of the URL string.

More details at http://help.yahoo.com/help/us/ysearch/slurp/slurp-02.html.

Forum discussion at WebmasterWorld & Search Engine Watch Forums.

Previous story: Running Out of AdWords Campaigns? Got Some Tips For You
 

Comments:

SEOEgghead

11/07/2006 12:27 am

I still don't think this matters much, because MSN doesn't honor it yet. Maybe if all three supported it I'd be more enthusiastic. But it's progress. I have to update my book copy now :) Thanks for the update.

SEOEgghead

11/07/2006 12:31 am

Actually, I retract that comment, as it appears that MSN actually supports wildcards. That's nice news, then.

blog comments powered by Disqus