Yahoo! Slurping Wildcards Via Robots.txt File

Nov 6, 2006 - 7:14 am 2 by
Filed Under Yahoo SEO

Thursday night, last week, the Yahoo! Search Blog wrote Yahoo! Search Crawler (Yahoo! Slurp) - Supporting wildcards in robots.txt. I am honestly a bit shocked by the SEO community's response to this, or lack thereof.

I have spotted two threads on the topic, both just threads that say, that Yahoo! announced it and nothing more. I know this is something Webmasters have requested in the past and that they have looked down upon Yahoo! for not supporting until now.

The two additional parameters include:

You can now use '*' in robots directives for Yahoo! Slurp to wildcard match a sequence of characters in your URL.
and
You can now also use '$' in robots directives for Slurp to anchor the match to the end of the URL string.

More details at http://help.yahoo.com/help/us/ysearch/slurp/slurp-02.html.

Forum discussion at WebmasterWorld & Search Engine Watch Forums.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Video Recaps

Search News Buzz Video Recap: Google November Core Update Done, Chrome Site Engagement Metrics, Canonicals, 21 Years & More

Dec 6, 2024 - 8:11 am
Google Updates

Google November 2024 Core Update Finally Finished Rolling Out

Dec 6, 2024 - 8:01 am
Google Search Engine Optimization

Google Does Try To Handle Broken Canonicals

Dec 6, 2024 - 7:51 am
Google Search Engine Optimization

Google Search: How Clustering Works With Localization

Dec 6, 2024 - 7:41 am
Google Search Engine Optimization

Google Marauding Black Holes With Clustering & Error Pages

Dec 6, 2024 - 7:31 am
Google Search Engine Optimization

Google Has 40 Signals For Canonicalization

Dec 6, 2024 - 7:21 am
Previous Story: Running Out of AdWords Campaigns? Got Some Tips For You