Yahoo! Slurping Wildcards Via Robots.txt File

Nov 6, 2006 - 7:14 am 2 by
Filed Under Yahoo SEO

Thursday night, last week, the Yahoo! Search Blog wrote Yahoo! Search Crawler (Yahoo! Slurp) - Supporting wildcards in robots.txt. I am honestly a bit shocked by the SEO community's response to this, or lack thereof.

I have spotted two threads on the topic, both just threads that say, that Yahoo! announced it and nothing more. I know this is something Webmasters have requested in the past and that they have looked down upon Yahoo! for not supporting until now.

The two additional parameters include:

You can now use '*' in robots directives for Yahoo! Slurp to wildcard match a sequence of characters in your URL.
and
You can now also use '$' in robots directives for Slurp to anchor the match to the end of the URL string.

More details at http://help.yahoo.com/help/us/ysearch/slurp/slurp-02.html.

Forum discussion at WebmasterWorld & Search Engine Watch Forums.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: April 28, 2025

Apr 28, 2025 - 10:00 am
Google

Deposition: Google Used Search Signals & Data To Train Gemini AI Models

Apr 28, 2025 - 7:41 am
Bing Search

Bing Copilot Answer More Videos Section

Apr 28, 2025 - 7:31 am
Google

Google Tests New Shipping, Returns & Payment Section In Retail Knowledge Panel

Apr 28, 2025 - 7:21 am

Apple Updates Applebot Docs: Explaining Applebot-Extended vs Applebot

Apr 28, 2025 - 7:11 am
Google Search Engine Optimization

Google Search Console API Data Stuck On April 22

Apr 28, 2025 - 5:56 am
Previous Story: Running Out of AdWords Campaigns? Got Some Tips For You