Google's URL Removal Tool Lacks Support for Wildcards

Jan 16, 2006 - 8:58 am 0 by

Dan Thies reports over at Search Engine Watch Forums URL Removal tool doesn't support robots.txt extensions. He explains that even though "Googlebot supports an extension to the robots.txt syntax, which allows webmasters to use wildcards in disallow directives." It does not support the same extensions when using the URL removal tool. He said it "will generate an error message telling you that wildcards aren't allowed, if you feed it a robots.txt file which makes use of these extensions."

Dan continues to explain that "Matt Cutts confirmed this... but it really shouldn't be a huge problem under normal circumstances, since it should only take a few days for Googlebot to pick up changes in the robots.txt file, and drop any pages that are disallowed."

So I would expect this to be added soon to the removal tool.

Forum discussion at Search Engine Watch Forums.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: October 10, 2025

Oct 10, 2025 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Search Ranking Volatility, AI Mode Expands Again, Business Profiles Insights, Bing Places Updates & Google and Microsoft Ad News

Oct 10, 2025 - 8:01 am
Google Maps

New: Google Business Profile Report Negative Review Extortion Scams

Oct 10, 2025 - 7:51 am
Google Ads

Google Ads Missed Growth Opportunities Tab

Oct 10, 2025 - 7:41 am
Web Analytics

GA4 Surge In Organic Search Traffic But Search Console Flat

Oct 10, 2025 - 7:31 am
Google

Google Voice Search Now Powered By Speech-to-Retrieval (S2R)

Oct 10, 2025 - 7:21 am
 
Previous Story: Google Testing Refinement or Clustering Filter Links