Google's URL Removal Tool Lacks Support for Wildcards

Jan 16, 2006 - 8:58 am 0 by

Dan Thies reports over at Search Engine Watch Forums URL Removal tool doesn't support robots.txt extensions. He explains that even though "Googlebot supports an extension to the robots.txt syntax, which allows webmasters to use wildcards in disallow directives." It does not support the same extensions when using the URL removal tool. He said it "will generate an error message telling you that wildcards aren't allowed, if you feed it a robots.txt file which makes use of these extensions."

Dan continues to explain that "Matt Cutts confirmed this... but it really shouldn't be a huge problem under normal circumstances, since it should only take a few days for Googlebot to pick up changes in the robots.txt file, and drop any pages that are disallowed."

So I would expect this to be added soon to the removal tool.

Forum discussion at Search Engine Watch Forums.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Rumbling, Manual Actions FAQs, Core Web Vitals Updates, AI, Bing, Ads & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Updates

Google Urges Patience As The March 2024 Core Update Continues To Rollout

Mar 18, 2024 - 7:51 am
Google

Official: Google Replaces Perspective Filter With Forums Filter

Mar 18, 2024 - 7:41 am
Google Maps

Google Business Profiles Now Offers Additional Review After Appeal Is Denied

Mar 18, 2024 - 7:31 am
Google Maps

EU Searchers Complaining About Google Maps Features Changes Related To DMA

Mar 18, 2024 - 7:21 am
Google

Google Showing Fewer Sitelinks Within Search

Mar 18, 2024 - 7:11 am
Search Forum Recap

Daily Search Forum Recap: March 15, 2024

Mar 15, 2024 - 4:00 pm
Previous Story: Google Testing Refinement or Clustering Filter Links