Google's URL Removal Tool Lacks Support for Wildcards

Jan 16, 2006 - 8:58 am 0 by

Dan Thies reports over at Search Engine Watch Forums URL Removal tool doesn't support robots.txt extensions. He explains that even though "Googlebot supports an extension to the robots.txt syntax, which allows webmasters to use wildcards in disallow directives." It does not support the same extensions when using the URL removal tool. He said it "will generate an error message telling you that wildcards aren't allowed, if you feed it a robots.txt file which makes use of these extensions."

Dan continues to explain that "Matt Cutts confirmed this... but it really shouldn't be a huge problem under normal circumstances, since it should only take a few days for Googlebot to pick up changes in the robots.txt file, and drop any pages that are disallowed."

So I would expect this to be added soon to the removal tool.

Forum discussion at Search Engine Watch Forums.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Coming, Ranking Volatility, Bye Search Notes, AI Overviews, Ads & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: July 25, 2024

Jul 25, 2024 - 10:00 am
Google Ads

Google Again: We Will Test Ads In AI Overviews Soon

Jul 25, 2024 - 7:51 am
Bing Search

Microsoft Now Testing Bing Generative Search Experience

Jul 25, 2024 - 7:41 am
Bing SEO

Reddit Blocked Bing Search & Others But Not Google

Jul 25, 2024 - 7:31 am
Local Search

Apple Maps Web Version Launches Beta

Jul 25, 2024 - 7:21 am
Google Ads

Google Local Service Ads Shows Phone Number On Hover

Jul 25, 2024 - 7:11 am
Previous Story: Google Testing Refinement or Clustering Filter Links