Robots.txt Google Sitemaps Bug Fixed

Apr 27, 2006 • 7:36 am | comments (0) by twitter Google+ | Filed Under Google Search Engine Optimization
 

After Google announced the new Google Sitemaps features, a bug became visible in the tool. In fact, I noticed the bug when it was first being presented at SES yesterday, but the presenter was quick to cover it up. The folks at WebmasterWorld reported the glitch around 11:55 AM (EST), right when it was launched. Google would tell you that your robots.txt file was invalid or something, when in fact, it may not have been. Google confirmed and fixed the bug soon after.

Thanks to our users for alerting us to an issue with incorrectly reporting that sites and Sitemaps were being blocked by robots.txt files. We have resolved this issue. If you were unable to add a site or Sitemap because of this issue, you should now be able to add them.

If Sitemaps was reporting that your home page was blocked by robots.txt, you should soon see an updated status. Thanks for your patience as we refresh the display of this data.

Forum discussion WebmasterWorld.

As an FYI, we reported a total of four Google bugs in the past two days. (1) Google Fixes Extended URL Broken Page Issue (2) Google AdWords Glitch: Bid Tool Conflicts With Position Preference Tool (3) Google AdWords Showing Same Two Ads On Search Results Pages at Google.com (4) And this one.

Previous story: Google AdWords Showing Same Two Ads On Search Results Pages at Google.com
 

Comments:

No comments.

blog comments powered by Disqus