FeedBurner Allows Access Again To feedproxy.google.com Via Robots.txt File

Sep 17, 2008 • 9:46 am | comments (3) by | Filed Under Other Google Topics

In case you didn't know, you can upgrade your Feedburner URL to feedproxy.google.com. However, some people noticed that feedproxy.google.com is actually disallowing robots.txt -- odd, huh? In fact, the problem has been breaking some feeds.

Fortunately, Google has been paying attention to the relevant discussion and will be adding the following code snippet to the robots.txt file for feedproxy.google.com:

User-agent: * Disallow: /~a/

Google has updated the issues page to confirm this update has been completed. Google said this will fix the issue: "This should permit all readers/crawlers that previously retrieved feed content, but now get a blocked response, to start working properly again. Our apologies for any inconvenience you may have encountered! "

Forum discussion continues at Google Groups.

Previous story: How To Ensure That Your Google Quality Score is 10/10
Ninja Banner
blog comments powered by Disqus