Google Crawls Robots.txt Files Daily

Sep 3, 2009 • 8:42 am | comments (2) by twitter Google+ | Filed Under Google Search Engine Optimization
 

JohnMu from Google posted in a Google Webmaster Help thread that Google typically crawls a site's robots.txt file on a daily basis. This is the first time (at least that I can remember) I have seen a Googler make a statement on the crawl frequency of robots.txt files.

JohnMu said:

We usually only check the robots.txt file once a day for most sites, so I assume you were just still seeing the version that we fetched yesterday.

I have not validated this with my sites log files, but that is not the point. The point is a Googler said, on a general level, how often Googlebot will refetch a site's robots.txt file.

Forum discussion at Google Webmaster Help.

Previous story: Daily Search Forum Recap: September 2, 2009
 

Comments:

Michael Martinez

09/03/2009 05:11 pm

What would happen if you put this entry into your robots.txt file? User-agent: * Disallow: /robots.txt

Ruud Kok

09/04/2009 07:27 am

So, will this mean that if you put a new folder online, which you don't want to be crawled, you need to put this folder into your robots.txt about 24 hours upfront?

blog comments powered by Disqus