Google: Don't Make A Dynamically Generated robots.txt

Oct 29, 2015 - 7:56 am 2 by

Butterfly Google 1900px

John Mueller from Google said in a Stack Exchange thread that although it is good practice to make a dynamically driven XML sitemap file, it is not good practice to make a dynamically driven robots.txt file.

He said that file should probably be kept more static and updated by hand.

John wrote:

Making the robots.txt file dynamic (for the same host! Doing this for separate hosts is essentially just a normal robots.txt file for each of them.) would likely cause problems: it's not crawled every time a URL is crawled from the site, so it can happen that the "wrong" version is cached. For example, if you make your robots.txt file block crawling during business hours, it's possible that it's cached then, and followed for a day -- meaning nothing gets crawled (or alternately, cached when crawling is allowed). Google crawls the robots.txt file about once a day for most sites, for example.

As you can see, changing your robots.txt file too often throughout the day can cause confusion for Google's crawlers and lead them places they should not go, while sometimes blocking them from places they should be invited to venture into.

Forum discussion at Stack Exchange.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: January 15, 2025

Jan 15, 2025 - 10:00 am
Google Search Engine Optimization

Google Third-Party Review Boxes Algorithmically Selected

Jan 15, 2025 - 7:51 am
Google

Google Search Generative AI For People Also Search For

Jan 15, 2025 - 7:41 am
Bing Ads

Bing Ads - See More Links (Sitelinks)

Jan 15, 2025 - 7:31 am
Google

Google Search Tests Sitelinks Carousels On Desktop

Jan 15, 2025 - 7:21 am
Google

Google Search Video Tab With Continuous Scroll

Jan 15, 2025 - 7:11 am
Previous Story: Google: Penguin Update Will Still Happen In 2015