Google: Don't Make A Dynamically Generated robots.txt

Oct 29, 2015 - 7:56 am 2 by

Butterfly Google 1900px

John Mueller from Google said in a Stack Exchange thread that although it is good practice to make a dynamically driven XML sitemap file, it is not good practice to make a dynamically driven robots.txt file.

He said that file should probably be kept more static and updated by hand.

John wrote:

Making the robots.txt file dynamic (for the same host! Doing this for separate hosts is essentially just a normal robots.txt file for each of them.) would likely cause problems: it's not crawled every time a URL is crawled from the site, so it can happen that the "wrong" version is cached. For example, if you make your robots.txt file block crawling during business hours, it's possible that it's cached then, and followed for a day -- meaning nothing gets crawled (or alternately, cached when crawling is allowed). Google crawls the robots.txt file about once a day for most sites, for example.

As you can see, changing your robots.txt file too often throughout the day can cause confusion for Google's crawlers and lead them places they should not go, while sometimes blocking them from places they should be invited to venture into.

Forum discussion at Stack Exchange.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: April 29, 2025

Apr 29, 2025 - 10:00 am
Other Search Engines

ChatGPT Search Gains Shopping Search Features (Not Ads) & More

Apr 29, 2025 - 7:51 am
Google Search Engine Optimization

Google: Changing Lastmod Date In Sitemap Isn't An SEO Hack

Apr 29, 2025 - 7:41 am
Bing Search

Bing Tests New AI Answer Summary

Apr 29, 2025 - 7:31 am
Google Ads

Google Tests New Shopping Ads Design

Apr 29, 2025 - 7:21 am
Bing Search

Bing Search Without Microsoft Name By Logo

Apr 29, 2025 - 7:11 am
Previous Story: Google: Penguin Update Will Still Happen In 2015