Google: Keep Your Robots.txt Simple

Mar 29, 2013 - 8:40 am 13 by

GooglebotA Google Webmaster Help thread has Google's John Mueller responding to some complex robots.txt questions.

In that, he strongly recommends that when you do set up a robots.txt file, make sure you keep it very simple. John said:

When possible, I'd really recommend keeping the robots.txt file as simple as possible, so that you don't have trouble with maintenance and that it's really only disallowing resources that are problematic when crawled (or when its content is indexed).

Heck, John even has recommended you remove the robots.txt file completely if it is not needed but always make sure to keep it under 500KB if you do have to go more complex.

To see the context this is in, see the Google Webmaster Help thread.

Personally, I am all for not using robots.txt files when possible.

Forum discussion at Google Webmaster Help.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Updates

Google March 2024 Core Update Finished April 19th (A Week Ago)

Apr 26, 2024 - 4:40 pm
Search Forum Recap

Daily Search Forum Recap: April 26, 2024

Apr 26, 2024 - 4:00 pm
Search Video Recaps

Search News Buzz Video Recap: Google Core Update Updates, Site Reputation Abuse Coming, Links, Ads & More

Apr 26, 2024 - 8:01 am
Google Search Engine Optimization

Google Publisher Center No Longer Allows Adding Publications

Apr 26, 2024 - 7:51 am
Google

Google Tests Placing The Snippet Date Next To URL

Apr 26, 2024 - 7:41 am
Google

Google Breaks Out Googlebot IP Ranges For User-Triggered Fetchers

Apr 26, 2024 - 7:31 am
Previous Story: Google Webmaster Academy Adds More Languages