Google: Keep Your Robots.txt Simple

Mar 29, 2013 - 8:40 am 13 by

GooglebotA Google Webmaster Help thread has Google's John Mueller responding to some complex robots.txt questions.

In that, he strongly recommends that when you do set up a robots.txt file, make sure you keep it very simple. John said:

When possible, I'd really recommend keeping the robots.txt file as simple as possible, so that you don't have trouble with maintenance and that it's really only disallowing resources that are problematic when crawled (or when its content is indexed).

Heck, John even has recommended you remove the robots.txt file completely if it is not needed but always make sure to keep it under 500KB if you do have to go more complex.

To see the context this is in, see the Google Webmaster Help thread.

Personally, I am all for not using robots.txt files when possible.

Forum discussion at Google Webmaster Help.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: December 6, 2024

Dec 6, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google November Core Update Done, Chrome Site Engagement Metrics, Canonicals, 21 Years & More

Dec 6, 2024 - 8:11 am
Google Updates

Google November 2024 Core Update Finally Finished Rolling Out

Dec 6, 2024 - 8:01 am
Google Search Engine Optimization

Google Does Try To Handle Broken Canonicals

Dec 6, 2024 - 7:51 am
Google Search Engine Optimization

Google Search: How Clustering Works With Localization

Dec 6, 2024 - 7:41 am
Google Search Engine Optimization

Google Marauding Black Holes With Clustering & Error Pages

Dec 6, 2024 - 7:31 am
Previous Story: Google Webmaster Academy Adds More Languages