Google: Keep Your Robots.txt Simple

Mar 29, 2013 • 8:40 am | comments (13) by twitter Google+ | Filed Under Google Search Engine Optimization

GooglebotA Google Webmaster Help thread has Google's John Mueller responding to some complex robots.txt questions.

In that, he strongly recommends that when you do set up a robots.txt file, make sure you keep it very simple. John said:

When possible, I'd really recommend keeping the robots.txt file as simple as possible, so that you don't have trouble with maintenance and that it's really only disallowing resources that are problematic when crawled (or when its content is indexed).

Heck, John even has recommended you remove the robots.txt file completely if it is not needed but always make sure to keep it under 500KB if you do have to go more complex.

To see the context this is in, see the Google Webmaster Help thread.

Personally, I am all for not using robots.txt files when possible.

Forum discussion at Google Webmaster Help.

Previous story: Google Webmaster Academy Adds More Languages


Taylor Toussaint

03/29/2013 03:28 pm

It still good for referencing a sitemap. There are other search engines, not just Google.

Soni Sharma

03/29/2013 05:31 pm

I haven't seen any 500 kb robots.txt file yet.. I want to check. have you noticed any site?

Perahu kertas

03/29/2013 05:38 pm

so complicated if so many robot txt Jual Liberty Reserve

Chris Faron

03/30/2013 11:34 am

I agree to keep it simple (I remember the Google one being quite complex though.....) also it seems strange to even recommend not having one at all, creating many 404 errors in your log files


03/30/2013 03:21 pm

Here's some useful examples on how to write robots.txt rules: Hope this helps.

hire seo expert

03/30/2013 05:20 pm

Nice post.Good explanation about robots.txt file. Thanks for such a great share.

John Britsios

03/31/2013 12:20 am

I only prefer using robots.txt for setting crawl-delays for user-agents that support it.

Jawad Latif

04/01/2013 09:49 am

Totally agreed. Keeping it simplest is best policy.

Jaydeep Kapadia

04/01/2013 12:20 pm

Even I have not read anything about size of robots.txt file.

Chelsea Bowling

04/01/2013 08:25 pm

I understand the reasoning behind keeping a robots.txt simple, but there are also cases when using one is much more convenient than the alternative-- going through individual pages and slapping on noindex/nofollow tags, for example, when working with old flat HTML sites. Although maybe the point is to be more selective in what should be noindex/nofollow-ed? Hm, interesting, anyway!


04/02/2013 07:11 am

It is very important indeed to deal properly with robots.text. As an owner of a digital agency specialized in web designing and development. and i can understand the importance of this feature.


04/02/2013 07:21 am

Not sure if Google still takes the sitemap reference in your robots.txt seriously.But other than that keep it simple, and yes if you don't have anything that you don't want Google to crawl, then I don't think it's necessary to have a robots.txt in the first place.


04/02/2013 11:03 pm

I still advise clients to use the robots.txt. I don't know of any large, dynamic sites that don't have some areas they want blocked. The robots.txt seems to get more respect than robots meta tags and is more versatile. Keep it simple is just good advice for anything you do.

blog comments powered by Disqus