Is a Robots.txt File Required for Search Engine Optimization?

Aug 31, 2007 • 7:46 am | comments (1) by twitter Google+ | Filed Under SEO - Search Engine Optimization
 

A Search Engine Watch Forums thread has a simple question. Is a robots.txt file required for SEO?

The answer is no, a robots.txt file is not required.

If you want the search engines to crawl your site, you do not have to do anything. If you do not want them to crawl your site, you can tell them not to with a robots.txt file.

That is not to say the robots.txt file isn't useful. You can use it to control which pages the search engines should not crawl, which can be very useful for duplicate content and SEO purposes.

In addition, most search engines have added layers of features that you can control via the robots.txt file. So if you want to institute a crawl-delay for Yahoo, you can. If you want to specify a sitemaps location, you can. If you want to try to catch rogue spider, the file can be helpful.

There are many useful uses with the robots.txt. But it is not a required element for ranking well.

Forum discussion at Search Engine Watch Forums.

Previous story: Yahoo Search Marketing Increases Minimum Deposit to $30
 

Comments:

Gidseo

08/31/2007 01:07 pm

How useful do you think - "distribution" content="global" is if you have a UK site and want to improve it in USA/Australia? Any thoughts much appreciated...

blog comments powered by Disqus