Make Sure Your Robots.txt File is UTF-8

Jul 28, 2008 • 7:45 am | comments (1) by twitter Google+ | Filed Under SEO - Search Engine Optimization
 

A Google Groups thread shows the tail of a webmaster who had issues with his robots.txt file. The robots.txt file was uploaded in what is called byte-order mark (BOM) encoding, which threw off Google, when trying to retrieve and understand the webmaster's robots.txt file.

Google Groups member, Phil Payne noticed the issue right away, by using rexswain.com/httpview.html. The HTML editor this webmaster was using uploaded his robots.txt file in the BOM encoding. Google and other search engines prefer to see the robots.txt file in UTF-8 encoding.

Googler, JohnMu, confirmed the issue saying:

Phil was right on target there, it seems the BOM at the beginning of the file might be throwing us off. The easiest way to get around this issue is to have an empty line (or a comment) in the top of your robots.txt file -- that way it'll work even if you have a BOM in your file.

In short, the webmaster fixed the encoding issue by editing the file manually and reuploading.

Forum discussion at Google Groups.

Previous story: Video Recap of Weekly Search Buzz :: July 27, 2008
 

Comments:

Vincent Wehren | SEO

04/11/2011 03:54 am

Robots.txt with UTF-8 signature throwing off Google still is very much an issue today. I ran it to this as documented in my post on http://vincentwehren.com/2011/04/09/robots-txt-utf-8-and-the-utf-8-signature/ with some theories about the possible why.

blog comments powered by Disqus