Don't Block Your 301 Redirects with a Robots.txt File

May 27, 2008 • 5:56 am | comments (4) by twitter Google+ | Filed Under SEO - Search Engine Optimization
 

A Google Groups thread has a very interesting discussion that is almost complete. The discussion takes you through the life cycle of a 301 redirect. Site owner moved from domain.com to domain.info, on a domain name sale, but wanted to retain his links, so set up a 301 redirect from .com to .info for a certain period of time.

Besides for the thread covering a ton of details that are critical to such a move, I wanted to highlight one point made by Googler, JohnMu. John said that you should not use "the robots.txt to block crawling while you have a 301 redirect enabled for the domain. By blocking crawling, you're effectively blocking the search engines from recognizing the redirect properly."

I wonder how many people do that because I never would have thought people do.

Besides for that, there is some discussion on how long the 301 should be in place before handing over the old domain to someone else. If you 301 the results for 3 weeks and then hand the old domain over to the new owner, if that owner drops the 301, will Google return the old links back to the old domain or keep them at the new domain? Some suggest keeping the 301 live for at least 6 months.

There are many tips in the thread for such a process including collecting as much linkage data you can from the previous domain. You can collect linkage data via Yahoo Site Explorer, Google Webmaster Tools, your web analytics, your own database scripts and more. This way you can go back to those sites and ask them to update your link to the new domain.

Forum discussion at Google Groups.

Previous story: Would You Buy a Link With a NoFollow Attribute?
 

Comments:

John Honeck

05/27/2008 03:39 pm

I don't know what Google officially suggests but for me as long as there is a link out there I'd keep the redirect in place, 6 weeks, 6 mos. or 6 years, whatever it takes. The old idea of designing for people and not search engines comes into play, remember humans actually do click on those links sometimes and I'd want them to be directed to the right page and not some 404 or someone else's site. As far as the robots.txt thing goes, I've seen it often when people want stuff removed from the index. They'll add a NOINDEX metatag to a page, then block the page/directory with robots.txt, meaning a crawler will never see the NOINDEX tag, so then the listing will just go URL only but stay in the index.

Jim Lee

05/28/2008 01:40 am

I agree with keeping the redirect in place for as long as possible unless you have some compelling reason to remove it. I have one in place now and I'm leaving it and the original URL in place for at least a year.

Matthew Kee

12/16/2008 11:22 pm

Thanks. Very helpful. We had our forum at www.*****.us and a second site at www.*****.org. We created a search engine and noticed that we would have to have 2 (one for each site) so we moved the www.*****.us to www.*****.org/forum to make things simple. Now, the site will be bigger and should rank better.

No Name

11/16/2009 11:52 pm

I just redirected my site, hope the old backlinks will work because i owned both domains from the start, i had many unique backlinks. I'm also keeping the redirect for as long as possible.

blog comments powered by Disqus