A Drop in Traffic After Relaunching a SEO Friendly Version of a Site

May 14, 2007 • 7:03 am | comments (4) by twitter Google+ | Filed Under SEO - Search Engine Optimization

You follow all the rules, you test all the scenarios, you make your version two site 100% search engine friendly. Then you launch the site and bam, your search referrals drop 60%. How could this of happened?

That is the current discussion in a Cre8asite Forums thread. The honest answer is that it does happen. Even if you follow all the recommendations in my Version 2: Relaunching a Site: SEO Considerations, including the incredible feedback in the comments, you are very likely to see a drop in traffic.

Why? Because you are potentially changing thousands of URLs and the hundreds of signals that go along with those URLs. Even the fastest computer needs time to compute this data, trust it and then push those signals through to the new URLs.

Here is an interesting question I was asked recently. I had a friendly competitor come to me the other day and ask me a question. They launched a site, they changed all the URLs but did not set up either 301s yet. They did set up new URLs, but the thing is, the new URLs and old URLs both worked. Yes, a duplicate content issue. The site was just launched and he asked what can he do now.

I suggested the following and I would love your feedback: (1) Get working on 301ing the new URLs to the old URLs (the old URLs were more search engine friendly because it was a static site). (2) Set up a database parameter to control which page gets which URL (more manual work, but for this case, it made more sense, since keeping the old URLs made more sense in this case). (3) Nofollowing the new URLs or excluding them in the robots.txt file until they can implement number 1 and 2. Which hopefully would be within a week or so.

I suspect you need more information to judge for yourself, but that is all I can provide.

In the case at the top of this article, they tested every scenario and set up all the 301s and mod_rewrites from the start. Even though, they still saw an initial drop in traffic. I am confident that traffic will get better sooner than later - if the content is worthy of it. But in the case I mentioned below, I found it interesting, in that the site was launched, there was a duplicate content issue and a potential issue of losing the current rankings. But in both cases, they issues should only be temporary.

Forum discussion at Cre8asite Forums.

Previous story: Ask.com's Bold Mothers Day Logo, Google.com & Yahoo Go Simple & As The Search Engine Roundtable


Simon Cullum

05/14/2007 11:56 am

A great article indeed. One other stance on this scenario is that it's not always possible to utilise 301 redirects, possibly because of the way the content management system operates. This was the case recently for the company that I work for. After a lot of hard thinking and many headaches, I came up with the following workaround: 1) A website has the "Friendly URLs" edited within the content management system and synchronised so that they appear live. 2) As soon as this is done, the Google Sitemap is re-created, listing all the new URLs. However, for the original URLs that we want to keep as they are appearing high in the search engine rankings, we replace the new URLs in the sitemap with the original ones. This means that Google's spiders will pickup all the new URLs plus a few of the original ones which otherwise would now be invisible to the spiders (the new URLs have cloaked the original ones, not replaced them, which was potentially a huge duplicate content issue). The sitemap is then resubmitted through Google's Webmaster Tools. (Please note that when I refer to the sitemaps as "Google Sitemaps", these sitemaps also now work for the other major search engines' spiders, so long as a line is included in the robots.txt file pointing the spiders to the Sitemap). 3) Next, the "robots.txt" file is edited, adding in a Disallow: for the new URLs that we do not want spidered as we want the corresponding old URLs to be instead. This negates the possibility of having duplicate content (one page with 2 different URLs). ------------ So in practice, when a spider comes along, they first of all read the robots.txt file, are immediately pointed to the Google Sitemap and follow the URLs listed to be spidered, however at the same time obeying the robots.txt file and not spidering the Disallowed URLs in the robots.txt file. The only original URLs that are spidered are the ones we list in the Google Sitemap, whilst the robots.txt file prevents the corresponding new URLs from being spidered. This has worked for us and seems to have the same end result as using a 301 redirect, if only we were able to!

Paul Connolley

05/14/2007 10:46 pm

Just wanted to stick my 2p in here as I've just transferred my personal blog from one bespoke CMS to a new one (my own creation for the firm I work for). As I am a firm believer in persistent URIs and because I had decided on creating a better URI scheme, I built in some nifty 301 handling that transfers the old resource to the new. It only took Google about 2 weeks and Yahoo roughly the same to find the new resources. I also saw a drop in traffic. Admittedly I'm only talking about a personal website but it was noticeable. On other websites that I didn't have the ability to create 301s (clients with MS IIS hosting) I have struggled. I also find that blocking those formerly important URIs using the robots.txt helped immensely after experiencing problems with another client whose former pages were still in Google's index after four months.


05/15/2007 01:43 am

When we are changing our site, usually the traffic got affected. This article is one of the good resources about Search Engine Optimization.

Greg Howlett

05/15/2007 11:33 pm

We recently did a massive update to Vitabase, which included moving from asp to .NET. This of course, meant that most page names changed. We did 301 redirects and hoped for the best. We were shocked at the carnage we faced. We lost most search engine rankings and worked for months to get them back (we eventually did). We are now stronger than ever, and a lot smarter. Do not expect for SE-related changes to help you immediately. You are likely to see a big drop initially.

blog comments powered by Disqus