We are now one day away from the one-year anniversary of the Google Panda update and there are still many many webmasters who have not been able to rescue their sites from the Panda quality factors.
They tried everything from removing tons of "low quality" pages, to disallowing portions of their sites, to rewriting all their content, but still find themselves with horrible Google rankings, leading to horrible sales and revenues for the site.
A WebmasterWorld thread has one Panda victim saying he is out of options. He wants to throw a hail mary pass and see what happens, he has "nothing to lose" at this point he said. He added that his "site is deader than that parrot on Monty Python."
So what is his hail mary pass?
Disallow GoogleBot completely from accessing his web site for six months and see what happens when he re-allows access to it six months later.
Will it work? Who knows.
WebmasterWorld administration, Tedster, said:
It's your business and your choice. I have heard from webmasters who made this choice (this was before Panda) when it seemed like Google was just not going to forgive whatever they hadn't liked. In at least one case, moving on did bring some relief and focusing on other traffic sources returned them to profitability.
What do you think?
Forum discussion at WebmasterWorld.
Image credit to ShutterStock for Panda picture.