Lazy Google Quality Raters, Can It Be?

Sep 9, 2011 • 8:38 am | comments (7) by twitter Google+ | Filed Under Google Search Engine Optimization
 

The Sleeping Geek Kitten - Angers -A WebmasterWorld thread has a post from one webmaster who decided to try to see how much investigation work a Google Quality Rater does on a whole when investigating and responding to a Google reconsideration request.

The user wrote some scripts to pinpoint the Googler(s) accessing his site via his log files. Then ran through them to see how many pages the Googler looked at before responding to the reconsideration request. It turned out he was very disappointed with the results.

He said:

So, anyhow, I was able to see a visit exactly 1 day (approx 25 hrs) before each of the events - ban and the response to my recon request. Both were from the 216.239.x.x subnet although earlier there were hits from other Google networks, too.

I was rather disappointed to see that before banning the site the rater visited a very drab and ordinary page on my site. Not a smoking gun of some incriminating evidence of a hacker break-in or some such I was looking for. Also disappointing is the fact that they visited one page only. I can't tell how long they have stayed on the page but can you make such a drastic decision about a 400,000+ pages site by looking at just one of those pages?

Probably even more disappointing yet is the way they treated the reconsideration request. A person came in and, indeed, only looked at a single page again. Only this time it was simply the homepage. My site is a forum, so the homepage contains pretty much only a list of the most recent threads - not much else to see there. At least the page they looked at before banning was representative of the layout (including ads layout which I hear they hate so much now). The only conclusion they could possibly have made by looking at the homepage and weighing my reconsideration request was that the site's still up. Apparently, that was enough to reject the request.

Now, he may have no considered that his script might have been wrong in isolating the human Google evaluators. Maybe they are not on Google IP ranges? Maybe they are not at Google at all?

Also, maybe the site doesn't have an issue with content - maybe the issue is related to off page factors? Or maybe the algorithms detecting an issue and thus you have to wait for the algorithms to lift the penalty?

Many factors in question here but interesting observation, nevertheless.

Forum discussion at WebmasterWorld.

Photo Credit: Nathanael VALERO on Flickr.

Previous story: Questions On Google Places? Google Is Taking Them.
 
blog comments powered by Disqus