Lazy Google Quality Raters, Can It Be?

Sep 9, 2011 • 8:38 am | comments (7) by twitter Google+ | Filed Under Google Search Engine Optimization
 

The Sleeping Geek Kitten - Angers -A WebmasterWorld thread has a post from one webmaster who decided to try to see how much investigation work a Google Quality Rater does on a whole when investigating and responding to a Google reconsideration request.

The user wrote some scripts to pinpoint the Googler(s) accessing his site via his log files. Then ran through them to see how many pages the Googler looked at before responding to the reconsideration request. It turned out he was very disappointed with the results.

He said:

So, anyhow, I was able to see a visit exactly 1 day (approx 25 hrs) before each of the events - ban and the response to my recon request. Both were from the 216.239.x.x subnet although earlier there were hits from other Google networks, too.

I was rather disappointed to see that before banning the site the rater visited a very drab and ordinary page on my site. Not a smoking gun of some incriminating evidence of a hacker break-in or some such I was looking for. Also disappointing is the fact that they visited one page only. I can't tell how long they have stayed on the page but can you make such a drastic decision about a 400,000+ pages site by looking at just one of those pages?

Probably even more disappointing yet is the way they treated the reconsideration request. A person came in and, indeed, only looked at a single page again. Only this time it was simply the homepage. My site is a forum, so the homepage contains pretty much only a list of the most recent threads - not much else to see there. At least the page they looked at before banning was representative of the layout (including ads layout which I hear they hate so much now). The only conclusion they could possibly have made by looking at the homepage and weighing my reconsideration request was that the site's still up. Apparently, that was enough to reject the request.

Now, he may have no considered that his script might have been wrong in isolating the human Google evaluators. Maybe they are not on Google IP ranges? Maybe they are not at Google at all?

Also, maybe the site doesn't have an issue with content - maybe the issue is related to off page factors? Or maybe the algorithms detecting an issue and thus you have to wait for the algorithms to lift the penalty?

Many factors in question here but interesting observation, nevertheless.

Forum discussion at WebmasterWorld.

Photo Credit: Nathanael VALERO on Flickr.

Previous story: Questions On Google Places? Google Is Taking Them.
 

Comments:

Will Spencer

09/09/2011 04:03 pm

Google does not care about you.  To Google, dropping one search result just means that another site moves up to take it's place.  The top 11 results, minus #2, are virtually indistinguishable from the top 10 results.  The net cost to Google for removing a useful site from the search results is very very very close to zero. We as web publishers are in a very competitive space.  We're all competing for a limited number of SERPs.  Google, on the other hand, has very little in the way of serious competition.  They are simply not motivated to do better.

jeffyablon

09/09/2011 04:26 pm

Let's remember something: If you have "400,000 pages", you're either a Content Farm (hello AOL),  a forum, or someone who's figured out how to make your site LOOK like it's "big" by cross-linking (this being an SEO forum I'll assume you all understand that one ). Which of those actually "deserve" high ranking? I concur that the Big G doesn't care about anyone (other than advertisers ), and their reviews of reconsideration requests are cursory once-overs at best. But seriously, what do you expect?  Puh-leaze; the 400,000 pages almost certainly does't equal "high quality".

Manuel

09/10/2011 07:06 pm

He says his site is a forum.

F-Google

09/11/2011 02:04 am

Google has no incentive to give a phack, now that they are a monopoly. That's why they need to get a reality check.

darrencross

09/11/2011 08:40 pm

"Or maybe the algorithms detecting an issue and thus you have to wait for the algorithms to lift the penalty?" Maybe they know it's an algo penalty and therefore are powerless to do anything other than just look at the site out of curiosity? Of the number of sites penalised I'm guessing a very small percentage are done so manually?

jeffyablon

09/16/2011 12:25 pm

Yes, I saw that, too. But forums aren't good content. They're LOTS of content, and some may be good (and some forums may have lots OF good content), but to look at 400,000 comments, call each a page (as it ... well ... is), and think that makes you worth ranking? Nonsense.

Anon

07/29/2013 04:30 pm

A source has told me they've worked with lazy and unintelligent raters who can't even form a logical sentence and many don't even understand English. The program is closing, and there are many other imposters that are not legitimate. Some contracts failed to live up to their standards and have not been renewed.

blog comments powered by Disqus