Google Has 10,000 Human Evaluators?

Jul 10, 2007 • 7:44 am | comments (9) by twitter Google+ | Filed Under Google Search Engine Optimization

In the Seattle Conference on Scalability, Google's Marissa Mayer was asked a question where she revealed that Google has 10,000 human evaluators that manually go through the search results and rate them.

You can find coverage of this keynote at Dare Obasanjo's blog.

Q: How do they tell if they have bad results?

A: They have a bunch of watchdog services that track uptime for various servers to make sure a bad one isn't causing problems. In addition, they have 10,000 human evaluators who are always manually checking the relevance of various results.

We have reported about Google's Human Touch and people finding trails of these humans reviewing their sites in the past. But to have 10,000 human evaluators! That seems a bit large.

Most people agree that the 10,000 number sounds large. The question is, does this include internal Google staff that have this rater hub evaluation software installed on their machines?

Forum discussion at WebmasterWorld.

Previous story: Search Marketing Commercial Airs on CNBC



07/10/2007 02:35 pm

Google do use human 'vetting agents' albeit through 3rd parties on of which is Lionbridge.


07/11/2007 05:27 am

Human process plus a little technology can go a long way... especially if that human process costs less due to offshoring.


07/11/2007 02:53 pm

I find it so hard to believe... and if true, I agree with the comment that it's probably offshore [ zimbobway is rating your site :) ]

Joseph B

07/11/2007 03:52 pm

Jokes like that are funny, if you can actually spell. It's Zimbabwe silly.

Shaun Morgan

07/11/2007 04:36 pm

Google. You love 'em and you hate 'em. When I am searching the net, trying to learn something, they are a god-send. I can usually find what I'm looking for in the first few results. On the other hand, when I am running on a shoe-string budget and trying to make some money with Adsense, they are a nightmare. Google wants webmasters to run ads, but then Google does not want those webmasters to making any type of suggestion that would lead a customer to click the ad. So, with Google having 10,000 covert storm troopers out there, I suppose that all the things that one might do to try and be competitive with the big money, and grandfather sites, is a waste of time. For Instance, Google frowns on cloaking. They explain it as (in my words) showing the user something different than what you show the search engine. That's anyone who uses CSS to position stuff like links and ads at the bottom of the page,when in-fact they are near the top! Googlebot sees the links at the top of the page and likely assigns them more value. The user,however, will see the content and ads first, and be forced to scroll the whole page and see ads before finding the navigation links. If a webmaster is trying to make money using Adsense, that type of page layout is the ideal way to get customers to click through ads.


07/11/2007 06:30 pm

Google could probably afford all those evaluators and then some...especially beings they're probably all outsouced anyways.

Mark H

07/12/2007 02:23 am

Bot + Algorithm = impartial human use pattern monitoring = impartial Human Reviewer = personal bias Human Reviewer = company bias Human Reviewer = big spend advertiser bias I.E DMOZ


07/13/2007 07:49 pm

They are the 'borg'. Remember those query handling server dumps? Look more closely at the configuration parameters. ps. If you're in the UK it was common knowledge and confirmed by me ages ago that they recruited degree qualified contractors from all sorts of backgrounds via Kelly Services, we even employ a few X eval: You can still see the job ads online :-)


09/17/2008 04:54 pm

i know this is a post from 14 months ago, but recently i have noticed an increase in the visitors from Google inc around some high traffic terms that are even country specific terms... very strange

blog comments powered by Disqus