Google Image Labeler Plagued With Pornography?

Apr 27, 2009 • 8:36 am | comments (4) by twitter Google+ | Filed Under Other Google Topics
 

Google Image Labeler is a fun game people can play to help Google Image Search better understand and tag pictures within the Google Image Search index. This game launched in September 2006 and has been somewhat under the radar since then.

Recently, I saw not one, but two different threads in Google Web Search Help forums by two different people, who both spotted pornography in Google Image Labeler. One even saw child pornography. Here is what they had to say:

I was doing the Google Image Labeler when suddenly I am confronted with child pornography to label! I am both shocked and outraged that Google allowed such a thing to happen. How can Google make sure this type of filth does not show up again?

Why is there porn on google image labeler? At first it was just women without anything covering their breasts but I saw a sex scene the other day. Is there any way to report this?

Is there a way to report images that are offensive or inappropriate in Google Image Labeler? No, not really. You can in image search, but not here.

Forum discussion at Google Web Search Help.

Update: Googlers have replied to both threads.

Jaime from Google said:

Thanks for taking the time to post here; we take any instance of abuse toward minors very seriously and will be in contact with you privately so that we can further investigate and take the appropriate action.

While we will certainly report any legitimate abuse we become aware of to the appropriate authorities, I'd also welcome you to help keep children safe by directly contacting the National Center for Missing & Exploited Children's (NCMEC) CyberTipline 24-hours per day, 7 days per week online at www.cybertipline.com or by calling 1-800-843-5678. Reports can be made regarding eight categories of child sexual exploitation such as online enticement, child pornography, or the prostitution of children. Learn more about the categories here.

As I mentioned, someone here will be emailing you shortly to investigate. If you encounter this type of material in our results or in the Image Labeler in the future, don't hesitate to let us know.

Evan from Google said:

The system is designed to only show safe images and we believe we are doing a good job at it, however, false positives do happen. Thanks for pointing this out for us.

Previous story: Google Banning AdSense Publishers For Not Updating Privacy Policies?
 

Comments:

jackie

12/14/2009 03:45 am

the other day i was on image labeler and i got two pornograpic images i think the images just come up as random if so it would be hard for google to view all the images but then again something has to be done about it.

timtim

02/25/2010 03:20 am

porn images are as common as flowers. the goal is to classify, label and filter out images that promote your/other ppl stuff. deal with it or never get on the image side. ur support is needed.

Rob

05/08/2010 03:29 am

The adult content of google image labeler is why I don't spend much time on it anymore. It's an intriguing system, and some of the landscape images are really amazing, but yes, adult images do frequently appear as part of the system, and this is unfortunate.

BobBobson

02/03/2012 03:49 am

Wow! I have not played GIM in a long time and there was nothing like this back then. I guess it must have changed quite a bit since the last time I played, and the porn must be why they eventually shut it down.

blog comments powered by Disqus