Google Image Labeler is a fun game people can play to help Google Image Search better understand and tag pictures within the Google Image Search index. This game launched in September 2006 and has been somewhat under the radar since then.
Recently, I saw not one, but two different threads in Google Web Search Help forums by two different people, who both spotted pornography in Google Image Labeler. One even saw child pornography. Here is what they had to say:
I was doing the Google Image Labeler when suddenly I am confronted with child pornography to label! I am both shocked and outraged that Google allowed such a thing to happen. How can Google make sure this type of filth does not show up again?
Why is there porn on google image labeler? At first it was just women without anything covering their breasts but I saw a sex scene the other day. Is there any way to report this?
Is there a way to report images that are offensive or inappropriate in Google Image Labeler? No, not really. You can in image search, but not here.
Update: Googlers have replied to both threads.
Jaime from Google said:
Thanks for taking the time to post here; we take any instance of abuse toward minors very seriously and will be in contact with you privately so that we can further investigate and take the appropriate action.
While we will certainly report any legitimate abuse we become aware of to the appropriate authorities, I'd also welcome you to help keep children safe by directly contacting the National Center for Missing & Exploited Children's (NCMEC) CyberTipline 24-hours per day, 7 days per week online at www.cybertipline.com or by calling 1-800-843-5678. Reports can be made regarding eight categories of child sexual exploitation such as online enticement, child pornography, or the prostitution of children. Learn more about the categories here.
As I mentioned, someone here will be emailing you shortly to investigate. If you encounter this type of material in our results or in the Image Labeler in the future, don't hesitate to let us know.
Evan from Google said:
The system is designed to only show safe images and we believe we are doing a good job at it, however, false positives do happen. Thanks for pointing this out for us.