Google's Strict SafeSearch Show Porn & Not Moderate SafeSearch

Jul 31, 2012 • 8:13 am | comments (6) by twitter Google+ | Filed Under Other Google Topics
 

I know computers and algorithms are funny but clearly, when you search for [wank] in Google Images and set the SafeSearch filter to moderate, no or little pornographic or nude content is coming up. But when you up the SafeSearch filter to the strict setting, then plenty of nude and pornographic content comes up.

If the lesser version of SafeSearch is blocking it, what is wrong with the stricter version of the filter?

Here is a picture of the current results under moderate SafeSearch:

click for full size

Here is a picture of just changing it from moderate to strict:

click for full size

Of course, I blurred out the nude ones. Keep in mind, there are plenty more when you scroll down.

Like I said, I guess this is just a weird quirk in Google Images SafeSearch algorithm and Google will fix it. But just weird nevertheless.

A system administrator said he runs 700 computers and he noticed this while doing some tests. He posted the issue in the Google Web Search Help forums and wrote:

We have about 700 computers and our internal policies ensure that Google settings for safe search are set to “Strict”. With this we normally have no issues with people accessing pornography. During normal testing we have found that searching for the word “wank” provides very inappropriate images that should not be displayed with safe search in use. I tried phoning Google with no luck. Does anyone know how to contact someone who can assist with this?

Here are some other stories similar to this we covered in the past:

Forum discussion at Google Web Search Help.

Previous story: Daily Search Forum Recap: July 30, 2012
 

Comments:

Justin Seibert

07/31/2012 12:23 pm

It's been bad for a long, long time and not only on suggestive terms, Barry. I won't let my children do image searches by themselves. At least if we're searching photos of kittens or backyard clubhouses together and something questionable pops up, I can quickly close the screen.

Lee Beirne

07/31/2012 12:46 pm

Ok, should I ask how you came across this in-discrepancy, Barry?

Lee Beirne

07/31/2012 12:49 pm

Ok, should I ask how you found that in-discrepancy, Barry?

Barry Schwartz

07/31/2012 01:57 pm

I mentioned above how I found it. I watch the forums and a sys admin said this was an issue. See above, I linked to the source.

Barry Schwartz

07/31/2012 10:45 pm

I said how in the post. I watch the forums and someone complained. I linked to it. This is how I find everything I write here.

BobBobson

03/18/2013 10:27 pm

It makes sense that Google would want to avoid being a purveyor of porn, but how does SafeSearch know when a picture is “offensive” or not? Obviously they cannot write an algorithm that can reliably flag pictures without false-positives, so of course they have to be manually flagged by people. Considering that there are a couple of trillion pictures on the Internet, it’s not going to be practical since many/most will not be accurately flagged. Moreover, what is stopping accidental or even purposeful false-positives? For example, what’s to stop a company from having a bunch of people sit around and flag pictures of their competitor’s products all day? Google employees would have to manually review each and every flag which is as impractical as it gets. SafeSearch is a good *idea*, but it’s not foolproof, so it should be up to users to use it or not, not to Google. For scenarios where a user *must* filter out inappropriate images (schools, offices, etc.), there are better, more appropriate filtering technologies than SafeSearch, so again, it should be up to the user to enforce filtering, NOT Google.

blog comments powered by Disqus