Google Search Suggests "Gay Should Die": Google Policy Team?

Nov 25, 2013 • 8:03 am | comments (49) by twitter Google+ | Filed Under Google Search Engine

If you go to Google and enter into the search box [gay should ], Google will offer up suggestions.

One searcher was doing a report on reasons on why "Gay should marry" but didn't complete his search and saw these suggestions:

Google Search Suggests

In fact, his first suggestion, as his picture shows in the Google Web Search Help forums was "Gay Should Die." I am not sure why his search showed that first and mine did not.

But as you can imagine, this can be incredibly upsetting.

A top contributor said, "I have escalated your question for the Google Team to look at. It's a weekend for many and it may not be addressed today."

It will be interesting to see how Google handles this. If you have not seen it yet, you should watch and read the search policy post we have, where Google talks about how they handle the thin line between keeping the algorithm untouched but dealing with situations like this.

It will be interesting to see how Google handles this situation.

Forum discussion at Google Web Search Help.

Previous story: Google's White Ribbon Takes Down The UN Women Web Site



11/25/2013 01:20 pm is really a reflection on the overwhelming ignorance and fear of 'the other' that is so dutifully preached to various flocks and easily harped on because fear is easy to stoke and keep roiling. I think they shouldn't clear it up manually, let it be a testament to how far we still have to go, lest we think ignorant hatred isn't alive and well. Maybe one day that changes on its own... Until then it is a shameful part of our species that shouldn't simply be swept under the rug and left to fester. Besides, that is nothing. You want to see full throated bigotry and hatred on display? Head on over to the Yahoo news comment is a fantastic reality check for those of us who find ourselves living in a self created bubble of 'live and let live'. Want another depressing auto-complete: "woman should "/"women should not "; or how about "muslims should "/"muslims should not "; or how about "jews should "/"jews should not "... ...unfortunately there are many shameful auto-completes out there.


11/25/2013 01:25 pm

Ugh, type of a response only to have it disappear. Barry, is there some rhyme or reason that certain posts are automatically not shown after submitting? I find length can be one...but there must be certain words or phrases that trigger this as well, no?


11/25/2013 01:42 pm

...hmm, or I am just impatient. ::embarrassed::


11/25/2013 03:15 pm

This is exactly what I've been talking about here :


11/25/2013 04:30 pm

you know that google suggest "men should ...all die" too ? Note that French jew association have succefully manage to avoid to add "jew" add to celebrity name in suggest so it's possible...if there is a legal action.

PM Fiorini

11/25/2013 05:05 pm

@barry Interesting...There is another search that could be construed as offensive as well...Type in "muslims should" --> Google suggest shows "muslims should be killed" and "muslims should be banned". Google's algorithm is clearly showing intolerance.


11/25/2013 05:07 pm

The users who search on Google are showing intolerance. The algorithm doesn't know or care about the differences between groups of people.


11/25/2013 05:38 pm

Well put. Its not google at fault here . its us as a species.


11/25/2013 05:42 pm

Did someone hear, that reporters like Barry or Danny asked Googlers precise questions on their conferences, like : " How does this ... correlates with your endless reports about your smart algo ? How is your algo smart, when it promotes obviously stupid things ? Or such things don't look stupid for you ? "


11/25/2013 05:56 pm

No comments

Emma North

11/25/2013 06:12 pm

Completely agree. I don't think it's down to Google to stop showing this sort of thing in auto-suggest - if people are searching for this stuff then it's going to be shown there. That said, perhaps Google could/should better educate the casual user as to what that auto-suggest actually is, to ensure they know that it isn't Google's idea of right or wrong and is just what some ignorant morons search for.


11/25/2013 06:17 pm

Then this is wrong algo. Internet was created to make world better, not to promote the worst it has. I even remember one company had the motto "Don't be evil".


11/25/2013 06:32 pm

The auto-complete is based off of most frequent searched words/phrases. Why should they go in and make changes to what is displayed because it might show a negative side of people who may be using the service? And how is this against 'do not evil'? Censorship isn't an ideal, even if it covers up something ugly; especially if it covers up something ugly.


11/25/2013 07:09 pm

Well, I also never could understand what is the problem to show occasionally porno episodes on the major TV channels, when sex is what the most of the people do on regular basis. Or why did this get the red-band (watch the trailer) : I'm absolutely against any censorship and this is not the case here. Bing uses algo as well. But somehow Bing algo calls for the discussion (see pic above). Does it mean, that Bing visitors are more tolerant ? Or does it mean, that there is way to make algo not to be evil without the censorship ?


11/25/2013 08:37 pm

@ethalon How about this : Is it also just the reflection ?


11/25/2013 08:52 pm not know what this is supposed to prove. You found a bug in a big complex system...does that prove anything?


11/25/2013 09:06 pm

Maybe it means that Bing has a much smaller sample size to pull from? Maybe Bing isn't the 'default' search engine for most people and therefore the sample is being pulled from a different type of user base? Maybe Bing actively polices it's auto-complete for 'controversial' topics?I don't know... ...but change the query to the way native English speakers would phrase the start of that query: "gays should ". You see much uglier results, probably because you now have a much larger sample size due to the way that query would most often be entered (using a plural for the noun).


11/25/2013 09:08 pm

This is not just the bug, but the bug related to the objective information. I suppose I don't need to explain you, that objective information is the pure information not influenced by anything. This is pure SELECT query and if Google can't handle even such simple queries, then problem is much deeper, than just bug or reflection. Ask any serious programmer.


11/25/2013 09:15 pm

Don't get me wrong, I is certainly an error that should be addressed... ...I just don't see it as a deep indicator of something worse boiling under the surface. Who knows, maybe I will be the Mayor in Jaws on this one and later I will realize I should have taken it more seriously.


11/25/2013 09:23 pm

1. according to the science called Statistics ( Google and Bing have equal samples. After the certain number it doesn't matter will it be 1mio or 5mio group of people. Results will be approx. the same. This is how exit-polls are collected during the elections. 2. again, it doesn't matter. Approx. figures remain the same ; 3. may be. I also don't know ... ... the fact is - Bing doesn't offence. I used the query Barry started from. But even "gays should" doesn't call for action.


11/25/2013 09:29 pm

1 and 2) Fair point. Didn't think that one through. The last bit: Interesting that the plural on 'gay' produces pretty drastically different results. Try 'gays should ' in Google and you get much more positive auto completes...but it is the opposite in Bing. 'gay should ' = not terrible auto complete in Bing; terrible auto complete answers in Google. 'gays should ' = not terrible auto complete in Google; terrible auto complete answers in Bing. So your argument against the algo/auto fill...would seem to apply for both, no?


11/25/2013 09:35 pm

"deep indicator of something" When we hear success stories from Google almost every month, but reality shows different stories, then this is the deep indicator. Pay attention again, that this is objective information we're talking about. Not influenced, not hacked, not spammed, nothing from the outside - pure inner SELECT query )


11/25/2013 09:37 pm

I hear you, I just don't agree with your conclusion. Like I said, maybe I am wrong and you are out hunting the shark. Fair enough.


11/25/2013 09:44 pm

"gays should" is really much better in Google. And I don't see why do you believe it's terrible in Bing. Because of the "executed"? But it's related to certain person - "arizona pastor". This is objective information - pastor said, search engine reflects as related to the quote. In the first case about Google it is not objective, not related to the certain person. If it would be (like in Bing), then it would be differently.


11/25/2013 09:45 pm

Ok )


11/25/2013 09:57 pm

"And I don't see why do you believe it's terrible in Bing" I am searching from their homepage and the auto complete results are: 1) the one you mentioned 2) n't be able to adopt 3) not be allowed in church 4) not have equal rights 5) burn in hell 6) be allowed in the military 7) be allowed to adopt 8) not be allowed to adopt The pastor one wasn't what I was referring to...


11/25/2013 10:06 pm

I'm out ) Too much time here today ... Back to work ... Thanks for conversation )


11/26/2013 11:29 am

Hope they fix the bug and not change the algorithm as whole, else again there will be a Google update

Link Juice

11/26/2013 12:21 pm

Tricky one.. another suggestion was "gay should not be allowed to marry" but what if it was in the context "many socially retarded people think that gay should not be allowed to marry"... what if the example in this story was generated by a big story condemning a march in the deep south where people where reported to be chanting "gay should die"

PM Fiorini

11/26/2013 04:30 pm

I know...I'm being sarcastic - What I said is supposed to be funny - not taken seriously - LOL!


11/26/2013 04:38 pm

Again I need to start the lobby for an accepted sarcasm tag so I can avoid looking like an asshat.


11/26/2013 04:55 pm

Its not a 'simple select', its the system that automatically generates the items to go inside the select list that is the problem. The select list itself is working fine, its the more complex system behind it which has the problem and the problem is it doesn't account for the stupidity of some humans!


11/26/2013 04:58 pm

How is that different from the 'die' statement been related to many stories of gay people dying. The simple fact is its based off what a lot of users have typed. Should it be auto-censored or not is a different question.


11/26/2013 05:24 pm

When I entered [jew should] the top google search suggestion was [jew should be killed] For a comparison, Bing suggestion was [jews should worry about dan brown's success].


11/26/2013 05:26 pm

Sorry man, I can't understand you. Obviously you're SEO who never wrote a piece of code with 'select' query.


11/26/2013 05:28 pm

It was replied already. Read above.


11/26/2013 08:10 pm

Not a fair comparison, as you aren't comparing the same queries. Type 'jews should ' and you get different results.


11/27/2013 05:26 am

Just curious, How d these auto suggested phrases come about?


11/27/2013 09:30 pm

Without wanting to sound insensitive, Google is still a search engine. One of the common search engine features is search suggestions, auto-complete or whatever you may call this feature. I think its commonly agreed upon that the feature is a good one. In my view Google offers the list of completions because these are the most likely completions that its site visitor might be interested in entering. Google has learned from previous searches that this is the most likely continuation for the search phrase. If we agree on that then we might agree that the problem is not Google but the people using Google. Why is no-one raising concerns about this? If Google were to vet any potentially insensitive phrase for auto-complete then people might complain about Google manipulating them ... Just a thought ....


11/29/2013 04:43 pm

I'm primarily a Web Developer so you couldn't be more wrong. Its got nothing to do with the select query, its what you fill the option values with. Building the select is the easy part.


11/29/2013 04:57 pm

And the option value you fill how ?? By dancing Futterwacken Dance ? ))


12/02/2013 09:24 am

No, by a program far far more complex than the select query itself.


12/02/2013 09:49 am

And what is the core of any "far far more complex" program, extracting data from databases ?


12/03/2013 09:25 am

I'm fairly sures its far more than a simple SQL select, there might be one, but the whole list is likely stored locally in RAM to improve speed in a indexed, linked list. Even if it was stored in the SQL, the select is just the first bit of it, we know there is some filtering and analysis done to try and help prevent Google bombs as well. Like I said, its far more complex than just a select statement.


12/03/2013 09:43 am

Just try to write very very complex program extracting data from DB (then filter it, store it, do whatever you want) and then see what will be the core of your thousands strings of code )


12/03/2013 10:03 am

Ah so its gone from "a simple select statement" to a program using multiple select statements and other logic. Even your version is getting more complex by the minute.


12/03/2013 10:23 am

Any program, extracting data from DB, no matter how complex is it, has simple SELECT query as the base. It's like the cell is the base of any life. Is it more clear now ?


12/03/2013 10:25 am

I never said it wasn't, I said it was more complex than a simple select query, which you were arguing with. Apparently you are now agreeing with me while arguing. (since any creature is more complex than a single cell barring ameboas)


12/03/2013 11:02 am

Write the code, man ) Write the code to see the full picture.

blog comments powered by Disqus