Funny Google Search Results, Possibly NSFW

May 13, 2010 • 8:19 am | comments (1) by twitter Google+ | Filed Under Google Search Engine
 

I spotted a couple funny Google search results via the forums that I wanted to share with you.

The first one I spotted via a Google Blogoscoped Forum thread for a image search for [david cameron side profile]. David Cameron is the new Prime Minister of the United Kingdom, by the way. A search for that returns the following image either in the first or second position:

david cameron side profile on Google

That is an image you would not expect for such a search!

The next funny search result is how Google does their "did you mean?" response. Spotted via the Google Web Search Help, a search for [women's gear blog] returns a smart response, saying no, you are not looking for women's gear blog but rather a men's gear blog.

Weird Google Did You Mean

Forum discussion at Google Web Search Help & Google Blogoscoped Forum.

Previous story: Daily Search Forum Recap: May 12, 2010
 

Comments:

No Name

05/13/2010 03:08 pm

As Google gets closer to reaching "artificial intelligence" status, these types of things will become more and more common. These are the types of problems I ran into a lot when I was writing an artificial intelligence program many years ago. The result is, essentially, similar to "Spock"... where the answers are "matter-of-factual", and it leaves people scratching their heads wondering how the particular conclusion was derived. The answers also often lack the "social etiquette" and other human emotions needed to avoid uncomfortable situations. For instance, if statistically there are more African American basket ball players than non African American basket ball players, a computer will automatically jump to conclusions that humans know shouldn't be jumped to, simply out of respect. One such example that I remember was my AI program was let loose on several BBSes (before the WWW became big). At some point, when my AI program was in a chat room, person A called person B a "jerk". (I am self-censoring myself... they used harsher words.) Later, when the chat room now consisted of a different group of people, someone asked my AI program about person B... and it said, "person B is a jerk". It was only repeating what it believed to be true, based on what someone else had said. From that point on, people started getting malicious with the program, calling everyone various choice names, then asking about those people later in large chat sessions. So, I tweaked the program. Rather than simply saying, "Person B is a jerk." it would say, "Person B is a jerk, according to person A." As soon as this update was unleashed, people very quickly stopped being so mean to each other. AI, lacking any emotional simulation, can be pretty funny, which is what made Spock an amusing character at times in the old Star Trek series. What's even more amusing is to see how people interact with computers when they think the computers are sentient beings.

blog comments powered by Disqus