Being "Condemned to Google Hell" and Matt's Rebuttal

May 2, 2007 • 10:37 am | comments (5) by twitter | Filed Under Google Search Engine Optimization
 

Yesterday, a popular article on Digg (stay tuned for the Digest) was the Forbes article on the impact of Google to search engine rankings and how losing rankings can negatively affect your business.

A Webmaster World forums post discusses the topic of Google Hell, a term that was coined by Jim Boykin, I believe, and links to Matt Cutts's response.

In a nutshell, a jeweler was delisted from Google and people were confused as to why. The article mentions some reasons why you may lose your visible rankings:

Web designers have found that pages with duplicate content, few words or pictures, and a lack of links to other quality sites are the most likely to be pulled in [to the supplemental index].

Matt, however, provides another take. Google received spam reports of link exchange emails and that caused the site to lose its credibility in the search engine's eyes.

Reciprocal links by themselves aren't automatically bad, but we've communicated before that there is such a thing as excessive reciprocal linking.

Still, a few questions remain:

If Google is weighing in on these email reports, did they just admit that they're looking at email? The thought is unsettling.

Did he just let us all know that Google is now looking at emails?

What's the point of the supplemental index anyway?

What exactly is the purpose of the “supplemental” index? Why do they need it? Why do they need to have two categories of results? The index, and then the supplemental index?

Personally I don’t see anything “supplemental” about it. Why not just a continuum of results, based upon relevance?

Another member believes that the supplemental results do serve a purpose:

Supp index exists because of all the crap that is made everyday. Unfortunately a lot of good stuff could get thrown in too.

I will chime in and say that I think ranking for regular results would become all the more competitive if there was no way of filtering out extraneous results and putting them into some sort of supplemental index.

Discussion continues at WebmasterWorld.

Previous story: Social Media Revolt: Digg and Democracy
 

Comments:

Matt Cutts

05/02/2007 03:44 pm

"If Google is weighing in on these email reports, did they just admit that they're looking at email?" Nope, that's not the case. I probably didn't explain it very clearly; what happened is someone outside of Google received that unsolicited email. Then they took that email and did a spam report to Google with the content of the email.

Tamar Weinberg

05/02/2007 03:50 pm

Hey Matt, thanks for the clarification. I didn't think so either. I've seen a fair share of spam reports in my sysadmin days too, and I know that these things eventually make it up the ladder if they're really that widespread.

Blackbeard

05/02/2007 03:51 pm

What I found funny about all of this is that when it comes down to it, these sites were playing in the gray areas and got caught. That's the risk you take in doing these things and ultimately I think these site owners should know better.

CVOS

05/02/2007 07:44 pm

If forbes had researched "supplemental index" in Matt Cutts blog they would have published a more accurate article. However, even though the article was low on facts, it was high on sensationalism and garnered a lot of links from the SEO community. +1 for forbes.com google ranking.

Michael Martinez

05/02/2007 11:47 pm

"I think ranking for regular results would become all the more competitive if there was no way of filtering out extraneous results and putting them into some sort of supplemental index." Google's Supplemental Results Index is NOT a repository for "extraneous results". The sooner the SEO community divests itself of these myths and nonsense about the Supplemental Results Index, the better. It's not the "duplicate content repository". It's not the "suspicious spammy content repository". It's the "place where your pages go because they don't have enough inbound links from other pages in the Main Web Index." Google won't parse pages in the Supplemental Index. You cannot find unique text expressions embedded on Supplemental Index pages unless there are links pointing to those pages from the main index. Google has effectively silenced a large portion of the Web -- perhaps the majority of the Web -- by requiring that pages obtain links from the Main Index. This is what Inktomi did in the late 1990s, so we see now that Bigdaddy has only served to move Google's quality back into the dark ages of search. Instead of addressing the core issue here -- which is the fact that Google allows people to distort relevance and degrade its search quality results by passing link anchor text -- Google has once again made all Webmasters the victims of bad judgement. You cannot determine the relevance of a document by looking at the link anchor text pointing to it. You cannot find the "best, most authoritative" content by deliberately omitting millions of pages from the queries just because they don't have some minimal amount of INTERNAL PageRank passing to them from links found in the Main Web Index. Googlers, you can expect to feel your ears sizzle and itch even more as the days pass by because, frankly, people are just getting tired of the "there is nothing wrong with being in the Supplemental Index" song and dance. That dog won't hunt.

blog comments powered by Disqus