"Phrase Based Re-Ranking" Algorithm To Blame for the Google 950 Penalty?

Feb 8, 2007 • 7:59 am | comments (11) by twitter Google+ | Filed Under Google Search Engine Optimization
 

There has been a lot of recent talk about the Google Minus-950 Penalty. It was named that because people didn't find their pages anywhere before around the 950th result. The question is what happened recently to cause such a change in their rankings at Google.com?

WebmasterWorld administrator, tedster, believes that it has to do with a new patent application by Anna Lynn Patterson of Google named Detecting spam documents in a phrase based information retrieval system. There is a large conversation about that patent application at WebmasterWorld. It basically says, in short:

Phrases are identified that predict the presence of other phrases in documents. Documents are the indexed according to their included phrases. A spam document is identified based on the number of related phrases included in a document.

Tedster broke off the original thread on the 950 penalty and created a new one at WebmasterWorld on "Phrase Based Re-ranking." Tedster feels it is the Phrase Based Re-ranking algorithm that is the cause for the minus 950 effect. Tedster explains, "What it seems to be is some kind of "Phrase-Based Reranking" - possibly related (we're still probing here) to the Spam Detection Patent invented by Googler Anna Lynn Patterson." Let me pull more quotes from the thread to clarify his theory.

It's like getting a poor health symptom in one area of your body from not having enough of some important nutrient -- even though you've got plenty of others and plenty of good health in many ways.

In one case I know of, the signs of this problem disappeared with one solid new inbound link from a very different domain, with the problematic phrase used as the anchor text. By "very different" I mean the linking domain was not in the 1,000 for the given search.

So, not less "SEO" fixed it, but more. The purely OOP assumptions don't sit right with me, given this anecdotal result. Now it's only one case, so it's certainly not "proof" of anything, but the fix has now been stable for several data refreshes at Google, so it is a decent data point to keep in mind.

Of course this is just a theory and that is why these threads are fun. Nice work tedster!

Forum discussion at WebmasterWorld.

Previous story: Google's Email, Gmail, Is Now Open To Public
 

Comments:

Chris Beasley

02/08/2007 02:12 pm

Why can't people just admit their [url=http://www.threadwatch.org/node/11502]site sucks[/url] instead of making up a penalty for it?

Chris Beasley

02/08/2007 02:13 pm

...and there I go thinking I'm in vbulletin and not a blog....

Michael Martinez

02/08/2007 04:58 pm

Highly unlikely. People need to stop using the patent applications as excuses for their poor search engine optimization skills.

theGypsy

02/11/2007 04:54 pm

While I am NOT a believer in -whatever penalties - there is much to be seen with Phrase Based Indexing and Retrieval. I started a more appropriate thread. http://www.webmasterworld.com/google/3247207.htm

Clint Dixon

02/12/2007 01:55 pm

Well the solution does not make sense based on what Tedster is claiming is the penalty..he references a spam document which is implied there is spam content on the page and the fix for this is to add a link??? somebodies reading things wrong.... It is like saying "add gasoline" to a transmission that is slipping and it will fix things...

Rob Abdul

02/14/2007 03:03 am

I agree with Clint Dixon.

Marcia

02/15/2007 07:48 pm

>>he references a spam document which is implied there is spam content on the page and the fix for this is to add a link??? Maybe is is NOT a spam document or a spam site, and maybe that site and page does not suck, and maybe that site/page might just be a respected authority site for its topic (as in HITS - hubs/authorities). And maybe tedster saw it happen with his own two eyes, saw said page and saw the link, which maybe was not just a "link" but a link from an on-topic page on an on-topic site with a decent hub score that ranks well in Google. He's not saying that's the whole enchilada, or that "links" are the magic ingredient for the secret sauce. What I saw is that he was initiating discussion based thinking rather than whining, weeping, wailing and gnashing of teeth.

Marcia

02/15/2007 08:53 pm

Incidentally, what I see as a major issue is that just as Dr. G. pointed out in one of his papers, just like a while back (and still) there were "snake-oil peddlars" trying to sell "LSI optimization" (no such thing as an LSI optimized page) the same thing will start to happen with "phrase based optimization" which is theoretically impossible unless someone has some means to control the whole coocurrence base of the entire Google index.

Jim Olsson

02/26/2007 03:36 pm

Only a fanatic fan of Google can deny the "end of results" or -950 penalty. There are many examples of good and bad articles which have been hit. Clearly, Google do not like content with commercial keywords competing with sponsored AdWords links.

Brad Dudley

02/26/2010 04:10 pm

I agree with Jim I think Google likes to do what benifits Google and noone else.

Thomas

12/17/2010 09:43 pm

Well Google is a business and they exist to make as much money as possible! I have noticed this with many of my clients....somehow they get dinged for spam, selling links, whatever....and their homepage drops to about the 1000th position.

blog comments powered by Disqus