A rebuttal of Phil Craven's "Google Explained" - Flawed

Feb 15, 2005 • 10:22 am | comments (1) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Phil Craven has one of the more popular documents on PageRank named PageRank Explained. A person by the name of Michael Martinez posted in several forums (SEW, Spider Food and others) a thread named A rebuttal of Phil Craven's "Google Explained", where he tried to find flaws in the article. His rebuttal is long, detailed, but flawed according to two individuals (not including Phil Craven).

Ammon Johns, in a thread at Cre8asite Forums discussing this topic, said:

If Martinez can show that Google aren't doing that he'll be famous.
However, he seems to miss what that average is about. It is a normalization value that is absolutely fundamentally essential, and is the entire reason that the reiterative link calculations can work. The convergence of the average value of 1 is where the reiterative calculations can stop reiterating.
The value of 1 helps ensure that there is not more total link popularity than there is links to create it. It means that on average, across the web as a whole, a link is worth 1 link, and not on an endless scale where the value of one single link has no true value, thus causing all other calculations to be valueless. It would seem that Martinez is no mathematician.

If Ammon's word is not good enough for you, how about the word from the individual who wrote the first (I believe) PageRank article, Chris Ridings (see PeterD's interview with Chris). ChrisR replies in with his thoughts on the rebuttal at the Spider Food forum.

I do not believe Phil Craven, Ammon Johns and Chris Ridings are all buddies, so its not only makes for a good technical read, but also it can be a bit humorous.

Previous story: Google AdWord Pros Get the Logo
 

Comments:

Michael Martinez

10/23/2006 06:50 am

The link is no longer active, probably due to some reorganization at Cre8asite. Ammon was unable to back up his remarks, due to his confusion over PageRank and other issues. Part of the discussion is still archived at <a href="http://web.archive.org/web/20050305113853/http://cre8asiteforums.com/viewtopic.php?t=21646" target="_blank">Archive.Org</a>. The comment you cited above ("It is a normalization value that is absolutely fundamentally essential, and is the entire reason that the reiterative link calculations can work. The convergence of the average value of 1 is where the reiterative calculations can stop reiterating.") was absolute nonsense. PageRank does not converge to an average of 1, as I have now shown many times since that discussion. In the section of the thread still available at Archive.Org, Ammon also misexplained the 15% damping factor, which was just an assumption by Page and Brin about the probability that their "random surfer" would get tired of following a link path and stop clicking. The damping factor represents a decay rate in random clicking. Several technical papers have suggested that almost any damping factor works well enough, but most seem to stay with the 15% factor.

blog comments powered by Disqus