There is a new WebmasterWorld thread that made it to the front page very quick named Flattening Effect of Page Rank Interations - explains the "sandbox"? I feel like I have to quote the majority of the post for you to understand this new Sandbox theory that many in the thread find "refreshing" and "intelligent."
Note the PageRank equation (sans filters) is:
PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn)) .
The first observation about this equation is that it can only be calculated after a statistically significant number of iterations.
If you analyze a site with 5 pages that all link to each other (the homepage having an initial PageRank of roughly 3.5), what you see in the first iteration of PageRank is that the homepage is PR 3.5, and all other pages are PR .365 – the largest PR gap that will ever exist through multiple iterations in this example.
This homepage PR represents a surge in PR because Google has not yet calculated PR distribution, therefore the homepage has an artificial and temporary inflation of PR (which explains the sudden and transient PR surge and hence SERPs).
In the second iteration, the homepage goes down to PR 1.4 (a drop of over 50%!), and the secondary pages get lifted to .9, explaining the disappearing effect of “new” sites. Dramatic fluctuations continue until about the 12th iteration when the homepage equilibrates at about a lowly 2.2, with other pages at about .7.
I believe that the duration of the “sandbox” is the same amount of time it takes Google to iterate through its PageRank calculations.
Therefore, I think that the “sandbox” is nothing other than the time it takes Google to iterate through the number of calculations uniquely needed to equilibrate the volume of links for a given site.
Did you digest that? WebmasterWorld Administrator, tedster, adds that deep links may "short circuit the flattening effect that PR iterations might produce, especially if they were added at decent intervals." To which an other WebmasterWorld Administrator, trillianjedi, adds "You have to begin to consider whether actually the entire PageRank system of old has been replaced with something entirely different....." but he continues to explain that this and all the other theories are speculation, which is why these threads are so enjoyable.
Google has continued to say that PageRank is used and part of the algorithms. Many SEOs believe it is only used now (1) to determine which site should rank higher when site A and site B are equal in all other characteristics (like that ever happens) and (2) to determine the crawl frequency of certain documents. But maybe this theory is right, or maybe it is wrong - maybe Google is using PageRank for this purpose? Who knows....
Forum discussion at WebmasterWorld.
Update: Please read the comments in this entry by clicking here.