Can Google Ever Spider 29.5 Million Pages on One Site?

Sep 13, 2007 • 9:18 am | comments (3) by twitter | Filed Under Google Search Engine Optimization
 

A WebmasterWorld member asks how long Google will take to spider all 29.5 million pages on his website. He just launched the site 2 weeks ago and is hoping that *all* pages get spidered.

That's a generous request of Google. After all, too much content being released at once can be risky, as one forum member says:

Matt Cutts discusses the possibility of triggering a penalty if too many pages are released at once.

That being said, I'd go for it because if you've created a quality site you're going to attract the links you need to start climbing the SERPs.

However, so far, 2,000 pages have been crawled. If the Googlebot goes on a path of spidering all these pages, it will take awhile.

Based on the current indexing routines of your site (143 pages per day), I'd say you have about 56.52 years in the queue. ;)

Maybe it will. After all, large sites like Amazon.com only has 18.7 million pages indexed. It will be hard to compete with authoritative sites even if they're not in the same space.

But the site, which appears to be a city directory of some sort, will have its challenges. It's rare to get that many pages crawled. And moderator pageoneresults adds that a site like Amazon is " is seeing upwards of 5+ million Googlebot visits daily." On the contrary, the webmaster says that Googlebot has only visited him 22 times since launch.

It's possible for Google to crawl more pages, but I'd also find it hard to expect all 29.5 million pages in the index. As tedster says, it will really depend on the backlinks to the sites. The more quality backlinks and the more quality deep links you have, the better you'll fare.

Forum discussion continues at WebmasterWorld.

This post was composed on September 11th and scheduled for publication on September 13th.

Previous story: September 26: MSN Live Search 2.0 Arrives
 

Comments:

Michael Martinez

09/13/2007 04:08 pm

There is more to search engine optimization than backlinks.

Matt Cutts

09/13/2007 05:45 pm

29.5 million pages? Wow. It must have taken forever for them to write that many pages. :)

William Vicary

09/14/2007 01:41 pm

If the user had created 25 million unique pages then possibly a few hundred k of them would be used. But without unique content he is soon going to be flagged for spam and be penalised left right and center!

blog comments powered by Disqus