Google: We Know About 30 Thousand Trillion URLs On The Web But...

Jun 3, 2015 - 8:20 am 2 by

serversGoogle's Gary Illyes said that Google knows there are over 30 thousand trillion URLs on the web at SMX Advanced last night. But that number is not new, I thought it was, but it was mentioned back in 2013 when they released the Inside Search portal.

What Gary did say, but I don't think he meant it that way, was that Google doesn't have the server capacity to store them all. He went on to explain the concept of crawl capacity, PageRank, what Google decides to index and not - this is all basic stuff for SEOs.

I don't think Gary meant to say Google doesn't have the ability to store all the URLs but rather, Google chooses not to.

Travis Write tweeted a pretty accurate quote from Gary last night at SMX, in my opinion:

But Gary responded later that it was incorrect, he wrote on Twitter:

Either way, crawl priority is important for SEOs to understand. Google doesn't want to bother indexing and storing web pages that are not useful.

Forum discussion at Twitter.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Flux, AdSense Ad Intent, California Link Tax & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: April 23, 2024

Apr 23, 2024 - 4:00 pm
Link Building

Google: Ignore Link Spam Especially To 404 Pages

Apr 23, 2024 - 7:51 am
Google Search Engine Optimization

Google: We Have Taken Action On Some Parasite SEO In Recent Update

Apr 23, 2024 - 7:41 am
Bing Search

Mikhail Parakhin Breaks Silence On Mustafa Suleyman Of Microsoft (Kinda...)

Apr 23, 2024 - 7:31 am
Google Maps

Google Business Profiles Gains Select Preferred Menu Source

Apr 23, 2024 - 7:21 am
Google Search Engine Optimization

Google: Crawl Budget Goes Across All Googlebot Crawling, Not Just Web Search

Apr 23, 2024 - 7:11 am
Previous Story: Google Search Console Data API Download Issues?