Google Crawl Limit Per Page Now Couple Hundred Megabytes

Aug 3, 2017 • 7:25 am | comments (5) by twitter Google+ | Filed Under Google Search Engine Optimization
 

googlebot

Last time we reported this, I think, was when Google was able to crawl 10 megabytes per page. Well now that is up to a couple hundred megabytes per page. (Note, the 10MB limit was just for the Search Console Tool).

John Mueller of Google said this in a hangout yesterday at the 45:38 mark into the video. He said:

We do also have a cut-off for indexing but usually that's fairly large. So that’s, as far as I remember from the last time I looked at this, a couple hundred megabytes. So if your pages are bigger than that then probably you have other issues than trying to get the text on the bottom of the page.

Back in 2015, about two years ago, it was only 10MB, now we are at at least 200MB. Nice sizable increase.

Glenn Gabe caught this and posted on Twitter about it saying "The cutoff for indexing per page (the last time John checked) was a FEW HUNDRED MB. That's a big jump."

Here is the video embed:

Forum discussion at Twitter.

Previous story: Matt Cutts Still Giving Webmasters SEO Help
 
blog comments powered by Disqus