
Earlier this month, we reported that Google updated two of its help documents around Google's crawler file size limits. Well, Google made a clarification to one of those documents the other day after some confusion within the SEO industry.
This was the help document that was updated and it specifically says now "a Google crawler like Googlebot may have a smaller size limit (for example, 2MB), or specify a larger file size limit for a PDF than for HTML."
The new version reads:
By default, Google's crawlers and fetchers only crawl the first 15MB of a file, and any content beyond this limit is ignored. However, individual projects may set different limits for their crawlers and fetchers, and also for different file types. For example, a Google crawler like Googlebot may have a smaller size limit (for example, 2MB), or specify a larger file size limit for a PDF than for HTML.
The older version read:
By default, Google's crawlers and fetchers only crawl the first 15MB of a file. Any content beyond this limit is ignored. Individual projects may set different limits for their crawlers and fetchers, and also for different file types. For example, a Google crawler may set a larger file size limit for a PDF than for HTML.
This change was made a couple of days ago.
Forum discussion at X.

