Google: Disallowed URLs Through Robots.txt Does Not Affect Crawl Budget

Jun 21, 2019 • 7:40 am | comments (0) by twitter | Filed Under Google Search Engine Optimization
 

Gary Illyes from Google added to the big crawl budget article he published on the webmaster blog in January 2017 a new question and answer. It basically says that if you disallow a URL in your robots.txt file, then those URLs do not affect your crawl budget.

He added this question and answer:

Q: Do URLs I disallowed through robots.txt affect my crawl budget in any way?
A: No, disallowed URLs do not affect the crawl budget.

He posted last night that he added the Q&A for a former Googler, Pierre Far:

More details:

Forum discussion at Twitter.

Previous story: Summer Season Logos From Google, Bing, Yahoo & More
 
blog comments powered by Disqus