Google & WordPress Robots.txt Handling Is Being Looked Into

Nov 6, 2019 • 7:28 am | comments (5) by | Filed Under Google Search Engine Optimization

Google and WordPress

One of the takeaways from the Google Webmaster Conference was that if Google tries to access your robots.txt file is unreachable but it does exist then Google won't crawl your site. Google said about 26% of the time GoogleBot cannot reach a robots.txt file. WordPress might make changes in order to reduce this error rate.

Here is one of many tweets about this:

Now, with WordPress, Joost de Valk from Yoast said " for sites you can't reach the robots.txt for, is a subset of those WordPress sites? A larger subset than you'd normally expect maybe?" He added that he is "trying to figure out if we should be safer in how WordPress generates robots.txt files."

Gary Illyes from Google said he believes WordPress is generally okay with this issue but he will look into it further to see if WordPress can make some small changes here.

I love this dialog between Google and Yoast (which is very tied to WordPress).

Forum discussion at Twitter.

Update: I upset Gary again, and for the record, the latest intel was the percentage of robots.txt Google cannot reach.

Previous story: Another Google Says We Don't Use User Behavior As A Ranking Factor
Ninja Banner
blog comments powered by Disqus