Google & WordPress Robots.txt Handling Is Being Looked Into

Nov 6, 2019 - 7:28 am 5 by

Google and WordPress

One of the takeaways from the Google Webmaster Conference was that if Google tries to access your robots.txt file is unreachable but it does exist then Google won't crawl your site. Google said about 26% of the time GoogleBot cannot reach a robots.txt file. WordPress might make changes in order to reduce this error rate.

Here is one of many tweets about this:

Now, with WordPress, Joost de Valk from Yoast said " for sites you can't reach the robots.txt for, is a subset of those WordPress sites? A larger subset than you'd normally expect maybe?" He added that he is "trying to figure out if we should be safer in how WordPress generates robots.txt files."

Gary Illyes from Google said he believes WordPress is generally okay with this issue but he will look into it further to see if WordPress can make some small changes here.

I love this dialog between Google and Yoast (which is very tied to WordPress).

Forum discussion at Twitter.

Update: I upset Gary again, and for the record, the latest intel was the percentage of robots.txt Google cannot reach.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: July 1, 2025

Jul 1, 2025 - 10:00 am
Google Updates

Google June 2025 Core Update Is Rolling Out - What We're Seeing So Far

Jul 1, 2025 - 7:51 am
Google Search Engine Optimization

New Google Search Console Insights Report

Jul 1, 2025 - 7:41 am
Spiders

Cloudflare To Block AI Crawlers By Default & Pay Per Crawl Model

Jul 1, 2025 - 7:31 am
Bing Search

Who Ate The Bing Search Results

Jul 1, 2025 - 7:21 am
Google AdSense

Google AdSense New Privacy Law & Opt-Out Mechanism

Jul 1, 2025 - 7:11 am
Previous Story: Another Google Says We Don't Use User Behavior As A Ranking Factor