Google & WordPress Robots.txt Handling Is Being Looked Into

Nov 6, 2019 - 7:28 am 5 by

Google and WordPress

One of the takeaways from the Google Webmaster Conference was that if Google tries to access your robots.txt file is unreachable but it does exist then Google won't crawl your site. Google said about 26% of the time GoogleBot cannot reach a robots.txt file. WordPress might make changes in order to reduce this error rate.

Here is one of many tweets about this:

Now, with WordPress, Joost de Valk from Yoast said " for sites you can't reach the robots.txt for, is a subset of those WordPress sites? A larger subset than you'd normally expect maybe?" He added that he is "trying to figure out if we should be safer in how WordPress generates robots.txt files."

Gary Illyes from Google said he believes WordPress is generally okay with this issue but he will look into it further to see if WordPress can make some small changes here.

I love this dialog between Google and Yoast (which is very tied to WordPress).

Forum discussion at Twitter.

Update: I upset Gary again, and for the record, the latest intel was the percentage of robots.txt Google cannot reach.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: September 13, 2024

Sep 13, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Ranking Volatility, Apple Intelligence, Navboost, Ads, Bing & Local

Sep 13, 2024 - 8:01 am
Google Ads

Google Ads Introduces Confidential Matching

Sep 13, 2024 - 7:51 am
Google

Google Tests AI Overview With Small "More" Link

Sep 13, 2024 - 7:41 am
Google Ads

Complaints On Google Ads Search Partners Network Fraud

Sep 13, 2024 - 7:31 am
Google

Google Testing For Reference Search Box

Sep 13, 2024 - 7:21 am
Previous Story: Another Google Says We Don't Use User Behavior As A Ranking Factor