Google Shares Its Robots.txt Parser Code With Open Source World

Jul 2, 2019 - 7:46 am 0 by
Filed Under Google

Google Open Source

Google announced yesterday as part of its efforts to standardizing the robots exclusion protocol that it is open sourcing its robots.txt parser. That means how GoogleBot reads and listens to robots.txt files will be available for any crawler or coder to look at or use.

It is rare for Google to share anything they do in core search with the open source world - it is their secret sauce - but here Google has published it to Github for all to access.

Google wrote they "open sourced the C++ library that our production systems use for parsing and matching rules in robots.txt files. This library has been around for 20 years and it contains pieces of code that were written in the 90's. Since then, the library evolved; we learned a lot about how webmasters write robots.txt files and corner cases that we had to cover for, and added what we learned over the years also to the internet draft when it made sense."

Forum discussion at Twitter.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google AI Overviews, Ranking Volatility, Web Filter, Google Ads AI Summaries & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Search Engine Optimization

Google Working On Surfacing More Content That Comes From The Heart

May 20, 2024 - 7:51 am
Google Search Engine Optimization

Danny Sullivan: We Speak Directly To Engineers Working On Google Ranking Systems

May 20, 2024 - 7:41 am
Google Search Engine Optimization

Comments To Notes In Google Search?

May 20, 2024 - 7:31 am
Google

Google Says Notes On Search Now Ends Soon

May 20, 2024 - 7:21 am
Bing Search

Bing Lets You Turn Off AI Responses But Google Doesn't Let You Turn Off AI Overviews

May 20, 2024 - 7:11 am
Search Forum Recap

Daily Search Forum Recap: May 17, 2024

May 17, 2024 - 4:00 pm
Previous Story: Search Google For Fireworks & Get A Fireworks Show