Bing: We Ignore Default Robots Directives If There Is A Bingbot Section

Jan 3, 2019 - 7:55 am 2 by
Filed Under Bing SEO


Frédéric Dubut from Bing's search team said on Twitter that if you create a specific robots.txt directive for Bingbot, their crawler, then Bing will only look at that specific section. So you should make sure that when you do that, copy all the directives from the default to the Bingbot section that you want Bing to comply with.

He said:

Useful robots.txt reminder - if you create a section for #Bingbot specifically, all the default directives will be ignored (except Crawl-Delay). You MUST copy-paste the directives you want Bingbot to follow under its own section. #SEO #TechnicalSEO

Google works a bit differently and goes with the most strict directive they can find, when not told otherwise:

Forum discussion at Twitter.

Update: John Mueller said it works the same way for Google, he said this on Reddit "This is standard for any user agent section in the robots.txt :)"


Popular Categories

The Pulse of the search community


Search Video Recaps

Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: May 24, 2024

May 24, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Ranking Volatility, Ads In Google AI Overviews, Sundar Pichai Interview, Heartfelt Helpful Content & More Ad News

May 24, 2024 - 8:01 am
Google Search Engine Optimization

Google: The Site Reputation Abuse Policy Enforcement Not Yet Algorithmic

May 24, 2024 - 7:51 am
Google Search Engine Optimization

Google Search Can Now Index Electronic Publication (EPUB)

May 24, 2024 - 7:41 am

Directory Of Embarrassing Google AI Overviews

May 24, 2024 - 7:31 am
Web Analytics

Google Analytics Real-Time Reports Adds Users In Last 5 Minutes

May 24, 2024 - 7:21 am
Previous Story: Google Local Asks For Food Or Other Photos From Local Guides