Bing: We Ignore Default Robots Directives If There Is A Bingbot Section

Jan 3, 2019 - 7:55 am 2 by
Filed Under Bing SEO

Bingbot

Frédéric Dubut from Bing's search team said on Twitter that if you create a specific robots.txt directive for Bingbot, their crawler, then Bing will only look at that specific section. So you should make sure that when you do that, copy all the directives from the default to the Bingbot section that you want Bing to comply with.

He said:

Useful robots.txt reminder - if you create a section for #Bingbot specifically, all the default directives will be ignored (except Crawl-Delay). You MUST copy-paste the directives you want Bingbot to follow under its own section. #SEO #TechnicalSEO

Google works a bit differently and goes with the most strict directive they can find, when not told otherwise:

Forum discussion at Twitter.

Update: John Mueller said it works the same way for Google, he said this on Reddit "This is standard for any user agent section in the robots.txt :)"

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Updates

Google March 2024 Core Update Finished April 19th (A Week Ago)

Apr 26, 2024 - 4:40 pm
Search Forum Recap

Daily Search Forum Recap: April 26, 2024

Apr 26, 2024 - 4:00 pm
Search Video Recaps

Search News Buzz Video Recap: Google Core Update Updates, Site Reputation Abuse Coming, Links, Ads & More

Apr 26, 2024 - 8:01 am
Google Search Engine Optimization

Google Publisher Center No Longer Allows Adding Publications

Apr 26, 2024 - 7:51 am
Google

Google Tests Placing The Snippet Date Next To URL

Apr 26, 2024 - 7:41 am
Google

Google Breaks Out Googlebot IP Ranges For User-Triggered Fetchers

Apr 26, 2024 - 7:31 am
Previous Story: Google Local Asks For Food Or Other Photos From Local Guides