Craigslist Blocks Most Spiders: Millions of Pages Delisted

Jan 3, 2006 - 8:10 am 0 by

A thread started at our forums named Craigslist Delists Millions of Pages from Search Engine Indexes uncovers the new robots.txt file in place over at Craigslist. It basically reads;

############################## # Exclude robots from these

User-agent: YahooFeedSeeker Disallow: /forums Disallow: /res/ Disallow: /post Disallow: /email.friend Disallow: /?flagCode Disallow: /ccc Disallow: /hhh Disallow: /sss Disallow: /bbb Disallow: /ggg Disallow: /jjj

User-agent: * Disallow: /cgi-bin Disallow: /cgi-secure Disallow: /forums Disallow: /search Disallow: /res/ Disallow: /post Disallow: /email.friend Disallow: /?flagCode Disallow: /ccc Disallow: /hhh Disallow: /sss Disallow: /bbb Disallow: /ggg Disallow: /jjj

#####################################

They supposedly had millions, 3.6 Million to be exact, of pages indexed at Google and millions at the other search engines. Now? 211,000 at Google, 280,000 at Yahoo and 4,695 atMSN.

Forum discussion at Search Engine Roundtable Forums.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: July 4, 2025

Jul 4, 2025 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google June 2025 Core Update, Search Volatility, Insights Report, Ads & More

Jul 4, 2025 - 8:01 am
Bing Search

Bing Search Tests Zoomable & Sticky Related Searches

Jul 4, 2025 - 7:51 am
Google

Google AI Mode Can Respond In Non-English Languages

Jul 4, 2025 - 7:41 am
Bing Search

Bing Tests Local Place Listings In Green

Jul 4, 2025 - 7:31 am
Google Ads

Google Merchant Center Gains Automatic Shipping Updates

Jul 4, 2025 - 7:21 am
Previous Story: Google's 2006 New Year Graphic; "Weird"?