Craigslist Blocks Most Spiders: Millions of Pages Delisted

Jan 3, 2006 - 8:10 am 0 by

A thread started at our forums named Craigslist Delists Millions of Pages from Search Engine Indexes uncovers the new robots.txt file in place over at Craigslist. It basically reads;

############################## # Exclude robots from these

User-agent: YahooFeedSeeker Disallow: /forums Disallow: /res/ Disallow: /post Disallow: /email.friend Disallow: /?flagCode Disallow: /ccc Disallow: /hhh Disallow: /sss Disallow: /bbb Disallow: /ggg Disallow: /jjj

User-agent: * Disallow: /cgi-bin Disallow: /cgi-secure Disallow: /forums Disallow: /search Disallow: /res/ Disallow: /post Disallow: /email.friend Disallow: /?flagCode Disallow: /ccc Disallow: /hhh Disallow: /sss Disallow: /bbb Disallow: /ggg Disallow: /jjj

#####################################

They supposedly had millions, 3.6 Million to be exact, of pages indexed at Google and millions at the other search engines. Now? 211,000 at Google, 280,000 at Yahoo and 4,695 atMSN.

Forum discussion at Search Engine Roundtable Forums.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: November 28, 2025

Nov 28, 2025 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Ads Account Hijacks, Thanksgiving Google Volatility & Web Guide For All

Nov 28, 2025 - 8:01 am
Spiders

OpenAI Scales Up Crawling & Bots For The Holidays

Nov 28, 2025 - 7:51 am
Google

Google's Sundar Pichai Monitors X On Launch Days For Feedback

Nov 28, 2025 - 7:41 am
Google

Gemini 3 Now Powering Google AI Mode For Some Queries, Automatically

Nov 28, 2025 - 7:31 am
Google

Google AI Overviews Showcases Events With New Results

Nov 28, 2025 - 7:21 am
 
Previous Story: Google's 2006 New Year Graphic; "Weird"?