Google Stops Indexing Craigslist; Matt Cutts Fixes

Mar 18, 2013 • 8:27 am | comments (3) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Google CraigslistA HackerNews thread highlights a blog post by Tempest Nathan where he said Google stopped indexing Craigslist.

It was true, Google did stop indexing Craigslist. But why?

Did Craigslist spam Google? Did they violate Google's webmaster guidelines? Did they add the noindex directive to their pages? Nope. None of this.

It was a technical quirk.

Matt Cutts, Google's head of search spam, explained at the HackerNews thread saying they are fixing the issue on Google's end but this is what technically happened:

To understand what happened, you need to know about the “Expires” HTTP header and Google’s “unavailable_after” extension to the Robots Exclusion Protocol. As you can see at http://googleblog.blogspot.com/2007/07/robots-exclusion-protocol-now-with-even.html , Google’s “unavailable_after” lets a website say “after date X, remove this page from Google’s main web search results.” In contrast, the “Expires” HTTP header relates to caching, and gives the date when a page is considered stale.

A few years ago, users were complaining that Google was returning pages from Craigslist that were defunct or where the offer had expired a long time ago. And at the time, Craigslist was using the “Expires” HTTP header as if it were “unavailable_after”–that is, the Expires header was describing when the listing on Craigslist was obsolete and shouldn’t be shown to users. We ended up writing an algorithm for sites that appeared to be using the Expires header (instead of “unavailable_after”) to try to list when content was defunct and shouldn’t be shown anymore.

You might be able to see where this is going. Not too long ago, Craigslist changed how they generated the “Expires” HTTP header. It looks like they moved to the traditional interpretation of Expires for caching, and our indexing system didn’t notice. We’re in the process of fixing this, and I expect it to be fixed pretty quickly. The indexing team has already corrected this, so now it’s just a matter of re-crawling Craigslist over the next few days.

So we were trying to go the extra mile to help users not see defunct pages, but that caused an issue when Craigslist changed how they used the “Expires” HTTP header. It sounded like you preferred Google’s Custom Search API over Bing’s so it should be safe to switch back to Google if you want. Thanks again for pointing this out.

Interesting....

Forum discussion at HackerNews.

Previous story: Google AdSense Blocks Arrows (Nessie) From 5,000 Publishers
 

Comments:

Andrea Moro

03/18/2013 02:00 pm

Intersting yes, but are we really sure they have implemented the Expires header propertly? I have had a look at a couple of their pages, and the expiration was set in 15 minutes for articles that are expiring much far away than this ..

John McCheap

03/18/2013 04:04 pm

Indexing or not indexing Craigslist, its pages aren't anymore in the top SERPs, so, what's the point, too few people know how to use site:craigslist.org "query" operator to find stuff on the site.

Gabe Garcia

03/19/2013 01:22 pm

So did Craigslist alert Google that their pages were no longer being indexed and, if yes, how? Or, did Google realize the issue and take the initiative?

blog comments powered by Disqus