New York Times Allowed to Cloak Content?

Jun 16, 2006 • 8:25 am | comments (0) by twitter Google+ | Filed Under Search Engine Cloaking / IP Delivery
 

A SearchDay article by Danny and Chris over at Search Engine Watch named Getting The New York Times More Search Engine Friendly talks about how Marshall Simmonds (first with About.com and then acquired by NY Times) made the NYTimes.com search engine friendly. Part of that process is to allow the search engines, including Google, to access, crawl, index and rank content that would require a username and password by a normal Web user.

Danny and Chris ask the question and answer it; "Isn't this cloaking—serving different pages to a search engine and an individual web browser? Yes, it is." Yes, there is a BUT;

Although both Google and Yahoo warn against cloaking, Marshall says both companies are aware of what the Times is doing, and apparently condone the practice.

"They want the content, and they're very interested in displaying it," says Marshall.

Reviewing the latest from Google on cloaking you see that Matt Cutts makes a clear distinction;

So IP delivery is fine, but don't do anything special for Googlebot. Just treat it like a typical user visiting the site.

NYTimes.com is clearly doing something "special for Googlebot" here and in terms of how Matt Cutts defines "acceptable cloaking," this does not fall within those terms. At other engines like Yahoo!, Ask and MSN, engines that have not taken as strong a stance on cloaking, this most likely would be acceptable. But at Google, I believe, based on Matt Cutts continued campaign against cloaking, this would not fall within Google's webmaster guidelines.

Forum discussion at Search Engine Watch Forums.

Previous story: Yahoo! Search Results Shifting?
 

Comments:

No comments.

blog comments powered by Disqus