NY Times is Cloaking But Not Spamming; Danny Sullivan Says

Jun 19, 2006 • 7:20 am | comments (0) by | Filed Under Search Engine Cloaking / IP Delivery

Friday, I asked New York Times Allowed to Cloak Content? Where I explained that I felt the NYTimes was indeed cloaking content, based on Matt Cutts interpretation and that they are receiving special treatment from Googlebot. Danny Sullivan posted his thoughts in the forum thread, stating clearly;

Do I think the NYT is spamming Google? No. Do I think they are cloaking? Yes. Do I think they should be banned because Google itself warns against cloaking? No.

Yes, Danny believes they are cloaking. But no, Danny, as do many, feel that Google should not ban NYTimes.com or others like them.

Of course, there are others that do not feel that this is a typical situation of cloaking. And cloaking can be defined differently. But I prefer to use Google's definition of cloaking, or at least Matt Cutts definition.

Strong question:


That would be nice, I try to always let my readers know when I link out to a subscription required link. Some news search engines do that also, but adding "registration req." or "subscription" in small text. If Google is allowing this, then at least give us that detail. And at least enable all publications to do the same. And clarify your policy on such "cloaking" practices.

Continued forum discussion at Search Engine Watch Forums.

Previous story: Growing Forum Participation With Discussions Not Answers
Ninja Banner
blog comments powered by Disqus