Current Issues With Cloaking Using IP Delivery Technology

Mar 15, 2006 - 7:47 am 0 by
Filed Under Cloaking

Rand Fishkin started an excellent thread at Cre8asite forums named Cloaking Beyond IP Delivery, Discussing Other Methods to Limit Access. In his thread, he asks people for alternatives to the cloaking through IP delivery methods. He states that several black-hat SEOs have said cloaking through IP delivery "has become passe." Rand asked for alternatives. Instead of focusing on the alternatives, which can lead a site to be kicked out of the index if implemented wrong, let's focus on the history of IP based delivery to perform what is known as cloaking.

For that we must read Ammon Johns post which goes through some of the history of IP delivery and its faults. Ammon explains that based on how IP delivery works, if you are missing an IP address of a spider, then the spider will be served your standard page and not the page you would like to serve up to the spider. So Ammon then moves on to what he calls "decloaking hazards" that have shown up over the past several years. Here is a list, provided by Ammon, of some of the "decloaking hazards:"

  • Translation services, such as Alta Vista's Babel Fish Translation that showed translation of the page the spider was served and not what the end user was served.
  • The Cache feature at the search engines, when you click on the cached page, it would show the page the spider was served and not what the end user was served. Of course you can tell the search engine not to cache the page, but back in the day, it was a great way for search engines to find sites that were likely to cloak - it raised a red flag. Ammon explained that cloakers had to use hidden text on the cloaked pages, to serve up a page that looked similar in the cache to those that the end users saw.
  • Toolbars and Desktop search came out, giving search engines yet an other method "to 'sample' exactly what the user is getting if it is even one bit different to what the engine has recorded."
  • Finally, search engines can and probably do send spiders through proxies and have spiders that act more human-like, making them extremely hard to detect and cloak properly.

So before deploying a form of IP Delivery, discuss it with professionals and also make sure to check out the Cre8asite Forum Thread.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Rumbling, Manual Actions FAQs, Core Web Vitals Updates, AI, Bing, Ads & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: March 18, 2024

Mar 18, 2024 - 4:00 pm
Google Updates

Google Urges Patience As The March 2024 Core Update Continues To Rollout

Mar 18, 2024 - 7:51 am
Google

Official: Google Replaces Perspective Filter With Forums Filter

Mar 18, 2024 - 7:41 am
Google Maps

Google Business Profiles Now Offers Additional Review After Appeal Is Denied

Mar 18, 2024 - 7:31 am
Google Maps

EU Searchers Complaining About Google Maps Features Changes Related To DMA

Mar 18, 2024 - 7:21 am
Google

Google Showing Fewer Sitelinks Within Search

Mar 18, 2024 - 7:11 am
Previous Story: Jenstar Quoted in TheStreet Article on Google AdWords Demographics