Imagine that, an army of bots that crawl the Web for pages that look to be spamming the search engines. This kind of makes me scratch my head and want to build this myself.
Some of the downsides or challenges: - Would these bots need to comply with the robot.txt file? So if someone excludes "spam bot" can it crawl anyway? - How is spam defined? I guess we can have some sort of level of spam and chart it from green to red. That can work (thinking out loud). - If the search bots can't pick up the spam, then how can my army of spam bots pick them up? Do the spam bots need to wear camouflage? :)
This topic is currently being discussed over at the Search Engine Watch Forums.