Blocks Search Engine Spiders

Feb 18, 2008 • 9:21 am | comments (4) by twitter | Filed Under Social Search Engines & Optimization

Colin Cochrane noticed that has blocked search engine spiders. He believes that it's not a simple robots.txt exclusion; instead, is serving 404 errors based on the User-Agent. Barry Welford confirmed this by changing the User-Agent himself.

How did he come across this? He was using a Firefox addon that and couldn't locate a page he had referenced before. It was only when he did the search directly on that he found it.

Not many people believe that this approach is a good idea. A 403 response code is better, says Pierre aka eKstreme. Pierre has noticed a bunch of errors lately within Yahoo, including JavaScript errors and pop-up alerts that indicate that something has broken. Some folks have turned the thread into a rant about the competence of Yahoo at this point. Barry Welford puts it this way: "this may be a sign of a debilitating decline for and Yahoo! is in no position to invest massively in a property that has uncertain monetization."

I honestly hope that that is not the case.

But EGOL says something else. It is possible that many people are gaming their way onto the front page (heck, I've seen some pretty bad-quality sites there myself) and this is the way to not pass juice to them. It's not the most ideal solution, and it's a mistake to do this without being forthright.

Most people believe this is just a bad mistake made on Yahoo's part, which doesn't help since Yahoo has been having a difficult time lately, and this doesn't help matters at all.

Forum discussion continues at Cre8asite Forums.

Previous story: Google Establishes Hydro-Power Data Center
blog comments powered by Disqus