Got ASP 2.0? GoogleBot Can Trip a Bug in the Code

Nov 8, 2007 • 10:14 am | comments (0) by | Filed Under Other Google Topics
 

Brendan Kowitz wrote a blog post about an interesting anomaly he noticed as Googlebot was crawling his site. Apparently, in March of 2006, Googlebot's User-Agent string was changed which triggered a bug in ASP.NET webservers. What this would mean is that Googlebot would not crawl the pages properly; instead, it would get 500 Internal Server Errors, thereby assuming that the pages were unavailable, even though they are.

A number of WebmasterWorld members were hit by this bug, and their rankings dropped because Googlebot would find 500 Internal Errors when spidering the pages. However, that seems to be fixed -- the only problem is that the rankings are not restored. (Google, can you fix this?)

Forum disucssion continues at WebmasterWorld.

Previous story: AdSense Code Free Changes Take Less Than 10 Minutes to Update
Ninja Banner
 
blog comments powered by Disqus