Got ASP 2.0? GoogleBot Can Trip a Bug in the Code

Nov 8, 2007 - 10:14 am 0 by
Filed Under Misc Google

Brendan Kowitz wrote a blog post about an interesting anomaly he noticed as Googlebot was crawling his site. Apparently, in March of 2006, Googlebot's User-Agent string was changed which triggered a bug in ASP.NET webservers. What this would mean is that Googlebot would not crawl the pages properly; instead, it would get 500 Internal Server Errors, thereby assuming that the pages were unavailable, even though they are.

A number of WebmasterWorld members were hit by this bug, and their rankings dropped because Googlebot would find 500 Internal Errors when spidering the pages. However, that seems to be fixed -- the only problem is that the rankings are not restored. (Google, can you fix this?)

Forum disucssion continues at WebmasterWorld.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: January 24, 2025

Jan 24, 2025 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Search Volatility, European Manual Actions, Quality Raters Updated, Bing Hides Google & Yahoo AI Chat

Jan 24, 2025 - 8:01 am
Google Search Engine Optimization

Google Search Quality Raters Guidelines Gain 11 New Pages

Jan 24, 2025 - 7:51 am
Google Ads

New Google Ads Performance‬‭ Max‬‭ Features

Jan 24, 2025 - 7:41 am
Google Search Engine Optimization

Google: Spammers Encourage You To Waste Time Disavowing Links

Jan 24, 2025 - 7:31 am
Google Ads

Google Ads To Drop Certification For Some Advertisers & Reorganizes Verification Docs

Jan 24, 2025 - 7:21 am
Previous Story: AdSense Code Free Changes Take Less Than 10 Minutes to Update