A WebmasterWorld thread asks if cleaning up code will improve Google Rankings. Can removing bad code get better rankings in the SERPs?
Not so much for Google, as one webmaster points out. He has had better success with MSN when he cleaned up his code.
Other benefits of clean code include increased repeat visitors (faster load times) and the increased ability to get links from authority sites.
Meanwhile, some people are expecting that this may be a problem in the future.
I spent a lot of time over the past 6 mos cleaning up my code so that each page validates. While I don't think there's any benefit to Google if you skip an Alt tag here and there, there is the idea that this could matter in the future. As Google marches along it increases the characteristics that are important to the algo. By doing this housekeeping, you are ready if this happens.
Tedster echoes this sentiment:
The most problematic kinds of true errors in your HTML mark-up can be very difficult to spot by eye - they really need a tool, such as the W3C HTML Validator, to be ruled out with certainty.
What are these problems? Things like an unclosed quotation mark or a missing angle bracket on a tag. You can stare for hours at your source code and miss that kind of thing. But until you fix that kind of error, there is a section you intended as content that just looks like an invalid attribute, or something like that. Browsers have different error recovery routines, and just because the content displays on screen is no guarantee that Google's index will "see" it as content.
Eventually Google's error recovery routines may pick up a clue farther along in the code - and after that point, the rest of the page can be indexed. But there can easily be a gap, sometimes with important content, that just gets skipped. I speak here from painful experience.
It's something to keep in mind for the future, but it's not a big deal now unless your code is bad enough that the Googlebot is choking on it.
Forum discussion continues at WebmasterWorld.