I am glad I do not have 404 OCD, because if I did, I'd be spending all my waking hours fixing 404, page not found, errors on my web sites.
A WebmasterWorld thread has some webmasters complaining that they have 404 errors being reported in Google Webmaster Tools that are not real pages, nor are they being linked to at all.
I came across some 404 pages on my site in Google WMT. These were pages that i know i definitely do not have on my site. Anyway i investigated and looked at the linking pages and they were external websites that are referencing my URLs and not actually linking using <a href> tag.
It seems Google Bot has tried to follow these URLs and reporting them as broken links.
This is supposedly a common issue.
But which would you prefer? To not know about potential 404 errors or for Google to hide 404 errors that you cannot do much about.
Most 404 issues can be handled either by keeping them a 404, because the 404 makes sense or by setting up clever redirects. Or even by placing content on web page.
Forum discussion at WebmasterWorld.