Matt went on to answer in a long way, but basically said, yes, Google does have whitelists for what they call, "exception lists." But it is important to note that the exception lists are on a per-algorithm basis. Meaning there is not a single specific global whitelist where a site will never get hurt in the Google results - some people call it the Wikipedia whitelist. Google said that most of the algorithms don't even have whitelists, but some do and have to because not all search algorithms are a 100% perfect.
Danny and I wrote in more detail about this at Search Engine Land and then Google provided a statement:
Our goal is to provide people with the most relevant answers as quickly as possible, and we do that primarily with computer algorithms. In our experience, algorithms generate much better results than humans ranking websites page by page. And given the hundreds of millions of queries we get every day, it wouldn't be feasible to handle them manually anyway. That said, we do sometimes take manual action to deal with problems like malware and copyright infringement. Like other search engines (including Microsoft's Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality. We don't keep a master list protecting certain sites from all changes to our algorithms.
The most common manual exceptions we make are for sites that get caught by SafeSearch-a tool that gives people a way to filter adult content from their results. For example, "essex.edu" was incorrectly flagged by our SafeSearch algorithms because it contains the word "sex." On the rare occasions we make manual exceptions, we go to great lengths to apply our quality standards and guidelines fairly to all websites.
Of course, we would much prefer not to make any manual changes and not to maintain any exception lists. But search is still in its infancy, and our algorithms can't answer all questions.
Yes, Microsoft said they also use exception lists. But those lists are always revised and cleaned off. I.e. when they push out an update to an algorithm, they try to make sure the sites in the exception list can be removed from the list, thus perfecting that algorithm.
To me, this is a big deal for the SEO space and I am not sure why it didn't get more play yet.
I might be able to publish a video recording of the session where Google and Bing admitted to having this exception list, so check back later today.
Forum discussion at Google Webmaster Help.
Update: I posted the full transcript and audio at Search Engine Land - enjoy!