This is a question I have seen asked many times on the forums usually meet with speculation and regarded with careful consideration on the potential risk. This might be off interest to those of you that run large websites or for some reason or another like to add TONS of new pages to the Google index regularly. If you are doing it currently then you probably know you risk level and what you can and cannot do. I haven't seen any good explanations with some possible answers until I came across some good information from Donna at SEO Scoop yesterday that I thought deserved a good mention. Donna gave some the true scoop from the WMW conference in New Orleans last week. She went about finding some of the best tidbits and common gossip and posted some good short summaries. While attending the Meet the Google Engineer session she picked up one some good info. In regards to adding hundreds if not tens of thousands of pages she reports that:
Google does not specifically filter for any one particular thing like that. Instead the algorithm looks at other similar situations and determines if the action is good or bad. For example, if a 2-page site suddenly adds 10,000 pages, there may in fact be a legitimate reason for it to do so. But the algorithm will first make the assumption that the action is "suspicious" and will then look at a large sample of other 2-page sites that have suddenly added 10,000 pages. If the majority of those sites were considered spammy, then your site will get lumped into the same spammy category. Of course, if the majority of those sites were deemed to be legitimate, then your site would likewise be deemed legitimate. Basically, he admitted that there may in fact be absolutely nothing wrong with your site, but it may be filtered or penalized if it fits the profile of other sites that have been marked as spammy.
He suggests that if your site falls into such a category (of being legit, but being lumped into a spammy classification), then you should let Google know about it. He implied that the matter would be looked into, and that the algorithm may be adjusted to be able to handle situations like yours in the future, but that nothing may be done to directly impact your particular site (other than adjusting the algorithm in general).
Check out the full report at SEO Scoop. There are a few forums threads about this but at this late hour I can't seem to find them. Check your local hangout for related discussion.