A huge topic at the SES conference last week was duplicate content. The definitions and how search engines handle duplicate content has changed a lot over the past few years. So that is why I dated the title of this post.
A Cre8asite Forums thread discusses just that.
In short, duplicate content is not a penalty. It hasn't been that way in years.
When you have 20 pages of the same page of content, a search engine will do their best to pick the best page on your behalf and filter out the remaining pages.
Why? The search engines do not want the same page in their index more than one time because it wastes resources and provides a bad search experience (showing the same result twice is not good).
So search engines (Google, Yahoo, MSN, Ask.com) all try to pick the best page (one with cleanest URL, most links, etc.). But if they pick the wrong URL (not the best page, in your opinion) then you may consider it a penalty, when it is not.
This is why you should help the search engines out by using 301s and robots.txt files to tell the search engines which pages are the important ones. With Google you can also use Sitemaps and increase the priority score of the important pages, relative to the others.
So it is your choice: Let the search engines choose for you or you make the choice.
Forum discussion Cre8asite Forums.