Do Blogs Suffer From Duplicate Content?

Oct 20, 2006 • 7:06 am | comments (5) by twitter Google+ | Filed Under SEO - Search Engine Optimization
 

A featured WebmasterWorld thread goes over the technical issues specifically with WordPress blogs and how they are very susceptible to duplicate content issues. But it is not just WordPress, it is most blog formats, even this one.

This single article I am writing right now can be found in a number of different pages all on the same site. You will be able to find this article on the home page, on the search engine optimization category page, on the 10/20/2006 page, on the October 2006 page, on the individual entry page. Also, if I placed this article in multiple categories, you would find the same content on those pages as well.

Is this a problem? I don't see it to be a huge problem. I have always been an advocate of Google figuring out the problem and not having webmasters figure it out, because most outside of the SEO world have no idea that this is an issue. And for the most part, I think for the most part they do a good job with blog software used widely.

But if you are worried, the WebmasterWorld thread has tips and other discussion on what to do.

Forum discussion at WebmasterWorld.

Previous story: Warning. Your IE 6.0 Browser Will Self Destruct in 15 Seconds.
 

Comments:

Mike, The Internet Guy

10/20/2006 02:15 pm

There is some really bad advice in that thread. Basically there is no such thing as 'noindex,follow'. If Google does not index page doesn't it make sense that they would not follow the links on that page. I have tried the solution with 'noindex,follow' and it does not work. It resulted in the entire blog I was working on being dropped from the index. just my 2 cents.

Joe Dolson

10/20/2006 03:11 pm

And, furthermore, from a usability perspective I think that having multiple paths to a specific article is useful. Being able to comb the site by date or category to find an article is beneficial to me. If I'm trying to find a particular article, I expect to be able to click on a category and have a chance of that: having articles in multiple categories helps me, because I may not think exactly the same way you do. I don't think there should be any penalties or filters on creating multiple pathways to the same goal, so I'm inclined to agree that this should be the search engine's problem. Besides, evidence seems to show that blogs have a disproportionately strong representation in SERPS - hardly points to the likelihood of duplicate content filtering.

Barry Schwartz

10/20/2006 03:15 pm

Thanks for your comments, all very good and important points...

Blackbeard

10/20/2006 04:04 pm

Another thing to consider is what causes pages to be indexed in the first place. Graywolf and QuadZilla both blogged recently about how having more internal linkage causes more pages to be indexed. QuadZilla put a link to every post on his blog in the sidebar and it had a positive effect on how pages were indexed. In my own experience I've also found that the more a page is linked to on your site, the more likely that it will be indexed and rank well. I think that with a blog, there is often a lot of content that has a pretty poor linking structure. Once content is older, it might take 2-5 pages from the front page before it shows up. Thus, it might not be a dupe content issue at all. It might just be that deep pages don't have enough links(votes) to be indexed in the first place.

Wordpress Workshop

10/20/2006 05:10 pm

Refining the path to more important content has it's benefits. I do not visit that forum anymore but if you look at any of my blogs you will see that I have removed many duplicate features. Google also can handle dulicates well now so no worries and yes, lots of bad advice in there for sure.

blog comments powered by Disqus