Matt Cutts: Google May Treat The Same Subdirectories As Unique

May 7, 2014 • 8:27 am | comments (7) by twitter Google+ | Filed Under Google Search Engine Optimization

Duplicate Content & SEOSomeone asked Matt Cutts, Google's head of spam, on Twitter if Google would treat the same subdirectory as different pages if the URLs are slightly off.

The example give were:


For most of you, the answer is pretty obvious. Most of the time, no, Google will figure it out. But Matt Cutts said theoretically, Google can treat them differently. When? I assume when the content on each of those versions are unique from each other.

Matt Cutts responded on Twitter saying "in theory those can all be different." He added, Google does make a decision on when to merge them or not.

Here is his tweet:

Forum discussion at Twitter.

Previous story: Google AdWords MCC Scripts


Adam Buchanan

05/07/2014 04:43 pm

.com/dir .com/dir/ .com/dir.htm Those ARE all different pages. I would be more annoying if Google over-assumed and treated them as the same when they aren't. I think Google handles this fine. SEOs and developers should code their sites like big boys and quit relying on Google to fix their issues for them. Not to mention that Google isn't the only search engine, and I'm sure the others aren't going to make that assumption either.

Joe Preston

05/07/2014 05:01 pm



05/07/2014 06:05 pm

And they treat http different from https... and URL params and # anchors... AND if you don't use canonical tags on your pages it opens the door to certain types of duplicate content negative SEO attacks using these alternate URL forms... it gets even more hairy when your DNS can wildcard subdomains to the same content... There is a lot of room to shoot yourself in the foot if you are not very strict about your URL structure and navigation practices.

Jitendra Vaswani

05/07/2014 07:10 pm

If they treat same then SEO will totally change, Amen

klungus funghsu

05/08/2014 09:57 am



05/08/2014 03:42 pm

why would SEO totally change?

TheeDesign Studio

05/08/2014 08:59 pm

Oh the possibilities. and http and https versions then we get into /index.html or .php or whatever extension then # bookmark locations on the page query parameters session ids searches being indexed Anyone want to calculate how many possibilities this is? Don't rely on Google to clean this up, fix your site.

blog comments powered by Disqus