Google Getting Stricter On The Supplementals?

Feb 9, 2007 • 7:13 am | comments (6) by twitter Google+ | Filed Under Google Search Engine Optimization
 

A WebmasterWorld thread is reporting many cases of people noticing the pages on their site going into the supplemental index. But even more so, they are noticing that a site:www.domain.com command returns one result and then shows you:

In order to show you the most relevant results, we have omitted some entries very similar to the 1 already displayed. If you like, you can repeat the search with the omitted results included.

And then when you click to display those results, up come pages, not necessary with the supplemental tag. Also, those pages tend to have very unique titles and content - so they are technically not all that similar.

I do have some examples validating these reports - but I cannot share them (I promised).

But then you have sites like this that do not have that problem (i.e. site:www.seroundtable.com).

Preferred member, Whitey, writes:

- Our 2nd result is not supplemental
- On some of our sites we are near 100% free of supplementals when clicking on "show more results"
- A site:tool search on a page like this site:oursite.com/Widget1/ shows the same pattern as described above.
- Meta titles and meta descriptions are unique
- Currently our traffic is increasing.

Current observations:

- Header text is sometimes being displayed in the SERP's "description" - Sites with good "trust rank" seem to be unaffected - Affected members reporting traffic climbed first then crashed

I have a feeling it has to do with page popularity (i.e. linkage).

Forum discussion at WebmasterWorld.

Previous story: Feedback & Suggestions For Google's Link Analysis Tool
 

Comments:

Matt Cutts

02/09/2007 03:45 pm

If you do find someone willing to mention their domain that has a) one result showing up for site:theirdomain.com and b) plenty of unique titles/descriptions, I'd be interested to get examples so that I can ask someone about this. Just fyi, I think we'll be changing the rule-of-thumb for site: searches so that even if sites do have the same title/meta descriptions, we'll still show plenty of results instead of that "click to see more results" link. It will probably take a few weeks before it's fully live, but I know it's on someone's todo list.

Barry Schwartz

02/09/2007 04:02 pm

Cool, good to know!

Michael Martinez

02/09/2007 05:11 pm

If it's link-related, it's not due to "popularity". I have sites with few inbound links that don't show this behavior. I have sites with substantial inbound links that do show this behavior. So if it's link-related, I would guess (at this time) that it may be more linking pattern-related.

Keri Morgret

02/09/2007 10:38 pm

Matt, were you guys messing with some of site: SERPs today? On several sites, I did a site: search. I hit refresh. Randomly, some pages would display a description and some would not. None of the descriptions were in the meta description tags, but taken from the actual site text itself. The link to show the hidden pages from the site was also missing. One site I have that has identical meta description tags across 20,000 pages (working with the CMS as we speak) did not do this, it still showed just the identical meta description when I went to the display more pages link. Anyone else notice this? It was happening about 2:00 pm PST, by 2:30 PST it cleared up.

BigBerries

02/10/2007 11:08 am

Thanks for the information. Traffic has really been odd lately. It seems we follow the rules and still get dinged. But help me understand this.Why does it matter if you don't have thousands of results in a site:domain.com search? If a user is searching for your site the one result would and should be sufficient. So all the supplemental results are infact supplemental. If a user is searching for site:domain.com + "search terms" they would ideally get what they are looking for. The average user is not going to troll through 50,000 pages looking for the widget information on your site, most barely used advanced search. I'm guessing if terms you used for meta or if a person was searching for a particular subject and then suddenly you were wiped from those queries then I could see the reason for the uproar. But essentially if you're looking to use site:domaiin.com as a type of sitemap or index shouldn't that be your own job as a webmaster and not Google? As far as I know mean users don't search using site:domain.com for the most part only seekers and webmasters do that. Like I said I'm looking to get a better understanding but that is what makes sense to me. On the flipside if Google drops results such as site:domain.com + "search term", then I can see that as a problem. Now what would be great is if your site used Google search for internal queries, when the results are "supplemental" users could see the results via the powered by Google search on your site. Don't mind me if that makes no sense just realize it's 3AM.

MetroWebDev

02/10/2007 04:15 pm

I noticed several days before the new "links" tab was added to the Webmaster console that the site: search for one of the major sites that I work on showed a HUGE drop in the number of indexed pages but when clicking on the "show omitted" link, the normal number of indexed pages showed and there weren't any new supplemental pages. I just figured it was an error with the site command (wouldn't be the first time). Then when the new links tab was added I noticed that the number of indexed pages showing showing in the site: search pretty closely matched the number of pages on my site that google showed inbound links to. Then I started going through and doing site: searches on the top 20 competitors for one of my main keywords and found that a couple of the sites showed the exact same number of indexed pages (260) and then had the 'show omitted' link at the bottom. This led me to believe that the site: search issue was only affecting sites that used Google sitemaps and that for some reason the number of indexed pages it would show was capped at 260. Maybe the 260 thing was just a coincidence, and maybe it is more dependent on the number of pages on your site that have inbound links plus the fact that a site uses Google sitemaps.

blog comments powered by Disqus