Google Site Command Becoming More Accurate?

Oct 30, 2006 • 7:31 am | comments (3) by twitter Google+ | Filed Under Google Search Engine Optimization
 

A WebmasterWorld thread has some credible sources claiming the site:www.domain.com, site command, at Google is becoming more accurate. In the past, conducting a site command on a domain may have returned a lot more pages than what is currently accessible on the main site.

WebmasterWorld admin, Tedster, said,

I also see sites returning numbers that are reasonable now when they were always 4X or worse -- and I see this even in cases where there was no canonical fix (or issue) on the part of the site owner. I think theBear got it right. The site: operator is returning better url number estimates now. Matt Cutts said that this was in the works.

This may be directly related to some folks using Google Sitemaps within Google Webmaster Central.

Forum discussion at WebmasterWorld.

Previous story: India SEO Company Impersonating RustyBrick?
 

Comments:

Michael Martinez

10/30/2006 04:13 pm

Actually, quite the opposite seems to be the case. None of the Google queries that SEOs can use seem to be reporting accurate results. The changes they have made over the past few months make it almost impossible to get reliable information about what they have actually indexed. This is starting to look like a classic case of "New Improved Garbage!" syndrome, where an intended upgrade actually degrades the quality of results. The primary query mechanism seems to be working fine. If you just search for keywords, you get pretty good results. They've just managed to break almost everything else.

Matt Cutts

10/30/2006 04:54 pm

I believe that site: estimates have gotten more accurate. The estimates aren't perfect, but they're much better than they were at (say) the beginning of the summer. Michael, what queries don't seem to be as accurate in your experience lately?

Michael Martinez

10/30/2006 08:00 pm

Matt, I'm finding that an info: query may or may not work. I can't tell if I'm looking at different data centers or not. And when I do a site: search for a sub-section of a site, I get all the pages from the entire domain. This type of query worked as I would expect up until recently. For example, if I just want to see everything in a particular sub-directory (not a sub-domain), I get the entire domain. URL searches (aka "contain the term") are producing erratic results I may or may not get something back. Over the past week or two, I've been able to run URL queries that produce no results, but if I do an info: query on the domain name I get results. When trying to find out what a client or prospect's coverage in Google may be (in terms of indexed pages versus known total pages), I'm having to make wild guesses. It's bad enough having to explain there are multiple data centers involved, but now I have to explain that "Google has changed things" and that just doesn't sound too good.

blog comments powered by Disqus