Why Shouldn't SEOs Obsess Over the Site Command

Nov 26, 2008 • 8:10 am | comments (5) by twitter Google+ | Filed Under SEO - Search Engine Optimization

Many SEOs use the site command to see how healthy their site is in a particular search engine. So you plug in site:www.mydomain.com in a search engine and the search engine will return the number of pages they have indexed for that domain. If you know you have a hundred pages and the search engine indexed 90% of those pages, then you are pretty well off.

But the problem is, the site command is not often all that reliable. We had recent reports that Google is dropping pages and we had recent reports that Microsoft Live Search is dropping pages as well. Most SEOs determine a drop in pages indexed by the number of results returned by the engine for a site command.

But is this a valid way of really determining how many pages a search engine indexed of your site? From what I am hearing from search engine representatives at both Google and Microsoft, the answer is no. A webmaster should not depend on the number returned by a site command as a reliable indicator of the number of pages a search engine has indexed of their site.

Googler, JohnMu, wrote in a recent Google Groups thread three reasons why SEOs and Webmasters should not depend on this number:

  • The previous approximation was incorrect, the current one is closer to the actual number of URLs that we have indexed or would show to users
  • The previous approximation was close and the current one is worse than before (this can happen)
  • A change in our algorithms (we make a lot of changes that will impact crawling, indexing and ranking -- for some sites perhaps more than for others)

At the same time, Microsoft's Jeremiah Andrick told me that it "is problematic to use the "site:" operator to determine how many pages for a site are included in the Live Search index. The “Site:” operator generates an estimate of the pages in the index. These numbers can vary wildly depending on when you execute the query."

That being said, how can you get an accurate number of pages indexed by a search engine for your site?

I know Google's Webmaster Tools has in their Sitemaps section a place to show you the number of pages submitted in your Sitemap compared to how many URLs actually indexed. So, this might be a better indicator, but I am nervous about this number, because way too often I hear of reporting glitches in Webmaster Tools.

Another option is to track each and every keyword phrase your pages rank for. Then see by keyword, not by site command, if those pages rank. This can be time consuming, but there are ways to automate this.

Overall, using the site command might not be the best way to determine how healthy your site is in a particular search engine. I know many SEOs use this as a factor, but maybe it is time we think again about this?

Forum discussion at Google Groups.

Previous story: See How To Save 25% On Holiday Gifts With Microsoft Cashback


Michael Martinez

11/26/2008 05:21 pm

There is no "best" way to determine how many pages are indexed for a particular site, if by "best" we mean reasonably accurate and reliable. However, if Webmasters dig deep and perform consistent sampling with site queries, they can build reliable profiles of indexing activity for their sites. When you're used to seeing 2400 pages indexed (and you can easily verify that all 2400 appear in a search index by breaking down your queries for site sections), then finding only 1200 indexed is an indication of something. Of course, no one should panic just because their search visibility has been halved, as long as that is happening to other people. Search engines occasionally rebuild their indexes and the process can be a bit unnerving for people. Through the years, I've watched sites lose search visibility on a temporary basis. It's good to know when that happens but people should not obsess over it. One final thought, though: Webmasters who rely upon site search tools to augment their navigation should be sampling their query results frequently, making necessary adjustments in on-page optimization and navigation to help the search engines find the right content.


11/26/2008 07:21 pm

Search engine optimization is a never ending process. Regardless of where you rank today, be it the top or the bottom, you still need to continue to optimize to either maintain or advance your ranking. Good website development, search eingine optimization, and online marketing will forever go hand in hand.


11/27/2008 12:17 pm

There's also the Indexed URL count per Sitemap file which is available in Google Webmaster Tools. This count is (as far as I know) fairly accurate. As a bonus, since the count is per Sitemap file, you could create separate Sitemap files for the different parts of your site to find out which parts need help getting indexed properly.

Affan Laghari

11/30/2008 08:21 pm

If you are in doubt about a particular page, can't you do a "phrase search" or something else like intitle, inurl, etc. to confirm?


12/01/2008 09:40 am

I find some of the other commands to give me irregular results too. "intitle" specifically. Sometimes you have to do things the long way

blog comments powered by Disqus