Specifically, he estimates that 30% to 50% of the links are not reported cumulatively via both the third party tools and search engine backed tools. Specifically, he said he uses Open Site Explorer, Majestic SEO, Webmaster Tools, and Ahrefs and he thinks at best, only 50% of his links to a site are reported back.
Where does he come up with this number? He says:
I base these statements on my experiences with helping sites recover from link penalties. At Stone Temple Consulting we have helped more than 50 sites recover from penalties this year, and it has happened over and over again that we would help these sites by cleaning out bad links only to have Webmaster Tools report lots of new links the next time it was queried. The reported new links were not new and I have no doubt that Google knew about them before, but simply did not choose to include them in the Webmaster Tools report. However, once we cleaned out some of the bad ones, we got exposed to some more of the links residing in their database.
Do you think 30-50% is accurate?
Is there even a real way for us to count all the links to a site without building our own spider and system that is guaranteed to crawl every page on the internet. Even then, you'd be unsure.
Forum discussion at WebmasterWorld.
This post was written earlier this week and scheduled to be posted today.
Image credit to BigStockPhoto for zipper