SEOs Rejoice To SEOmoz's New Linkscape Tool

Oct 7, 2008 • 7:17 am | comments (7) by twitter Google+ | Filed Under Search Engine Optimization Tools

I often try to stay away from covering tools but in this case, a Sphinn thread with over 70 votes and 35 comments, I cannot pass this up.

SEOmoz launched a new tool called Linkscape, the tool utilizes a growing index of 30 billion pages indexed by the SEOmoz team to plot detailed linkage data for SEOs. In the past, SEOs relied on search engine data from APIs or via a scraper. SEOmoz decided to crawl the web themselves to get this data. Thus reducing the reliance on search engines to obtain critical linkage data.

While most SEOs are really happy to have this tool, some are eager to block SEOmoz from crawling their site. One said:

Cool tool, but I want to know the user agent so I can block it - no sense giving my competitors my data if the tool ends up working well.

I am not sure what the useragent is, but most are of the belief that they do obey the standard robots.txt protocol. The tool does seem to be down now, as their whole site, probably for a lot of usage. But I am told the useragent discussion is over at SEOmoz's feedback thread.

In any event, I commend SEOmoz for undergoing this process. I know it is expensive and complex but it is also a valued new tool that should help SEOs get their job done.

Forum discussion at Sphinn and SEOmoz.

Update: I spoke with Rand and he said they will announce a single useragent that people can use to block the SEOmoz bots from crawling their site. What do you want to name that useragent?

Previous story: Daily Search Forum Recap: October 6, 2008



10/07/2008 11:53 am

"...most are of the belief that they do obey the standard robots.txt protocol" This was actually <a href="" rel="nofollow">confirmed</a> by Nick Gerner: "...we're crawling everything we can get our hands on (excluding REP protected pages of course!)."


10/07/2008 12:04 pm

Actually Linkscape is quite useful piece of metrics again.

Michael Martinez

10/07/2008 05:25 pm

You have to know a user-agent in order to block a respectful robot through robots.txt. Otherwise, you have to hope it obeys whatever you put in the asterisk (*) category, and some Webmasters may use that to only partially block spiders. SEOmoz needs to publish a single user-agent that everyone can block with a dedicated section.

diarmuid ryan

10/07/2008 06:23 pm

the more tools the better...anything that makes my life easier is a welcome addition ot my toolbox...

Affan Laghari

10/09/2008 02:06 am

"anything that makes my life easier is a welcome addition to my toolbox..." -> AND to my competitors toolbox

Bharath Reddy

10/13/2008 09:30 am

Thanks to Seomoz for introducing a new tool "LinkScape".. This tool is amazing and gives everyone using it a huge competitive advantage. however I was told that LinkScape can be used to give you a picture on how the other search engines view your that true?


10/15/2008 09:31 am

The first and last question I asked was the User-agent and the IP address for LinkScape.

blog comments powered by Disqus