I often try to stay away from covering tools but in this case, a Sphinn thread with over 70 votes and 35 comments, I cannot pass this up.
SEOmoz launched a new tool called Linkscape, the tool utilizes a growing index of 30 billion pages indexed by the SEOmoz team to plot detailed linkage data for SEOs. In the past, SEOs relied on search engine data from APIs or via a scraper. SEOmoz decided to crawl the web themselves to get this data. Thus reducing the reliance on search engines to obtain critical linkage data.
While most SEOs are really happy to have this tool, some are eager to block SEOmoz from crawling their site. One said:
Cool tool, but I want to know the user agent so I can block it - no sense giving my competitors my data if the tool ends up working well.
I am not sure what the useragent is, but most are of the belief that they do obey the standard robots.txt protocol. The tool does seem to be down now, as their whole site, probably for a lot of usage. But I am told the useragent discussion is over at SEOmoz's feedback thread.
In any event, I commend SEOmoz for undergoing this process. I know it is expensive and complex but it is also a valued new tool that should help SEOs get their job done.
Update: I spoke with Rand and he said they will announce a single useragent that people can use to block the SEOmoz bots from crawling their site. What do you want to name that useragent?