Andy Beard Blocks Googlebot with Robots.txt

Feb 18, 2008 • 8:47 am | comments (4) by twitter | Filed Under Google Search Engine Optimization

This past weekend, Andy Beard did what many people thought unthinkable. In his blog post, he says that he's blocked Google from crawling paid reviews on his site. His reasoning is clear:

I have spent a long time deciding on a course of action, and have decided that blocking my content using Robots.txt is ultimately better for me, and better for people hiring my services.

But there's brilliance in this strategy. He claims that nofollow is not the answer to Google's troubles and states:

My content will still be in the index, filtered through an extra layer of editorial control, but there is going to be a whole lot more of it.

Google have made it clear that they are only worried about the existence of links, and not the time it takes to create content, expertise, and whether links within reviews were specified or given in an editorial capacity.

I honestly don't like junk reviews written purely for SEO purposes, but as Google seem determined to impose the letter of the law rather than the spirit, throwing the baby out with the bath water, whilst I will comply to the letter of the law, I can't see a reason why I shouldn't sidestep the charging bull.

As the discussion ensues, some people are wondering what would happen if everyone implemented this practice:

If everybody did that, Google would have some problems.

Other people feel that Andy's approach is a bit extreme. TimDineen says, "Blocking Google is like wishing it'd be daylight for all 24 hours - it's not gonna be worth the time you spend wishing. It's not worthwhile, though it's certainly nice to wish."

Others, like Barry Welford, says that it's great for Andy and not as great for Google.

It sounds a great idea. I'm sure he'll do very well and if anything Google loses a little in relevancy.

What do you think? Forum discussion continues at Sphinn.

Previous story: & Dogpile Suit Up for Presidents Day: Google, Yahoo Don't


Andy Beard

02/18/2008 02:23 pm

Hi Tamar Thanks for the coverage - it is good to see a few people who appreciate the subtle difference between using nofollow and robots.txt I don't track how many sites are still "syndicating" my content, as their trackbacks all get eaten by Spam Karma, but I usually see a new blog almost every day.

Jonathan Dingman

02/18/2008 03:14 pm

It's an interesting move, but one thing to keep in mind are those people that, sadly, depend on Google for traffic and revenue. Not all of us can afford to simply "block" Google :(

Michael Woo

02/21/2008 12:14 pm

very interesting, but i think Google somehow will find a way to filter out these things and 'still penalize' you. He got Google's motive wrong, Google is not simply upholding their TOS when it comes to PR, but they want to monopolize the advertising market all together..

Pavan Somu

07/03/2012 11:32 am

This is really funny. Every blogger wants his blog to be crawled by Google bots. But Andy looks quite unique :P

blog comments powered by Disqus