Google Updates "Can Competitors Harm Ranking" Statement

May 29, 2012 • 8:44 am | comments (21) by twitter Google+ | Filed Under Google Search Engine Optimization

harm meThe time stamp on the Google document that answers the question Can competitors harm ranking? was updated on May 22nd. Although I swear it was updated a couple months ago and I thought I covered it (maybe I was dreaming, prophecy ;-) ).

Now it reads:

Google works hard to prevent other webmasters from being able to harm your ranking or have your site removed from our index. If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don't control the content of these pages.

Before it read:

There's almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don't control the content of these pages.

Google: Can Competitors Harm Ranking

There are several threads saying that this change means that negative SEO is possible and Google admits it by changing this statement. We have threads at WebmasterWorld, TrafficPlanet and Google Webmaster Help.

In fact, the Google Webmaster Help thread was from April 18th, so I am not sure what was changed on May 22nd, because the content copied and pasted in that thread is exactly what I see there. I am thinking maybe the video above the content was added, since Google published that video last week.

What do you make of this statement change, which again happened at least back by April 18th. Update, it was around March 14th as Hobo-Web has it on their blog then.

Forum discussion at WebmasterWorld, TrafficPlanet and Google Webmaster Help.

Image credit to ShutterStock for waiting cow

Previous story: 65% Of SEOs Hurt By Google's Penguin Update


Alex Leigh

05/29/2012 01:08 pm

It's clear that Penguin allows you to screw over competitor sites in a big way. Rand covered it pretty will in an SEOMoz wbf a few weeks back with the well known examples where this has happened. I think it's good that google are acknowledging this in their own small way, but really the way forward is to simply devalue spammy links rather than allowing them to hurt sites. That way this problem doesn't exist and spam content (in theory) falls to the bottom. If they did that, then I might agree that they are "working hard" to correct this problem.


05/29/2012 01:11 pm

Of course they won't say that competitors can't harm a site. In real life they would cut into their own profits if website owners start to put each other out of business and into bankruptcy by denouncing or blackmailing. Both techniques that would be possible if negative SEO really exists.


05/29/2012 01:13 pm

I found that update message on April 6, and I posted to WMW


05/29/2012 01:13 pm

Acknowledging the problem is the first step of solving it...

Barry Schwartz

05/29/2012 01:23 pm

Yea, I am sure I covered it just can't find it now, so posted again…

James Gurd

05/29/2012 03:09 pm

I think it's a sensible change of wording - how can they guarantee you won't get harmed by bad practices from someone else? The fact is Google has created its own monster - with search penetration over 80% and market domination, the financial gains for being at the top and nailing your competitors are high. For every 99 hard working business peeps out there trying to succeed via hard work and intelligence, there will be the 1 who just wants the money and doesn't care how they achieve it. So, as Russell says, acknowledging the issue is a sensible step. Next is how to ensure websites that get nailed are quickly reinstated and the people responsible punished. How long is that piece of string...... Thanks james

SEO Profi

05/29/2012 04:17 pm

Imy interview with Kaspar - in Polish :) - he stated already on 27th Fabruary that this is possible that competitors may harm our site :) - question 11 -

Scott McKirahan

05/29/2012 04:19 pm

Actually, Rand admitted in that article that those websites had done things, themselves, which could have cause the drop in rankings post-Penguin. There still is no definitive proof of negative SEO being implemented and working. Sure gives bad SEO companies who don't know how to do anything but deliver webspam something to sell. They love the panic this conjecture is causing!


05/29/2012 04:24 pm

I think this actually changed back in March... Dave Harry wrote about it. The May change seems to have been the format changing to add the pic of MC.

Jaan Kanellis

05/29/2012 04:54 pm

Google updated it again: "We are pretty positive that we cant produce relevant and accurate SERPs without your help. We need you to use the nofolllow attribute, submit spam reports and somehow control who and why someone links to you. If you do all this work for us it will not guarantee anything and your site will still more than likely suffer in the SERPs"

Rob Woods

05/29/2012 04:56 pm

Google needs to do a better job of allowing webmasters to specify which links should pass no value. It would allow site owners to combat negative SEO and clean up either their own former wrongdoing or that of a consultant and agency. It can clearly be done. I know of someone who bought a domain that had formerly been an "adult" site and had a backlink profile to match. They were able to get those links devalued, essentially resetting the link profile and starting to grow natural links. The problem is that they had to go to Google I/O and corner a search engineer to make it happen, which most site owners obviously are not going to be able to make happen. If it's my site and 3rd parties can damage it through no action of my own, at least allow me to manage which backlinks "count". Sure it's a little more data to manage on each site, but it would be relatively easy to manage in Webmaster Tools.

David Sewell

05/29/2012 05:57 pm

As well as robots.txt we may need ignore.txt containing a list of domains we want crawlers to disconnect from our backlink graph.

Dustin Williams

05/30/2012 02:54 am

I am not surprised at the update to this, It is entirely possible that negative SEO can be used to harm a competitor's rankings. You read my thoughts on this topic at


05/30/2012 10:47 am

That's right, Miranda. I recall a discussion on our forum that mentioned this over a month ago:


05/31/2012 12:08 am

For people who do not think Negative SEO is possible. There is a simple way to test it, do the following. 1) Go to 2) Buy 5 x 70k+ scrapebox blasts and aim them at your own site. 3) buy 5 x big xrummer blasts 4) and if you feel like really going hard add a few linkwheel gigs 5) repeat for 6 months. Use your main/highest paying anchor text in all of these gigs. Only 3 possible things can happen. 1) you go up the Serps 2) you go down or 3) You stay where you are. Yes you will be attempting negative SEO on yourself but Google won't know who is making these links will they? I personally think not too many people here would dare try this on their own site. Except maybe Barry ;)

Samuel Junghenn

05/31/2012 03:14 am

Sure Scott, want to put your money site here so that all the negative SEOers reading this can come and help you out with testing?

Ashutosh Kumar

05/31/2012 05:58 am

I see it a way of Google to fix the web quality like it did with page speed factor. :) But this act is more than cruel thing...

Rigseo Seo Optimization

06/08/2012 03:47 am

In past Google had a policy, one way incoming links make no harm, that was logical. Its absurd to contact sites linking to my site for every bad link and wait when they will remove or report google- this is a bad link, in somecases not possible. Why google is not ignoring bad links ? Bikram RiG SEO Service

David M.

06/19/2012 02:10 pm

I'm pretty sure negative SEO IS possible, but Alan is right - few people are going to actually "test" it on their own sites, and they're not going to waste time and money testing it on their competitors. Ultimately, I think the real question is - what company IS going to waste resources trying to slam other people in the SERPs?

Marc Perez

07/18/2012 04:58 pm

I know to what they are referring. At one point, you could have a requested a URL to be removed from Google index in GWMT. Someone figured out how to trick the url to remove a competitors site from the index using their own GWMT account. That was one of the majors bugs.

David Sewell

11/04/2012 10:39 am

and five months later, my wish is granted with the disavow tool.

blog comments powered by Disqus