Win/Win: Block Unwanted Links In Google Webmaster Tools

May 23, 2012 • 9:31 am | comments (59) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Block Unwanted Links In Webmaster ToolsFor years, webmasters and SEOs have been asking Google for a way to specify which links they do not want Google to count (positive or negative) towards their rankings.

The concept is simple. You go to your link report in Google Webmaster Tools and have an action button that says "don't trust this link" or something like it. Google will then take that as a signal to not use that link as part of their link graph and ranking algorithm.

The reason webmasters and SEOs want it is because they want an easy way to remove any negative links that may be hurting their sites from rankings.

But do SEOs really want it?

What I can't understand is why hasn't Google released it yet? It is a great way for Google to do mass spam reporting by webmasters and SEOs without calling it spam reporting. You will have all these webmasters rush after a penalty to call out which links they feel are hurting them. Google can take that data to back up their algorithms to on links they already know are spam but also find new links that they might not have caught.

That is a win situation for Google. They will find more spam this way than via a basic spam report tool.

It may be perceived as a win for SEOs and webmasters, because they can easily discount links that might be hurting them.

But is it really a win? I have no doubt that most of the links a webmaster decides to discount would not have been marked as spammy by Google already. So who is really benefiting?

You go through links, links that might be positively helping your site and discount them and then hand over the reason why you want to discount them to Google.

I really do not see why Google has not released this functionality yet? Am I missing something? Who cares if it really discounts things, Google would be collecting tons of data.

Forum discussion at Google Webmaster Help.

Update: Matt Cutts of Google said at SMX Advanced to expect this type of tool within a few months.

Previous story: Anti Penguin Link Building, Really?
 

Comments:

Alistair Lattimore

05/23/2012 01:40 pm

The most obvious reason why not, is that it'd allow someone to spam, get caught & then simply undo the damage. If you're willing to spam your way to the top of the rankings, are you running a high quality site and indirectly business that Google wants to see at the top of their rankings?

Barry Schwartz

05/23/2012 01:41 pm

But like I said, it doesn't have to really work. Google can just put it there and take it as a hat tip but not make it really work.

Fedor

05/23/2012 01:44 pm

Would be nice, but would you really want to sit there and disable links all day for a bunch of sites? I get incoming links from scraper sites that come up as 404s because their spiders are garbage, it's easy enough for Google to filter out problem domains but Google is not perfect. So a tool like this would provide additional data and hopefully be put to some use.

Sean

05/23/2012 01:46 pm

Interesting idea Barry, I for one would like the opportunity to do this. First thing that came to my head is that I could harm my competitors by creating a resource that they link to and then informing Google that they are not to be trusted via GWT. Saying that this would be a better situation than the current one!

Anti-SEO

05/23/2012 01:56 pm

First of all it's not so easy when such a huge algo is involved. Secondly, I suppose that Google will move further from the links, as an algo base, to users' behaviors. Links still will be counted, but much less. The same way PR did. From major factor to nothing. Links are to easy to manipulate compared to behaviors. Why waste more time on this algo factor ?

Joel Mackey

05/23/2012 02:11 pm

Well what's the point of that? Give people that have enough work to do a gerbil wheel to run on while Google laughs at us? I think it's a great idea but adding it and making it useless, lets just not and say we did.

AJ Kohn

05/23/2012 02:42 pm

I've been talking about this for a couple of years, not just since the Penguin update. I think it's a fairly straightforward bit of crowdsourcing data about bad link neighborhoods. It's another 'line' of data that Google could use and integrate into their efforts. I'm not sure I'd use the term block, but simply 'disavowing' those links would be a nice option. Of course, much of this would be predicated on GWT displaying ALL the links.

Barry Schwartz

05/23/2012 02:43 pm

Yep, this is not a new thing. I am sure I blogged it before. But I am not sure if it is a win/win as everyone is saying.

Samuel

05/23/2012 02:44 pm

I am trying to find this option in webmaster tool but not getting option....where is it ?

Barry Schwartz

05/23/2012 02:46 pm

It isn't there. It a request for Google to maybe add it.

AJ Kohn

05/23/2012 02:57 pm

Oh, it'll have some negative consequences and I'm sure someone will get their GWT hacked (or will simply let some vendor stay connected) and they'll mark all the links as bad. And there's a huge bias here of those who do it and those who don't. The vast majority of sites don't use GWT so I'm sure Google's concerned that it wouldn't be equitable. But there's a Pareto principle involved with those who would participate. I think Google may not be interested because it would simply play into the obsession with links.

Lyndon NA

05/23/2012 03:03 pm

This has come up in a few discussions with Google over the years, and recently as well. There are plenty of reasons G are hesitant to do this, such as; * shooting their own foot : plenty of naive site owners will misreport, and lose the value of quality links * It will take more resources and consume even more time/effort from G * Many site owners don't even use GWT - so you will end up with a division of benefit, which an "automated" approach should avoid. * Quick escape : repenting after being caught is a bit of cheat for cheaters. Why should they be able to own up, shed the dodgy and then recoup rapidly? It's not an easy thing. G already discount some link sources, link types etc. Why they simply don't extend that ... they know the spammy sites, so why not automatically discount those from the word go? Why are they forcing people to dig through all that limited info, when they can do it themselves?

Webstats Art

05/23/2012 04:01 pm

It does not solve the problem. For a start, SEO Book might consider my links to them to be spammy while I do not. It is all subjective. At the end of the day Google is deciding like a god decide who will reign and who will not. We have to find ways to make our business model more independent of Google search and to make google search more irrelevant. You guys have managed to do this here with me because I regularly come back to read your stuff and I don't need google to help me find SEO ROUND TABLE. But I can search for SEO ROUND TABLE and see some dejan seo site which I have not bothered to visit much since you guys are more entertaining.

Scott McKirahan

05/23/2012 04:39 pm

To Lyndon's comment - "Why should they be able to own up, shed the dodgy and then recoup rapidly?" ... If they "shed the dodgy" which is what got them up in the rankings to begin with (and then led to their demise), I'm not sure how they will "recoup rapidly" if the only way they know to get high in the rankings is to employ methods that a new Algorithm change has devalued.

RyanMJones

05/23/2012 04:49 pm

My guess: never. It's not robust and it doesn't scale well. and I'm still not 100% convinced that you can harm a legit site that hasn't done any spam simply by pointing bad links to it.

Henway

05/23/2012 04:52 pm

The problem this solves is the possibility of negative SEO. But negative SEO doesn't become a problem if Google just devalues bad links as opposed to penalizing them. Maybe the Penguin update just wanted to give people the message/thought that spammy links will just hurt you in the long-run, and doing any type of linkbuilding is bad. It's like the movie inception.... maybe they aren't really doing anything different in their algorithm but just want to make you scared and change the perception of SEO. It's working 'cause people are doing stuff like deoptimizing, and not obsessing over anchor text now.

Steve Gerencser

05/23/2012 04:56 pm

This was a great idea the first time we started asking for it 5 years ago. And while it may be seen as a way to spam and then disavow the spam, wouldn't a better solution be to simply not count those links to begin with?

Jesse Friedman

05/23/2012 05:15 pm

Keep in mind, not every industry is impacted by social influence. To rely on social as a major factor might work for some industries/subjects but not others.

Anti-SEO

05/23/2012 05:28 pm

Users' behaviors are much wider, than just social and definitely can be measured for every website in every niche.

Jesse Friedman

05/23/2012 05:35 pm

Sorry, I thought you meant social signals. In re-reading your comment, I do not know why I thought that. Yes you are right about user behavior in general but I would think that could be gamed quite easily.

Takeshi Young

05/23/2012 07:41 pm

That would actually involve Google providing a list of all your backlinks, something which they have been reluctant to do.

Talprihar

05/23/2012 08:12 pm

Yes - you are missing somthing. 1. the fact that such function will let everyone be a spammer with license. that means: " i will blast myself with 1000 low quality links that will push me up the SERPS - oooops i got penalized? not a big deal, i will just block the links via WMT" and than do that whole process again, and again and again, kind of a backdoor... 2. google does not create functions that should be used only by SEOS 3. google does not have any interest of encouraging webmaster to spend to much time on exploring their link rather than "making great content that people will link to" 4. one have no way knowing whats the real value of links - as matt cutt stated in previous video, that people would try to copy links while they have no idea most of them are devalued anyway, same goes for blocking them. in short: never gonna happen.

Barry Schwartz

05/23/2012 08:15 pm

1) They already have spam reporting tools. So your logic would be applied there as well. 2) Yes they do. It is called Google Webmaster Tools. 3) They have a link reporting tool in Google Webmaster Tools, so they do spend time there. Finally, I really think it will happen within the next 6 months. I can be wrong but hey, I think it will happen.

Carlos Fernandes

05/23/2012 08:45 pm

Totally agree with this post Barry... I too have been saying this tirelessly for years. I cant see any downside and it would have no downside. There is already an obsession with links... putting this new feature out there would at least combat any notion of being unarmed against potential competitor negative SEO.

Carlos Fernandes

05/23/2012 08:46 pm

just wish they would listen to requests.

Carlos Fernandes

05/23/2012 08:50 pm

you can get a download of 1000's of your backlinks in Webmaster Tools - and you can get even more if you use the GOOGLE Webmaster Tools API - and use the download functions to get CSV files (currently these are in php tho although we did a .net version of them).

Talprihar

05/23/2012 09:07 pm

Barry im sorry to disagree, 1) whats spam reporting has to do with that? spam reporting cannot save you from spamming yourself and than recovering. Blocking my own links on the other hand, let me spam myself all day and never get penalized cause i will keep blocking my own spam once i get penalized and recover , and than spam again and block-recover again etc.... 2) thats not only for SEO - but for webmasters (sitemap, crawl errors, malwares etc...) 3) yes they have but pay closer attention to what i said : """google does not have any interest of encouraging webmaster to spend to **much time** on exploring their link rather than "making great content that people will link to"""" means - they give you the data, will never cause you to spend much time dealing with those links, or in general give you any incentives to pay much attention to your links rather than your onpage, im sure you know what i mean... 4) as mentioned... Recently i have wrote a post about related issue and we came up with same dicussion, my bottom line was: "i wish they will give such option due to reason number 1 i stated here, but thats also the reason i believe they will never do that". * i think a more realistic option will be "report this link" so its not exactly blocking the link but sending it to some sort of review, but than again, i dont see how google want to spend more resources in order to make life easier for SEO's

Barry Schwartz

05/23/2012 09:19 pm

1) Like I said in my article, Google can choose not to use your command if you abuse it. 2) Whatever you need to tell yourself. :) 3) I disagree, they do have a whole team to help webmasters who want to make content more visible in Google.

Peter Watson

05/23/2012 11:18 pm

I agree with Barry. I think we will see this option available in our GWT within 6 months. Google has to put the control of link profiles in the webmasters hands. Then no one can bitch to Google about negative SEO. Taking Googles 'word' for it, is simply not enough when it comes to de-valuing 'inorganic' links.

Matthew Forzan

05/23/2012 11:44 pm

Interesting concept. Pro's and con's to both sides but I do like the idea of crowd-sourcing the algorithm.

Aaron Friedman

05/24/2012 02:22 am

AJ, it appears Barry beat me to this :) darn! I am still working on a more in-depth post to explain this, because it is for sure a win/win for everyone!

Casey

05/24/2012 02:23 am

We're discussing a very similar concept. Check out our recommendation to Google made on 4/19/12 at the end of the article below. http://www.visibilitysquad.com/panda-and-potential-link-rejection-capabilities.php

Alan

05/24/2012 02:40 am

So what you are saying Alistair is that Neg SEO is possible? Ie someone does some neg SEO on their own site and wants to remove it? In that case what about people who have had it done to them? Shouldn't they be allowed to remove the "NEG SEO".

AJ Kohn

05/24/2012 02:48 am

I did think of you when I saw this. Looking forward to your in-depth post on the topic.

Barry Schwartz

05/24/2012 02:49 am

Is it a win/win? Really?

Aaron Friedman

05/24/2012 03:25 am

@rustybrick:disqus for sure it is. It makes the idea of linking more dangerous. More spammers will be caught. People wont be so willy nilly about sharing them, which will boost the quality of the links. Ultimately it will keep the web more honest if webmasters had more control of reporting spam that directly hits their site. Aside from the few situations @AJ_Kohn:disqus mentioned (i.e. hacking) those are more extreme and is always something a concern. What would be the downside at all?

Codex M

05/24/2012 07:08 am

You are discussing over nothing. If Google won't add this feature, it simply says that Penguin update is only an illusion to SEO that they are penalized for these bad links. In reality, its not. If negative SEO and spammy are so real that the webmaster can't control, won't be logical that Google steps up and offer some help? The reality is that they already devalue these links and the news from so called opinionated experts that spammy links can penalize your site after penguin is simply a MYTH.

Talprihar

05/24/2012 07:21 am

Once again you choose to ignore the facts that if "Google can choose not to use your command if you abuse it" will make it to a total useless. if my site got blasted with 200K links (20$ in fivver) and i try to block them, so it means im abusing it? (or at the second time that happens) 2. im sure you know that webmaster tools is showing only a portion of your links, so whats the use of blocking links while you have no clue about the most of them at many cases? I still cant see the logic from *Google point of view* to give us such option. (from SEO point of view is clear why we would want that) If the wanted to give us a break in links issues they would just devalue, not penalize. "they do have a whole team to help webmasters who want to make content more visible in Google." , Goog thing i never based my SEO knowlege on that team :)

Talprihar

05/24/2012 07:22 am

Once again you choose to ignore the facts that if "Google can choose not to use your command if you abuse it" will make it to a total useless. if my site got blasted with 200K links (20$ in fivver) and i try to block them, so it means im abusing it? (or at the second time that happens) 2. im sure you know that webmaster tools is showing only a portion of your links, so whats the use of blocking links while you have no clue about the most of them at many cases? I still cant see the logic from *Google point of view* to give us such option. (from SEO point of view is clear why we would want that) If the wanted to give us a break in links issues they would just devalue, not penalize. "they do have a whole team to help webmasters who want to make content more visible in Google." , Goog thing i never based my SEO knowlege on that team :)

Talprihar

05/24/2012 07:22 am

Once again you choose to ignore the facts that if "Google can choose not to use your command if you abuse it" will make it to a total useless. if my site got blasted with 200K links (20$ in fivver) and i try to block them, so it means im abusing it? (or at the second time that happens) 2. im sure you know that webmaster tools is showing only a portion of your links, so whats the use of blocking links while you have no clue about the most of them at many cases? I still cant see the logic from *Google point of view* to give us such option. (from SEO point of view is clear why we would want that) If the wanted to give us a break in links issues they would just devalue, not penalize. "they do have a whole team to help webmasters who want to make content more visible in Google." , Goog thing i never based my SEO knowlege on that team :)

Talprihar

05/24/2012 07:22 am

Once again you choose to ignore the facts that if "Google can choose not to use your command if you abuse it" will make it to a total useless. if my site got blasted with 200K links (20$ in fivver) and i try to block them, so it means im abusing it? (or at the second time that happens) 2. im sure you know that webmaster tools is showing only a portion of your links, so whats the use of blocking links while you have no clue about the most of them at many cases? I still cant see the logic from *Google point of view* to give us such option. (from SEO point of view is clear why we would want that) If the wanted to give us a break in links issues they would just devalue, not penalize. "they do have a whole team to help webmasters who want to make content more visible in Google." , Goog thing i never based my SEO knowlege on that team :)

cutey

05/24/2012 09:34 am

It's a great idea, but it will show up Googles weakness in failing to identify spammy links.

Barry Schwartz

05/24/2012 09:52 am

I guess you won't stop so I will.

Barry Schwartz

05/24/2012 09:54 am

I said in my story why it isn't a win for web masters.

Eyepaq

05/24/2012 11:13 am

true but this will also mean that google will need to display all or at least the majority of the links a site has - and they don't want that. Tjis feature can be then turned out into a different tool / weapon. Overall it's true, in will bring a huge benefit for most of the webmasters and also for google but in my opinion it has also some major downsides and also SEOs will start becoming obsessed on this and review and track those links like there's no tomorrow.

aaronfriedman

05/24/2012 01:40 pm

@rustybrick:disqus True its not a "win" in the classic sense for webmasters, but Google isn't perfect. If as a webmaster, you notice something and you can inform Google about it, I think that is a win. Not a direct benefit, but in the long term a benefit for sure.

TJmailly

05/24/2012 03:01 pm

Barry I couldn't agree with you more, this would be of great benefit for webmasters making it a very easy way to filter through unwanted links. I have been effected by penguin being a small business and my site completely disappeared from the SERP's. Currently working on recovery :)

Anonymous

05/24/2012 07:13 pm

Negative SEO should not work. Right now it is working. It was proofed by some members of Traffic Planet.

Talprihar

05/24/2012 08:19 pm

Ok, i assumed that you wanted to create some real discussion, even about the hard parts that need to be taken into consideration . Guess not, nevermind than. * sorry for posting the previous comment a few times, was some error in the post button.

Barry Schwartz

05/24/2012 08:19 pm

I do but I have to move on and provide new content. :)

Nathaniel Bailey

05/25/2012 09:31 am

I'm kind of split, it would be nice to have a tool to tell google not to count only links which have become spammy over the years, but on the other hand... Such a tool could be abused by people. For example you could harm a competitors site by giving them a bad name if you was to build links on their site and then report them as spam, so would google check each link report to combat this or what could be done to make sure innocent sites/links are not wrongly named as spam by google? Surely people could also abuse this by building spam links each week to gain fast rankings and then simple remove them and replace with new spam links the following week in order to keep top positions with spam links that google don't catch in time to punish you because you would be reporting them as not being built by you! So to answer your question Barry, I think google would have a lot to think about and to cover in order to make sure such a tool couldn't be abused, that's like to be a good reason why google aint given us a tool like this yet. But lets hope they are thinking about it and have ways to combat such abuse that will mostly be tried by spammers and black hat people that don't know how to do real seo :)

Nathaniel Bailey

05/25/2012 09:33 am

double post some how?! Please delete Barry :(

Nathaniel Bailey

05/25/2012 09:33 am

Bring in the supper bots so google can see you have thousands of users loving your site! User behaviour and social could be manipulated just as easily as links.

John Britsios

05/25/2012 12:17 pm

Barry, I really hope that will happen soon. That will save us a lot of time and efforts than wasting our time doing "Link Pruning".

Sheldon Campbell

05/26/2012 05:17 pm

On the surface, having the ability to disavow incoming links seems like a good approach, but I think Google simply devaluing links it rates as spammy is a better long-term solution. I think there's a flaw in allowing bad-actors to acquire a ton of crappy links, and be forgiven in the confessional, even though I'd like to assume that Google would be critical of repeat offenders.

Samuel Junghenn

05/31/2012 03:25 am

Hehehe http://screencast.com/t/rpjn1jVpJT

Royur

06/03/2012 06:58 am

Because black hatter would know whether their site is considered spammy or not? They could try one by one which of their site is considered spammy and which are not. then the spammy site could be selled as negative SEO service...

jasjotbains

06/10/2012 05:48 pm

The tool will be one of the best things that Google will do post panda and penguin..waiting for it eagerly !!

Pretush

07/06/2012 07:07 am

Just one question, how do we know that site is of low quality and know which links to ignore ?

blog comments powered by Disqus