Google: Invalid URLs Generally Don't Hurt Your Rankings

Dec 4, 2012 • 9:03 am | comments (14) by twitter Google+ | Filed Under Google Search Engine Optimization
 

invalid URLsA busy thread at Google Webmaster Help has one webmaster asking if a large number of broken links that are found internally can have a negative impact on your Google rankings.

The answer is it won't necessarily hurt the overall site's rankings in Google but it does hurt your internal linkage structure and hurts the user experience on your site. But generally, invalid URLs do not hurt the site's ranking on a whole. It does hurt the specific invalid page from ranking - since it is invalid. It doesn't give you the full benefit of using your internal linkage the most optimal way. But having invalid URLs and isolating just that, does not have a negative impact.

John Mueller of Google said on Google Webmaster Help the following:

The number of crawl errors on your site generally doesn't affect the rest of your site's crawling, indexing, or ranking. It's completely normal for a site to have URLs that are invalid and which return 404 (or 410, etc). That wouldn't be something which we would count against a site -- on the contrary, it's something which tells us that the site is configured correctly.

For more information about 404's in particular, I'd also check out our blog post over here.

Forum discussion at Google Webmaster Help.

Image credit to BigStockPhoto

Previous story: Google AdSense Brings Arrows To All Ads
 

Comments:

Yakezie

12/04/2012 02:09 pm

What do you think is Ye main cause of broken links? For example, I never change my links once published, so nobody linking to Financial Samurai or Yakezie would ever har a broken link. Internal linking would be fine too. Thx, Sam

ScottyMack

12/04/2012 02:51 pm

If that is true, why does Webmaster Tools issue "Warnings" for too many broken links? I'm very skeptical about this statement!

Josh

12/04/2012 04:37 pm

This contradicts previous statements provided by Google regarding 404 errors...

Nick Ker

12/04/2012 04:43 pm

Well, this doesn't really make much sense. How can a site with lots of broken internal links still be considered "good quality"?

Ian Scott

12/04/2012 06:43 pm

Actually, I am seeing the opposite of what Mueller claims. I have a site that had a huge subdirectory added - it was indexed by Google before the subdirectory was removed. There are no external links to anything within that subdirectory, but because it also had the standard template of the site, internal site links existed. For some reason, Google has been very slow... VERY SLOW at discovering there are no more links to anything within that subdirectory that no longer exists. At one point, GWMT's said there were 300,000 404's. Six months later, it is saying 26,000 - but sometimes, it decreases the number, and then it will revert back, ie. a month ago, it said 19,000 404's... then it will jump back up.. then go back down. There IS a correlation with the increase in 404's and rank dropping. When GWMT's decreases it's count of 404's, the rank increases. In reality, there are NOT that number of 404's at all.. it just seems Google takes forever to have it's database cleaned up. Google is very inefficient at that, it appears.

Odez

12/04/2012 08:46 pm

To notify of potential problems that the crawler has detected so that you can fix it? If it's necessary. Of course, they could not bother providing that information. However, it can be useful to know, especially as you can see the source (if it was from a link). Maybe someone made a link to your site, but there was a typo. A quick 301 and you've got the link. Or you delete/move some content and forget to 301 it.

Odez

12/04/2012 08:47 pm

Are you sure? Did you read the blog post Mueller links to? It seems consistent.

Odez

12/04/2012 08:50 pm

Just imagine this. If Google made it so simple that broken links could affect rank then how easy would it be for a competitor to make links to invalid URLs in the thousands. Or if you have a UGC site then it could be heavily spammed with broken internal links. No, that would be a really dumb way to handle the situation.

Nick Ker

12/04/2012 09:08 pm

Broken links from other sites have very little to do with this article or the original webmaster forum discussion. That is why I said "internal" links. Internal links would be under the control of the site owner/webmaster. Even if they are user generated, it is still the responsibility of the site owner to keep it in the best shape for users. Later in the thread, Mueller did clarify that 404s would normally indicate that the site is handling the incorrect or missing URLs properly. So from Google's point of view, that would not necessarily mean poor quality. That makes sense. My first comment must have been written before my first caffeine of the day.

Upendra Kumar Chansoria

12/06/2012 08:54 am

Then It`s just need to disable the Google Webmaster`s crawl errors link.....

Upendra Kumar Chansoria

12/06/2012 09:05 am

But i how i know that where someone made a link to my website internal page

The Big K

12/06/2012 09:44 am

I'm 'that' webmaster who asked the question. I've asked the question on several websmaster forums and the general 'experience' seems to be quite different than what Google's suggesting. Let me explain the situation: As covered by seoroundtable(Link: http://www.seroundtable.com/disqus-google-errors-15663.html) , DISQUS comment system created ~99k broken URLs on our website, 'pointing to non-existing locations on our own website'. Note : No external domain is involved in this situation. The broken URLs were 'found' on our domain that linked to non-existing locations (404) on our own domain. There are a few comments in that post that say the websites have suffered traffic loss, but Google continues to refute that the 'internal broken links' hurt rankings. To me, it's quite obvious that a large number of broken links signals Google that the website isn't configured properly. What we found in GWT is that around 4-5 September, the number of 'Not Found' URLs went up high - from ~9k to 17k, and by 8 September GWT reported over 50k of these 'Not Found' URLs. The traffic drop happened exactly on the day Google reported massive rise in 404 errors. I've already fixed the errors by setting up 301 redirects, and Google's dropping the error count by ~1k every day. I've recently seen ~8k drops and the count is down to ~34k as of today. It's still a large number, I do see 'some' improvement in traffic. I will write a case study if our traffic comes back once our error count drops below 10k. My aim, obviously, is to get it down to

The Big K

12/06/2012 09:48 am

We're *NOT* talking about 'other domains' linking to non-existing locations on our domain. We're *specifically* talking about our own domain linking to non-existing locations on itself! There's a huge difference between the two cases. Of course we shouldn't be penalised for mistakes made by other webmasters.

The Big K

12/06/2012 09:50 am

Nick, that might mean that the site's handling the errors properly - but Google does have this "User Experience" signal which is critical in determining rankings. When a site has a ton of broken internal links, Google might think "this site is directing users to 404s through so many links - that's bad user experience". That's why it *may* rank the domain down. I'll post my case study soon.

blog comments powered by Disqus