Google's "Big Memory" On 404 Pages

Apr 9, 2013 • 8:51 am | comments (12) by twitter Google+ | Filed Under Google Search Engine Optimization

Google 404 ErrorA Google Webmaster Help thread brings up a common question I see from webmasters, publishers and SEOs - why does Google Webmaster Tools continuously show old 404 pages in Webmaster Tools when those pages are either no longer linked to or 404ed.

Google's John Mueller said you don't have to worry about it. He said "Google has a big memory" and often Google will continue to retry the 404ed URLs over and over again, just to be sure. Plus, if anyone on the Internet is still linking to a 404ed page and Google crawls that link, they will try it again.

Having 404ed pages is not something that is an issue by itself. Of course, if the 404ed page shouldn't be 404ed then that is an issue. But if the page does not exist, it should report a 404 error, that is what it is made to do.

Google's John Mueller said:

If we've seen a URL once, we're likely to retry it again and again, especially if we should run across new links to that URL. This isn't something that you really need to worry about, it's absolutely fine and even expected that a website returns 404 for URLs that don't exist. Assuming these are URLs that you don't want to have indexed, then these 404 errors will not negatively affect your site's performance in search. Having 404s is fine.

For more details, see this Google blog post.

Forum discussion at Google Webmaster Help.

Previous story: Google: Be Careful Hiring An SEO To Manage Your Google Maps Listing



04/09/2013 01:19 pm

Well at least we now know that they are retrying 404ed pages.Just a pity Google always takes so long to respond to these kind of things, especially since it's a common question amongst webmasters and SEOs....

Josh Zehtabchi

04/09/2013 01:51 pm

Meh, the 404's can't hurt you - just annoy you in GWT. I applaud the fact they keep trying in hopes a site is down or compromised without removing the URL from the index. Annoying, sure. Practical, of course. Good Guy Google for once.

Prashant Bairwa

04/09/2013 02:31 pm

I also got these type errors in webmaster tools.Pages are not linked any Now no worry . Thanks!


04/09/2013 03:07 pm

It makes sense that Google would retry again and again. 404 means "temporarily unavailable", it's only code 410 which means "gone, remove from your index".


04/09/2013 04:29 pm

Way more important than 404's regarding the google memory is, if removed external links, pointing to your site, which nonetheless still appear in webmaster tools, should be disavowed. Many people contacted countless webmaster with a request to remove a link. After the links got removed, they usually still appear in webmaster tools for many months. In some cases a year.


04/10/2013 12:39 am

You are incorrect. 404 means 'Not Found' and carries no indication of whether the resource is gone temporarily or permanently.


04/10/2013 05:32 am

Touche' - that is often how it's used though.


04/10/2013 06:19 am

I think you are totally off-topic here.. The post is about Google re-assuring us that it won't remove a page from it's index or impact your site in the the SERPS because you had a couple of 404ed pages in the past as Google retries them even if people are still linking to them. Yet you are talking about disavowed/removed links still showing up in GWT which in this case is a different topic altogether. Google has explained countless times that it takes time before properly reflecting removed/disavowed links in GWT.


04/11/2013 12:18 am

Even after two years of having 404ed them, I dont see the point


06/03/2013 08:24 am

I had the same error code. My fix was to open a new tab, highlight inside the search box then save as or copy it to a folder then open the folder , right click on the google emblem then select to send short cut to desktop. delete old google on desktop. use new desktop shortcut.

Prince Bhalani

01/16/2014 04:51 pm

Removing an outdated page or link Google updates its entire index regularly. When we crawl the web, we automatically find new pages, remove outdated links, and reflect updates to existing pages, keeping the Google index fresh and as up-to-date as possible. If outdated pages from your site appear in the search results, ensure that the pages return a status of either 404 (not found) or 410 (gone) in the header. These status codes tell Googlebot that the requested URL isn't valid. Some servers are misconfigured to return a status of 200 (Successful) for pages that don't exist, which tells Googlebot that the requested URLs are valid and should be indexed. If a page returns a true 404 error via the http headers, anyone can remove it from the Google index using the webpage removal request tool. Outdated pages that don't return true 404 errors usually fall out of our index naturally when other pages stop linking to them.


06/11/2014 09:27 pm

We upgraded our website to SharePoint which changed virtually every link on the site. We added redirect rules and IIS rewrites to handle the majority of these issues, but our 404 errors have increased continuously since we upgraded. From 17 pre-upgrade to over 53,000 this week. Should we be concerned? It's been 2 months, so I would think it should be dropping.

blog comments powered by Disqus