Google Search Console 404 Error Report For /1000 URLs Spike

Jan 16, 2024 - 7:51 am 157 by

Lost Google Robot

I am seeing an unusual number of complaints around the reported 404 errors within Google Search Console reporting. This is specific to your URLs having /1000 appended to the end of the URL. Google is aware of the issue and is telling us to ignore them since the links go to 404 pages.

I was able to replicate it for this site, you can see, Google is showing 404 error reports for URLs that do not exist on the site, so it is accurate. But many of these URLs end in /1000 and tons of SEOs and site owners are asking - what is up with these /1000s.

Here is what I see:

Google Search Console 404 1000

Google's John Mueller replied to this issue saying "No need to disavow; it's probably just random spam." He also replied on LinkedIn saying, "You can ignore them. 404s are fine." Google has said 404s are fine in the past and that links pointing to 404 pages do not count anyway.

I spotted the complaints in the comments here and there are complaints about this on Reddit, LinkedIn and several on X.

In the comments, someone posted an example to sites linking to /1000:

1000 Links

When I checked some of the referring pages that link to my /1000 URLs that 404, they are mostly redirects that lead to porn and malware sites.

Have you seen this?

Forum discussion at Reddit.

Update on February 2, 2024: John Mueller from Google posted a more detailed response in this Reddit thread saying don't worry about it, these links do not help or hurt you. He wrote:

You can just ignore these. There's absolutely nothing you need to do about them. They have no negative (or positive!) effect on your site's SEO.

I've seen threads where people robot them out (you can do that, it also does nothing, it just changes how they're reported in Search Console).

I've seen threads where people redirect them (you can do that, it also does nothing, there's no value to be gained from these links, and redirecting them elsewhere brings no effects either).

I've seen threads where people use the diavow links tool (you can do that, it also does nothing, and given the number of domains, it's a lot of useless work).

I've seen threads where people block those requests at a server level, 410, 403, 5xx, etc - (you can do that, it also does nothing here and at most changes how it's reported in Search Console, but you better not mess up and accidentally break your site's actual content because that's very easy to get wrong)

Google & the other search engines have a lot of practice dealing with random spammers & random link-spam. The web is full of weird & wonderful things, and full of so much useless spam. All search engines discard or de-prioritize a lot of junk in order to find the highlights that users want to find (and I hope your site is one of those highlights). Please have your site be one of the highlights of the web.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Updates

Google March 2024 Core Update Finished April 19th (A Week Ago)

Apr 26, 2024 - 4:40 pm
Search Forum Recap

Daily Search Forum Recap: April 26, 2024

Apr 26, 2024 - 4:00 pm
Search Video Recaps

Search News Buzz Video Recap: Google Core Update Updates, Site Reputation Abuse Coming, Links, Ads & More

Apr 26, 2024 - 8:01 am
Google Search Engine Optimization

Google Publisher Center No Longer Allows Adding Publications

Apr 26, 2024 - 7:51 am
Google

Google Tests Placing The Snippet Date Next To URL

Apr 26, 2024 - 7:41 am
Google

Google Breaks Out Googlebot IP Ranges For User-Triggered Fetchers

Apr 26, 2024 - 7:31 am
Previous Story: Bing Unchanged Despite Copilot Pro Paid Release