New Google Index Report Doesn't Show Blocked URLs From Sitemaps

Jul 5, 2018 - 7:51 am 3 by

Google Blocked

Alan Bleiweiss shared a scree shot of how the Google Sitemaps report in Google Search Console shows over 11,000 URLs that were blocked by robots.txt as an issue and warning. Alan asked why is the new Google Index report in the new Google Search Console not reporting on these errors?

John Mueller said on Twitter that the new report won't report on an error on sample URLs at the sitemap submission level. He said "those are sample URLs tested before being submitted to indexing - this is done at the sitemap submission, so it wouldn't be in the indexing report in the new SC."

Here is Alan's screen shot:

click for full size

I should note, that on this site, I block only one URL via the robots.txt file and it shows as an error both in my Sitemap submission and in the Index report.

Sitemap report:

click for full size

New Index coverage report:

Index Report Block

Forum discussion at Twitter.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Gvolatility, Bing Generative Search, Reddit Blocks Bing, Sticky Cookies, AI Overview Ads & SearchGPT - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: July 26, 2024

Jul 26, 2024 - 10:00 am
Search Video Recaps

Google Volatility, Bing Generative Search, Reddit Blocks Bing, Sticky Cookies, AI Overview Ads & SearchGPT

Jul 26, 2024 - 8:01 am
Google

Google Gemini Adds Related Content & Verification Links

Jul 26, 2024 - 7:51 am
Other Search Engines

SearchGPT - OpenAI's AI Search Tool

Jul 26, 2024 - 7:41 am
Search Engine Optimization

Google's John Mueller: Don't Use LLMs For SEO Advice

Jul 26, 2024 - 7:31 am
Google

Google Search With Related Images Carousel Below Image Box

Jul 26, 2024 - 7:21 am
Previous Story: Google 10 Years Later: Absolute Vs Relative URLs Doesn't Matter For SEO