New Google Index Report Doesn't Show Blocked URLs From Sitemaps

Jul 5, 2018 - 6:51 am 3 by

Google Blocked

Alan Bleiweiss shared a scree shot of how the Google Sitemaps report in Google Search Console shows over 11,000 URLs that were blocked by robots.txt as an issue and warning. Alan asked why is the new Google Index report in the new Google Search Console not reporting on these errors?

John Mueller said on Twitter that the new report won't report on an error on sample URLs at the sitemap submission level. He said "those are sample URLs tested before being submitted to indexing - this is done at the sitemap submission, so it wouldn't be in the indexing report in the new SC."

Here is Alan's screen shot:

click for full size

I should note, that on this site, I block only one URL via the robots.txt file and it shows as an error both in my Sitemap submission and in the Index report.

Sitemap report:

click for full size

New Index coverage report:

Index Report Block

Forum discussion at Twitter.


Popular Categories

The Pulse of the search community


Search Video Recaps

Google Search Ranking Update, Recipe Blogs Drop, Google Hits Reviews & More SEO, PPC and Local - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Google Search Engine Optimization

Google: Sites Wrongfully Using The Google Indexing API Often Are Spammy & Low Quality

Feb 20, 2024 - 7:51 am
Google Search Engine Optimization

Google Merchant Center Updates Estimated Delivery Times Calculation

Feb 20, 2024 - 7:41 am
Google Search Engine Optimization

Google Merchant Center Requires Meta Labels On AI Generated Images

Feb 20, 2024 - 7:31 am
Google Maps

Google Local Pack Tests Restaurant Price Ranges

Feb 20, 2024 - 7:21 am

Google Search Can Wait On Customer Service Hold For You

Feb 20, 2024 - 7:11 am
Google Search Engine Optimization

New Google Search Support For Product Variant Structured Data

Feb 20, 2024 - 6:00 am
Previous Story: Google 10 Years Later: Absolute Vs Relative URLs Doesn't Matter For SEO