Google: Page Indexed Without Content Notice Is About Page Being Blocked By Robots.txt

Mar 23, 2021 • 7:21 am | comments (1) by twitter | Filed Under Google Search Engine Optimization
 

We've all seen the message in Google Search Console's index coverage report for "Page indexed without content." Google's Gary Illyes said when you see that, most of the time (not all of the time), it is about "pages that are blocked by robots.txt."

The error is defined in Google's help document as "Page indexed without content: This page appears in the Google index, but for some reason Google could not read the content. Possible reasons are that the page might be cloaked to Google or the page might be in a format that Google can't index. This is not a case of robots.txt blocking. Inspect the page, and look at the Coverage section for details."

Gary Illyes was asked if this error can be caused by "heavy loading time or time-outs" but Gary said no. If it was a heavy loading time or time-out issue, you'd likely see a soft 404 notice instead Gary explained. Gary said "this error is really just for pages that are blocked by robots.txt."

Here are those tweets:

Forum discussion at Twitter.

Previous story: Google: Don't Worry About Bounce Rate For SEO Purposes
 
blog comments powered by Disqus