Google: Avoid Blocking Pages That Are Important Enough To Have Links To Them

Dec 14, 2020 - 7:21 am 0 by

Dandelion Google

Google's John Mueller said he would advice that if "that's something where if you see that these pages are important enough that people are linking to them then I would try to avoid blocking them by robots txt."

In short, if you have important or popular pages with a lot of links to them, make sure Google can access the page.

If you robots.txt out that page, Google may drop the links and those links won't help Google understand the true importance of your web site. That means, your rankings can decline in Google Search.

Of course, it all depends on the specific situation of that piece of content. So this is not simple blanket advice across every situation.

Here is the embed where John said this at the 6:44 mark into Friday's video:

Forum discussion at YouTube Community.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: February 23, 2026

Feb 23, 2026 - 10:00 am
Google Search Engine Optimization

Google Won't Use Sitemap Files If Its Not Convinced Of New/Important Content

Feb 23, 2026 - 7:51 am
Google

Google: A Spike In Impressions Doesn't Cause Problems For Search

Feb 23, 2026 - 7:41 am
Bing Search

Bing Tests New UI For AI Responses With New Links & References

Feb 23, 2026 - 7:31 am
Google Maps

Google Business Knowledge Panels With AI Generated Services

Feb 23, 2026 - 7:21 am
Google Maps

Google Email: Fake Or Incentivised Reviews Found On Your Business Profile

Feb 23, 2026 - 7:11 am
 
Previous Story: Google Tests Displaying Images Within Snippets On Mouse Hover