Google: Avoid Blocking Pages That Are Important Enough To Have Links To Them

Dec 14, 2020 - 7:21 am 0 by

Dandelion Google

Google's John Mueller said he would advice that if "that's something where if you see that these pages are important enough that people are linking to them then I would try to avoid blocking them by robots txt."

In short, if you have important or popular pages with a lot of links to them, make sure Google can access the page.

If you robots.txt out that page, Google may drop the links and those links won't help Google understand the true importance of your web site. That means, your rankings can decline in Google Search.

Of course, it all depends on the specific situation of that piece of content. So this is not simple blanket advice across every situation.

Here is the embed where John said this at the 6:44 mark into Friday's video:

Forum discussion at YouTube Community.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Gvolatility, Bing Generative Search, Reddit Blocks Bing, Sticky Cookies, AI Overview Ads & SearchGPT - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: July 26, 2024

Jul 26, 2024 - 10:00 am
Search Video Recaps

Google Volatility, Bing Generative Search, Reddit Blocks Bing, Sticky Cookies, AI Overview Ads & SearchGPT

Jul 26, 2024 - 8:01 am
Google

Google Gemini Adds Related Content & Verification Links

Jul 26, 2024 - 7:51 am
Other Search Engines

SearchGPT - OpenAI's AI Search Tool

Jul 26, 2024 - 7:41 am
Search Engine Optimization

Google's John Mueller: Don't Use LLMs For SEO Advice

Jul 26, 2024 - 7:31 am
Google

Google Search With Related Images Carousel Below Image Box

Jul 26, 2024 - 7:21 am
Previous Story: Google Tests Displaying Images Within Snippets On Mouse Hover