Google: Avoid Blocking Pages That Are Important Enough To Have Links To Them

Dec 14, 2020 - 7:21 am 0 by

Dandelion Google

Google's John Mueller said he would advice that if "that's something where if you see that these pages are important enough that people are linking to them then I would try to avoid blocking them by robots txt."

In short, if you have important or popular pages with a lot of links to them, make sure Google can access the page.

If you robots.txt out that page, Google may drop the links and those links won't help Google understand the true importance of your web site. That means, your rankings can decline in Google Search.

Of course, it all depends on the specific situation of that piece of content. So this is not simple blanket advice across every situation.

Here is the embed where John said this at the 6:44 mark into Friday's video:

Forum discussion at YouTube Community.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Flux, AdSense Ad Intent, California Link Tax & More - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: April 25, 2024

Apr 25, 2024 - 4:00 pm
Google Updates

Google March Core Update Still Rolling Out & Heated SEO Chatter Continue

Apr 25, 2024 - 7:51 am
Google

Report: How Prabhakar Raghavan Killed Google Search

Apr 25, 2024 - 7:41 am
Google Search Engine Optimization

Google Favicon Documentation Adds Rel Attribute Value Definitions

Apr 25, 2024 - 7:31 am
Google Ads

Google Ads API Version 16.1 Now Available

Apr 25, 2024 - 7:21 am
Google Search Engine Optimization

Google: Splitting & Merging Sites Takes Longer Than Normal Site Migrations

Apr 25, 2024 - 7:11 am
Previous Story: Google Tests Displaying Images Within Snippets On Mouse Hover