Can You Hide Text & Links From Your Users, If Search Engines Won't See?

Jan 22, 2008 • 7:35 am | comments (5) by twitter Google+ | Filed Under SEO - Search Engine Optimization

Here is an unusual scenario for you. You have a web page, the web page is one page of many on your site. This specific page does not allow search engines to crawl them by using a robots.txt file to restrict access. Now, the webmaster wants to hide links or text on that page from the end user. If you hide text on pages that search engines are not suppose to crawl, but I guess technically can crawl since the page is not password protected, are you at risk to a penalty from a search engine?

Got that? Page A contains a noindex, nofollow META Tag. Page A, hides links and text. Can Page A cause the whole site to be penalized by Google, Yahoo, Live, or other search engines?

We know hiding links and text are against all search engines terms of service. But can you hide links and text on pages that search engines are asked not to crawl?

Why would someone want to do this? Designers might do this for 'creative reasons.'

There is currently a debate going on about this topic at WebmasterWorld. Personally, I think it would be fine to hide links or text in this situation - at least it won't be a violation of the terms of a search engine, since search engines are not allowed to access that page. It might violate your end user's terms of service, but technically, what a search engine should not be crawling should not be of interest to a search engine.

Am I right?

Forum discussion at WebmasterWorld.

Previous story: Google's Remove URLs Feature Removes For 90 Days Only



01/22/2008 01:21 pm

You're spot on, Barry. Uncrawlable stuff as well as noindex'ed pages can't violate any search engine's TOS WRT to hidden text/links. There might be one exception: If a page is disallow'ed in robots.txt, but has strong inbound links, it will be indexed by all major search engines. Then if the page does a sneaky redirect, Matt's gang in building 43 would be upset. In this case I'd add a "noindex" X-Robots-Tag, or robots meta tag, and allow crawling. "Disallow:" in robots.txt allows indexing by design, so a search engine can argue that the Webmaster performing the sneaky redirects or other nasty stuff on disallow'ed pages has asked for indexing webspam, hence deserves a penalty.


01/22/2008 04:13 pm

I'd have to agree with you Barry, just because something is available to the general public doesn't mean it needs to comply with search engines standards if you are not going to utilize those search engines for the page. Tedsters point about human reviews is a good one, but as a simple amateur I always look at the robots.txt and metatags when looking at a site to see if the page should be indexed, I'd imagine the pros at Google consider that as well.


01/22/2008 10:58 pm

I have sevearl links on my sites that are both hidden and disallowed from robots.txt. I rven use the no follow atribute. The purpose of those hidden links are to catch robots that do not follow robots.txt proticol. If I understood you basic question correclty I see nothing wrong.

No Name

01/23/2008 07:28 am

I can't see how you can be penalized if the search engine is not supposed to be on that page. Technically if they did that they would be violating the agreement not to crawl that page.


01/23/2008 12:41 pm

I think Barry you are ok, if search engine will not crawl that page then what is the necessary to SEO work

blog comments powered by Disqus