Hidden Divs Not Indexed by Google

Dec 15, 2005 • 9:58 am | comments (25) by twitter Google+ | Filed Under Google Search Engine Optimization

There are two very interesting threads on the topic of hidden divs. The first one I found this morning at Search Engine Watch Forums named Is Google No Longer Indexing Hidden Divs? but then decided to hold off on posting on it until we received some more confirmation in the thread. Later, WilliamC at article distribution, notified me of a thread he posted at Phil C's forum named Jagger and why so many fell and still dont know why?. Ok here is the scoop, as I understand it.

Google is no longer indexing hidden div tags. Spammers used hidden divs to hide content, but also many non-spammers and even non-SEOs have used them for design purposes. The theory is that the Jagger update now hurt any site using hidden divs. One example site thrown out in Phil's forum was a white hat site at ducor.com that used hidden divs for its menus. If you look at the site's CSS you will notice in the CSS code a line for #elnav {position:absolute; visibility:hidden;}. The member reports that the site "has been all but delisted from Google now.. Right now they show just 4 of our pages in the index.."

My own sites that use a form of this seem not to be hurt, but I do not see any penalty. It may be because the CSS is off the page. Supposedly, if you do not have it on page, then this wont be an issue. Maybe that is why some sites have been affected. Should they be? Most should not. But that is a Google update for you.

So what can you do now? Move that CSS off the page, to an external style sheet. Then maybe, block the robots.txt from those files.

Forum discussion at Web Work Shop Forums and Search Engine Watch Forums.

Previous story: AOL to Use MSN adCenter and Drop Google?


Barry Schwartz

12/15/2005 03:50 pm

Jaimie, Google will not have spiders touch files that have been excluded via the robots.txt file. They are very strict about it. The only spiders that don't care are rogue spiders.

Jaimie Sirovich

12/15/2005 03:56 pm

Is it fair to say, then, that this technique could be (very) successfully used to cloak? That's my only problem with this line of reasoning -- why bother with Fantomaster when you can just block a css file that makes your viagra spam invisible to everyone except Google ...

Barry Schwartz

12/15/2005 04:25 pm

No its not fair to say that.


12/15/2005 04:28 pm

I think that people forget that many times when an SE suspects funny stuff, they can easily have a real person go look at the file easily. Blocking the robots file would only stop automatic filtering of the css hidden properties.

Jaimie Sirovich

12/15/2005 04:41 pm

I know it's possible for this to happen, but in practice, it doesn't. There are too many spam sites in too many prominent areas for me to believe that Google directs a lot of effort into this. Of course I may be wrong. But Barry, to be quite honest, if I can scrape a few results off of some engine, scrape those sites for content, display:none it in a stylesheet that's blocked by an excluded file in robots.txt, then use some 2nd-tier PPC (searchfeed?) feed to monetize the spam, wouldn't this be a clear and flagrant reason for Google _NOT_ to honor my exclusion?

Barry Schwartz

12/15/2005 04:55 pm

They can manually review your site and boot you, like William said above.

Gerard Manning

12/15/2005 04:56 pm

Just wanted to point out on ducor.com that the site is blocking cookies, and Yahoo is also showing some pages with the error page cached. It seems the site is blocking cookies and they don't have an exception for spiders. Its part of the problem at least.

Jaimie Sirovich

12/15/2005 05:16 pm

The same can be said about Fantomaster, no? In fact, I'd say this method, though in some ways primitive, might be HARDER to detect than Fantomaster-style IP-origin cloaking. Just a thought ... I'm totally not a black hat spammer, and rather ignorant of the "dark side," but TBH, I don't see cloaking having a major edge over clever CSS spam.


12/15/2005 05:17 pm

Does Fantomaster know the IPs of everyone at Google -- not just the spiders? Meaning, their "QA" team? I doubt it ...

Barry Schwartz

12/15/2005 05:18 pm

Let's leave Fantomaster out of this.


12/15/2005 06:22 pm

I would go further and ban G ip from accessing the css file through .htaccess


12/15/2005 06:41 pm

I have been able to look into this a bit more since posting at SEW and have found that in all the text caches I have looked at of pages that use this type of menu system are missing the contents of the hidden divs used in the menu system. But similar menu systems where the hidden divs are declared in the CSS file instead of on the page are intact. Can it be long until Google dumps all these too? IMO this is a boo boo - thowing out the baby with the bathwater type of thing.


12/15/2005 06:41 pm

I wouldn't agree with that. But I'm whitehat. I guess it's one extra stopgap measure, though. Though Barry seems to indicate that G won't touch it anyway. And if they manually audit your site, I'm sure they won't use Google IPs as I've said above. J.

Barry Schwartz

12/15/2005 07:31 pm

This is not about spamming. It is about those who use the divs for non-spam reasons and are totally unaware of this. So if a manual audit would occur, the manual review will show a clean site that got hit for no reason.

Shawn Hogan

12/15/2005 07:44 pm

I don't buy it. vBulletin legitimately uses hidden divs for it's DHTML drop-down menus and every vBulletin forum I look at (including mine) is not banned from Google.

Barry Schwartz

12/15/2005 07:59 pm

Shawn, I was thinking the same thing, about 4 hours ago. But all I do is cover the buzz, not come up with it. A very valid point, however, I also thought that Google has something in there that is special for vb forums, since its so widely used...

Jaimie Sirovich

12/15/2005 08:04 pm

Shawn, It's not causing banning. People aren't saying it gets you banned, they're saying it just doesn't get indexed. This means if you have some links in DIVs and they're visibility:hidden, Google is ignoring it. This is akin to the fad in the 90s of using javascript menus that didn't get spidered. And I'm not a blackhat. I'm a whitehat. Barry, I'd argue that in order to be a competent SEO, you'd have to also analyze this as a blackhat. If you don't, you aren't fully understanding G's motives. So it IS about spamming IMHO, because spammers are what caused this aggressive tactic. J.

Sebastian Schneider

12/15/2005 08:17 pm

You can nearly always blame the spammers... anyway I wouldn't necessarily recommend invivisble divs for menus because it's somewhat predicted that they will not be seen or indexed by bots, as it probably happened now.


12/15/2005 08:20 pm

But how many thousands of sites out there that use image replacement for legitimate reasons (on-page content == image content) would actually <em>receive</em> a manual audit? I seriously doubt that even 5% would; their SERPs would just tank for no reason.

Shawn Hogan

12/15/2005 09:33 pm

Jaimie - Whatever the case may be, I don't buy it. :) My forum uses openly hidden div, and there are 576,000 pages in the Google index, the content of those divs are seen in Google's cache (and searchable with a web search), and the links within them are also seen with the link: command. It *may* be one of the million factors Google uses. Probably individually it doesn't trigger a penalty, but if you combine it with other stuff maybe it does. {shrug}

Jaimie Sirovich

12/15/2005 09:52 pm

Shawn, Barry suggested something I'd hesitate to believe ... that they hacked in faux whitelisting for common forum posting software. TBH, it looks like that to me. Thing is, from a programmign point of view .. a purist designing an algorithm designed to function generally would be tainted by hacks like that. That said, I used to openly go style='display:none.' I stopped that. And I'm putting my css in a folder and denying bots access. Cloaking a 404 for the file is a bit overboard, because as Barry said, SEs honor robots.txt rather well ... Jaimie.


12/16/2005 06:12 pm

This is about spamming??? (probably is.) This seems a tremendous overkill which will affect many clean sites. Seriously, if a spammer can't put his css in an external file... is he/she really capable of creating spam that's threatening to Google? Ok, so great move G. Spammers will now move to external css and the ratio of clean to dirty sites hit by this will keep (un)improving!


12/17/2005 12:39 am

Its not just about menus, right? Hiding and displaying content via toggles makes for a more efficient, user-friendly UI. It is akin to a tabbed browser, or multiple application windows on a desktop. I trust Google spiders will accomodate such UI, though it may mean the spider has to do a logical analysis of the css and javascript. If not, they set the web back. EP

L. Mohan Arun

03/21/2009 06:05 pm

Hmm no, I can confirm that hidden divs are indexed by Google and they even show up in search results. See for yourself http://www.google.com/search?hl=en&client=firefox-a&rls=org.mozilla%3Aen-US%3Aofficial&hs=6VQ&q=http%3A%2F%2Fwww.thinkwiki.org%2Fwiki%2FCategory%3AModels&btnG=Search First result, notice description, go to the page, view source - the description is in a hidden div.


07/16/2009 04:36 pm

"So what can you do now? Move that CSS off the page, to an external style sheet. Then maybe, block the robots.txt from those files." that entire last line shows how little the author knew/knows about CSS/SEO/Google/Search Engines. The fact they are blogging about something they know little about is funny though

blog comments powered by Disqus