Google Panda 4.0 Hit Sites Blocking CSS & JavaScript

Jun 23, 2014 • 8:24 am | comments (32) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Google Panda 4.0Joost published a blog post showing how some of the sites he hit by Google Panda 4.0 and in one case, unblocking the CSS & JavaScript resulted in the site returning to its normal rankings.

Joost wrote, "they’ve returned on almost all of their important keywords. Just by unblocking Google from spidering their CSS and JS."

Well, I have an issue with this for a couple reasons:

(1) Panda is an algorithm and it needs to run again for it to have an impact. So first, you need to theoretically unblock your CSS & JavaScript, then wait for GoogleBot to crawl to pick it up, then Google needs to process all of that, and then Panda has to be rerun.

(2) I haven't seen enough evidence from the community to prove this to work.

(3) More of his supporting evidence is Google recommending you don't block CSS & JavaScript at SMX Advanced a couple weeks ago. Google has been saying that for years and years. They just keep saying it.

The truth is, a lot of sites that may have benefited or took a hit from Panda 4 actually saw reversals a week ago (last weekend). We have a story on that over here. So something did happen with Panda 4 but Google would not chime in about that.

I think it is unrelated to the blocking or allowing of CSS & JavaScript.

Robert Charlton, moderator of WebmasterWorld, wrote in the WebmasterWorld thread:

He associated a site's drop with its accidental blocking of CSS and Javascript files. This was shortly after Google had announced its new Fetch and Render feature in Webmaster Tools. Assumption is that this is now being used in the page layout algorithm. Unblocking CSS and JS appeared to produce quick recoveries.

This kind of association is entirely consistent with algorithmic changes I've seen over the years, where Google has been quick to make use of a capability that we first see in a reporting feature.

What do you think?

Forum discussion at WebmasterWorld.

Update: Have somewhat a response on this from Google, new story over here.

Previous story: Google Adds Rounded Borders To AdSense Ads Arrows
 

Comments:

Joost de Valk

06/23/2014 12:48 pm

I'm not saying (or should probably say: didn't want to say) it's necessarily Panda, the timing coincided which makes people think it's Panda where it doesn't have to be. Google could have just rolled out several things at once and they're, if not the same algorithm, at least related in thinking. My reasoning still stands that it's Google "penalizing" when you have a lot of ads and it can't render your site.

Michael Martinez

06/23/2014 01:11 pm

No, Joost's case study shows the traffic loss occurred almost 2 weeks before the Panda 4.0 release. There is no connection between the traffic drop and Panda 4.0.

StevenLockey

06/23/2014 01:36 pm

Makes a lot of sense. If Google is comparing your site (which it can't see JS/Images) with another site which does, why wouldn't it prefer the one which it can see all of the content on. Not only does it therefore have more faith it can actually see what is been presented to clients, without bits been hidden, added e.t.c.

John Britsios

06/23/2014 02:44 pm

But for the record, adding i.e to the .htaccess file the following lines will not prevent Google to render the pages effectively: Header set X-Robots-Tag "noindex,noarchive,nosnippet,follow" I would assume it might be about blocking css and js via robots.txt.

WP-Site-Robots

06/23/2014 02:50 pm

The default best practice for wordpress in the past has been to block /wp-content/plugins/ via robots text. Seeing as a lot of JS and CSS can be in this robots block, do you think it is now wise to remove Disallow: /wp-content/plugins/ from the robots text file?

Donna D. Fontenot

06/23/2014 02:58 pm

Based on the title of your blog post, and the first 2 sentences which say, "A month ago Google introduced its Panda 4.0 update. Over the last few weeks we’ve been able to “fix” a couple of sites that got hit in it."....I kinda think you are saying it's necessarily Panda. Maybe you're right, maybe you're wrong, but seems to me you were definitely saying so.

tukyr

06/23/2014 03:19 pm

RTFA... date's in the charts being used as the supporting data are the right time frame... Author clearly made a mistake saying "a month ago" in the open line when he should have said "a while ago"... But your statements are totally false if you read past the opening sentence. Stop being such a doosh.

Yo Mamma

06/23/2014 03:37 pm

So many have success with your W3 caching, but for me, oh boy, could never get it to work well.

josh bachynski (SEO)

06/23/2014 05:21 pm

I have another site that blocked CSS and JS and was ALSO hit on May 19 - panda 4 date. They have yet to remove the block as far as i know and are still hit

Fedor

06/24/2014 12:21 am

You'd have to be an idiot to block CSS and JS from Google in the first place. Why not just strip everything and give them pure text? Best idea ever.

Michael Martinez

06/24/2014 12:28 am

You clearly need a magnifying glass to see when the traffic dropped off around May 8-9.

Michael Martinez

06/24/2014 12:30 am

Panda 4.0 rolled out on May 19-20. Your article's chart shows the traffic dropped around May 8-9. That's a very broad interpretation of "the timing coincided".

Michael Martinez

06/24/2014 12:37 am

"You'd have to be an idiot to block CSS and JS from Google in the first place." If you pay attention to things like crawl and how much bandwidth you pay for with your Web hosting, and if you see a LOT of unnecessary and repetitive crawl for files that don't change, it makes a LOT of sense to block the crawlers from those files. They don't need to refetch that content nearly as often as they do. We need to see more case studies or hear a clear statement from Google.

Fedor

06/24/2014 12:53 am

Buddy, it's called minification and compression... In almost all cases the size of your images will be a bigger problem than compressed CSS and JS. I'm guessing you don't have much experience running webservers? A web server can just load-balance and serve the content from wherever it wants. Bots cache, they are more sophisticated than web servers in some regards, their bandwidth and processing costs are more intensive and expensive so they have to be more efficient.

Michael Martinez

06/24/2014 02:19 am

"In almost all cases the size of your images will be a bigger problem than compressed CSS and JS. " You need to learn much more about how hard drives work. The constant fetching of files might be satisfied by disk cache but there is no guarantee of that. Unregulated crawl wears out servers. Web servers don't do much on their own. They have to be told what to do. And the search engines do not need to see the same files over and over again.

asdfa

06/24/2014 03:37 am

Modern web applications and web servers use request caching... Real web servers have dozens of gigabytes of RAM... Static files that get requested a lot cause next to no disk I/O... Its like your understanding of the web application technology stack is twenty years old and never been updated. You can look it up... the source code for apache is open source.

Michael Martinez

06/24/2014 05:18 am

"Real web servers have dozens of gigabytes of RAM.." Wow...and the FAKE Web servers must not. Meanwhile, if you pack hundreds of Websites on a "real" server it's still going to be taxed. The math is pretty simple. You can learn to do it yourself.

Fedor

06/24/2014 05:48 am

OK. I give up.

Michael Martinez

06/24/2014 01:28 pm

That's better. Spend a little more time actually looking at server logs and less time reading hard drive specs and you'll start to see the big picture.

phr3ak

06/24/2014 05:03 pm

1st off, if you do not have unlimited bandwidth for your server you are with the wrong host. Almost all the big guys offer unlimited... 2ndly, Fedor is correct using proper caching and minimization the effects of Google crawling your website will be extremely minimal compared to the traffic you would receive. 3rdly, Real web servers do have more than 16 GBs of RAM. If your using a shared web host that's the biggest mistake you've done. If any of the other websites on the shared server have security flaws your website is potentially open for an attack indirectly. Finally, if you do have a proper web server for you website with a lot of RAM. Asdfa is also correct, that the majority of the fetching is done via the RAM and not the disk I/O. I've been developing enterprise websites for over 12 years, working with Fortune 500 campanies (over 2 dozen; Cisco, Microsoft, HP, to name a few) and have setup countless data centers...

phr3ak

06/24/2014 05:15 pm

Forget what you read, those reports are based simply on when rankings were most volatile. Google spends days even weeks deploying some of their updates, they do not just do it over a 24-48 hour period... I track well over 100 properties, and I saw the 1st effects of Panda starting on Mothers day (May 11th), a day that should have seen an increase in web traffic because of people searching for flowers as an example, but rankings and volume of traffic were still down across the majority of websites. In addition, May 11th data in GWT was not available and did not update for 72 hours. There was almost a full week of data missing in GWT before they finally updated it... this is one of the major signs that Google is rolling updates because they want to keep the data to themselves and maintain transparency so people like us cannot detect the updates before they're actually in full effect.

Michael Martinez

06/24/2014 07:14 pm

"1st off, if you do not have unlimited bandwidth for your server you are with the wrong host" I stopped reading right there. Much to learn about the ways of unlimited bandwidth have you yet, my young padawan. And that really has nothing to do with the toll that crawl spam takes on a server.

Michael Martinez

06/24/2014 07:15 pm

"Forget what you read, those reports are based simply on when rankings were most volatile. " Now that John Mueller has definitively stated that Google's (in)ability to access CSS and Javascript files has no connection/relationship with Panda downgrades/upgrades, I suggest you read up a bit more on the topic.

phr3ak

06/24/2014 08:23 pm

LOL troll somewhere else then if you cannot grasp the fact I was answering multiple threads... you brought up the bandwidth issue... no one else...

Michael Martinez

06/24/2014 08:37 pm

All I ask is that you reply in context when you try to start an argument with random strangers.

phr3ak

06/24/2014 09:20 pm

It was totally within context, you mentioned bandwidth being an issue... stop dodging, and trying to fix what you already said wrong - FLAMER

Michael Martinez

06/24/2014 09:31 pm

Dude, you clearly don't understand what the word "flamer" refers to you. Your behavior is exemplary in that respect. I suggest you post under your real name. You'll be less inclined to act this way when you feel a sense of accountability.

phr3ak

06/24/2014 09:37 pm

Your no longer worthy of a response... now I am a flamer...

phr3ak

06/24/2014 10:04 pm

Read up, you cannot even read people replies in their entirety, so what gives you the sense that you've read enough...

Michael Martinez

06/24/2014 10:13 pm

Well, that was your choice. No one forced you to be a flamer.

Nico

06/28/2014 06:00 pm

LOL unilimited bandwidth, where do this noobs come from? Go go shared hosting wahhahaha!!

Nico

06/28/2014 06:03 pm

certainly, those wp docs are so outdated they could kill your site SEO wise.

blog comments powered by Disqus