Google: Don't Block Your CSS & JavaScript Files

Mar 27, 2012 • 8:22 am | comments (14) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Yesterday, Google's Matt Cutts posted a video as a "public service announcement" asking SEOs and webmasters who are blocking GoogleBot from accessing their CSS and JavaScript files to stop blocking Google from accessing those files.

Matt Cutts said it will make for a better crawl and give Google get a better idea of what is going on with your site and return better results to the user. Here is the video:

Going forward, will you block GoogleBot from accessing your CSS and JavaScript files? Take my poll:

Here is the transcript of this video:

0:02 Hi everybody.
0:03 Matt Cutts here.
0:04 I just have a short public service announcement rather
0:07 than a question today.
0:08 And the PSA is this.
0:10 If you block Googlebot from crawling JavaScript or CSS,
0:15 please take a few minutes and just take that out of the
0:17 robots.txt.
0:18 And let us crawl the JavaScript, let us crawl the
0:20 CSS, and get a better idea of what's going on on the page.
0:24 A lot of people block it because they think, this is
0:26 going to be resources that I don't want to have the
0:29 bandwidth or something.
0:30 But Googlebot is pretty smart about not crawling stuff too
0:33 fast. And a lot of people will do things like, they'll check
0:38 for flash, but then they're including some JavaScript.
0:41 And they don't realize, including that Javascript, the
0:44 Javascript is blocked.
0:45 And so we're not able to crawl the site as effectively as we
0:48 would like.
0:49 In addition, Google is getting better at processing
0:52 JavaScript.
0:52 It's getting better at things like looking at CSS to figure
0:55 out what's important on the page.
0:58 So if you do block Googlebot, I would ask, please take a
1:01 little bit of time, go ahead and remove those blocks from
1:03 the robots.txt, so you can let Googlebot in, get a better
1:06 idea of what's going on with your site, get a better idea
1:09 of what's going on with your page.
1:10 And then that just helps everybody, in terms of if we
1:13 can find the best search results, we'll be able to
1:15 return them higher to users.
1:16 So thanks if you can take the chance.
1:18 I know it's kind of a common idiom for people to just say,
1:21 I'm going to block JavaScript and CSS.
1:23 But you don't need to do that now.
1:25 So please, in fact, actively let Googlebot crawl things
1:29 like Javascript and CSS if you can.
1:31 Thanks.

Personally, I rarely block these files from Google or anyone else.

Forum discussion at WebmasterWorld.

Previous story: Ludwig Mies van der Rohe Google Logo
 

Comments:

Alyssa

03/27/2012 02:13 pm

I don't like Matt Cutts. There, I said it. I don't trust him, one bit. I guess we're supposed to take what he says as gospel and quickly run around and make changes that he tells us to. It's quite worrying though, as it suggests that js and CSS are ranking factors, or at least websites that Google "believe" are using questionable js or css, have or will be penalised?

Jon

03/27/2012 03:32 pm

"It suggests that js and CSS are ranking factors, or at least websites that Google "believe" are using questionable js or css, have or will be penalised? " I don't believe that's what he's saying completely, he's saying that if you let us see your JS/CSS we may give you higher rankings (because we'll see what's important on-page), not that they'll penalize if you don't. Normally Cutts-balls says "there's no need to block anything, but feel free to do so". It's interesting that they're actively pleading with Webmasters to let them see the shit. One reason folders can we blocked is so Google doesn't know they're affiliate link - wonder if this is related to their request...

Abro, from Kerpen (DE)

03/27/2012 03:47 pm

Uhm...they're crawling it anyway if they want to - no matter if the file is 'blocked' or not.

Barry Adams

03/27/2012 04:24 pm

Yet another scare tactic. From stirring the bushes with alarming GWT notifications to making small business website owners panic about 'over-optimisation' penalties to now trying to make cloakers defecate their trousers with a thinly veiled hint at a penalty for blocked JS & CSS files... - all this serves no other purpose than to prime the SEO mindset to blindly obey Google's guidelines and best practices and not ever do anything that might possibly stray from what Google says SEO should be.Disgusting.

SEO Expert Steve Wiideman

03/27/2012 04:50 pm

Am I really the only one who sees the August 2011 date on this video? tedster in WebmasterWorld apparently missed that and the whole world picked up on the story like it's new news. Am I off here?

Jon

03/27/2012 05:38 pm

Haha, nicely spotted. Er, yeah you're right.

Eric Wu (╯°□°)╯︵ ┻━┻

03/27/2012 05:39 pm

MattCutts mentioned it on Twitter yesterday: "Today's webmaster video is a request to let Googlebot crawl JavaScript and CSS". I imagine that's the source and not a Webmaster World thread.

SEO Expert Steve Wiideman

03/27/2012 06:08 pm

Oh "I see now". Perhaps the post should link to the Twitter status first and the video second? Just my opinion anyway. Thanks for the note Eric. I'll try to find the original thread.

suzukik

03/27/2012 06:40 pm

How did you get the transcript, Barry? I don't think you listened to the whole talk and wrote it down.

Nuclei

03/27/2012 07:15 pm

It is google yet again showing they can not do their jobs that they chose to do, and want to scare webmasters into doing it for them. Nothing new here, move along...

SEO India

03/27/2012 08:11 pm

Tedster is the big losar because always defend Google. He know nothing now but defend Google all they do. Maybe work for Google?

Website Consultancy

03/29/2012 08:11 am

Why would you want to block the CSS & JavaScript files unless you had something to hide or you're one of those robots.txt obsessives!

nicolas de France

07/03/2012 07:22 pm

I know i don't control lots of things regarding google crawling of my websites. Nevertheless, i noticed the more i added disallowed folders to robots.txt and the more i lost traffic. Then, i also noticed that "beginners" who did not add anything to robots.txt could have better traffic than me. It has been enough for me to remove all blocked folders from robots.txt. I only worked with wordpress. Today, i have a allow: /. In /wp-includes/ there are lots of JS that are sometimes indexed by google. What matt cutts said is right. Google can recognize what should and what should not really indexed. There is no "garbage" data in snippet description when i look into site:mydonain. My own guess: adding a complicated robots.txt adds cycle to crawling and indexing.

Melody Paxton

04/12/2013 04:36 pm

Hi, I see a lot of bashing here but not any kind of rebuttal worth saving as factual. It all looks like opinion. I appreciate the dialogue but I am still unsure of what the real takeaway is, other than Google is garbage and everyone else has the 4-1-1 on SEO, etc. Can anyone post some relevant links and make an argument? I would really like to hear more information. Thanks.

blog comments powered by Disqus