Google Compression Test:

May 3, 2010 • 7:40 am | comments (1) by | Filed Under Other Google Topics

A Google Blogoscoped Forums thread notes that Google recently disallowed the URL in their robots.txt file. The question is, what is this all about?

Google does have a code base for that is "a small test framework that performs benchmark comparisons between a variety of open source compression libraries." But how does this impact searchers or SEOs?

On February 17th, a few searchers noticed a bug with their browsers associated with this file. There are threads at Google Web Search Help and Google Custom Search Help with questions on this. Here they are:

When we browse to it asks to download a 1kb file call 'compressiontest' which is a file with no extension.

The call to the file can been seen in this excert from the source code ;google.neegg=1;google.y.first.push(function(){(function(){ function b(a){document.cookie=a}function c(){if(!document.cookie.match(/GZ=Z=[0,1]/)){b("GZ=Z=0");var a=document.createElement("iframe");a.src="/compressiontest";"none";(document.getElementById("xjsd")||document.body).appendChild(a)}}c(); })()

Hi, Im also experiencing the same problem when I open It says file download-security warning. Name: Compressiontest Type: Unknown File Type From:

A few months later, Google blocks this file in their robots.txt. For what reason? Just clean up?

Ionut Alex. Chitu from Google Operating System blog said in the thread:

Nothing that interesting. I've noticed a request to a similar URL when visiting It's just an almost empty HTML file that tests HTTP compression.

Is Google just that obsessed with speed? Does it really mean nothing?

Forum discussion at Google Blogoscoped Forums.

Previous story: Google vs Rosetta Stone Case Dismissed Over Search Ad Trademark Issues
Ninja Banner
blog comments powered by Disqus