Google's Site Performance Shows For Some Blocked Pages

Dec 30, 2010 - 9:38 am 1 by
Filed Under Google Updates

A Google Webmaster Help thread has one webmaster asking why Google's Site Performance reports in Google Webmaster Tools are showing pages he blocked using the robots.txt.

The reason is pretty simple. A GoogleBot is not used to calculate the speed of a page.

Instead, Google uses Toolbar data from real users surfing and accessing your web pages with the Google Toolbar installed. Blocking GoogleBot will not block ordinary users with the Google Toolbar installed from accessing your site.

So if you are trying to hide your slow pages from Google, I'd recommend other methods, such as hiding your slow pages from your users.

Forum discussion at Google Webmaster Help.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Google Core Update Volatility, Helpful Content Update Gone, Dangerous Search Results & Ads Confusion - YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: April 18, 2024

Apr 18, 2024 - 4:00 pm
Google

Google Notes On Search Won't Necessarily Go Away In May

Apr 18, 2024 - 7:51 am
Google Maps

Google Maps Releases New Directions, Travel & EV Features

Apr 18, 2024 - 7:41 am
Google Ads

Google Ads Reminds Advertisers Some Ad Customizers Will Go Away May 31st

Apr 18, 2024 - 7:31 am
Google Search Engine Optimization

Google Drops Video Carousel Markup

Apr 18, 2024 - 7:21 am
Google Maps

Google Business Profiles Register Your Defibrillator (AED)

Apr 18, 2024 - 7:11 am
Previous Story: Google PageRank Becoming Non Exclusive, At Least The Original Patent Document