Google's Site Performance Shows For Some Blocked Pages

Dec 30, 2010 • 9:38 am | comments (1) by twitter Google+ | Filed Under Google PageRank & Algorithm Updates
 

A Google Webmaster Help thread has one webmaster asking why Google's Site Performance reports in Google Webmaster Tools are showing pages he blocked using the robots.txt.

The reason is pretty simple. A GoogleBot is not used to calculate the speed of a page.

Instead, Google uses Toolbar data from real users surfing and accessing your web pages with the Google Toolbar installed. Blocking GoogleBot will not block ordinary users with the Google Toolbar installed from accessing your site.

So if you are trying to hide your slow pages from Google, I'd recommend other methods, such as hiding your slow pages from your users.

Forum discussion at Google Webmaster Help.

Previous story: Google PageRank Becoming Non Exclusive, At Least The Original Patent Document
 

Comments:

SEO Nepal

12/31/2010 04:14 am

disabling robots from robots.txt but submitting sitemap from webmaster tools is showing the results from Google. I thought it should remain disabled.

blog comments powered by Disqus