Google: Blocked URL Count Updates Slowly

Apr 4, 2013 • 8:24 am | comments (3) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Google Blocked URLsA Google Webmaster Help thread has a webmaster who is looking to lower the number of blocked URLs being reported within Google Webmaster Tools Index Status report.

To make a long story short, they used the robots.txt to block hundreds of thousands of pages. Then eventually they just removed the pages and removed the lines in the robots.txt to block them. But Google still shows them as being blocked in the Index Status report.

Google's John Mueller explained that it can take a long time for Google to recrawl and notice the pages are no longer there. He wrote:

It's likely going to take quite some time for those URLs to either drop out of the index or be recrawled again, so I would not expect to see that number significantly drop in the near future (and that's not a bad thing, it's just technically how it works out).

We also know the index status report is delayed by about a week or so.

Forum discussion at Google Webmaster Help.

Previous story: When Google Completes You Wrong
 

Comments:

Solangi Naeem

06/24/2013 04:08 pm

Thanks for sharing dear But i don't take how to solve the problem of blocked url by robots.txt? in webmaster blocked url section. please help me....

Mostafa Nastary

09/14/2013 10:22 am

hi, you should edit your Robots.txt and Remove the Command for example your robots.txt contain this lines: User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-* You want solve every /wp-admin/ URLs, so you delete Disallow: /wp-admin/ and wating for google.

Karl Beeton

03/04/2014 11:19 pm

Hello Mostafa what about if the robots.txt file returns a 404 error page? Thank you Karl

blog comments powered by Disqus