A Google Webmaster Help thread has a webmaster who is looking to lower the number of blocked URLs being reported within Google Webmaster Tools Index Status report.
To make a long story short, they used the robots.txt to block hundreds of thousands of pages. Then eventually they just removed the pages and removed the lines in the robots.txt to block them. But Google still shows them as being blocked in the Index Status report.
Google's John Mueller explained that it can take a long time for Google to recrawl and notice the pages are no longer there. He wrote:
It's likely going to take quite some time for those URLs to either drop out of the index or be recrawled again, so I would not expect to see that number significantly drop in the near future (and that's not a bad thing, it's just technically how it works out).
We also know the index status report is delayed by about a week or so.
Forum discussion at Google Webmaster Help.