Don't get me wrong, Google Sitemaps is a great tool for Webmasters and we should thank Google for setting it up (and Danny Sullivan for pushing it). When Google announced Google Sitemaps Verification Service Webmasters were in joy. But after a second look, many Webmasters are confused or unhappy with what was rolled out.
A WebmasterWorld Thread notes some of the confusion and disappointment in the service. One Webmaster reports, "I followed their instructions, but it's been two days now and I still see a red "NOT VERIFIED" message." And an other said, "i have tried and failed with this." There can be many issues causing the unsuccessful verification of a Google Sitemaps, and Google does provide documentation as to the reason. I have set up Google Sitemaps on one particular site, and the errors I received was not having proper 404 page headers on 404 pages. So we corrected that and the Sitemaps were verified.
Some other feedback which I agree with include; "I was a little dismayed that it only listed pages it had problems with, was hoping for a report on all the pages it has crawled." Also, "The stats you get after verifying your site is more of a problem report. For one site it told me about two broken internal links." A problem report is not really a verification report, The report description says, "We have been crawling your site as part of our regular crawl process. This includes following links from your pages and the pages of other sites. Below we have listed URLs that we were unable to reach during this crawl process, with links that explain why we could not reach them." But what about a filter of pages that were crawled within 6 hours, 12 hours, 1 day, 2 days, 5 days, 1 week, 2 weeks, 1 month, and so on. That would be very useful.