A WebmasterWorld thread discusses a more detailed issue with how Google's spider, GoogleBot, is crawling some pages. Let me quote the detailed explanation:
I've tried: Checking for the HTTP_IF_MODIFIED_SINCE header and returns "304 Not Modified" if possible.
Problem: Googlebot doesn't always send this header. Even if they already know about a page they doesn't always send the header.
I've tried: Using the expires header to tell google that each page should expire in a month from the request.
Problem: Googlebot keep requesting the pages. They seem to ignore this header.
Brett Tabke, founder of WebmasterWorld, said he noticed these issues as well. jdMorgan, a WebmasterWorld moderator, tried to offer some advice:
Check that the 'expires' header is relative -- Expires after so much time, rather than Expires at a certain time.
You should check your Cache-control server response headers as well.
Is this a Webmaster issue or GoogleBot issue?
Forum discussion at WebmasterWorld.