A Google Webmaster Help thread has a member upset that Google is crawling his site during times when his server is overloaded. Is there a way to tell GoogleBot to stay away during these times?
JohnMu of Google said, yes there is. John explains that you can use a 503 status code to tell GoogleBot to come back later. The thing is, you would need to just server this status code to spiders and not your visitors - and that might get a bit sticky. Let me quote John:
One thing you can do is to encourage Googlebot (and other crawlers) to not visit your site at busy times by returning a 503 HTTP result code. This tells us that you currently can't serve the content, but that we should come back at some later time. The difficulty would be to recognize search engine crawlers and to only serve this result code when the server is actually under load - but it might be worth following up on if your server resources are limited.
John in the past recommended using the 503 during site downtime and site maintenance, so that Google doesn't think your site went bye-bye. The 503 will simply tell the spider to check back later and that your site is fine, but only temporarily not available.
Now, automating this at specific times or during specific CPU utilization patterns might be fun for coders. You can set up logic to say, serve GoogleBot or other spiders a 503 status code when it is between the hours of X and Y or when the server's CPU load is above Z. The only issue is, would this be considered a form of cloaking or not showing the search spider what the visitor sees? A bit of a gray area.
Forum discussion at Google Webmaster Help.