A Google Webmaster Help thread has Google's John Mueller responding to some complex robots.txt questions.
In that, he strongly recommends that when you do set up a robots.txt file, make sure you keep it very simple. John said:
When possible, I'd really recommend keeping the robots.txt file as simple as possible, so that you don't have trouble with maintenance and that it's really only disallowing resources that are problematic when crawled (or when its content is indexed).
To see the context this is in, see the Google Webmaster Help thread.
Personally, I am all for not using robots.txt files when possible.
Forum discussion at Google Webmaster Help.