The end of the post has my current robots.txt for the standard installation of the Drupal content management system. It borrows heavily from many places including this thread on the Drupal website.
That same thread has a discussion about whether a robots.txt should be distributed with Drupal. I'm not a huge proponent one way or the other. What I want instead and will likely have to be a combination of things, is a plaintext module that provides nodes as plain text without themes or anything fancy. This will allow the authors of independent Drupal sites on the same server to each have a distinct robots.txt as well as other files. An additional possible use is the publication of a comment spam blacklist. This plain-text list would be parsed and comments with the words in the list deleted at a semi-regular interval.
As the thought has evolved I now would think the way to do it is a special node type. The "plain-jane" node type will simply regurgitate what it is fed, verbatim. Then with URL aliasing it can be robots.txt or any other file one prefers.