The problem: You've made a multisite solution (either for multiple languages or just multiple sites in one Umbraco installation). You want to offer an XML sitemap per site, but your robots.txt can't display more than one URL for a sitemap.
The solution: DynamicRobots! This httpHandler will look for the string {HTTP_HOST} in your robots.txt file and replace it with the current site for you, so that spiders get a complete and valid URL for your sitemap.
Example:
Sitemap: http://{HTTP_HOST}/sitemap
Note: if you are caching .txt files in IIS, then the hostname won't always change correctly, so make sure to disable caching for those files.
You can still edit your robots.txt file with the excellent robots.txt editor package by Lee Kelleher: http://our.umbraco.org/projects/developer-tools/robotstxt-editor
---
Also check out the Cultiv SearchEngineSitemap package, this package supports multisite solutions out of the box: http://our.umbraco.org/projects/website-utilities/cultiv-search-engine-sitemap