Do you have some recommendation on how to use the robots.txt with a Sitemap-setting for a multiple site installation? I never paid much attention to that file to be honest.
I tried to omit the domain but when I checked that with a online tool (http://tool.motoricerca.info/robots-checker.phtml) - I got error saying "sitemap requires the full url" (just as in your example).
Robots.txt and multiple sites
Great search engine sitemap package, it was accepted at once by https://www.google.com/webmasters/tools.
Do you have some recommendation on how to use the robots.txt with a Sitemap-setting for a multiple site installation? I never paid much attention to that file to be honest.
I tried to omit the domain but when I checked that with a online tool (http://tool.motoricerca.info/robots-checker.phtml) - I got error saying "sitemap requires the full url" (just as in your example).
But then : running multiple domains will require a dynamically created robots.txt to respond with the correct sitemap url for each site, right?
Regards / Jonas
Oh... http://our.umbraco.org/projects/website-utilities/cultiv-dynamicrobots :-)
Cool, thanks again!
Yes, I had the same problem once.. ;-) Your welcome!
Just for the record, its fine to have just one sitemap.xml with multiple domains in it - at least, according to Google - https://support.google.com/webmasters/answer/75712?hl=en
is working on a reply...