Hi! We have several sites and domens under one Umbraco installation. They have single wwwroot directory, so we can't create a robots.txt file for each site. May be you have any suggestions except one file for all sites?
Well, you could look into letting asp.net handle the robots.txt request, and the use urlrewriting to serve the appropriate content for the current domain. You could even do it using umbraco to edit the content for each domain.
Let me know if this doesn't make any sense, and I'll try and explain a bit better.
I've tried it with urlRewriting tool but failed to get rules for different domains. Could you give me an advice how to do this if I have 3 variants of site naming: http://maindomain.com http://subsite.maindomain.com and http://secondDomain.com
robots.txt for multisite configuration
Hi! We have several sites and domens under one Umbraco installation. They have single wwwroot directory, so we can't create a robots.txt file for each site. May be you have any suggestions except one file for all sites?
Well, you could look into letting asp.net handle the robots.txt request, and the use urlrewriting to serve the appropriate content for the current domain. You could even do it using umbraco to edit the content for each domain.
Let me know if this doesn't make any sense, and I'll try and explain a bit better.
I've tried it with urlRewriting tool but failed to get rules for different domains. Could you give me an advice how to do this if I have 3 variants of site naming: http://maindomain.com http://subsite.maindomain.com and http://secondDomain.com
thanks
You could do something like this:
Then you could make a robots.aspx page in umbraco that uses the querystring parameter to serve different content.
The url would be like: /robots.aspx?domain=maindomain.com
Now all you need to do is to tell IIS to send the .txt request through the aspnet_isapi handler so it actually hits the urlrewriting mechanism.
thanks a lot I'll try it
is working on a reply...