The dlls are definitely in the bin directory and web.config looks as it should to me but the replace on the {HTTP_HOST} doesn't look like its happening.
Whilst testing I've noticed that any changes I make to web.config don't seem to make the page recompile when refreshing the robots.txt. Whereas if I make a change to web.config and then view a normal page within the site it takes the morning 10 seconds to reload.
Am I missing something else? Anything else I need to do in IIS?
I did read through that post but think it may have been something else.
I was testing it in IIS6, I quickly tried it in IIS7 and it works fine so must've been something to do with that. Luckily our production server is using IIS7 so thats fine.
Robots.txt still showing http://{HTTP_HOST}/sitemap
Hi
I've installed this package and everything seems to have installed correctly however when I browse to my site I still the contents of robots.txt is:
The dlls are definitely in the bin directory and web.config looks as it should to me but the replace on the {HTTP_HOST} doesn't look like its happening.
Whilst testing I've noticed that any changes I make to web.config don't seem to make the page recompile when refreshing the robots.txt. Whereas if I make a change to web.config and then view a normal page within the site it takes the morning 10 seconds to reload.
Am I missing something else? Anything else I need to do in IIS?
Thanks
Ben
Have you looked at the answer on this post?
http://our.umbraco.org/projects/website-utilities/cultiv-dynamicrobots/requests-and-questions/14052-Sitemap-http%7BHTTP_HOST%7Dsitemap#comment81563
Hi
Thanks for the quick reply.
I did read through that post but think it may have been something else.
I was testing it in IIS6, I quickly tried it in IIS7 and it works fine so must've been something to do with that. Luckily our production server is using IIS7 so thats fine.
Thanks
Ben
is working on a reply...