I was wondering if anyone has had this problem but I have various sites within one Umbraco instance, we have one of those sites which we dont want to be getting crawled. But you cant specify a relative link in robots.txt.
Each site has a different domain, but uses same templates and doctypes. Anyone know a way around this?
You could make a setting on your Home document type where you have textstring field where you can write noindex,nofollow and then you can create a macro that inserts the value from the textstring into <meta name="robots" content="index, follow" /> in the <head> section on all the pages on the site.
On the other sites you should of course write index,follow - and if the field is empty you can just make some logic to not write the <meta> tag.
I hope the above makes sense otherwise let me know :)
Various sites with one domain - Google crawling
Hi,
I was wondering if anyone has had this problem but I have various sites within one Umbraco instance, we have one of those sites which we dont want to be getting crawled. But you cant specify a relative link in robots.txt.
Each site has a different domain, but uses same templates and doctypes. Anyone know a way around this?
Preferably not using an Umbraco package.
Thanks for any help in advance.
Thanks
Jason
Hi Jason
You could make a setting on your Home document type where you have textstring field where you can write noindex,nofollow and then you can create a macro that inserts the value from the textstring into <meta name="robots" content="index, follow" /> in the <head> section on all the pages on the site.
On the other sites you should of course write index,follow - and if the field is empty you can just make some logic to not write the <meta> tag.
I hope the above makes sense otherwise let me know :)
/Jan
is working on a reply...