Press Ctrl / CMD + C to copy this to your clipboard.
This post will be reported to the moderators as potential spam to be looked at
I have installed the https://our.umbraco.com/packages/website-utilities/cultiv-search-engine-sitemap/ package.
I have working as expected but when i use a robots.txt file to link to the sitemap, i get the below error and therefore returns 500.
Could not load file or assembly 'Cultiv.DynamicRobots' or one of its dependencies. The system cannot find the file specified.
But when i visit the Sitemap URL within the robots.txt file, it loads the sitemap.aspx page correctly with the xml on the page.
Robots.txt file contents:
User-agent: * Allow: / Sitemap: http://URL/sitemap.aspx
Thank you for any help :)
is working on a reply...
This forum is in read-only mode while we transition to the new forum.
You can continue this topic on the new forum by tapping the "Continue discussion" link below.
Continue discussion
Robots.txt Issue with Cultiv.DynamicRobots
I have installed the https://our.umbraco.com/packages/website-utilities/cultiv-search-engine-sitemap/ package.
I have working as expected but when i use a robots.txt file to link to the sitemap, i get the below error and therefore returns 500.
But when i visit the Sitemap URL within the robots.txt file, it loads the sitemap.aspx page correctly with the xml on the page.
Robots.txt file contents:
Thank you for any help :)
is working on a reply...
This forum is in read-only mode while we transition to the new forum.
You can continue this topic on the new forum by tapping the "Continue discussion" link below.