Press Ctrl / CMD + C to copy this to your clipboard.
This post will be reported to the moderators as potential spam to be looked at
We are currently unable to enable a dynamic robots .txt file, when the user clicks on the checkbox to enable the txt file and clicks save it appears to save, but when the user goes to navigate away from the page it, prompts the user they have unsaved changes.
When going back to the Enable dynamic robots.txt, checkbox this is no longer selected and therefore the http://website-url/robots.txt is taking the user to a 404 page.
The site is using the following versions:
Umbraco Version: 7.12.4
SEOChecker Version: 2.2.1
Any help or advice on this would be much appreciated
Looks like the HTTPHandler is not configured, propably a filepermissions issues during install. Please check the manual and add the handler to your web.config file then it should be good again.
Thanks for your reply, I have just had a look through the manual and can't seem to see the section on adding the HTTPHandlers.
Do you know what section of the manual this is in?
If you add the following to the /handlers section in web.config it should be ok.
<remove name="SEOCheckerRobotTxt" />
<add name="SEOCheckerRobotTxt" type="SEOChecker.Handlers.HttpHandlers.RobotsTxtHandler" path="robots.txt" preCondition="integratedMode" verb="*" />
<remove name="SEOCheckerSitemapxm" />
<add name="SEOCheckerSitemapxm" type="SEOChecker.Handlers.HttpHandlers.XmlSitemapHandler" path="sitemap*.xml" preCondition="integratedMode" verb="*" />
I'll give it a go and get back to you.
Checking now I can see the Handler for the robots.txt appears to be missing, I'll add this in.
is working on a reply...
Write your reply to:
Image will be uploaded when post is submitted