In Issues -> Configuration issues it is reporting that I have no custom 404 page configured and no Robots.txt in place.
Neither of these are true.
My 404 section says:
<errors> <!-- the id of the page that should be shown if the page is not found --> <!--<error404> <errorPage culture="default">1</errorPage> <errorPage culture="en-US">200</errorPage> </error404>--> <error404>1126</error404> </errors>
and here's my Robots.txt which is being half generated by the 'Enable dynamic robots.txt file' setting.
I can stop it complaining about the 404 page by setting one on the domain in the 'Domain settings' but I don't know why it is otherwise ignoring what's in the umbracoSettings.config?
Why does the error desription call it umbraco.settings could that be a clue?
You can disable the 404 settings in the SEO Checker configuration then it will use the normal Umbraco ones. I think you web.config doesn't have the handler for Robots.txt configured in web.config.
I think the problem here is that there is a delay between changing the config options and seeing the results and the configuration issues report recognising that things have changed.
It seems to be working now using the Umbraco 404 or the SEO Checker 404, but if you select the SEO Checker 404 option you must then choose a page for the domain. Seems obvious but I missed that last bit.
I mainly noticed the delay when testing the different Robots.txt options, these all now seem to be working as expected however I did notice one annoyance.
If you've previously right clicked the domain and edited its Robots.txt file but then untick 'Enable dynamic robots.txt file' in the configuration you lose any changes you've made next time you re-tick the 'Enable dynamic robots.txt file' setting.
Wrongly reported Configuration issues
In Issues -> Configuration issues it is reporting that I have no custom 404 page configured and no Robots.txt in place.
Neither of these are true.
My 404 section says:
and here's my Robots.txt which is being half generated by the 'Enable dynamic robots.txt file' setting.
Regards, Matt
I can stop it complaining about the 404 page by setting one on the domain in the 'Domain settings' but I don't know why it is otherwise ignoring what's in the umbracoSettings.config?
Why does the error desription call it umbraco.settings could that be a clue?
You can disable the 404 settings in the SEO Checker configuration then it will use the normal Umbraco ones. I think you web.config doesn't have the handler for Robots.txt configured in web.config.
Do you see this one in HttpHandlers section?
Best,
Richard
Hello Richard,
With regards to the web.config I have the following SEO Checker related bits
I notice that it's a fair bit more than mentioned in the current documentation.
Regards, Matt
Hi Matt,
That is great. Is your webserver running in integrated mode or classic mode?
Best,
Richard
Integrated Richard.
I think the problem here is that there is a delay between changing the config options and seeing the results and the configuration issues report recognising that things have changed.
It seems to be working now using the Umbraco 404 or the SEO Checker 404, but if you select the SEO Checker 404 option you must then choose a page for the domain. Seems obvious but I missed that last bit.
I mainly noticed the delay when testing the different Robots.txt options, these all now seem to be working as expected however I did notice one annoyance.
If you've previously right clicked the domain and edited its Robots.txt file but then untick 'Enable dynamic robots.txt file' in the configuration you lose any changes you've made next time you re-tick the 'Enable dynamic robots.txt file' setting.
Regards, Matt
is working on a reply...