Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Matt Taylor 873 posts 2086 karma points
    Aug 19, 2014 @ 17:36
    Matt Taylor
    0

    Wrongly reported Configuration issues

    In Issues -> Configuration issues it is reporting that I have no custom 404 page configured and no Robots.txt in place.

    Neither of these are true.

    My 404 section says:

    <errors>
          <!-- the id of the page that should be shown if the page is not found -->
          <!--<error404>
                       <errorPage culture="default">1</errorPage>
                       <errorPage culture="en-US">200</errorPage>
                </error404>-->
          <error404>1126</error404>
    </errors>

    and here's my Robots.txt which is being half generated by the 'Enable dynamic robots.txt file' setting.

    Regards, Matt

  • Matt Taylor 873 posts 2086 karma points
    Aug 19, 2014 @ 18:05
    Matt Taylor
    0

    I can stop it complaining about the 404 page by setting one on the domain in the 'Domain settings' but I don't know why it is otherwise ignoring what's in the umbracoSettings.config?

    Why does the error desription call it umbraco.settings could that be a clue?

     

  • Richard Soeteman 4046 posts 12899 karma points MVP 2x
    Aug 19, 2014 @ 18:39
    Richard Soeteman
    0

    You can disable the 404 settings in the SEO Checker configuration then it will use the normal Umbraco ones. I think you web.config doesn't have the handler for Robots.txt configured in web.config.

    Do you see this one in HttpHandlers section?

    Best,

    Richard

  • Matt Taylor 873 posts 2086 karma points
    Aug 20, 2014 @ 13:14
    Matt Taylor
    0

    Hello Richard,

    With regards to the web.config I have the following SEO Checker related bits

    <system.web>
      <httpModules>
       <add name="SEOCheckerValidationqueueModule" type="SEOChecker.HttpModules.ValidationqueueModule, SEOChecker" />
       <add name="SEOCheckerUrlModule" type="SEOChecker.HttpModules.UrlModule, SEOChecker" />
      </httpModules>

      <httpHandlers>
        <add path="sitemap*.xml" verb="*" type="SEOChecker.Handlers.HttpHandlers.XmlSitemapHandler" />
        <add path="robots.txt" verb="*" type="SEOChecker.Handlers.HttpHandlers.RobotsTxtHandler" />
      </httpHandlers>
    </system.web> 

    <system.webServer>
      <modules runAllManagedModulesForAllRequests="true">
        <remove name="SEOCheckerValidationqueueModule" />
        <add name="SEOCheckerValidationqueueModule" type="SEOChecker.HttpModules.ValidationqueueModule, SEOChecker" />
        <remove name="SEOCheckerUrlModule" />
        <add name="SEOCheckerUrlModule" type="SEOChecker.HttpModules.UrlModule, SEOChecker" />
      </modules>

      <handlers accessPolicy="Read, Write, Script, Execute">
        <remove name="SEOCheckerSitemapxm" />
        <add name="SEOCheckerSitemapxm" path="sitemap*.xml" verb="*" type="SEOChecker.Handlers.HttpHandlers.XmlSitemapHandler" preCondition="integratedMode" />
        <remove name="SEOCheckerRobotTxt" />
        <add name="SEOCheckerRobotTxt" path="robots.txt" verb="*" type="SEOChecker.Handlers.HttpHandlers.RobotsTxtHandler" preCondition="integratedMode" />
      </handlers>
    </system.webServer>

    I notice that it's a fair bit more than mentioned in the current documentation.

    Regards, Matt

  • Richard Soeteman 4046 posts 12899 karma points MVP 2x
    Aug 20, 2014 @ 13:15
    Richard Soeteman
    0

    Hi Matt,

    That is great. Is your webserver running in integrated mode or classic mode?

    Best,

    Richard

  • Matt Taylor 873 posts 2086 karma points
    Aug 20, 2014 @ 13:21
    Matt Taylor
    0

    Integrated Richard.

  • Matt Taylor 873 posts 2086 karma points
    Aug 21, 2014 @ 11:49
    Matt Taylor
    100

    I think the problem here is that there is a delay between changing the config options and seeing the results and the configuration issues report recognising that things have changed.

    It seems to be working now using the Umbraco 404 or the SEO Checker 404, but if you select the SEO Checker 404 option you must then choose a page for the domain. Seems obvious but I missed that last bit.

    I mainly noticed the delay when testing the different Robots.txt options, these all now seem to be working as expected however I did notice one annoyance.

    If you've previously right clicked the domain and edited its Robots.txt file but then untick 'Enable dynamic robots.txt file' in the configuration you lose any changes you've made next time you re-tick the 'Enable dynamic robots.txt file' setting.

    Regards, Matt

Please Sign in or register to post replies

Write your reply to:

Draft