Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Luke Johnson 61 posts 80 karma points
    Mar 28, 2012 @ 18:38
    Luke Johnson
    0

    $url caused URL doubling

    Google Webmaster Tools informed me that all the file paths in my sitemap were invalid. They were being printed as: http://website.comhttp://www.website.com/page.aspx

    The following code was the problem:

    <loc>
    <xsl:value-of select="$url"/><xsl:value-of select="umbraco.library:NiceUrl(@id)"/>
    </loc>

    When I commented out <xsl:value-of select="$url"/>, the problem was resolved, and my Google Webmaster Tools errors went away.

    I believe the problem occurred because I installed "CultivDynamicRobots" package. Both this sitemap package and CultivDynamicRobots call HTTP_HOST, so the root level of my domain may have been doubled as a result. (Just a guess... I'm no expert.)

    Just thought I'd share in case anyone else has experienced this problem.

  • Stefan van Leusden 21 posts 73 karma points
    Mar 11, 2014 @ 11:13
    Stefan van Leusden
    0

    Thanks for sharing Luke, it resolved my issue. I guess it also be an issue if you use umbraco hostnames.

  • This forum is in read-only mode while we transition to the new forum.

    You can continue this topic on the new forum by tapping the "Continue discussion" link below.

Please Sign in or register to post replies