When I commented out <xsl:value-of select="$url"/>, the problem was resolved, and my Google Webmaster Tools errors went away.
I believe the problem occurred because I installed "CultivDynamicRobots" package. Both this sitemap package and CultivDynamicRobots call HTTP_HOST, so the root level of my domain may have been doubled as a result. (Just a guess... I'm no expert.)
Just thought I'd share in case anyone else has experienced this problem.
$url caused URL doubling
Google Webmaster Tools informed me that all the file paths in my sitemap were invalid. They were being printed as: http://website.comhttp://www.website.com/page.aspx
The following code was the problem:
When I commented out <xsl:value-of select="$url"/>, the problem was resolved, and my Google Webmaster Tools errors went away.
I believe the problem occurred because I installed "CultivDynamicRobots" package. Both this sitemap package and CultivDynamicRobots call HTTP_HOST, so the root level of my domain may have been doubled as a result. (Just a guess... I'm no expert.)
Just thought I'd share in case anyone else has experienced this problem.
Thanks for sharing Luke, it resolved my issue. I guess it also be an issue if you use umbraco hostnames.
is working on a reply...