Dynamic Sitemap.xml solution using feed cache Discussion
So I thought I'd try something today, and I want to hear what you all think of this solution. I'm working with an SEO partner who wanted a physical sitemap.xml document at the root of the website, but I wanted that xml document to be dynamic based on the content of the site, so I used the razor from the Cultiv Search Engine Sitemap package to build me a nice sitemap that can be found at http://myurl/sitemap.aspx. I then installed the feed cache package and used it to read that sitemap.aspx page and write a physical sitemap.xml page to the root of the site, which I then set to run every 24 hours.
I did this more to see if I could do it, obviously I could have just set the robots.txt file to point to the sitemap.aspx page as the sitemap for the bots to use, but my SEO exeprt wanted to see a physical sitemap.xml document. Was this an experiment in futility or do you think having a physical sitemap.xml document really will beneft the site?
Honestly, from my own experience, Google generally doesn't care about the URL/filename of the XML Sitemap, as long as it is accessible from the website root and is in the correct structure/format.
SEO people usually want the "sitemap.xml" file as that's the standard approach - rules out any obscurities and follows Google's handbook to the letter. Also given many of them are not developers, they may like to take comfort in having a "physical" file that can't be tampered with - as opposed to a dynamically generated sitemap is like voodoo magic to them.
I don't think your efforts were futile - it's good to keep customers/partners happy, rather than start a holy-war over something minor.
I concur with Lee. Google Webmaster Tools will support a sitemap in a directory or using an alternate name as long as it is accessible from the website root and is in the correct structure/format. I can report first hand that Google Webmaster Tools and Bing Webmaster have confirmed submitted XML sitemaps that are NOT in the root or named sitemap.xml and indexed the pages as expected. Our typical dynamic sitemap based on the Umbraco content tree is www.mysiteurl.com/sitemapxml.
From an SEO perspective I do not believe extraneous effort to generate a file named sitemap.xml is worth it. I do believe that a dynamic sitemap is very much worth the extra effort. Particularly in cases where content is often added or changing such as blogging, user submitted/generated content and new product listing. The dynamic sitemap will serve as an up to date notice to Google of new or changed content. My assumtpion is that Google recognizes the <lastmod> tag in the sitemap.
Hope that helps you feel better about your effort.
Dynamic Sitemap.xml solution using feed cache Discussion
So I thought I'd try something today, and I want to hear what you all think of this solution. I'm working with an SEO partner who wanted a physical sitemap.xml document at the root of the website, but I wanted that xml document to be dynamic based on the content of the site, so I used the razor from the Cultiv Search Engine Sitemap package to build me a nice sitemap that can be found at http://myurl/sitemap.aspx. I then installed the feed cache package and used it to read that sitemap.aspx page and write a physical sitemap.xml page to the root of the site, which I then set to run every 24 hours.
I did this more to see if I could do it, obviously I could have just set the robots.txt file to point to the sitemap.aspx page as the sitemap for the bots to use, but my SEO exeprt wanted to see a physical sitemap.xml document. Was this an experiment in futility or do you think having a physical sitemap.xml document really will beneft the site?
Chuck
Hi Chuck,
Honestly, from my own experience, Google generally doesn't care about the URL/filename of the XML Sitemap, as long as it is accessible from the website root and is in the correct structure/format.
SEO people usually want the "sitemap.xml" file as that's the standard approach - rules out any obscurities and follows Google's handbook to the letter. Also given many of them are not developers, they may like to take comfort in having a "physical" file that can't be tampered with - as opposed to a dynamically generated sitemap is like voodoo magic to them.
I don't think your efforts were futile - it's good to keep customers/partners happy, rather than start a holy-war over something minor.
Cheers, Lee.
I concur with Lee. Google Webmaster Tools will support a sitemap in a directory or using an alternate name as long as it is accessible from the website root and is in the correct structure/format. I can report first hand that Google Webmaster Tools and Bing Webmaster have confirmed submitted XML sitemaps that are NOT in the root or named sitemap.xml and indexed the pages as expected. Our typical dynamic sitemap based on the Umbraco content tree is www.mysiteurl.com/sitemapxml.
From an SEO perspective I do not believe extraneous effort to generate a file named sitemap.xml is worth it. I do believe that a dynamic sitemap is very much worth the extra effort. Particularly in cases where content is often added or changing such as blogging, user submitted/generated content and new product listing. The dynamic sitemap will serve as an up to date notice to Google of new or changed content. My assumtpion is that Google recognizes the <lastmod> tag in the sitemap.
Hope that helps you feel better about your effort.
Loyan
is working on a reply...