Copied to clipboard

Flag this post as spam?

This post will be reported to the moderators as potential spam to be looked at


  • Graeme W 113 posts 289 karma points
    Sep 21, 2014 @ 18:00
    Graeme W
    0

    Anything in Umbraco config that would stop robots.txt being accessed by Google?

    Have setup the new version of our site in Google webmaster tools as I wanted to upload site maps for the new urls. However I realised that Google cant fetch the robots.txt file to index the new site. Have tried several variations including no robots.txt file and allowing all.

    We have moved to a new host so it's possible it's something configured at their end. However can't get in touch with them until tomorrow and was wondering if there's any settings within Umbraco that could cause this

    Can anyone help?

    thanks

  • Graeme W 113 posts 289 karma points
    Sep 21, 2014 @ 18:39
    Graeme W
    0

    It looks like it must be something to do with Umbraco. I can create another website on the same hosted server and the robots.txt can be accessed ok. Help ! :-)

  • Graeme W 113 posts 289 karma points
    Sep 21, 2014 @ 22:17
    Graeme W
    0

    Problem solved! .txt extensions were configured in IIS mappings to go through the aspnet_isapi.dll This was probably because individual extensions need to be mapped like this for file types to be protected when using the Media Protect package on IIS 6. It means we won't be able to protect .txt files (can't see why we'd want to!) but a small price to pay to be indexed by Google!

Please Sign in or register to post replies

Write your reply to:

Draft