0 votes


A simple lightweight .Net solution for websites which need easy management of multiple robots.txt

Once you have the ChordsTech.MultiSiteRobots.dll downloaded, simply add the following line to start intercepting robots.txt file requests.


   <handlers accessPolicy="Read, Write, Script, Execute">      

  <remove name="ChordsTech.MultiSiteRobots" />

      <add name="ChordsTech.MultiSiteRobots" verb="*" path="/robots.txt" type="ChordsTech.MultiSiteRobots.RobotsTxt" />



You can then add robots.txt files for all your domains e.g:





The code will intercept the robots.txt request and match the domain to the correct robots.txt file.

Tested predominatley in V8 but this should work with all versions.


Package owner



SinkyPars has 175 karma points

Package Compatibility

This package is compatible with the following versions as reported by community members who have downloaded this package:
Untested or doesn't work on Umbraco Cloud
Version 8.10.x (untested)
Version 8.9.x (untested)
Version 8.8.x (untested)
Version 8.7.x (untested)
Version 8.6.x (untested)
Version 8.5.x (untested)
Version 8.4.x (untested)
Version 8.3.x (untested)
Version 8.2.x (untested)
Version 8.1.x (untested)
Version 8.0.x (untested)
Version 7.15.x (untested)
Version 7.14.x (untested)
Version 7.13.x (untested)
Version 7.12.x (untested)
Version 7.11.x (untested)
Version 7.10.x (untested)
Version 7.9.x (untested)
Version 7.8.x (untested)
Version 7.7.x (untested)
Version 7.6.x (untested)
Version 7.5.x (untested)
Version 7.4.x (untested)
Version 7.3.x (untested)
Version 7.2.x (untested)
Version 7.1.x (untested)
Version 7.0.x (untested)
Version 6.2.x (untested)

You must login before you can report on package compatibility.

Package Information

  • Package owner: SinkyPars
  • Created: 04/05/2020
  • Current version 1.0.0
  • .NET version 4.6.2
  • License MIT
  • Downloads on Our: 16
  • Downloads on NuGet: 205
  • Total downloads : 221