What are folks using to perform file replication across load-balanced Umbraco instances (well, any website for that matter)?
I'm aware that using SAN storage is preferable, but that's not an option in the environment I'm using (Rackspace Cloud), so I need to replicate files across servers.
DFS is the "out of the box" solution, but i) there are possible latency issues, and ii) I don't want to run a full AD domain for my webservers.
Robocopy/etc will work on a scheduled basis, but that'll leave gaps in my filesystem content (eg media items) until the next sync happens.
Just interested to see how folks are doing this in practice...
We had a master Umraco instance and then when a file us updated, PeerSync listens out for the change and replicates that across as many servers as you have setup. They used it in a cloud environment and had 5 servers I think.
For the record, I have this working now, using DeltaCopy (a Windows wrapper around rsync) running as a scheduled task on the media folder every minute.
It seems to work in testing, although I'm not entirely enthusiastic about this approach as it means only one server can run the backoffice, and that leads to a mess of URLRewrite rules to stop duplicate content problems when using www1/www2 hostnames.
I'd much prefer SAN and just have both IIS instances pointing to the same place, but we have to work with what we have...
Just updating this a week on, I've since switched to using Unison for the file synchronisation - it's a bit messy to use as it's a command line thing with Unix roots, but importantly it offers two-way sync, meaning I can dispose with the URLRewrite stuff, as changes can be made from either server now.
I created some code that creates a text file whenever changes are published. Then, I have a batch script that checks for that file every 15 seconds. If found, it triggers an FTPsync.
Unfortunetlly, something about FTPsync occasionally brings down the servers. I'm looking for something else to do the job.
Load balancing file replication
What are folks using to perform file replication across load-balanced Umbraco instances (well, any website for that matter)?
I'm aware that using SAN storage is preferable, but that's not an option in the environment I'm using (Rackspace Cloud), so I need to replicate files across servers.
DFS is the "out of the box" solution, but i) there are possible latency issues, and ii) I don't want to run a full AD domain for my webservers.
Robocopy/etc will work on a scheduled basis, but that'll leave gaps in my filesystem content (eg media items) until the next sync happens.
Just interested to see how folks are doing this in practice...
Phil
I assume you're talking about media?
You've just named the most common practices.
The best solution is SAN.
Alternatively you could use a CDN.
@fastchicken on twitter can tell you about media on CDN
For one of our clients we used PeerSync to keep servers uptodate - http://www.peersoftware.com/products/peersync/peersync.aspx
We had a master Umraco instance and then when a file us updated, PeerSync listens out for the change and replicates that across as many servers as you have setup. They used it in a cloud environment and had 5 servers I think.
Cheers
For the record, I have this working now, using DeltaCopy (a Windows wrapper around rsync) running as a scheduled task on the media folder every minute.
It seems to work in testing, although I'm not entirely enthusiastic about this approach as it means only one server can run the backoffice, and that leads to a mess of URLRewrite rules to stop duplicate content problems when using www1/www2 hostnames.
I'd much prefer SAN and just have both IIS instances pointing to the same place, but we have to work with what we have...
Just updating this a week on, I've since switched to using Unison for the file synchronisation - it's a bit messy to use as it's a command line thing with Unix roots, but importantly it offers two-way sync, meaning I can dispose with the URLRewrite stuff, as changes can be made from either server now.
I created some code that creates a text file whenever changes are published. Then, I have a batch script that checks for that file every 15 seconds. If found, it triggers an FTPsync.
Unfortunetlly, something about FTPsync occasionally brings down the servers. I'm looking for something else to do the job.
is working on a reply...