Creating a content editing and content delivery environment
Hi
We have a client that has a requirement to run an Umbraco website in a split environment. They want to do all of their content management on an internal installation that is behind a firewall and they then want to push the content to a content delivery server that is only used for serving content. The content delivery server will not be accessible to content editors, as such all content / media management is required to be perofrmed on the content authoring server and "pushed" from here to the content delivery server.
Initially we thought Contour would be suitable but now we're not so sure. As far as we're aware, Contour isnot going to workvery well with updates and deletions of content (happy to be proved wrong on this).
Now were thinking of writing some custom logic that serilise content from the content authoring environment and send it to a service on the content delivery server for deserialisation and publishing.
Are we missing a trick here (as this sounds like a massive job)?
1) Courier. Like Janusz said, I'm not sure if it handles deletes
2) Umbraco has a inbuilt suport for web farms, so you could have your main install push out to the CDN that way
3) Wright something to package up the changes in the local environment. ie take a db diff and package up any files then just apply it on the live server. As you haven't got any user generated content, you could just keep over writing the CDN versions.
Not much in the way of detail, but a few options to look into.
@Matt: With the web-farms approach, would the master instance push out to the slave instances on publish? (Wondering if there was a way to separate the "publish" from the "push"?)
@Lee from what I remember yes. Being meant for web farms, it expects you are trying to keep a few instances in sync, so that's what it does.
As another idea, you could do something with the XML cache maybe? ie push that to the server and have something to sync that to the database? seing as you are never going to create anything on the CDN servers.
Sorry for later reply on this one, manic prject delivery time again.
The solution we are currently looking at is to use database replication to keep the databases in sync. The Content Entry database essentially pushing transactions to the CDN server database. On the CDN server we have a a process that monitors the database for new transactions, when a new one occurs it starts to rebuild the the xml content cache on a seperate thread. Once re-built, it will swap the live content cache with the newly re-built one (i.e. trying to ensure that the website isn't blocked whilst the cache is being re-built, which could be quite frequent).
Still not overly happy about it, but I think levering the database replication will save us a lot of headaches.
We'd still need to sync media though (probably via DFS or similar).
Courier by it's very nature is designed not to break things on the live server. It is designed to deploy things, not keep things in sync. As such it's good at pushing, but it's designed not to delete things.
Thanks. I understand the rationale. However deleting things is not the only way to break things.
The same argument would prevent allowing the deployment of css or js wouldn't it?
Of course I'm new to umbraco, so apologies if i sound clueless and miss the obvious.
As for db replication, i used a similar solution for a dnn site, where i triggered a replication job via a "publish" button using rmo. I then synced the asset directories for images, etc. Dnn is cleaner though as the content is purely from the db (no xml cache to deal with).
Courier handles files seperatly. Masterpages, css files, javascripts, macros, xslts etc can all be deployed and will overwrite the corresponding item on the destination server (i.e. it's only the developers that are going to be editing these files, hopefully in a development environment so there is no issue with overwriting these files on the live server). It's only content that is handled in this way as content has more issues with concurrency.
Creating a content editing and content delivery environment
Hi
We have a client that has a requirement to run an Umbraco website in a split environment. They want to do all of their content management on an internal installation that is behind a firewall and they then want to push the content to a content delivery server that is only used for serving content. The content delivery server will not be accessible to content editors, as such all content / media management is required to be perofrmed on the content authoring server and "pushed" from here to the content delivery server.
Initially we thought Contour would be suitable but now we're not so sure. As far as we're aware, Contour isnot going to workvery well with updates and deletions of content (happy to be proved wrong on this).
Now were thinking of writing some custom logic that serilise content from the content authoring environment and send it to a service on the content delivery server for deserialisation and publishing.
Are we missing a trick here (as this sounds like a massive job)?
I think there are a couple of questions that might help narrow down your options.
1. Will the final website perform any dynamic tasks? ie contact forms and the likes?
2. Will the final website have any user generated content? using contour? or something else that will generate nodes on the live environment?
1: Dynamic tasks yes
2: Though nothing that will be creating content on the CDN.
I'm interested in the details as to why you think courier won't work.
I'm in need of a similar solution with my first umbraco site and was hoping courier would fill the gap.
Hi Janusz,
Curious if you found a workable solution for this?
Cheers, Lee.
In my head, there are pretty much 3 options.
1) Courier. Like Janusz said, I'm not sure if it handles deletes
2) Umbraco has a inbuilt suport for web farms, so you could have your main install push out to the CDN that way
3) Wright something to package up the changes in the local environment. ie take a db diff and package up any files then just apply it on the live server. As you haven't got any user generated content, you could just keep over writing the CDN versions.
Not much in the way of detail, but a few options to look into.
Matt
@Matt: With the web-farms approach, would the master instance push out to the slave instances on publish? (Wondering if there was a way to separate the "publish" from the "push"?)
@Lee from what I remember yes. Being meant for web farms, it expects you are trying to keep a few instances in sync, so that's what it does.
As another idea, you could do something with the XML cache maybe? ie push that to the server and have something to sync that to the database? seing as you are never going to create anything on the CDN servers.
Matt
Sorry for later reply on this one, manic prject delivery time again.
The solution we are currently looking at is to use database replication to keep the databases in sync. The Content Entry database essentially pushing transactions to the CDN server database. On the CDN server we have a a process that monitors the database for new transactions, when a new one occurs it starts to rebuild the the xml content cache on a seperate thread. Once re-built, it will swap the live content cache with the newly re-built one (i.e. trying to ensure that the website isn't blocked whilst the cache is being re-built, which could be quite frequent).
Still not overly happy about it, but I think levering the database replication will save us a lot of headaches.
We'd still need to sync media though (probably via DFS or similar).
Any thoughts?
Also Steve
Courier by it's very nature is designed not to break things on the live server. It is designed to deploy things, not keep things in sync. As such it's good at pushing, but it's designed not to delete things.
Thanks. I understand the rationale. However deleting things is not the only way to break things.
The same argument would prevent allowing the deployment of css or js wouldn't it?
Of course I'm new to umbraco, so apologies if i sound clueless and miss the obvious.
As for db replication, i used a similar solution for a dnn site, where i triggered a replication job via a "publish" button using rmo. I then synced the asset directories for images, etc. Dnn is cleaner though as the content is purely from the db (no xml cache to deal with).
Hey Steve
Courier handles files seperatly. Masterpages, css files, javascripts, macros, xslts etc can all be deployed and will overwrite the corresponding item on the destination server (i.e. it's only the developers that are going to be editing these files, hopefully in a development environment so there is no issue with overwriting these files on the live server). It's only content that is handled in this way as content has more issues with concurrency.
is working on a reply...