Press Ctrl / CMD + C to copy this to your clipboard.
This post will be reported to the moderators as potential spam to be looked at
Now then fellas. This one has been raised a few time but we've yet to see any really useful suggestions as to how it could be done. So I wanted to throw this question out to the core team to see if they could offer any insight or ideas or secret know how on how to crack this nut.
I run a little code shop, we score 10 on the Joel test (we are two small to get the whole 12), we have a build server, we use source control (SVN), we like automated tests. We regularally work on projects together running them locally on our own machines, checking in/out changes and everything works well.
Then I found this great little CMS that is kick ass powerful, great community and very fun to work with. I beaver away at nights and weekends learning more and more about the ins and outs. I build a site or two. Then I put them live. Then I decide to show the rest of the gang the new CMS that I've worked with enough to say "this is the one". First question...how can we all work on that together? I fell at the first hurdle, I had no good answer and I'd broken all my own rules!
So, how on earth can I have multiple developers working on the same lot of code. If a client reports a problem how can I get a local version running quickly so I can hack around with it without effecting the live site. How do I develop new functionality and get it live while clients (and other developers) are updating the site (either live or in the case of developers locally) also? How do I get this thing into SVN when most of the CMS is a database!
Now I've not just come up with the question then dumped it on you. I've given it some thought but I still can't seem to see a clear way of getting it to work. Here are some of the ideas so far and the problems that come with them.
* Everyone has a cut of all the images/css/templates, etc. locally but points to live or a dev server. This dev server "could" be synced with new content from live over night or at request using some funky script - I've no idea which 30 tables I need to sync with or know how you'd know which records where "dirty" and would need copying over, especially without over-writing anything you've done locally. Tricky one to over come without some more indepth knowledge, a re-occurring problem as you'll see.
* Have a local copy and everytime you want to work on the DB you do a sync with live. Trouble is, lets say you create a new template which the DB gives the id of 101. Now someone else working on another problem also creates a template for a different page, that template get the id of 101 in his DB. When we try to check it all back in our template ids clash. Plus what about the client who all this time has been adding content of their own so your DB dump now over-writes all their new content items...not best.
* Have a local version of the DB and write some code that runs before the creation/update of each doc type, template, macro, content item which dumps out the SQL for that addition/update. These can then be SVN'ed in when you are ready to commit your changes and can be run similar to migrations in Rails to bring your DB upto date. Tricky...so many tables. To truely work like Migrations in rails you need some DB mods to record migration numbers.
* Have a script that can JUST grab everything but the actual content, then you can do a full sync from live to your locally running version (again scripted for ease) but when you are done you can just pull out the data for the templates, marcos, etc.
* You could create a bunch of "spare" templates, marcos, etc for each developer. Then when you need to create a new template you can just rename one of your spares. That way you are not clashing with any one elses created templates. You still have the problem of getting these out though from your local DB to live/dev.
* Work on live and be really, really, careful...but what if you wanted to develop a macro on the homepage, would you not have to copy the homepage, copy the content, not publish it, hit it directly (not even sure if you can do this without publishing?). Not the smoothest of methods of working.
* Create a package for every mod you do, SVN in the package, these can then act like your migrations. Packages are a very manual process and show you everything not just bits you've been working on. I don't know of a quick and easy way of generating packages. What if your package needs to make a change to something that some one else package messes with, how do you sort that one out? Are package install smart enough to work out clashes with two templates called the same thing etc. Admittedly I don't know enough about packages to answer that one.
So hopefully you can see its defo something I've been pondering but...I still don't have an answer. Does anyone have any other ideas, is there anything in the pipe line to solve this, is there a best practice as to how to do what I'm after?
All help gratefully recieved :)
Usually in small teams we work as follows (for dev environment)
We sync dev environment to live using Courier (umbraco pro). I've discussed this before http://our.umbraco.org/forum/getting-started/installing-umbraco/2918-Update-an-Umbraco-website?p=0#comment11311
"If a client reports a problem how can I get a local version running quickly" - Using our approach we find we just have to download the live DB, get our code out of source control and republish the site... done.
I use the dev environment for developing and testing new functionality only. I very rarely care if the content is the same as is on the Live site - this removes all the issues to do with node ID's, syncing content etc.
Rather than passing Node ID's as parameters in macros I prefer to set in the web.config. That way I can copy updates to templates without having to check parameter IDs are correct.
With regards to Macros, document types, creating new templates and changes to the web.config I will either do these manually or package them up. I come from a background of working in an environment where I wouldn't even have access to the Live environment so this would be standard procedure i.e. create manual instructions that someone else can follow. Once the macro or template is created (even if it is just a stub) changes can be easily copied over as part of a release.
Document types are more difficult to manage so manual changes for small changes and package for larger/new document types works for me.
Rails migrations is more geared at schema changes, where as document types, template & macro IDs are data changes. While you can do data population with the migration I don't think it would resolve the ID issues so would need to be adapted in some way.
Easy releases of changes is one of the main things I would like to see for Umbraco (for code changes only - I think having Courier is the best option for content syncing if that is required)
Currently I'm also in the process of setting up a proper dev environment.
I think a centralized dev database is the best way to go.
Furthermore everyone runs IIS locally and checks in and out all the files locally. In IIS6 you can export the website and put the config file in SVN as well for easy setup on additional dev boxes. In IIS7 you can choose to use a shared website config for all your dev boxes or you can use the Web Deployment tool to generate a website package and put that in SVN to do easy installation on other dev boxes: http://www.iis.net/extensions/WebDeploymentTool
Also I think you should never change any document type or developer related stuff directly on your production environment. And I don't think you need the exact same content and data which is on your production envrionment on your dev environment. Like Paul says, if you really need the same content on dev just make a backup of de production database and restore it over your dev database or restore it on a temporary place on dev.
If you don't have a pro license you probably need to export all the items you create in Umbraco (documenttypes, macros etc) you can put these files in SVN as well. Maybe in some kind of versioning folder structure so you can see what you still need to release to your acceptance and/or production environment.
Content is another story. You could probably script content manually but I do not have a lot of experience with scripting Umbraco content yet.
Seem courier might be an importatnt missing piece of my puzzle. Seem it can do all the syncing up and id managing for us.
I'm liking the idea of having a seperate dev DB and doing NO dev work on live at all, that makes sense and no different to what we already do for other non-umbraco sites I suppose.
Still causes problems when you want to branch though, different mods could still interferre with a shared dev DB, thats my worry.
I'll keep my thinking cap on and have a look to see what we can do, I think I'm going to have to play around with some of your suggestions and see how they feel and work in practise. I'm sure there is a best practise way of doing it, I just need to find it.
Cheers for your input and please keep it coming.
Ok, so we started a new project and from the off worked towards getting it all setup for multiple developers.
Heres what we did for reference:
All good right? Well we did have one problem, if I created a new template the code for it lives on my machine, but the fact it exists is visible to all over developers. If another developer clicks on the template "before" I've checked it in (do it does not exist on this machine yet) his version of Umbraco creates a template for him but it messes up the template creation and includes the default template code twice in the same template, one nested in the other. This then plays havoc which his install and with our check in process as we get conflicts.
But thats the only problem we've hit so far, we've only been at it for half a day though :)
Just a question for the Umbraco guys related to this. Do you have any plans to release Courier as a stanadlone developer tool? I think it would be much better if a developer could buy a license and use it on all the sites they manage, rather than having to get each client to agree to the costs.
You could solve the "template" problem with a little communication I guess. Just do a standup-meeting every morning where you discuss what you have been working on and what you will be working on.
I have been looking into the MS Web Deployment tool for IIS7 btw. Might also come in handy for a development environment. With this tool you can export IIS7 settings and even your database (files on filessystem is also possible, but you have these in svn as well). This will result in a package which you can import on other systems.
I did create a package but I didn't have time yet to import it on another dev box.
@Matt: We have plans to release Courier as a stand alone package (without Umbraco PRO), but it'll still be licensed per box or per site.
I thought I would just say what we have done as at one time I was the only Umbraco developer across our offices but now we have 3 across 3 offices and was also having the same problems.
Working on issues from live sites.
Wish I had 70 karma points so I could vote up this thread.
As already mentioned, we also found a shared database to be essential (although this model never really worked with remote connections due to the latency). I have heard talk of setups where each developer has their own range of umbraco NodeIds so as to simplify db merges...
Anyone any thoughts on CI setups ?
When working with a different range of nodeid's per developer. (and having a local database per developer so you don't run into latency problems) you could reseed all the databases so you make sure you won't be in the same range of nodeid's of another developer.
Check out this tsql command:dbcc checkident('tablename', reseed, <newid>)
An interesting article just popped up in the umbraco blog section that might be of interest.
Roel great idea (I understand it even more after reading Matts linked article). Matt thats an awesome link, wonder how they are getting on with it?
Reseed could be the answer then, use it to create a buffer of ids for dev, live and even per developer. SQL Data Compare has a command line interface if you get the expert addition. Could be possible to knock up a script that could do the merging for you...hmmm need much more time to have a noodle with this one. At least its in my head now so I can let it rattle around for a bit.
Just moved to a newer, bigger, nicer office so not had as much time as I normally like on these sort of problems.
Funny you should ask how those guys are getting on, I just noticed they have a follow up post.
Good to see you yesterday, and sorry I couldn't stay as long as I'd liked.
So, as we discussed, I reckon the best tact would be to have the Dev DB seeded at 0 and the first time the DB goes to live, reseed all tables to something like 10000 (You'd probably also want to re-seed any staging server DB aswell). This way, anything the client creates will be IDs 10000+ and anything below this is content / doc types / etc imported by yourself. It should hopefully then just be a case of doing a DB compare to get the differences and generate a SQL scipt.
Also mentioned yesterday was another DB compare tool by xSQL Software who offer a lite edition of their tools which are fully functional and completley free if your DB is a SQL Express DB. You can find out more on this here: http://www.xsqlsoftware.com/LiteEdition.aspx
Hopefully this will be a good start to finaly getting this nailed.
How about this for a SQL script to reseed all tables with an identity column?
DECLARE @tblName varchar(200)SET @tblName = ''WHILE EXISTS( SELECT 1 FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'dbo' AND COLUMNPROPERTY(object_id(TABLE_NAME), COLUMN_NAME, 'IsIdentity') = 1 AND TABLE_NAME > @tblName)BEGIN SELECT @tblName = MIN(DISTINCT TABLE_NAME) FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'dbo' AND COLUMNPROPERTY(object_id(TABLE_NAME), COLUMN_NAME, 'IsIdentity') = 1 AND TABLE_NAME > @tblName EXEC('DBCC CHECKIDENT ('+ @tblName +', reseed, 10000)')END
Slight improvement. Now only reseeds tables that haven't already been reseeded.
DECLARE @seed intDECLARE @tblName varchar(200) SET @seed = 10000SET @tblName = '' WHILE EXISTS ( SELECT 1 FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'dbo' AND COLUMNPROPERTY(object_id(TABLE_NAME), COLUMN_NAME, 'IsIdentity') = 1 AND TABLE_NAME > @tblName AND IDENT_SEED(TABLE_NAME) < @seed)BEGIN SELECT @tblName = MIN(DISTINCT TABLE_NAME) FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'dbo' AND COLUMNPROPERTY(object_id(TABLE_NAME), COLUMN_NAME, 'IsIdentity') = 1 AND TABLE_NAME > @tblName AND IDENT_SEED(TABLE_NAME) < @seed EXEC('DBCC CHECKIDENT ('+ @tblName +', reseed, '+ @seed +')') END
Nice work gents, I am looking forward to putting this into practice.
My plan is to spring for SQLDataCompare Professional and to run the Sync via Command Line from NANT / CruiseControl.net. I will automate the comparison of our dev and staging databases, and save the results to script files that can later be run against production.
The tool is a little pricey but I am already using this method for schema changes using SQLCompare on other projects and it works beautifully.
If you give this a go before us, it would be cool if you could post your build scripts.
All the best
I definitely will post them. Right now, I'm still learning to walk with Umbraco (our first project). Once I get more comfortable (and committed to using it), I'll likely get to work on the build process. Could be pretty soon, things are going fairly well, and the community has been great so far.
This brings up a question:
The part of the umbraco that I am least familiar with is the Umbraco DB schema. The knowlege of which would seem to be pretty crucial in the creation of the SQLDataCompare project.
When creating scripts, I would want to make sure not to move any "page content" (documents), only structural objects like document types/properties, dictionary items, templates, macros etc. (I'm not even sure of what is in the DB vs. File System at this point.
Are you fellas on the same page, or are you more interested in moving the actual content from environment to environment? I notice that the articles on FarmCode are more oriented toward migrating content.
Sounds like we are all singing off the same hymm sheet :)
I'm not really all that farmiliar with the DB tables either which is a bit of a drag. Might be worth posting a question to the Core forum to see if we can get some insider info on this one?
Matt great script. I'm hoping to have some time on this on this week.
Keep the ideas coming though, Build servers I'm ok with I'm a bit rusty on all the DB stuff though :)
Just an update to this thread: we have created a command line utility that installs packages. We are calling it from nant/cruisecontrol, and storing our document type and other changes as packages in our SVN repo.
We are just starting to test today. I hope to have a code plex project started in time for the new year.
How did your testing go? I've had alot going on in my life of late thats been more important that Umbraco. Now thats behind me I'm trying to get back into it.
Just catching up on where you got with this one.
Are you storing complete additions as packages then commiting them? Tell us some more if you can.
SMercer and others,
I'm just looking to get started with Umbraco myself and am considering our source control options as well. I have a console app that I use to manage environment setups for different dev environments. It runs can restore backups, sql scripts and code against whatever APIs are necessary. (This also may make it to CodePlex soon.) I was hoping to be able to use a database restore of a clean install and then somehow load the appropriate packages and custom document types, and then use the new Linq API to create the folders and default documents that I need. I'm just getting started but was really hoping that someone else had solved most of these issues by now, and from this thread it doesn't really seem like it. I'm used to being able to completely rebuild a site from SVN whenever I need to. Anyone think I'll able to get to that point with Umbraco? Sorry to ramble, but I'm just a little nervous as I'm about to start a very large project with whichever CMS I pick.
Randy, there is no uber solution to this "yet" but it is something several of us are/have been looking into.
Your best bet is to follow the steps covered so far.
You can build a site from SVN but you need to:
* Take a SQL backup and SVN that in* SVN in all your template, master page, css, js etc.* Create your DB and import the backup* Change your connection string in web.config
That should do it. Bit of a pain and its something we want to automate. We also want to do the reseed idea Matt's been playing with so that dev and live and play nice with each other when you need to push stuff live.
Packages are another option but again you need to manually create those and then check them in and then manually install them. Not ideal.
Just a note on the reseeding aproach, I did try my script on a live deploy, and it didn't quite work (couldn't create any new pages) so I'm going to have to look into the specific tables used I think, rather than doing a global sweep. So untill I do find right tables (which I need to find soon, as I have a site going live and want to make it as easy to manage as possible) please nobody use the script i posted previously.
Hmmmm, weird. Just created a copy of a DB and reseeded all tables and it worked fine. The only small issue, is that some of the tables have types of "tinyint" and "smallint" which can't go up heigh enough. Do you think there would be any problems if we wrote a script that changed all "tinyint" and "smallint" columns to "int" instead?
Which tables are affected? Anything major? If they are using tinyint for identity fields then might be safe to leave be and not reseed them?
You might be able to get away with upping from tinyint to smallint as I'm guessing that in the code it would just be mapping to c# int type anyway which can store as much as a SQL smallint. Might be a developer question.
Have asked Niels via the medium of the tweet and will await a response :)
The tables and columns are as follows:
Regarding the mapping to int, isnt a c# int a 32 bit signed integer? which I'm pretty sure maps to a SQL int (I belive tinyint is 8 bit, smallint is 16bit and int is 32 bit). If this is the case, I would imagine it is pretty safe to make them all ints.
I have manged to change these on my dummy table, and it seems to be running fine (Although I can't claim to have tested it fully). The only problem I had is that there are a number of constraints on the tables, so right now the easiest way is to change these manualy. I should be able to use a SQL diff tool though to generate a scheme change script (I just need to make sure there is data in all the tables, so that the diff tool takes this into account).
I guess it depends how far down this road we want to go, as we shouldn't really need to reseed everything, but without knowing more about the DB structure, it seems the safest option.
As this thread is to do with multiple environments, I thought I'd share something else I'm trying out. I've started using a tool called XmlConfigMerge which you can fine on CodeProject. Combined with a postbuild script, you can use this to configure you environment dependant on the build mode and/or the computer you are building on. My post build script is currently this:
REM #################################################REM # Copy filesREM ################################################# xcopy /s /y "$(ProjectDir)Css\*.css" "$(WebProjectOutputDir)\..\Web\css\"xcopy /s /y "$(ProjectDir)Images\*.jpg" "$(WebProjectOutputDir)\..\Web\images\"xcopy /s /y "$(ProjectDir)Images\*.gif" "$(WebProjectOutputDir)\..\Web\images\"xcopy /s /y "$(ProjectDir)Images\*.png" "$(WebProjectOutputDir)\..\Web\images\"xcopy /s /y "$(ProjectDir)MasterPages\*.master" "$(WebProjectOutputDir)\..\Web\masterpages\"xcopy /s /y "$(ProjectDir)Scripts\*.js" "$(WebProjectOutputDir)\..\Web\scripts\"xcopy /s /y "$(ProjectDir)UserControls\*.ascx" "$(WebProjectOutputDir)\..\Web\usercontrols\"xcopy /s /y "$(ProjectDir)Xslt\*.xslt" "$(WebProjectOutputDir)\..\Web\xslt\"xcopy /s /y "$(ProjectDir)favicon.ico" "$(WebProjectOutputDir)\..\Web\"xcopy /s /y "$(TargetDir)*.dll" "$(WebProjectOutputDir)\..\Web\bin\" REM #################################################REM # Merge config filesREM ################################################# FOR /F %%a IN ('dir /B $(WebProjectOutputDir)\..\Web\Config\*.config') DO ( IF EXIST "$(ProjectDir)Config\%%~na.config" ( "$(SolutionDir)Tools\XmlConfigMerge\XmlConfigMergeConsole.exe" "$(WebProjectOutputDir)\..\Web\Config\%%~na.config" -m "$(ProjectDir)Config\%%~na.config" ) & IF EXIST "$(ProjectDir)Config\%%~na.$(ConfigurationName).config" ( "$(SolutionDir)Tools\XmlConfigMerge\XmlConfigMergeConsole.exe" "$(WebProjectOutputDir)\..\Web\Config\%%~na.config" -m "$(ProjectDir)Config\%%~na.$(ConfigurationName).config" ) & IF EXIST "$(ProjectDir)Config\%%~na.$(ConfigurationName).%COMPUTERNAME%.config" ( "$(SolutionDir)Tools\XmlConfigMerge\XmlConfigMergeConsole.exe" "$(WebProjectOutputDir)\..\Web\Config\%%~na.config" -m "$(ProjectDir)Config\%%~na.$(ConfigurationName).%COMPUTERNAME%.config" )) IF EXIST "$(ProjectDir)Config\Web.config" ( "$(SolutionDir)Tools\XmlConfigMerge\XmlConfigMergeConsole.exe" "$(WebProjectOutputDir)\..\Web\Web.config" -m "$(ProjectDir)Config\Web.config") IF EXIST "$(ProjectDir)Config\Web.$(ConfigurationName).config" ( "$(SolutionDir)Tools\XmlConfigMerge\XmlConfigMergeConsole.exe" "$(WebProjectOutputDir)\..\Web\Web.config" -m "$(ProjectDir)Config\Web.$(ConfigurationName).config") IF EXIST "$(ProjectDir)Config\Web.$(ConfigurationName).%COMPUTERNAME%.config" ( "$(SolutionDir)Tools\XmlConfigMerge\XmlConfigMergeConsole.exe" "$(WebProjectOutputDir)\..\Web\Web.config" -m "$(ProjectDir)Config\Web.$(ConfigurationName).%COMPUTERNAME%.config")
The first part I'm pretty sure everyone is aware of, the second part is a basicaly a for loop through the config folder in the root of the site, which then looks in a config folder in the project folder for config files to merge. It looks for 3 different files:
Name.config - Always mergedName.BuildMode.config - Merges depandet on the build mode (ie, Debug, Realease, any others you have defined)Name.BuildMode.ComputerName.config - Merges dependant on build mode, and the current users computer name
This way, we can put all config settings under version control, and the post build event auto configures the environment for the current user.
I think the only down side at the moment, is that it can be a tad slow (both building, and refreshing the site), but it's a start.
Ok, back on the SQL script, I think I know why the script failed the first time i tried it. When I ran it the first time, I reseeded the database to 10000, however the cmsPropertyData tables identity had already exceeded this amount, which means when a new entry was created, it threw an error, as 10001 was already in use.
What do you think should happen in this case? Should it all just be wrapped in a transaction and if the seed / next id is higher than the seed, then fail? This would mean the script can only be used for reseeding upwards (not sure if you would ever need to reseed downwards).
Ideally it should only be run on a fresh DB right? As in a fresh install?
Could use it on an existing install I suppose, bit scary though :)
Is there no way of getting Max( ID ) + 10,000?
The script to convert to ints yes, should ideally be run on a fresh DB, however I can see the need to be able to update an existing install, so it currently supports both (It's definatley worth doing a backup first though, as to modify the types, you basicaly have to create a new table and copy all the data over, then resetup all the constraints and finaly delete the old table). The reseed script, would more than likley be run on first deploy to the live environment, so this one definatley needs to be run on an existing install.
Re the reseed, yes I could do what you meantioned, just increase the seed by 10k, however I would think you would want all your seed limits to start from the same point. The reason being that when you do come to do a DB merge, you can definativley say that ANYTHING with an ID greater than 10k was produced on the live site. Anything below this, was done during development. If you did just increase by 10k, then it would be much harder to know where the live ids start.
For this to work, you are going to have to try and look into the future a little. Is this a small content site that is not going to grow a lot? Starting at a 1000 should be fine. Is there a blog? Does it have potential? Depending on the amount of potential: 10.000 could work, but maybe even 50.000 is necessary.
When estimating, also think of the growth amount, if you think that 50.000 could be reached, but only after 5 years, it might not be necessary to go that far (though it couldn't hurt). Most websites do not have a life expectancy of 5 year before a complete rebuild is done.
Thats true, but just to add to this, the thing you need to think about, is how much dev you are likley to do over time. Live content isn't the issue, what you want to do, is leave enough room (enough IDs) to be able to develop new features, and then easily merge back with the live DB. So whilst your comments are true, I just wanted to clarify what you are leaving room for.
I think the table likliest to cause problems is the cmsPropertyData table, as just with a simple site, populated with content, I have already reached the 30k mark before going live. If I'm correct, the maximum limit for an int is 2,147,483,647. So it may be worth upping the seed limit to something more like 1,000,000 as a general baseline?
Man I wish some core devs would take a look at this...some pointers would be good.
Well, looks like that answers that question. You can't just convert the "tinyint" and "smallint" fields to int, as there seems to be some explicit calls to SQLDataReader.GetInt16 which just throws a Specified cast is not valid exception. Which in turn means you can't just reseed the whole DB. So i guess the question now is, of the tables listed below, would you ever really need to reseed them? (ie is it likley you would ever change them directly on the live environment?)
Hmmmm, I've just realised, we are still going to have issues seperating developer content from say just document type definitions as both put values in some shared tables. So whilst with reseeding, we can distinguish between developer and user content, it's still not goint to be straight forward to seperate features (doc types, templates, macros) from test content.
Anybody got any ideas on this?
doc types, templates and macros should only be created on the dev machine though right? Users should only be creating content not features so we "should" be able to only worry about content on live. We "should" be able to blast over the top of any features in the DB with the master stuff from Dev?
I was more concerned over the test content that is on dev site used to test the features, and how to seperate this when it comes to merging the changes in with the live?
This is an interesting thread.
I've tried getting towards the end-all-woes setup around this for some time (though mostly from a single developer perspective), but I don't think it's possible to get to a completely pain-free setup with the current architecture.
I've blogged a bit about it here: http://blog.claudihauge.dk/post/443264785/the-ideal-developer-cms
Disclaimer: There's no solution this problem in linked blog post! Only some thoughts about the origin of the problem, and how a cms should be constructed to be easier to work with from a developers viewpoint.
Ok, back again =)
I've had a thought. As we stand, we are able to reseed the live DB no problem, the issue we have is merging dev functionality, without merging dev content. So I got to thinking and remembered the farm code link posted earlier (http://farmcode.org/post/2009/09/09/How-to-sync-data-across-multiple-Umbraco-environments.aspx). What farm were doing is only merging content, which we basicaly want to do the oposite, so if farm have identified these tables as being "linked" with content, it should give us a decent list of tables to look into to be able to seperate the content.
After a quick look, it does look promising. In terms of merging, the following tables are purley to do with content, so can be ignored in the merge:
- cmsContent- cmsContentVersion- cmsDocument(Might also be worth ignoring cmsDocumentXml to)
This leaves us with umbracoNode and cmsPropertyData. Now all we need to do is identify which node IDs are content. Luckily, it looks like any content node (Documents and Media items) creates an entry in the cmsContent table, so if we did a lookup to ignore anything that has a value in this table, it should leave us with just functionality. Bingo!
I think I'll have another look in the DB though to see if there is anything else we should miss out, as off the top of my head, members should be another thing we want to miss out.
After a quick look through the DB, these are the tables I this are "content only" based, and thus can be ignored during the merge:
These ones I am not to sure about, but think we should also be able to ignore them
Anybody see any problems with these?
A nice find by mr Duncanson roled up the following link:
What is interesting is that it shows how to identify the different types of nodes within the umbracoNodes table. Taking a look through the source, here are the GUIDs for the main "content" types:
Document - c66ba18e-eaf3-4cff-8a22-41b16d66a972Media - b796f64c-1f99-4ffb-b886-4bf4bc011a9cMember - 39eb0f98-b348-42a1-8662-e7eb18487560
With these, it should now be possible to easily seperate out content nodes.
Can anbody think of any other node types that are generated either by the client, or users?
Ok, time to actually start trying some of this out, so I have created a test setup as follows:
1) Create a bog standard umbraco setup including runway and all suggest modules during setup2) Create a member type, member group and a test member3) Backup the database4) Create a second DB, restore the previous backup, reseed to 1000005) Still with the site pointing to db1, install google datatype package and blog4umbraco package 6) Update webconfig to point to db27) Connected to db2, create a new member and some new content pages
So now i have 2 DBs:
DB1 = Developer DB. Now contains new functionality that needs merging, but also contains test data (the blog4umbraco package installed a test blog that we don't want to merge). I also created an extra member which again, we don't want to merge.
DB2 = Live DB with ids reseeded to 100000. Now contains user generated content (Extra pages and a new registered member).
So, my next step will be to see if I can generate a merge script.
Well, that didn't go too badly for a first attempt.
I've had to create a couple of additional primary keys so that my sql diff tool was able to compare some of the tables. But after that the only error I seemed to have was a conflict on a foreign key, which I'm pretty sure is becuase I forgot to limit the entries copied to the cmsPropertyData table to none "content" nodes.
I'll give it another try and see how it goes.
Attempt 2 went much better. Using XSQL (A SQL diff tool, which is free for Express databases) I have created a reusable XML config which will generate a SQL script of all FUNCTIONALITY unique to the dev DB which needs to be copied to live, which successfully ran with no errors. After a quick check in the admin, all features look to be successfully merged over, and there is no rouge content from the dev DB.
Looking into the documentation for XSQL, I should also be able to do the comparison without creating the primary keys I mentioned earlier. I can just tell XSQL which column to use as a primary key (I would imaging the Redgate SQL tool should be able to do the same).
So, it looks like success. Time to do more testing, and I also want to look at my XML config as I currently have all the tables to compare declared manualy (But dunno if thats a bad thing).
I'm now starting to think what types of setup may be needed:
DEV - Generate a script to copy FUNCTIONALITY only from the DEV db to STAGING.STAGING - Generate a script to copy FUNCTIONALITY and CONTENT from the STAGING db to LIVE.LIVE - Generate a script to copy CONTENT only from LIVE db to DEV or STAGING
Would people agree?
Oooh, found another article that might come in handy:
Just below the first db diagram, is a SQL script to create a db lookup table of all the different object types. Will come in handy to be able to cross reference node types (I had hard coded it into my script at the moment)
You have been busy. Think we might need another beer to see where we are up to with all this :)
Ok latest idea.
Each developer has their own cut of the DB and Umbraco running locally. Files are stored in SVN, its the DB stuff that is the problem.
Latest solution is to have a play back facility. New package which would write down every DB mod you make (not content but functionality, add a doc type, add a field, etc.). To begin with this acts as a change log so you at least have a history of what it is you have touched.
Step 2 is to then use that log to drive the CMS API to recreate the action in your change log on other systems. Tricky to manage the Ids but some placeholders should do it.
Idea is when you check it you create a new change log file (some how, hopefully automagically). This is what gets checked in, everytime you check in you should create a new change log.
Our package would check for new change logs (like when you do an SVN update) and prompt you to play them back to get your DB up to date with the latest changes. Hopefully all within a nice transaction so you can roll back should anything goose.
Now how to name the change logs in such a way that our package and know which order to install them in (I pondered username_date_time). We can store a table in the DB to make sure we only install change logs once.
Started having a poke around, watch this space.
Any additional updates on this "project"? Has version 4.5.1 caused any change in your methods? We are about to go into full development mode and roll out a production copy and want to allow multiple developers access without stepping on each others toes.
This is still just an idea at the moment, every time we try to flesh out how we would do it we hit problems. There is rumour that Contour might fix some of our problems in its next release but have no solid details. I've also been told that the umbraco devs think solving this problem will be a "1 hour coding job" but again have had no more on that either...
In the meantime, SVN* all your templates etc. and keep a change log in the root of your application (also checked into SVN).
* Other version control systems are available :)
I'm trying to get a local development copy running. I can get into the Umbraco admin and see my content and nodes. However, when I try and browse the site, via localhost, I get a message that no pages can be found.
Connie you probably need to post that as a seperate issue, it will sort of get lost in this discussion, we are trying to keep this one on topic if possible.
Connie, do you already have created a documentypte (ie a homepage?)
If you are new to umbraco i will advise you to install a starter kit so you get used to the interface and concepts.
If you like Umbraco it's certain worth to get a Umbraco.TV membership.
Did anyone ever come up with a workable solution for this problem?
I've just stumbled upon this thread, hugley interesting. Its covered most of what I've been musing over the last week.
I want to create a dev environment where i can work from home and in the office without to much hassle...the code side of things are covered using KILN.
As with what a lot of other people are saying its the DB thats the issue. Looks like my DB needs to be availbale to both internal and external networks or i go down the merge route for SQL....is there Source control for DB's?
Or i could use Azure for the DB
(Commerical product recommendation) Source control for DB's: https://www.red-gate.com/products/sql-development/sql-source-control/
We also love the SQL Compare and Data Compare tools as well; bundles are available and in my opinion, these products have paid for themselves several times over.
Try this one : http://koske88.wordpress.com/2014/08/19/umbraco-7-version-control/
is working on a reply...
Write your reply to:
Image will be uploaded when post is submitted