Press Ctrl / CMD + C to copy this to your clipboard.
This post will be reported to the moderators as potential spam to be looked at
Hey there :)
We are at the moment growing with more team members, and soon we will be 1-2 frontend and 2 backend developers, and we are looking for suggestions on how to handle our development/deploying environment, both for frontend and backend perspective. So right now im seeking some advice on how to handle this situation.
We have just switched from regular SVN to Mercurial(on bitbucket) to become even more fleksible.
Most of our umbraco sites contain a little custom code(it could be a custom section or datatype, or maybe some usercontrol) or no custom code at all.
Info : Our frontenders dosnt use visual studio.
It would be awesome if you could share on how you guys are handling projects where multiple people are interacting with the project.
Good question, heres what we do (two full time devs):
Both of us use Visual Studio, all our code is stored within Mercurial and we have our own repositories of course - occasionally we will work upon the same project at the same time hence we ensure that we don't cross over work wise to minimise upon merges.
For our local dev environments, we try to keep the same folder structure on all machines - just makes things easier with post build scripts etc - we use Dropbox to backup our code in between checkins to mercurial
We also run SQL Server locally and do all our dev direct in our own little environment before pushing to staging.
Staging wise, we have a centralised dev server cluster where we client test our sites before either pushing to live or sending to the client for them to install themselves. This is where we find out how our processes are working, if everything tests and runs fine then we are good, otherwise we have something different from our dev -> live machines.
In some cases, we will restore a copy of the live site + db to our dev/staging environment before testing new code so we can be sure we are testing with real data.
If Roger is working on PSD -> Html work before adding to Umbraco, he will develop locally but use his dropbox for backups. We don't yet check this into source control but will do shortly so our folder structure in Mercurial will allow for templates and their assets from PSD work etc
Back to Visual Studio, if we are doing any custom code, we will build a class library project for it and add the website to the solution - build events ensure we copy everything over that is needed i.e. dll's, usercontrols etc. We then attach to process to debug although there are other ways i've read about recently that simply need an F5 to debug which is nice - however we run in a true IIS environment for all development and don't use the inbuilt VS web server tools.
Thats it in a snapshot, could go into a lot more detail but i'm in the middle of a rollout :-)
Yeah, that's a real good question... We have several teams working in differenet constallations all the time. Even Mac and PC's across.
Therefore VS is not the right base for us to rely completely on. What we do (And we are still just inventing our process) is:
We have a Mercurial setup with a repo per customer, then we have a release branch if the thing is live otherwise just the default branch. When a dev (frontend or backend) takes upon a task that is more than just a single commit, we make a feature branch for that (then it's easier in deploy situations to see what's changed).
At any time we have a changelog, telling us what SQL scripts, and umbraco doctypes etc. (Basically all taskbased and db based stuff) we manually have to deploy.
We also have a team city server for depolyment and test (If the client want's it), so that whenever i push something to the default branch, it's build to our test server (Using x-copy or something our server dude knows a lot about). Whenever we push to our feature branches nothing happens. If we push to a release branch, teamcity build to our staging server. (Atm we don't have the balls to auto deploy to live servers but maybe in the future).
When more dev's develop on the same feature, they typically share the dev database, witch makes dev deployment easier.
What the VS guys do for spinning up their solutions I actually don't know, but it's not that important in this context, since we don't have to run everyting inside VS. Us frontenders, we like our Mac's :-)
Hope it can be helpfull, and I would like som more inspiration for developing our environment :-)
Thx for your comments :)
Hope more will come, i will soon write the idea we are tingeling with but its much the same as you are mentioning. But we are atm in a transaction to some new servers so not quite clear yet.
Ive been looking at different deployment tools and none sounds awesome.
The thing with writing in the commit comment what task based things should be done, is a good idea on till macros etc. becomes file based.
Here's our setup currently (we're still learning, so any feedback is welcome):
Subversion, shared SQL Server database, Umbraco files not in repository, except for some .config files and all assemblies (but moved to the _bin_deployable folder). Visual Studio with a Web Application project, only include templates, scripts, stylesheets and user controls (and macro scipts and xslt files). Usually remove code behind files from .master files, unless really needed.
Move all assemblies from Umbraco into _bin_deployable folder, because sometimes a rebuild or clean of the Web Application project will wipe out the contents of the bin folder. _bin_deployable will restore assemblies after each build.
No experience with adding custom sections or controls. Not sure where the code and files would go, in a separate project?
Use a separate Class Library project for code. Reference the Umbraco assemblies in the _bin_deployable folder in the web project. Reference the Class Library from the Web Application project.
To deploy: use the publish command in Visual Studio, to copy project files, assemblies, and transformed .config files for staging or live environments. Manually recreate document types and other meta data, or use Courier/USiteBuilder (doesn't work well in all situations, doesn't cover all types of meta data).
Document in a simple text file: version of Umbraco, any packages installed, anything else needed to recreate the same environment on other developers' machines.
No custom build scripts, no unit testing of any code that touches Umbraco, no automated deployment or builds, and sometimes serious headaches to recreate the same environment on dev, staging and live.
Can you detail some of the things you do? I am curious how you push to staging, what build scripts you have written specifically for Umbraco, and how you share meta data that is in the database with the other developer? And is there any reason in particular you can't use IIS Express to emulate a true IIS environment?
@MadsDo you have Umbraco files under source control? If you have to deploy document types and other db changes manually, how can you have a build server for deployment?
@Michiel, nope not Umbraco files in repo... Only som .config files.
The build server can actually execute som x-copy after build (I'm not 100% sure) It somehow builds to a local instance and x-copy files to staging and dev servers.
Not really, a build server can copy files, but it can't (without writing lots of code) recreate document types and other meta data in the staging server. So even if the build server is fully automated, after it completes a developer still has to go into the CMS on the staging server and repeat manually the steps to recreate meta data.
Unless I'm missing something?
Ahh yeh, forgot we are now starting to use Courier on some sites although not impressed overall, seems very buggy and hit n miss whether your changes get sent up, support is not great either for a paid product but thats another rant!
For the other sites, we will develop locally and test. Once happy we will move over the code to a staging server and again test - this server would normally be a copy of the live environment exactly i.e. copy live site files and db and setup under a new hostname. This way we can be sure that it works, We then have the option to either switch hostnames in IIS and point the live to the staging folders or upgrade live - the former is a lot quicker i.e. ~10 seconds downtime whereas the latter is a lot more time consuming
BTW on one site, I setup a batch script I call in VS to copy over the sites files to the target server - that works well as i'm on the same network when building the site, doesnt work when remote though!
One day i'll write a blog about how we do it in more detail, difficult to get done atm :-)
Build scripts wise, pretty much run of the mill really and along the lines outlined in this post on our
Heres one i've got in my current project
XCOPY "$(ProjectDir)\bin\*.*" "$(SolutionDir)\www\bin" /Y /S /E /R /V
XCOPY "$(ProjectDir)\usercontrols\*.ascx" "$(SolutionDir)\www\usercontrols" /Y /S /E /R /V
XCOPY "$(ProjectDir)\dashboard\*.ascx" "$(SolutionDir)\www\usercontrols\dashboard" /Y /S /E /R /V
XCOPY "$(ProjectDir)\pages\*.aspx" "$(SolutionDir)\www" /Y /S /E /R /V
XCOPY "$(ProjectDir)\admin\*.aspx" "$(SolutionDir)\www\umbraco\admin" /Y /S /E /R /V
XCOPY "$(ProjectDir)\admin\*.ascx" "$(SolutionDir)\www\umbraco\admin" /Y /S /E /R /V
Not perfect but works well at the moment, still fine tuning this process :-)
I didn't read the complete topic, but I have a question about the build script. We also use a build script and copy over all the dll files to the bin folder when we build the solution, but of course this does a 'full' recycle of the website which means it takes a while to load for the first time after that. We build our solutions a lot so is there a way we can copy over our files without a 'full' recycle? That would speed up developent quite a bit.
What exactly is the reason to put those files in a separate project and copy them over? Why not simply add www (in your example) as a project and have the files in there? The reason is not explained in the wiki page either, and that page is quite old (2009)
Just noticed that i'm copying all the dll's over, not normally the case but as jerome says, sometimes nice to refresh everything, especially if there have been upgrades of other dlls and libraries that would not normally be copied.
In my case and on this particular project, i've got 6 class libraries and the umbraco website in my solution. There is one library that references all the other projects dll's etc and its this one that has the script to copy across upon build.
I've got the build configuration options set to not build the umbraco site itself and once built, I currently use attach to process to debug as needed.
Our teams do mostly exist of 1 frontender and 2 developers, we use VS2010 and normally use a central umbraco database on the development server.We use SVN in combination with automatic builds with teamcity. For our customers we always run a full OTAP envirenment, dev is responsible for O and T, A & P are managed by our application management team.
I have no real 4.x experience, so feel free to ignore this, but our 5.x setup was very similar to what I did in my years customizing CommunityServer.
We used VS & local databases as others have mentioned, but ALL database work is done via code. In the Umbraco case, this was via package installs. All datatypes & doctypes were created by the various package installs, and any test data was either created via code or imported. So whenever I want to merge another dev's work, I just get latest, run a quick uninstall/install cycle on my local Umbraco instance, re-install the packages, and I'm good to go.
Clearly, that stops working so cleanly once a site is in production, but that can probably be handled via package updates if necessary.
Hope this helps.
is working on a reply...
Write your reply to:
Image will be uploaded when post is submitted