开发者

How to deploy: database, source and binary changes in 1 patch?

I'm part of a development team that works on many CMS based projects, using systems like Joomla and Drupal.

In our development process, all of our code changes are managed inside of Git. At the end of a sprint, we create a DIFF that we can apply via patch to live site.

The problem is that most of the time, the changes include

  • Database Schema Changes
  • Database Data Changes
  • Source Code changes
  • Binary file changes (like images)

Git Diff handles Source Code changes beautifully. Binary files are only not included in the Diff except for reference to the fact that the files have changed.

Database Schema Changes and Database Data Changes are a mess.

I was wandering if anything like an unified patch system exists that could be used to deploy all of these changes in 1 patch.

So the question is, "Is there a system that can be used to deploy all of these changes in 1 shot?

Ideally, this system would allow to run dry-run like patch, but for all of the 4 data types.

Edit: Thank you everyone for the feedback that you provided, it was a starting point for my research in this area.

Here is what I found so far:

  1. It's difficult to deploy php based applications using linux packaging system because the c开发者_运维知识库hanges to the project happen iteratively rather then as releases.

  2. It would be possible to use dbconfig to deploy changes to a project, but the problem is generating mysql db diffs (schema and data)

  3. what really is missing for deployment of php based applications is a deployment manager that would be installed on the server and would be the interface for deploying the patches

I started a Google Wave on this topic and produced a lot of information as a result. If anyone is interested in reading this wave, please let me know and I will add you.


For handling installation and upgrade of our application, we use the debian packaging system . ( .deb package )

Context : We are making J2EE + Flex application. Shipping and administred throught a VPN. So not so far from you.

Fresh install and upgrade for a version to another are made through puppet ( a system for automating system administration tasks : he install our .deb )

In the .deb we have

  1. our compiled sourcecode
  2. the schema of the database ( handled by [db-config][1] )
  3. binary stuff
  4. how to install throught apt all other application needed ( mysql, tomcat ... )

= All stuff for a fresh install

We also add the info to go from a version to another

  1. the script for upgrading the database ( for each version )
  2. new binary
  3. new stuff to lauch at the machine start ( eg : some weeks ago we have add a activeMQ server )

=> Once the .deb is made correctly, we can install or upgrade seamless in one operation. ( it's made automatically, without any prompt ).

Theire is one .deb per realease, each .deb has a version number and a signature. You can pick any of our .deb and make a fresh install or upgrade from the actual version to the version number he hold.

The .deb is in our continous integration system. ( we build a .deb each hour, like if we are about to realease a new version )


What are the benefit ?

  • Install / upgrade automaticcally, with confidence.
  • Rollback a version
  • run dry are natively supported

In your precise case

* Database Schema Changes
* Database Data Changes
* Source Code changes
* Binary file changes (like images)

Database => you will have to write migration script. One for each version. ( ex : 1.2-update.sql 1.3-update.sql )

Source code and binary => add them, say in witch version they have to be copied/use

Edit : i'm not sure about source code. We are doing that with compiled code...


Some links to start :

https://wiki.ubuntu.com/PackagingGuide/Complete

http://www.debian.org/doc/manuals/maint-guide/index.fr.html#contents ( in french )

[1]: http://pwet.fr/man/linux/formats/dbconfig dbconfig

[1]: http://www.debian.org/doc/FAQ/ch-pkg_basics.en.html debian


I don't think you'll find a fail-safe mechanism.

I recommend that, when possible, you take into account compatibility with the current published source when making schema/data changes.

This way you can make a v. simple tool that runs database scripts committed to a particular svn location (you don't want diff on database changes, as if you need further modifications you need different statements).

With the above done, you can have a simple command that runs the database changes, then the binary & source code changes.

For database there is also the option of schema&data comparisons tools, these could be used to compare environments & make sure there isn't anything unexpected missing in the change scripts - could also generate the change scripts, but as I said you really want to make sure it won't break current source.


You can create a tool to do the migrations painlessly -- something similar to Peoplesoft's Patch Upgrade Assistant.

It is basically a standalone executable that reads an "Upgrade Template" and carries out tasks. The upgrade template declaratively describes the upgrade tasks or "steps". The steps could be - copy (for backing up or moving the precompiled objects like classes and othar binaries), database (for altering schema elements), SQL Scripts (for loading or transforming current data). The steps will have some predicate logic capable - if it is this, do this, else skip it and go to next etc.

The template is usually an XML file. It also provides for manual steps with instructions for manual actions. Each step also specifies if it is recoverable or not. It would also validate if the step has succeeded or not.

It may be possible to have a Open Source project around this requirement which is quite common.


You need to save git commit objects in local file and then import them into other repo/branch.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜