开发者

Keeping evolving DB schema and disaster recovery scripts in sync

We have a rather large DB schema that is ocnstantly changing - every application release comprises the app itself, plus a migration script to apply to the live DB as required.

Nopw, in parallel we maintain a schema creation script which we can use at any point to build a DB from scratch (for testing purposes).

The thing that vexes me slightly is that this seems to be a violatyion of DRY - if I add a column to a table I have to create a script to do it and make a similar change to the schema build script.

Is there any strategy to avoid this? I thought of having maybe a 'reference DB' with no dynamic data in it that we could simply export after every build is installed. SO we create a migration script and then, once the build is live, export the schema back into the 'create' script.

Im not convinced that wouldnt be more 开发者_StackOverflowwork than the process it replaces though.....


I have the same problem/concern/vexation. The best way I've found the only way to follow DRY is to use a database modeling tool (e.g. CA Erwin, Sybase PowerDesigner) where all the modeling work is done to build / maintain reference schema. Then you can leverage the change management capabilities of the tool to generate both the diff script which gets executed to go from release A to release B and generate the reference implementation yourself.

You may, of course, find there are places where you didn't repeat yourself, but the comparison tools will show all differences, so you can suck "Oh, I had to change that on the fly" changes back into the reference implementation as well.

Obviously, all of it -- diff script, reference implementation, and model itself -- then get protected through source code management.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜