开发者

Managing DB migration: scripts vs tools

Our project has about 20 developers, but our application makes relatively light use of databases. We have a collection of about 5 databases, all of which are very small and would have less than 20 tables each, none of which have millions of rows or anything large.

We have two options on the table for how to manage the evolution of the databases over time:

  • Some kind of tool. Currently we're using Visual Studio database projects, which contain the current definition of the schema, and look at a reference database to generate a diff script. We then use this diff script to bring the reference database up to date.
  • Use version scripts to build the database from a baseline. The scripts are manually placed in source control. Any data migration to move data from old columns/tables to new would开发者_运维技巧 be part of these scripts. There would be a version recorded in the DB somewhere and upgrading would run all scripts between DB version and the current version.

The second option seems to be widely used and I have found an indepth discussion here: http://odetocode.com/blogs/scott/archive/2008/01/31/versioning-databases-the-baseline.aspx

The problem we have with what we've got at the moment is that we don't have access over our Production databases. This means to create a release package, we have to restore a backup of Production into another location, generate a diff against that referece DB and give the script to the production DB team. So our release to production is different to our other environments.

This makes the idea of running versioned scripts appealing because we use the same scripts in all environments, and there's no ad-hoc work in deployment (eg manual restore of prod to reference DB). But given that we have such a small scale DB situation, I feel like we can hardly be a difficult case for the DB tools out there. What we want is something as simple as possible which is easy to understand.

Do the tools such as RedGate's suite make sense for this kind of scenario, or should we go with versioned scripts? Cost isn't so much of an issue, it's more about creating a Pit of Success where maintaining and deploying the DB is as basic and automated as possible.


I'm the product manager at Red Gate for SQL Compare, which generates diff scripts between two databases. I'd like you to take a look at our SQL Source Control tool, which will allow you to track schema changes as and when they're made in development. When it comes to deployment, if you know which schema version is in production, you can generate a deployment script from your source controlled versions. Of course you should always be testing this out in a staging environment before running on production.

Scott's article makes an excellent point in regards to migration script, and Denis alludes to more complex changes that can't realistically be second guessed by comparison tools, and would therefore require custom migration scripts to be managed and used appropriately. The next version of SQL Compare in conjunction with SQL Source Control will therefore manage both your schema versions and your migration scripts, allowing you to get the best of both worlds. If you'd like to see early screenshots of this, please email me at David dot Atkinson at red-gate dot com. I'd really love to discuss your requirements so we can better design the tool.


In my experience there always is more to it than mere schema changes. If you split a column in two, or shift a column to a separate table, or other such things, you need to migrate both the schema and the data.

No tool or script will allow you to migrate the actual data automatically. At the very most you'll get a diff for the schema which your devs may find useful as a reminder/check list for DB version migration scripts (sequences of create/alter/drop and insert/update/delete done in a single transaction).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜