开发者

Deploying database changes with Python

I'm wondering if someone can recommend a good pattern for deploying database changes via python.

In my scenario, I've got one or more PostgreSQL databases and I'm trying to deploy a code base to each one. Here's an example of the directory structure for my SQL scripts:


my_db/
    main.sql
    some_directory/
        foo.sql
        bar.sql
    some_other_directory/
        baz.sql

Here's an example of what's in main.sql


/* main.sql has the following contents: */
BEGIN TRANSACTION
\i some_directory/bar.sql
\i some_directory/foo.sql
\i some_other_directory/baz.sql
COMMIT;

As you can see, main.sql defines a specific order of operations and a transaction for the database updates.

I've also got a python / twisted service monitoring SVN for changes in this db开发者_JS百科 code, and I'd like to automatically deploy this code upon discovery of new stuff from the svn repository.

Can someone recommend a good pattern to use here?

Should I be parsing each file? Should I be shelling out to psql? ...


What you're doing is actually a decent approach if you control all the servers and they're all postgresql servers.

A more general approach is to have a directory of "migrations" which are generally classes with an apply() and undo() that actually do the work in your database, and often come with abstractions like .create_table() that generate the DDL instructions specific to whatever RDBMS you're using.

Generally, you have some naming convention that ensures the migrations run in the order they were created.

There's a migration library for python called South, though it appears to be geared specifically toward django development. http://south.aeracode.org/docs/about.html


We just integrated sqlalchemy-migrate which has some pretty Rails-like conventions but with the power of SQLAlchemy. It's shaping up to be a really awesome product, but it does have some downfalls. It's pretty seamless to integrate though.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜