Keeping databases up to date while developing thick modules for a multi-application Kohana environment
I'm currently planning out on how to maintain several thin projects which shares a bunch of thick modules in Kohana.
The plan is to have some modules, like "blog", "forum" etc. And then a bunch of separate projects which shares some modules.
As you can image, most of the modules will have a database dependency.
So, when a module is deployed to a project, the back-end database must be updated accordingly. (Also note that when开发者_StackOverflow an update of a module is deployed, the database might have to be altered.)
A good solution to this is Migrations, I've seen some implementations of them for Kohana, but almost all of them are application-specific and can not be used with mere modules. (Please correct me if I'm wrong)
What I really want to know is: Is there a good way to automatically update the back-end database on deploy with Kohana?
First thing you need to address is what migrations module you are going to use. There's a nice one based off a module called Minion, see the tasks-migrations repository. You can specify migration locations which are directories inside your migrations directory.
classes/migrations/[location]
So passing the argument --location=module
would only execute migrations inside that directory, that's how you'd separate and run per module migrations.
If you leave off --location
then all migrations are run (need to double check this)
The neat thing about minion is each location can use a different database group if you specify it, allows for a lot of flexibilty.
The next thing you have to deal with is deployment. I personally do this with Capistrano, so on a new deployment I'd run the minion command to bring up the database to the latest version.
Take a look at these two code samples to see how you might integrate running migrations with Capistrano.
- http://pastie.org/459506
- https://gist.github.com/410363
精彩评论