开发者

Large tables of static data with DBGhost

We are thinking of restructuring our database development and deployment processes by using DBGhost, we want to move away from the central development database and bring the database to the source control.

One of the problems we have is a big table with static data (containing translated language strings), it has close to 200K rows.

I know th开发者_如何学Pythonat our best solution is to move these stings into resource files, but until we implement that, will DbGhost be able to maintain all this static data and generate our development and deployment databases in a short time? And if not is there a good alternative to filling up this table whenever we need to?


This is an older question with an accepted answer, but I have some different input into this.

We use DBGhost and we have lots of static table data, although the largest is only about 20K rows, rather than 200K rows.

DBGhost has a feature to script data (as a series of insert statements). We used that to export our static data into scripts and put those scripts under version control. We tweaked those scripts to clear the data before adding the data back in, so we can use a single script to "reset" the static data for a table. This addition was for our specific needs, and is not the only way that you could handle static data with DBGhost.

The "build from scripts" and "sync" processes both support runnning ad-hoc scripts before and after the process. We added the static data scripts as ad-hoc scripts to run after the build/sync.

DBGhost also supports data synchronization in the synchronization process. The sync process can be configured to do a data synchronization on selected tables. Using this technique, you can have your build process add the data via the scripts, then the sync process can automatically sync the data for those tables. Using this technique, you would not need to change the scripts like we did.


Would you be able to take a look at SQL Source Control? We've just added static data support and are looking for feedback prior to the full release.

http://www.red-gate.com/MessageBoard/viewtopic.php?t=12298

Would you be able to explain why you're moving away from a central database development model?


DBG is not really designed for moving massive amounts of data

That's from an email received from Innovartis regarding the same question as yours. You've probably found this out by now though!


Maybe when you asked this they didn't have an evaluation though I'm not sure that is true. The only way you will know is to test it and see how it works.

http://www.innovartis.co.uk/evaluation.aspx

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜