开发者

Strategy for handling daily batch jobs when required to run 24 x 5

We have an application that we are attempting to move to a 24x5 operation. We have a few batch jobs that import data feeds overnight to update customer changes for the following day. This job is structured that the entire table several million rows is wiped and rebuilt. During this job the table is unavailable for about 2 hours.

Its large enough that its not feasible to load into memory before updating. I wanted to know what the best approach is to making the data available some other way until the batch job is finished.

Loading the table to disk in xml format? Copying the table before the batch job?

We have a D开发者_运维技巧R stack but I think with the way its setup it will try and sync realtime so it will be unavailable as well during the job execution.

Database is DB2, front end is IBM Websphere


Evaluate the need for the kill-and-fill strategy. Check if you can change that to just apply the appropriate deltas for the additions and changes and get away with it.

You are putting a huge load on the database to release and recapture pages.

If you can do a delta, but only need to keep a few days worth of data, you can use partitioning on the table and drop partitions as they go out of scope instead of having to drop records.

If you must go for the Kill-And-Fill, use your original table (lets call it BatchTable) to create a new table called BatchTable_Process with the same structure. Use BatchTable_Process to process the batch job data. Once processing is done, swap table names.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜