Issue with huge table archive
I am assigned to move data from huge tables (around 20 hundred thousand record each) to identical history table. But when my query is running the log file grows too large and messes up everything. I tried the following:
- For each table being archived, handle as separate transaction
- Anyway, for history table I didn’t specify the primary key (could this be a problem?)
- All the trans开发者_StackOverflowactions were written in a single stored procedure
Can anyone tell me if there is any issue with my work or this is not the right way?
You can minimise logging if you use table locks with a bulk import
Lots of great info is found here:
http://msdn.microsoft.com/en-us/library/ms190422.aspx
Some pointers from the article:
- change db mode to bulk logged
- apply indexes after import
- import in batches
- do a log backup after each batch.
精彩评论