开发者

What do I set the Transaction Isolation Level for a conversion with large amounts of records

I am using SQL Server 2008 R2 and I have a conversion written in VB.NET that takes several hours to run and converts millions of records. There are times where I need to cancel the process, which rolls back the transaction. When that happens I don't want it to take several hours to rollback like it does开发者_JAVA百科.

Therefore, I removed the transaction all together so that there is no time to roll back, but it leaves all records in the state they are. However, I have one table with millions of records in it. I delete all records to start. It takes hours just to delete all records. When I look at the RowCount of the table it says zero, like it deleted them all, but it is waiting on something and I can't figure out what. I am hoping that putting the DELETE command in a transaction with an isolation level of READ UNCOMMITTED will help.

I wanted to get an opinion on what is the best practice for transaction isolation levels on large amounts of data before I attempt this because it will take several hours. Also, do you think if I put the whole conversion in a transaction of READ UNCOMMITTED it would not take any time if I cancel the process?


isolation levels have nothing to do with rolling back or commiting transactions, they are there to give you more control to prevent lost updates, phantom reads, repeatable reads, dirty reads etc etc. If something takes 2 hours to do..it will probably take 2 hours to rollback

recovery levels do, simple or bulk recovery will be faster for inserting data when using bcp or BULK INSERT

Is you code written in a SET based fashion or are you looping by using evil cursors?

Do you have triggers on the table that are unoptimized?

To delete all rows truncate the table since this is minimally logged or batch the delete process in batches of for example 10000 or so if you don't want to delete all


I have one table with millions of records in it. I delete all records to start. It takes hours just to delete all records.

truncate table xxx ---- should nto take more than a fraction of a second, regardless of table size.

wanted to get an opinion on what is the best practice for transaction isolation levels on large amounts of data before I attempt this because it will take several hours.

Depends on the process, the mount of data and the needs. No generic answer. Having non-dektop level hardware helps, though. I have no problem dealing with 2-5 million rows in a delete statement in minutes... on a database server (i.e. not 1-2 discs but quite some).

Also, do you think if I put the whole conversion in a transaction of READ UNCOMMITTED it would not take any time if I cancel the process?

No. THe rollback takes time because sql server is not optimized for that and has to roll back all changes made so far.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜