开发者

SQL Server 2000 : Deleting and inserting huge amounts of data

We have a SQL Server 2000 in place which has about 6,000,000 records. And one column has a pdf stored in it.

The question is every month we delete about 250,000 records and insert about 250,000 on a sp开发者_如何学Goecific day. After that there are no updates, only reads.

Question is: Is it optimal to delete 500 records and insert 500 records then delete then insert then delete and insert and so on..

Or delete 250,000 at a time and insert 250,000 in 500 batch?

Which option is optimal? and have the best memory management?

+1 for Anyone who points me to a MSDN article or something..


http://msdn.microsoft.com/en-us/library/aa178096%28v=sql.80%29.aspx


As you don't mention it it's worth underlining the standard practice for inserting/deleting/updating any very large volume of data on a rdbms of dropping all indexes before applying the changes, and reapplying them afterwards.

Two reasons for this.

  1. It's faster for the system to rebuild the indexes in one go rather than on a by record basis (less head movement on the disk).

  2. If you rebuild the index from scratch your subsequent accesses using it are likely to be faster as the index tree will be more likely to be better balanced.


You might want to consider partitioning. If you organize it so you can simply drop a partition at a time, deleting will take milliseconds. See http://msdn.microsoft.com/en-us/library/aa902650(v=sql.80).aspx

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜