How to Speed UP the Effects of Foreign Keys
We have a very normalized database. When i do adds and deletes of the data that have foreign keys to tables that have tons of data it takes FOREVER to delete the records not to mention the locks that come from it. What can i do so that foreign keys are still usable but don't affect the performance of my bulk inserts and deletes to the database. All foreign keys are indexed.
Basically foreign keys slow down my bulk data scenarios significantly. Our clients import hundreds of thousands of records a day, and we also delete large amounts of data every day. Foreign keys offer data integrity, but at the same time redu开发者_运维知识库ce the speed of these processes significantly.
Is there any way to sidestep the side affects of a normalized schema? How have you gotten around these same issues?
I am using SQL Server 2005.
You have not given enough information about the problems you are trying to solve using a database.
It sounds to me like the problem is not the foreign keys but that they are not static and/or you are cascading deletes.
Inserts may require normalization and staged loads so that the primary key rows are loaded before the rows with the foreign keys which refer to them, so there is a little more work there. Deletes (for range purges) can sometimes be improved using partitions or scheduled batches, with soft-deletes during production time.
So the question is - why is your data churning so much? What is your specific usage scenario? And is a normalized model good for this. Typically in OLTP environments, data is accretive and updating. INSERTs don't often do a lot of blocking, and DELETEs would only block operations on the same data entities involved in the delete (so you want them blocked). In an OLAP environment where data is loaded and unloaded regularly for analysis, a relational model isn't always so good, and so a dimensional model may be more appropriate, since the fact loading/unloading will generally not be blocking between different time periods.
精彩评论