DataTable and thread safety
I'm storing DataTable in ASP .NET Cache property. Operations that开发者_如何学JAVA can be done on that DataTable are:
- binding to grid control (that 3rd party grid internally manages datasource object, after postback its DataSource is NULL, I assume once data is binded it does not use datasource DataTable any more)
- removing rows from DataTable (Row.Delete()
I added basic reader/writer locks when explicitly working on that DataTable instance, but I wonder are there any other thread safety issues with that solution. I guess something might go wrong when grid control is in the middle of DataBinding, and other thread removes rows? If so, how can I sync access to that table so that no Delete method calls are made when grid control is binding? Are there any combination of events where I can put AcquireWriterLock & ReleaseWriterLock methods ?
Thanks,Pawel
If you are exposing the datatable via data-binding, then forget it; you cannot make that thread-safe. Even if you wrap the DataView
somehow (in a custom ITypedList
), that doesn't do enough - data-binding makes assumptions about the data, in particular the IList
etc - for example, that it isn't going to randomly change length in a thread-contended way in the middle of iterating the data or adding a row on the UI thread.
There is provision for changes on the same thread via events ... but not cross-threaded.
As people mentioned in other answers:
- generally you only cache immutable data (or at least: treat it as immutable once it is there). Otherwise it isn't a cache
- DO NOT expose multiple requests to common data if somebody is ever going to edit it
- If you are exposing the datatable via data-binding, then forget it; you cannot make that thread-safe
However I really needed to cache a DataTable. Reasons:
- query that gets results is really huge & time consuming. Also refresh rate of page that is making a call with that query is big. As a result db engine is becoming bussy. With caching I'll get only 1 db engine call every 2 minutes
- result of that query differs. But it's totally acceptable for the user to see the same result for, let's say, 2 minutes. So storing data in cache for 2 minutes is acceptable. Also there are no other concerns like concurrency, optimistic / pessimistic offline locks...
However when user operates on data it sees, some changes are made to data:
- that change has to be applied on db. Before implementing cache, after change to db was applied application once again issued huge query to get result with only slight differences
- now, with caching, change is applied on db and applied on cached DataTable. Then that data table is binded once again to databound control. Benefits: no need to make WCF call, fetch data using huge query + transfer datatable to web application
This is how I implemented locking for that solution:
Cached data is stored in singleton-like wrapper:
public class AllocationQueue { private static object tableSyncRoot = new object();
This is the only piece of code that modifies cached DataTable:
internal void RemoveTaskRowFromAllocationQueue(Guid queueId, Guid taskId) { var allocationQueueEntry = GetAllocationQueueEntry(queueId); var queueData = allocationQueueEntry.TaskIdIndexedView; lock(tableSyncRoot) { int rowIndex = queueData.Find(new object[] { taskId }); queueData[rowIndex].Delete(); } }
This is the only piece of code that exposed data for databinding:
public DataTable GetAllocationQueue(Guid queueId, string filter) { var allocationQueueEntry = GetAllocationQueueEntry(queueId); lock (tableSyncRoot) { var rows = allocationQueueEntry.Table.Select(filter); if (rows.Length > 0) { return rows.CopyToDataTable<DataRow>(); } } return null; }
Thread-safe & works like a charm (Am I right? :) ). But it's very specyfic to my requirements.
Here is a simple way to add rows in a thread safe manner. Where dt = my DataTable and dr = my DataRow
lock (dt.Rows.SyncRoot)
{
dt.Rows.Add(dr);
}
精彩评论