Using 'HttpContext.Current.Cache' safely
I am using Cache
in a web service method like this:
var pblDataList = (List<blabla>)HttpContext.Current.Cache.Get("pblDataList");
if (pblDataList == null)
{
var PBLData = dc.ExecuteQuery<blabla>(@"SELECT blabla");
pblDataList = PBLData.ToList();
HttpContext.Current.Cache.Add("pblDataList", pblDataList, null,
DateTime.Now.Add(new TimeSpan(0, 0, 15)),
Cache.NoSlidingExpiration, CacheItemPriority.Normal, null);
}
But I wonder, is this 开发者_运维问答code thread-safe? The web service method is called by multiple requesters. And more then one requester may attempt to retrieve data and add to the Cache
at the same time while the cache is empty.
The query takes 5 to 8 seconds. Would introducing a lock statement around this code prevent any possible conflicts? (I know that multiple queries can run simultaneously, but I want to be sure that only one query is running at a time.)
The cache object is thread-safe but HttpContext.Current
will not be available from background threads. This may or may not apply to you here, it's not obvious from your code snippet whether or not you are actually using background threads, but in case you are now or decide to at some point in the future, you should keep this in mind.
If there's any chance that you'll need to access the cache from a background thread, then use HttpRuntime.Cache instead.
In addition, although individual operations on the cache are thread-safe, sequential lookup/store operations are obviously not atomic. Whether or not you need them to be atomic depends on your particular application. If it could be a serious problem for the same query to run multiple times, i.e. if it would produce more load than your database is able to handle, or if it would be a problem for a request to return data that is immediately overwritten in the cache, then you would likely want to place a lock around the entire block of code.
However, in most cases you would really want to profile first and see whether or not this is actually a problem. Most web applications/services don't concern themselves with this aspect of caching because they are stateless and it doesn't matter if the cache gets overwritten.
You are correct. The retrieving and adding operations are not being treated as an atomic transaction. If you need to prevent the query from running multiple times, you'll need to use a lock.
(Normally this wouldn't be much of a problem, but in the case of a long running query it can be useful to relieve strain on the database.)
I believe the Add
should be thread-safe - i.e. it won't error if Add
gets called twice with the same key, but obviously the query might execute twice.
Another question, however, is is the data thread-safe. There is no guarantee that each List<blabla>
is isolated - it depends on the cache-provider. The in-memory cache provider stores the objects directly, so there is a risk of collisions if any of the threads edit the data (add/remove/swap items in the list, or change properties of one of the items). However, with a serializing provider you should be fine. Of course, this then demands that blabla
is serializable...
精彩评论