different hashtable cacheItem with similar data values or separate cacheItems for each data value – which is an efficient approach?
I have broadly two different classes of data caching requirements based on data size: 1) very small data (2-30 characters) – this includes such things as the type code for a given entityId. The system is based upon the concept of parent-child entity hierarchy and actions are authorized against values that are built in combination with entity type code. Caching these type codes for different entities saves time on db fetch. 2) medium/Large data – This is general data like products description and pages.
I'm confused as to which approach is better suited for first class of data. I can cache it like this:
开发者_StackOverflowHttpRuntime.Cache.Insert("typeCode" + entityId, entityTypeCode);
or like this:
Dictionary<int, string> etCodes =
(Dictionary<int, string>)HttpRuntime.Cache["typeCode"];
etCodes[entityId] = entityTypeCode;
Clearly, In the second approach, I'm saving on unnecessary cache items for each entityId. or, having Cache object populated with several items of such small size is okay.
Which of these approachs is good in terms of performance and overhead?
Personally I would take your second approach of one single object and use a custom object instead of a Dictionary
.
This would enable me to later control more aspects like expiration of items within the object or changing the implementation.
I would do it similar to this:
public class MyCacheObject
{
public static MyCacheObject
{
get
{
// ...Omitted locking here for simplification...
var o = HttpRuntime.Cache["MyCacheObject] as MyCacheObject;
if ( o = null )
{
o = new MyCacheObject();
HttpRuntime.Cache["MyCacheObject] = o;
}
return o;
}
}
public object GetEntity( string id, string code )
{
// ...
}
public void SetEntity( object entity, string id, string code )
{
// ...
}
// ...
}
If you have a custome base class for the entities, the GetEntity
and SetEntity
methods could be optimized further.
精彩评论