开发者

Can I constrain a HashMap by the amount of memory it takes up?

I am implementing a simple cache using LinkedHashMap based on the instructions found here. I use the following code:

public class Cache extends LinkedHashMap {
  private final int capacity;

  public Cache(int capacity) {
    super(capacity + 1, 1.1f, true);
    this.capacity = capacity;
  }

  protected boolean removeEldestEntry(Entry eldest) {
    return size() > capacity;
  }
}

This is very easy. However, it simply imposes a fixed size on the map. I am running on a very small heap and depending on the size of the cached objects and my chosen capacity this could still run out of memory. The objects are arbitrary and so I can't estimate how big they might be. I don't want to depend on SoftReferences to prune the cache because 开发者_Go百科the way those are cleaned up is unreliable; it changes from VM to VM, and they might either get reclaimed too soon, or they might never get reclaimed until they fill up my heap.

Is there any way for me to monitor the size of the map and constrain based on that?


If soft/weak references are out of the question, then I see 2 (non-trivial) options:

1) Use Java instrumentation to check the actual size of the items added to the map. The instrumentation interface provides the "shallow" size of an object, and you will need more code to explore the references (and avoid counting duplicates!). Here is a solution that calculates the deep size of one object.

2) Use JMX to track the heap size after GCs, and change the map behavior when some dangerous threshold is being reached. See "notifications" section in MemoryMXBean javadoc.


The map itself contains only fixed-size entries, which contain references to the actual objects "contained" in the map. You would need to override all map-mutating methods (i.e. put(), copy-constructor, etc) to keep track of the sizes of objects referenced from the map (can you even determine how much memory a Java object takes up?). Then consider that objects you add to the cache might themselves contain references to other objects and/or collections. How deep do you go?

Take a look at http://www.javapractices.com/topic/TopicAction.do?Id=83


As others have mentioned, you can use agent instrumentation to do this. The SizeOf project provides a handy utility for this approach. This can be used with the ConcrrentLinkedHashMap's concept of weighted values, where a Weigher determines how many units of capacity a value consumes. That enables caches to properly handle collections or memory limits in addition to the traditional maximum number of entries constraint.

If you wish to bound by the heap, then there is a fork of an earlier version of ConcurrentLinkedHashMap that does this. This retains the Apache license of the original so it could be adapted for your needs since it is packaged with Voldemort.

http://sizeof.sourceforge.net/

http://code.google.com/p/concurrentlinkedhashmap/

http://github.com/Omega1/voldemort/blob/master/src/java/voldemort/store/memory/ConcurrentLinkedHashMap.java


You could wrap a Map implementation and enforce the size in the put and putAll methods.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜