How to free up memory?
We have been facing Out of Memory errors in our App server for sometime. We see the used heap size开发者_运维问答 increasing gradually until finally it reaches the available heap in size. This happens every 3 weeks after which a server restart is needed to fix this. Upon analysis of the heap dumps we find the problem to be objects used in JSPs.
Can JSP objects be the real cause of Appserver memory issues? How do we free up JSP objects (Objects which are being instantiated using usebean or other tags)?
We have a clustered Websphere appserver with 2 nodes and an IHS.
EDIT: The findings above are based on the heap-dump and nativestderr log analysis given below using the IBM support assistant
nativestd err log analysis:
alt text http://saregos.com/wp-content/uploads/2010/03/chart.jpg
Heap dump analysis:
![alt text][2]
Heap dump analysis showing the immediate dominators (2 levels up of hastable entry in the image above)
![alt text][3]
The last image shows that the immediate dominators are in fact objects being used in JSPs.
EDIT2: More info available at http://saregos.com/?p=43
I'd first attach a profile tool to tell you what these "Objects" are that are taking up all the memory.
Eclipse has TPTP, or there is JProfiler or JProbe.
Any of these should show the object heap creaping up and allow you to inspect it to see what is on the heap.
Then search the code base to find who is creating these.
Maybe you have a cache or tree/map object with elements in and you have only implemented the "equals()" method on these objects, and you need to implement "hashcode()". This would then result in the map/cache/tree getting bigger and bigger till it falls over. This is only a guess though.
JProfiler would be my first call
Javaworld has example screen shot of what is in memory...
(source: javaworld.com)
And a screen shot of object heap building up and being cleaned up (hence the saw edge)
(source: javaworld.com)
UPDATE *************************************************
Ok, I'd look at...
http://www-01.ibm.com/support/docview.wss?uid=swg1PK38940
Heap usage increases over time which leads to an OutOfMemory condition. Analysis of a heapdump shows that the following objects are taking up an increasing amount of space:
40,543,128 [304] 47 class
com/ibm/wsspi/rasdiag/DiagnosticConfigHome 40,539,056 [56] 2 java/util/Hashtable 0xa8089170 40,539,000 [2,064] 511 array of java/util/Hashtable$Entry 6,300,888 [40] 3 java/util/Hashtable$HashtableCacheHashEntry
Triggering the garbage collection manually doesn't solve your problem - it won't free resources that are still in use.
You should use a profiling tool (like jProfiler) to find your leaks. You problably use code that stores references in lists or maps that are not released during runtime - propably static references.
If you run under the Sun 6 JVM strongly consider to use the jvisualvm program in the JDK to get an inital overview of what actually goes on inside the program. The snapshot comparison is really good to help you get further in which objects sneak in.
If Sun 6 JVM is not an option, then investigate which profiling tools you have. Trials can get you really far.
It can be something as simple as gigantic character arrays underlying a substring you are collecting in a list, for e.g. housekeeping.
I suggest reading Effective Java, chapter 2. Following it, together with a profiler, will help you identify the places where your application produces memory leaks.
Freeing up memory isn't the way to solve extensive memory consumption. The extensive memory consumption may be a result of two things:
- not properly written code - the solution is to write it properly, so that it does not consume more than is needed - Effective Java will help here.
- the application simply needs this much memory. Then you should increase the VM memory using
Xmx
,Xms
,XX:MaxHeapSize
,...
There is no specific to free up objects allocated in JSPs, at least as far as I know. Rather than investigationg such options, I'd rather focus on finding the actual problem in your application codes and fix it.
Some hints that might help:
- Check the scope of your beans. Aren't you e.g. storing something user or request specific into "application" scope (by mistake)?
- Check settings of web session timeout in your web application and appserver settings.
- You mentioned the heap consumption grows gradually. If it's indeed so, try to see by how much the heap size grows with various user scenarios: Grab a heapdump, run a test, let the session data timeout, grab another dump, compare the two. That might give you some idea where do the objects on heap come from
- Check your beans for any obvious memory leaks, for sure :)
EDIT: Checking for unreleased static resources that Daniel mentions is another worthwhile thing :)
As I understand those top-level memory-eaters are cache storage and objects stored in it. Probably you should make sure that your cache is going to free objects when it takes too much memory. You may want to use weak-ref if you need cache for live objects only.
精彩评论