开发者

java OutOfMemory problem - heap dump 800 Mb smaller than max heap configured

I have a web application deployed in Oracle App Server 10.1.3, in an oc4j started with 1Gb initial heap and 2 Gb max heap, on an RHEL on 32 bit, configured to see 32 Gb of RAM. Lately I have experienced OutOfMemory errors, so I configured the app to create heap dumps on OutOfMem. So I have 4-5 heap dumps, each no more than 1.2 Gb in size (so 800 Mb smaller than max heap size). Also, doing a free on the machine at average hours show about 20Gb of free ram.

Does this mean that the application tries to allot 800 Mb in one go? Or if there are 2 or more threads that try to allot memory at the same time, they both fail, even if let's say there is memory for each one, but not for the sum of both? Could there be a pb with the linux machine, maybe it cannot give memory to java? Could the memory be fragmented, maybe the configuration that allows the 32 开发者_Go百科bit machine to see 32 Gb of ram has a pb?

(I should mention that the application didn't change lately, but on that machine a new oc4j and a new application was deployed laely, and that eats 1-2g of ram)


In most 32 bit machines(including most flavors of linux) the max memory your process can allocate is around 2G. Now if you say that your heap is taking 1.2G, then in worst case I would assume that your perm gen is eating the rest 800M.Try setting -XX:MaxPermSize=200M and check.


I think your problem is that you allocate 1G-2G heap size for the whole App server. It consumes some memory by itself, not sure how much. But if you start the App server with max 2G memory you will have definetly less than 2G available for your webapp.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜