You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@geronimo.apache.org by boes <gj...@xs4all.nl> on 2010/10/01 17:59:08 UTC
Memory leak using Geronimo 2.1.4? Having 116760 instances of
LifecycleEvent
I'm using Geronimo 2.1.4 and yesterday one of our production servers ran into
an OutOfMemory error after running fine for at least a month. I had to kill
the geronimo java process as it did not react anymore. For another customer
I have the same setup running more than a month now. Today I made a heap
dump of the more-than-a-month-running geronimo with jmap and analyzed the
heap dump with jhat. Here I show the top of the 'Instance Counts for All
Classes (excluding platform)':
116760 instances of class org.apache.catalina.LifecycleEvent
116760 instances of class [Lorg.apache.catalina.Container;
8000 instances of class
org.apache.geronimo.gbean.runtime.ReflectionMethodInvoker
6651 instances of class org.apache.geronimo.gbean.runtime.GBeanAttribute
6387 instances of class org.apache.geronimo.gbean.GAttributeInfo
5288 instances of class
org.apache.geronimo.connector.outbound.connectiontracking.SharedConnectorInstanceContext
3634 instances of class org.apache.geronimo.kernel.repository.Artifact
3380 instances of class org.apache.geronimo.gbean.GOperationSignature
3233 instances of class org.apache.geronimo.gbean.runtime.GBeanOperation
3182 instances of class org.apache.geronimo.kernel.repository.Version
2726 instances of class org.apache.geronimo.gbean.GOperationInfo
2639 instances of class org.apache.tomcat.util.http.mapper.Mapper$Wrapper
2210 instances of class org.apache.geronimo.gbean.AbstractNameQuery
1719 instances of class org.apache.tomcat.util.buf.ByteChunk
1680 instances of class
edu.emory.mathcs.backport.java.util.concurrent.CopyOnWriteArrayList$COWIterator
1670 instances of class org.apache.tomcat.util.buf.CharChunk
1611 instances of class org.apache.tomcat.util.buf.MessageBytes
1510 instances of class org.apache.log4j.CategoryKey
1217 instances of class org.apache.catalina.util.LifecycleSupport
1217 instances of class [Lorg.apache.catalina.LifecycleListener;
1200 instances of class [Lorg.apache.catalina.Session;
If I choose the jhat list 'Instance Counts for All Classes (including
platform)' the top shows:
195159 instances of class java.util.HashMap$Entry
170188 instances of class java.lang.String
135287 instances of class [C
116793 instances of class java.util.HashMap$ValueIterator
116760 instances of class org.apache.catalina.LifecycleEvent
116760 instances of class [Lorg.apache.catalina.Container;
I need help. How can I debug this problem? Is it a Geronimo or a Tomcat
issue? What part of Geronimo or Tomcat generates all these LifecycleEvents?
Can it be a result of some programming errors in our application? I looked
in the heap dump for classes from our own application and the first in the
list only has 208 instances.
Any ideas on what might be the cause of this huge number of instances in the
heap?
Thanks for the help
--
View this message in context: http://apache-geronimo.328035.n3.nabble.com/Memory-leak-using-Geronimo-2-1-4-Having-116760-instances-of-LifecycleEvent-tp1615458p1615458.html
Sent from the Users mailing list archive at Nabble.com.
Re: Memory leak using Geronimo 2.1.4? Having 116760 instances of
LifecycleEvent
Posted by boes <gj...@xs4all.nl>.
I used YourKit to find out that the large amount of LifecycleEvent instances
is normal behaviour of Geronimo Tomcat. It shows that almost every 2 seconds
558 of such LifecycleEvents are generated and when there are something like
200000 of them, the garbage collector removes them from the heap.
The out of memory problem had nothing to do with LifecycleEvents. I created
some heavy load on the server with the use of Apache JMeter and was able to
reproduce the out of memory problem.
As suggested by Kevan I used Eclipse Mat to analyze the heap. I found that a
few instances of org.apache.jasper.runtime.BodyContentImpl where responsible
for holding almost all memory in the heap. This is a well known Tomcat
issue. For some reason the default behaviour of tomcat is to have unlimited
buffer growth for this class. To limit the buffer growth a java option must
be set:
org.apache.jasper.runtime.BodyContentImpl.LIMIT_BUFFER=true.
After that config change the JMeter test ran without any problem, so it
looks like the LIMIT_BUFFER setting did the trick.
--
View this message in context: http://apache-geronimo.328035.n3.nabble.com/Memory-leak-using-Geronimo-2-1-4-Having-116760-instances-of-LifecycleEvent-tp1615458p1642725.html
Sent from the Users mailing list archive at Nabble.com.
Re: Memory leak using Geronimo 2.1.4? Having 116760 instances of LifecycleEvent
Posted by Kevan Miller <ke...@gmail.com>.
On Oct 1, 2010, at 11:59 AM, boes wrote:
> I need help. How can I debug this problem?
You've got a good start -- a heap dump that shows a number of LifecycleEvent and Container instances. They should definitely be investigated.
I've found YourKit to be very helpful in analyzing heapdumps. I've never used jhat, but have used Eclipse MAT, which should do the job, also...
Using the tool, you want to identify the GC roots of the LifeCycleEvent and/or Container objects. By GC Root, I mean the chain of references which are preventing the objects from being GC'ed.
> Is it a Geronimo or a Tomcat
> issue? What part of Geronimo or Tomcat generates all these LifecycleEvents?
> Can it be a result of some programming errors in our application? I looked
> in the heap dump for classes from our own application and the first in the
> list only has 208 instances.
I don't recall seeing a similar problem. So, can't predict where the problem is, yet. What are you doing with the server. Are you deploying/redeploying many applications? Are you stopping/starting modules?
LIfecycleEvents get generated when Tomcat components change state (start, stop, etc). I think I recall a timer-based lifecycle event, also.
Given the number of Container[] objects, I would say they must be Container events... If you can inspect the contents of the LifeCycleEvents, there should be a type field which will tell you what type of events are being generated.
--kevan