You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@tomcat.apache.org by Ian Marsh <ia...@sportplan.com> on 2011/07/22 18:26:31 UTC

Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Hi,

I am in charge of running a Apache-2, Tomcat-7, Ubuntu-10.04 set up
for which we have to be PCI Compliant. We recently upgraded to
Apache-2.2.17 and Tomcat-7.0.8 (from Apache-2.0.x and Tomcat 5.0.28)
in order to comply with the requirements of the PCI Compliance checks
and ironed out any issues to get us back to a satisfactory running
state.

We have now received warning, from our PCI Compliance monitor service,
that further updates are required to remain compliant, those being to
use Apache-2.2.19 and Tomcat-7.0.14 (or higher).

When using Tomcat-7.0.8, we experience healthy memory cycling. The Old
Generation slowly increases until garbage collection runs and clears
out the Old Gen memory level, dropping considerably as expected.

So to the upgrading! Upgrading Apache has been successful and without
problem, however upgrading Tomcat has caused memory problems which, as
yet, I cannot find similar reported cases or clear possibilities for
why its happening.

I've searched post archives and web results, looked through the change
log of Tomcat and tried using tools like jmap and the YourKit monitor
to discover some answers, although I am quite inexperienced with these
tools, but unfortunately I'm not progressing so far.

I have tried running with...
- Tomcat-7.0.16 (being the latest available)
- Tomcat-7.0.14 (being the minimum required for PCI Compliance)
- Tomcat-7.0.10 (being the next available release after 7.0.8)

Tomcat-7.0.10 obviously has the fewest number changes from
Tomcat-7.0.8 but, all of these have resulted in seeing a rather sharp
rise in the Old Generation memory usage, reaching 90%+ (of the
available 1.25GB) in under an hour. Requesting forced garbage
collection gives little effect, if not none at all, and eventually the
Old Gen memory becomes full and the site stops functioning.

There have been no changes to the jvm settings, so I doubt they are
the cause, but for your information they are as follows...

-Djava.awt.headless=true
-Xmx1536m
-Xms1536m
-XX:NewSize=256m
-XX:MaxNewSize=256M
-XX:SurvivorRatio=6
-XX:ReservedCodeCacheSize=64m
-XX:MaxPermSize=512m
-XX:PermSize=512m

I have used jmap to look into what objects are being held in memory
for the problematic versions of Tomcat. For Tomcat-7.0.10, the top 20
listed entries are...

 num     #instances         #bytes  class name
----------------------------------------------
  1:       8170940      471172552  [C
  2:       8501813      272058016  java.lang.String
  3:       5388341      215533640  javax.servlet.jsp.tagext.TagAttributeInfo
  4:        581051       35164552  [Ljava.lang.Object;
  5:        170658       34508384  <constMethodKlass>
  6:        528746       33839744  javax.servlet.jsp.tagext.TagInfo
  7:        528746       31094960  [Ljavax.servlet.jsp.tagext.TagAttributeInfo;
  8:         75231       25469312  [B
  9:        170658       23223888  <methodKlass>
 10:        395025       22121400  org.apache.jasper.compiler.Mark
 11:         11737       21889840  <constantPoolKlass>
 12:        281224       20248128  org.apache.jasper.compiler.Node$TemplateText
 13:        229740       19444304  [Ljava.util.HashMap$Entry;
 14:        594062       19009984  java.util.HashMap$Entry
 15:        220856       15973856  <symbolKlass>
 16:        110747       13909808  [Ljava.lang.String;
 17:         10561       13740840  <constantPoolCacheKlass>
 18:        400243       12807776  java.util.Stack
 19:        224826       10791648  java.util.HashMap
 20:         11737       10042552  <instanceKlassKlass>

When running under Tomcat-7.0.8 after approximately a similar time of
about an hour, the top 20 entries are...

 num     #instances         #bytes  class name
----------------------------------------------
  1:        592439       86974504  [C
  2:         47888       73416688  [I
  3:        189957       39457512  <constMethodKlass>
  4:         88307       27806528  [B
  5:         13444       26059552  <constantPoolKlass>
  6:        189957       25848632  <methodKlass>
  7:        619859       19835488  java.lang.String
  8:        242174       19462792  <symbolKlass>
  9:         12260       16256800  <constantPoolCacheKlass>
 10:         13444       11624240  <instanceKlassKlass>
 11:        236662        9466480  java.math.BigDecimal
 12:        184843        7393720  javax.servlet.jsp.tagext.TagAttributeInfo
 13:         83238        5795576  [Ljava.lang.Object;
 14:         59168        5611728  [Ljava.util.HashMap$Entry;
 15:        132465        4238880  java.util.HashMap$Entry
 16:          5287        3630456  <methodDataKlass>
 17:         54218        2602464  java.util.HashMap
 18:         26369        2320472  java.lang.reflect.Method
 19:         39714        1588560  java.util.TreeMap$Entry
 20:         14143        1470872  java.lang.Class

It seems that the character arrays [C, java.lang.String and
javax.servlet.jsp.tagext.TagAttributeInfo entries are considerably
higher in Tomcat-7.0.10 than in Tomcat-7.0.8 and I am wondering if
this could lead to an explanation for the difference.

Would anyone know of any changes between the two versions, possibly
linked to those memory entries, that could lead to such behaviour?

Any help or suggestions is greatly appreciated! I'm sorry for a long
post, but hopefully its got the information needed to help diagnosis.

Thanks in advance,

Ian

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Mark Thomas <ma...@apache.org>.
On 22/07/2011 20:17, Mark Thomas wrote:
> On 22/07/2011 17:26, Ian Marsh wrote:

>> It seems that the character arrays [C, java.lang.String and
>> javax.servlet.jsp.tagext.TagAttributeInfo entries are considerably
>> higher in Tomcat-7.0.10 than in Tomcat-7.0.8 and I am wondering if
>> this could lead to an explanation for the difference.
> 
> Maybe. What you really want to look at is the GC roots for those
> objects. That will tell you what is holding on to the references. Based
> on that data I'd start looking at the arrays of TagAttributeInfo but
> that might be completely the wrong place to look.
> 
> I've just triggered a heap dump on the ASF Jira instance (running
> 7.0.19) to see what that looks like. I'll report back what I find (once
> the 4GB heap has finished downloading - it may be some time).

The Jira heap dump looks pretty much as I'd expect it to look so
whatever is going wrong is probably related to the application.

In Yourkit, view the class list, order it by retained size. That should
point you in the right direction. Filtering the classes for just those
that start with "org.apache" or "org.apache.jasper" may shed some light
as will looking at what objects are being retained by each class.

>> Would anyone know of any changes between the two versions, possibly
>> linked to those memory entries, that could lead to such behaviour?
> 
> Nothing jumped out at me from the changelog.

Looking again at the list of classes, my guess is something related to
JSPs (due to the presence of javax.servlet.jsp.tagext.TagAttributeInfo
and org.apache.jasper.compiler.Node$TemplateText). Is it possible you
have development mode enabled for 7.0.10 but disabled for 7.0.8?

This is getting to the point where I'd really need access to the heap
dump to provide any more specific information.

Mark



---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Ian Marsh <ia...@sportplan.com>.
Just for completeness on anyone reading this...

Setting development mode to false has vastly improved the performance, as is
no doubt already known by most people! My Old Generation memory now cycles
nicely, clearing out by 75% each time.

I also added the "checkInterval" parameter (at anything greater than zero)
to enable background compiling of JSPs, otherwise any changes to the JSP
files are not reflected on the website if development mode is false.

Thanks again for the help Mark... and guess what... we have since been asked
to upgrade to Tomcat-7.0.19 by our PCI auditors! Useful!

Ian


On 29 July 2011 13:09, Ian Marsh <ia...@sportplan.com> wrote:

> Bugger thanks.
>
> I looked at this but, when I did, I simply compared the two web.xml
> files between Tomcat-7.0.8 and Tomcat-7.0.10 to see if a specific
> setting for development mode was used differently, but the two files
> were exactly the same, with no development mode setting mentioned.
> Which means they are both running in development mode, but with
> different behaviours, as I have only just seen that the default
> setting for it is "true".
>
> I will try running Tomcat-7.0.10 with development mode set as false to
> see if this fixes it.
>
> Thanks again, sorry for such an oversight.
>
>
> On 29 July 2011 12:27, Mark Thomas <ma...@apache.org> wrote:
> > On 29/07/2011 09:53, Ian Marsh wrote:
> >> Ok thanks... so here's the trace of the 3 biggest
> >> org.apache.jasper.servlet.JspServletWrapper objects.
> >>
> >> I'm just showing the path of the objects that retain the biggest sizes
> >> at each nested level to save from overkill on detail. There are
> >> references to parent objects at some levels which show a larger
> >> retained size but I expect that's normal. If it would be useful to see
> >> all nested objects at each level, no problem, but here's a first
> >> look...
> >>
> >> 1)
> >> + org.apache.jasper.servlet.JspServletWrapper (1,608,752)
> >> =+ ctxt  org.apache.jasper.JspCompilationContext (1,602,384)
> >> ==+ jspCompiler  org.apache.jasper.compiler.JDTCompiler (1,601,032)
> >> ===+ pageNodes  org.apache.jasper.compiler.Node$Nodes (1,600,952)
> >
> > Which takes us back to my comment of almost a week ago. You have
> > development mode enabled which retains the pageNodes for improved error
> > messages. Disable development mode and the memory usage should fall.
> >
> > I don't recall any changes to this code bewteen 7.0.8 and 7.0.10 but to
> > be sure, I have double checked that disabling development mode does
> > indeed have the desired effect.
> >
> > To fix this:
> > - edit $CATALINA_BASE/conf/web.xml
> > - find the definition for the servlet named "jsp"
> > - add the following init parameter
> >        <init-param>
> >            <param-name>development</param-name>
> >            <param-value>false</param-value>
> >        </init-param>
> >
> > Mark
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
> > For additional commands, e-mail: users-help@tomcat.apache.org
> >
> >
>

Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Ian Marsh <ia...@sportplan.com>.
Bugger thanks.

I looked at this but, when I did, I simply compared the two web.xml
files between Tomcat-7.0.8 and Tomcat-7.0.10 to see if a specific
setting for development mode was used differently, but the two files
were exactly the same, with no development mode setting mentioned.
Which means they are both running in development mode, but with
different behaviours, as I have only just seen that the default
setting for it is "true".

I will try running Tomcat-7.0.10 with development mode set as false to
see if this fixes it.

Thanks again, sorry for such an oversight.


On 29 July 2011 12:27, Mark Thomas <ma...@apache.org> wrote:
> On 29/07/2011 09:53, Ian Marsh wrote:
>> Ok thanks... so here's the trace of the 3 biggest
>> org.apache.jasper.servlet.JspServletWrapper objects.
>>
>> I'm just showing the path of the objects that retain the biggest sizes
>> at each nested level to save from overkill on detail. There are
>> references to parent objects at some levels which show a larger
>> retained size but I expect that's normal. If it would be useful to see
>> all nested objects at each level, no problem, but here's a first
>> look...
>>
>> 1)
>> + org.apache.jasper.servlet.JspServletWrapper (1,608,752)
>> =+ ctxt  org.apache.jasper.JspCompilationContext (1,602,384)
>> ==+ jspCompiler  org.apache.jasper.compiler.JDTCompiler (1,601,032)
>> ===+ pageNodes  org.apache.jasper.compiler.Node$Nodes (1,600,952)
>
> Which takes us back to my comment of almost a week ago. You have
> development mode enabled which retains the pageNodes for improved error
> messages. Disable development mode and the memory usage should fall.
>
> I don't recall any changes to this code bewteen 7.0.8 and 7.0.10 but to
> be sure, I have double checked that disabling development mode does
> indeed have the desired effect.
>
> To fix this:
> - edit $CATALINA_BASE/conf/web.xml
> - find the definition for the servlet named "jsp"
> - add the following init parameter
>        <init-param>
>            <param-name>development</param-name>
>            <param-value>false</param-value>
>        </init-param>
>
> Mark
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
> For additional commands, e-mail: users-help@tomcat.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Mark Thomas <ma...@apache.org>.
On 29/07/2011 09:53, Ian Marsh wrote:
> Ok thanks... so here's the trace of the 3 biggest
> org.apache.jasper.servlet.JspServletWrapper objects.
> 
> I'm just showing the path of the objects that retain the biggest sizes
> at each nested level to save from overkill on detail. There are
> references to parent objects at some levels which show a larger
> retained size but I expect that's normal. If it would be useful to see
> all nested objects at each level, no problem, but here's a first
> look...
> 
> 1)
> + org.apache.jasper.servlet.JspServletWrapper (1,608,752)
> =+ ctxt  org.apache.jasper.JspCompilationContext (1,602,384)
> ==+ jspCompiler  org.apache.jasper.compiler.JDTCompiler (1,601,032)
> ===+ pageNodes  org.apache.jasper.compiler.Node$Nodes (1,600,952)

Which takes us back to my comment of almost a week ago. You have
development mode enabled which retains the pageNodes for improved error
messages. Disable development mode and the memory usage should fall.

I don't recall any changes to this code bewteen 7.0.8 and 7.0.10 but to
be sure, I have double checked that disabling development mode does
indeed have the desired effect.

To fix this:
- edit $CATALINA_BASE/conf/web.xml
- find the definition for the servlet named "jsp"
- add the following init parameter
        <init-param>
            <param-name>development</param-name>
            <param-value>false</param-value>
        </init-param>

Mark

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Ian Marsh <ia...@sportplan.com>.
Ok thanks... so here's the trace of the 3 biggest
org.apache.jasper.servlet.JspServletWrapper objects.

I'm just showing the path of the objects that retain the biggest sizes
at each nested level to save from overkill on detail. There are
references to parent objects at some levels which show a larger
retained size but I expect that's normal. If it would be useful to see
all nested objects at each level, no problem, but here's a first
look...

1)
+ org.apache.jasper.servlet.JspServletWrapper (1,608,752)
=+ ctxt  org.apache.jasper.JspCompilationContext (1,602,384)
==+ jspCompiler  org.apache.jasper.compiler.JDTCompiler (1,601,032)
===+ pageNodes  org.apache.jasper.compiler.Node$Nodes (1,600,952)
====+ root  org.apache.jasper.compiler.Node$Root (1,600,840)
=====+ body  org.apache.jasper.compiler.Node$Nodes (1,501,368)
======+ list  java.util.Vector (1,501,344)
=======+ elementData  java.lang.Object[160] (1,501,312)
========+ [0]  org.apache.jasper.compiler.Node$IncludeDirective (32,464)
=========+ body  org.apache.jasper.compiler.Node$Nodes (31,776)
==========+ list  java.util.Vector (31,752)
===========+ elementData  java.lang.Object[10] (31,720)
============+ [0]  org.apache.jasper.compiler.Node$Root (31,664)
=============+ body  org.apache.jasper.compiler.Node$Nodes (27,432)
==============+ list  java.util.Vector (27,408)
===============+ elementData  java.lang.Object[80] (27,376)
================+ [0]  org.apache.jasper.compiler.Node$PageDirective (1,480)
=================+ attrs  org.apache.jasper.util.UniqueAttributesImpl (1,176)
==================+ <class>  org.apache.jasper.util.UniqueAttributesImpl (936)
==================+ data  java.lang.String[25] (464)
==================+ qNames  java.util.HashSet (304)
==================+ length = int 5  0x00000005
==================+ pageDirective = boolean true

2)
+ org.apache.jasper.servlet.JspServletWrapper (1,534,320)
=+ ctxt  org.apache.jasper.JspCompilationContext (1,406,208)
==+ jspCompiler  org.apache.jasper.compiler.JDTCompiler (1,404,944)
===+ pageNodes  org.apache.jasper.compiler.Node$Nodes (1,404,864)
====+ root  org.apache.jasper.compiler.Node$Root (1,404,752)
=====+ body  org.apache.jasper.compiler.Node$Nodes (1,303,480)
======+ list  java.util.Vector (1,303,456)
=======+ elementData  java.lang.Object[320] (1,303,424)
========+ [0]  org.apache.jasper.compiler.Node$IncludeDirective (32,464)
=========+ body  org.apache.jasper.compiler.Node$Nodes (31,776)
==========+ list  java.util.Vector (31,752)
===========+ elementData  java.lang.Object[10] (31,720)
============+ [0]  org.apache.jasper.compiler.Node$Root (31,664)
=============+ body  org.apache.jasper.compiler.Node$Nodes (27,432)
==============+ list  java.util.Vector (27,408)
===============+ elementData  java.lang.Object[80] (27,376)
================+ [0]  org.apache.jasper.compiler.Node$PageDirective (1,480)
=================+ attrs  org.apache.jasper.util.UniqueAttributesImpl (1,176)
==================+ <class>  org.apache.jasper.util.UniqueAttributesImpl (936)
==================+ data  java.lang.String[25] (464)
==================+ qNames  java.util.HashSet (304)
==================+ length = int 5  0x00000005
==================+ pageDirective = boolean true

3)
+ org.apache.jasper.servlet.JspServletWrapper (1,454,520)
=+ ctxt  org.apache.jasper.JspCompilationContext (1,381,656)
==+ jspCompiler  org.apache.jasper.compiler.JDTCompiler (1,380,136)
===+ pageNodes  org.apache.jasper.compiler.Node$Nodes (1,380,056)
====+ root  org.apache.jasper.compiler.Node$Root (1,379,944)
=====+ body  org.apache.jasper.compiler.Node$Nodes (1,220,552)
======+ list  java.util.Vector (1,220,528)
=======+ elementData  java.lang.Object[2560] (1,220,496)
========+ [2]  org.apache.jasper.compiler.Node$IncludeDirective (22,744)
=========+ body  org.apache.jasper.compiler.Node$Nodes (22,040)
==========+ list  java.util.Vector (22,016)
===========+ elementData  java.lang.Object[10] (21,984)
============+ [0]  org.apache.jasper.compiler.Node$Root (21,928)
=============+ body  org.apache.jasper.compiler.Node$Nodes (18,680)
==============+ list  java.util.Vector (18,656)
===============+ elementData  java.lang.Object[40] (18,624)
================+ [1]  org.apache.jasper.compiler.Node$TemplateText (808)
=================+ parent  org.apache.jasper.compiler.Node$Root (21,928)
=================+ <class>  org.apache.jasper.compiler.Node$TemplateText (856)
=================+ extraSmap  java.util.ArrayList (496)
=================+ startMark  org.apache.jasper.compiler.Mark (144)
=================+ text  java.lang.String "     " (96)
=================+ beginJavaLine = int 103  0x00000067
=================+ endJavaLine = int 125  0x0000007D
=================+ isDummy = boolean false

Hopefully these are a good examples of what's being held. It is a very
similar story for the other objects, with varying sizes and array
lengths at the relevant levels etc.

There are 3,772 org.apache.jasper.servlet.JspServletWrapper objects in
total, the top 6 retain...
1) 1,608,752 bytes
2) 1,534,320 bytes
3) 1,454,520 bytes
4) 1,236,048 bytes
5) 1,176,048 bytes
6) 1,167,592 bytes

...the rest fall below 1,000,000 bytes and gradually drop in retained
size until YourKit says "and more..." after, I would guess, the top
500 objects.

One thing that has been apparent in the objects I have looked at is
that there is eventually a mention of either a <% page ... %> or <%
taglib ... %> directive. We have a common file that is included on
many other JSP pages that looks as follows...

<%@ page language="java" errorPage="/error.jsp" pageEncoding="UTF-8"
trimDirectiveWhitespaces="true"  contentType="text/html;charset=utf-8"
%>
<%@ taglib uri="/WEB-INF/struts-html.tld" prefix="html" %>
<%@ taglib uri="/WEB-INF/struts-html-el.tld" prefix="html-el" %>
<%@ taglib uri="/WEB-INF/struts-bean.tld" prefix="bean" %>
<%@ taglib uri="/WEB-INF/struts-bean-el.tld" prefix="bean-el" %>
<%@ taglib uri="/WEB-INF/struts-logic.tld" prefix="logic" %>
<%@ taglib uri="/WEB-INF/struts-logic-el.tld" prefix="logic-el" %>
<%@ taglib uri="/WEB-INF/taglibs-random.tld" prefix="rand" %>
<%@ taglib uri="/WEB-INF/c.tld" prefix="c" %>
...

Could including this file in multiple places cause an issue?

Ian




On 28 July 2011 17:46, Mark Thomas <ma...@apache.org> wrote:
> On 28/07/2011 12:29, Ian Marsh wrote:
>> Right, I have taken a memory snapshot using YourKit of the system
>> running Tomcat-7.0.10 after about 1 hour, when the Old Gen memory was
>> beginning to reach its maximum.
>
> OK. I think a little more digging is required but this might be heading
> somewhere useful.
>
>> So I looked into the "Selected Objects" of the
>> java.util.concurrent.ConcurrentHashMap$Segment[16]. This contains...
>>
>> + java.util.concurrent.ConcurrentHashMap$Segment[16] (732,561,032)
>> =+ [14]  java.util.concurrent.ConcurrentHashMap$Segment (55,193,864)
>> ==+ table  java.util.concurrent.ConcurrentHashMap$HashEntry[512] (55,193,792)
>> ===+ [16]  java.util.concurrent.ConcurrentHashMap$HashEntry (496,712)
>> ====+ value  org.apache.jasper.servlet.JspServletWrapper (496,448)
>
> Filter for objects of org.apache.jasper.servlet.JspServletWrapper and
> trace where the bulk of the retained size is for the few largest. That
> should move things forward a little.
>
> The level of detail in here is about right. Enough to see where to go
> next, not so much it takes ages to analyse.
>
> Mark
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
> For additional commands, e-mail: users-help@tomcat.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Mark Thomas <ma...@apache.org>.
On 28/07/2011 12:29, Ian Marsh wrote:
> Right, I have taken a memory snapshot using YourKit of the system
> running Tomcat-7.0.10 after about 1 hour, when the Old Gen memory was
> beginning to reach its maximum.

OK. I think a little more digging is required but this might be heading
somewhere useful.

> So I looked into the "Selected Objects" of the
> java.util.concurrent.ConcurrentHashMap$Segment[16]. This contains...
> 
> + java.util.concurrent.ConcurrentHashMap$Segment[16] (732,561,032)
> =+ [14]  java.util.concurrent.ConcurrentHashMap$Segment (55,193,864)
> ==+ table  java.util.concurrent.ConcurrentHashMap$HashEntry[512] (55,193,792)
> ===+ [16]  java.util.concurrent.ConcurrentHashMap$HashEntry (496,712)
> ====+ value  org.apache.jasper.servlet.JspServletWrapper (496,448)

Filter for objects of org.apache.jasper.servlet.JspServletWrapper and
trace where the bulk of the retained size is for the few largest. That
should move things forward a little.

The level of detail in here is about right. Enough to see where to go
next, not so much it takes ages to analyse.

Mark

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Ian Marsh <ia...@sportplan.com>.
Right, I have taken a memory snapshot using YourKit of the system
running Tomcat-7.0.10 after about 1 hour, when the Old Gen memory was
beginning to reach its maximum.

I am not completely sure what information is useful for you to know as
I have not used YourKit before so I am working from the demos and
support docs that YourKit provide, hopefully they're the right things
to be looking at. If there's anything missing or this isn't helpful
information, I apologise in advance as I don't mean to waste anyone's
time.

So... ordering the "Class List" by retained size, in bytes, gives the
top entries as...

java.util.concurrent.ConcurrentHashMap$Segment (retained size: 755,981,384)
java.util.concurrent.ConcurrentHashMap$HashEntry (retained size: 735,577,824)
java.util.concurrent.ConcurrentHashMap$Segment[] (retained size: 753,281,704)
...

The top entries when restricted to match "org.apache" are...
org.apache.jasper.servlet.JspServlet (retained size: 732,632,608)
org.apache.jasper.compiler.JspRuntimeContext (retained size: 732,613,072)
org.apache.jasper.servlet.JspServletWrapper (retained size: 714,078,800)
...

Looking at the "Biggest objects (dominators)" section shows a top entry of...
org.apache.jasper.servlet.JspServlet (retained size: 732,615,184)

Which is considerably bigger than the next in the list, at 12,235,872 bytes.

Expanding this top entry in "Biggest objects (dominators)" shows...
+ org.apache.jasper.servlet.JspServlet (732,615,184)
=+ org.apache.jasper.compiler.JspRuntimeContext (732,596,248)
==+ java.util.concurrent.ConcurrentHashMap (732,561,080)
===+ java.util.concurrent.ConcurrentHashMap$Segment[16] (732,561,032)
====+ java.util.concurrent.ConcurrentHashMap$Segment (55,193,864)
====+ java.util.concurrent.ConcurrentHashMap$Segment (49,947,008)
====+ .... (the rest of the 16 map entries, of similar size)

So I looked into the "Selected Objects" of the
java.util.concurrent.ConcurrentHashMap$Segment[16]. This contains...

+ java.util.concurrent.ConcurrentHashMap$Segment[16] (732,561,032)
=+ [14]  java.util.concurrent.ConcurrentHashMap$Segment (55,193,864)
==+ table  java.util.concurrent.ConcurrentHashMap$HashEntry[512] (55,193,792)
===+ [16]  java.util.concurrent.ConcurrentHashMap$HashEntry (496,712)
====+ value  org.apache.jasper.servlet.JspServletWrapper (496,448)
====+ key  java.lang.String "/directory-name/file-name.jsp" (32)
====+ hash = int -457695728  0xE4B81E10
===+ [1]  java.util.concurrent.ConcurrentHashMap$HashEntry (96,912)
===+ [13]  java.util.concurrent.ConcurrentHashMap$HashEntry (27,488)
===+ ... (the rest of the 512 map entries)
==+ sync  java.util.concurrent.locks.ReentrantLock$NonfairSync (32)
==+ count = int 248  0x000000F8
==+ loadFactor = float 0.75
==+ modCount = int 248  0x000000F8
==+ threshold = int 384  0x00000180
=+ [9]  java.util.concurrent.ConcurrentHashMap$Segment (49,947,008)
=+ [13]  java.util.concurrent.ConcurrentHashMap$Segment (49,717,568)
=+ ... (the rest of the 16 map entries)

I don't want this to become any more unreadable than it might already
be so maybe this is a reasonable point to stop at to see if I'm even
going down the right route.

>From what I can make out simply, there is a large HashMap (~700MB)
holding references to JspServletWrappers indexed by the JSP page uri,
which garbage collection has little impact on. I have looked further
into the JspServletWrapper, would this information be useful for
anyone? It did not reveal much to my limited knowledge! Or is this
enough to show that it is very much looking like something within our
JSPs that's causing the issues rather than a difference between the
versions of Tomcat?

Ian




On 26 July 2011 10:52, Mark Thomas <ma...@apache.org> wrote:
> On 26/07/2011 10:43, Ian Marsh wrote:
>> Unfortunately the conf changes to fork the compilation of JSPs and the
>> increased 'modificationTestInterval' value made no real improvement so
>> I am continuing to work towards replacing the language text request
>> scope variables with property files references.
>>
>> However, if this is a problem, this has made me concerned that all
>> request scope variables aren't being released, not just the ones we
>> use for language text!
>
> Again, use a profiler and follow the gc roots. Until you provide some
> evidence that Tomcat is doing something wrong, the assumption is going
> to be that the problem is with the application.
>
> Mark
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
> For additional commands, e-mail: users-help@tomcat.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Mark Thomas <ma...@apache.org>.
On 26/07/2011 10:43, Ian Marsh wrote:
> Unfortunately the conf changes to fork the compilation of JSPs and the
> increased 'modificationTestInterval' value made no real improvement so
> I am continuing to work towards replacing the language text request
> scope variables with property files references.
> 
> However, if this is a problem, this has made me concerned that all
> request scope variables aren't being released, not just the ones we
> use for language text!

Again, use a profiler and follow the gc roots. Until you provide some
evidence that Tomcat is doing something wrong, the assumption is going
to be that the problem is with the application.

Mark



---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Ian Marsh <ia...@sportplan.com>.
Unfortunately the conf changes to fork the compilation of JSPs and the
increased 'modificationTestInterval' value made no real improvement so
I am continuing to work towards replacing the language text request
scope variables with property files references.

However, if this is a problem, this has made me concerned that all
request scope variables aren't being released, not just the ones we
use for language text!

Ian


On 25 July 2011 10:45, Ian Marsh <ia...@sportplan.com> wrote:
> Good morning and thanks for the replies!
>
> Unfortunately, as far as I am aware, our bank determines who we must
> use as the PCI Auditor so our hands are tied with that one, but it
> does seem like they're just covering their backs by choosing the
> latest available version rather than actually determining a reason as
> to why its needed.
>
> However, thanks for looking into my problem... the first thing that
> rang a bell was the mention that it could be to do with the JSPs. I
> have checked for development mode settings and neither have it
> enabled. The bell it rang though was that, for some slightly silly
> reasons that I am planning on finally clearing up today to see if they
> are the cause, we have, in certain pages of the site, roughly 1000
> <c:set> tags of text for different languages set in request scope. I
> know these should be implemented through properties files instead but
> the excuses are for me to deal with not you guys!
>
> Could this be link to the cause though? Could it somehow be that the
> text variables are being held or referenced for longer than the
> request scope period and so quickly eating up memory?
>
> I also remembered some changes we made to the tomcat settings in
> web.xml to fork the compilation of JSPs to a separate JVM and we had
> set the 'modificationTestInterval' to 40 seconds. These may have had
> an impact as well so I'll change these in the later versions of Tomcat
> to see if they have any effect.
>
> Thanks again,
>
> Ian
>
>
>
>
> On 22 July 2011 22:42, Pid <pi...@pidster.com> wrote:
>> On 22/07/2011 20:17, Mark Thomas wrote:
>>> On 22/07/2011 17:26, Ian Marsh wrote:
>>>> Hi,
>>>>
>>>> I am in charge of running a Apache-2, Tomcat-7, Ubuntu-10.04 set up
>>>> for which we have to be PCI Compliant. We recently upgraded to
>>>> Apache-2.2.17 and Tomcat-7.0.8 (from Apache-2.0.x and Tomcat 5.0.28)
>>>> in order to comply with the requirements of the PCI Compliance checks
>>>> and ironed out any issues to get us back to a satisfactory running
>>>> state.
>>>
>>> Hmm. I think you need some better PCI auditors. If your app was running
>>> on Tomcat 5.0.x and you trust the app (which seems reasonable given it
>>> is doing something that requires PCI compliance) then an upgrade to
>>> 7.0.12 should be sufficient if you using the HTTP BIO connector.
>>
>> Indeed.
>>
>> In my experience, I'd expect a QSA/PCI Auditor to be far, far more
>> conservative than to promote Tomcat 7.0.x as a 'safe' version compared
>> to 6.0.recent.
>>
>>
>> p
>>
>>
>>> Since Tomcat appears to behind httpd then there is a strong chance you
>>> are using AJP (BIO or APR), in which case 7.0.2 should be sufficient.
>>>
>>> It appears your current auditors are blindly (and wrongly) assuming any
>>> vulnerability in Tomcat will impact your installation. Expect a demand
>>> to upgrade to 7.0.19 when they get around to reading the Tomcat security
>>> pages again.
>>>
>>> <snip/>
>>>
>>>> It seems that the character arrays [C, java.lang.String and
>>>> javax.servlet.jsp.tagext.TagAttributeInfo entries are considerably
>>>> higher in Tomcat-7.0.10 than in Tomcat-7.0.8 and I am wondering if
>>>> this could lead to an explanation for the difference.
>>>
>>> Maybe. What you really want to look at is the GC roots for those
>>> objects. That will tell you what is holding on to the references. Based
>>> on that data I'd start looking at the arrays of TagAttributeInfo but
>>> that might be completely the wrong place to look.
>>>
>>> I've just triggered a heap dump on the ASF Jira instance (running
>>> 7.0.19) to see what that looks like. I'll report back what I find (once
>>> the 4GB heap has finished downloading - it may be some time).
>>>
>>>> Would anyone know of any changes between the two versions, possibly
>>>> linked to those memory entries, that could lead to such behaviour?
>>>
>>> Nothing jumped out at me from the changelog.
>>>
>>>> Any help or suggestions is greatly appreciated! I'm sorry for a long
>>>> post, but hopefully its got the information needed to help diagnosis.
>>>
>>> To be honest, there isn't enough info hear to diagnose the root cause
>>> but there is enough to demonstrate that there is probably a problem and
>>> maybe where to start looking. That might not seem like much but it is a
>>> heck of a lot better than most of the reports we get here. Thanks for
>>> providing such a useful problem report.
>>>
>>> Mark
>>>
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
>>> For additional commands, e-mail: users-help@tomcat.apache.org
>>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
>> For additional commands, e-mail: users-help@tomcat.apache.org
>>
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Ian Marsh <ia...@sportplan.com>.
Good morning and thanks for the replies!

Unfortunately, as far as I am aware, our bank determines who we must
use as the PCI Auditor so our hands are tied with that one, but it
does seem like they're just covering their backs by choosing the
latest available version rather than actually determining a reason as
to why its needed.

However, thanks for looking into my problem... the first thing that
rang a bell was the mention that it could be to do with the JSPs. I
have checked for development mode settings and neither have it
enabled. The bell it rang though was that, for some slightly silly
reasons that I am planning on finally clearing up today to see if they
are the cause, we have, in certain pages of the site, roughly 1000
<c:set> tags of text for different languages set in request scope. I
know these should be implemented through properties files instead but
the excuses are for me to deal with not you guys!

Could this be link to the cause though? Could it somehow be that the
text variables are being held or referenced for longer than the
request scope period and so quickly eating up memory?

I also remembered some changes we made to the tomcat settings in
web.xml to fork the compilation of JSPs to a separate JVM and we had
set the 'modificationTestInterval' to 40 seconds. These may have had
an impact as well so I'll change these in the later versions of Tomcat
to see if they have any effect.

Thanks again,

Ian




On 22 July 2011 22:42, Pid <pi...@pidster.com> wrote:
> On 22/07/2011 20:17, Mark Thomas wrote:
>> On 22/07/2011 17:26, Ian Marsh wrote:
>>> Hi,
>>>
>>> I am in charge of running a Apache-2, Tomcat-7, Ubuntu-10.04 set up
>>> for which we have to be PCI Compliant. We recently upgraded to
>>> Apache-2.2.17 and Tomcat-7.0.8 (from Apache-2.0.x and Tomcat 5.0.28)
>>> in order to comply with the requirements of the PCI Compliance checks
>>> and ironed out any issues to get us back to a satisfactory running
>>> state.
>>
>> Hmm. I think you need some better PCI auditors. If your app was running
>> on Tomcat 5.0.x and you trust the app (which seems reasonable given it
>> is doing something that requires PCI compliance) then an upgrade to
>> 7.0.12 should be sufficient if you using the HTTP BIO connector.
>
> Indeed.
>
> In my experience, I'd expect a QSA/PCI Auditor to be far, far more
> conservative than to promote Tomcat 7.0.x as a 'safe' version compared
> to 6.0.recent.
>
>
> p
>
>
>> Since Tomcat appears to behind httpd then there is a strong chance you
>> are using AJP (BIO or APR), in which case 7.0.2 should be sufficient.
>>
>> It appears your current auditors are blindly (and wrongly) assuming any
>> vulnerability in Tomcat will impact your installation. Expect a demand
>> to upgrade to 7.0.19 when they get around to reading the Tomcat security
>> pages again.
>>
>> <snip/>
>>
>>> It seems that the character arrays [C, java.lang.String and
>>> javax.servlet.jsp.tagext.TagAttributeInfo entries are considerably
>>> higher in Tomcat-7.0.10 than in Tomcat-7.0.8 and I am wondering if
>>> this could lead to an explanation for the difference.
>>
>> Maybe. What you really want to look at is the GC roots for those
>> objects. That will tell you what is holding on to the references. Based
>> on that data I'd start looking at the arrays of TagAttributeInfo but
>> that might be completely the wrong place to look.
>>
>> I've just triggered a heap dump on the ASF Jira instance (running
>> 7.0.19) to see what that looks like. I'll report back what I find (once
>> the 4GB heap has finished downloading - it may be some time).
>>
>>> Would anyone know of any changes between the two versions, possibly
>>> linked to those memory entries, that could lead to such behaviour?
>>
>> Nothing jumped out at me from the changelog.
>>
>>> Any help or suggestions is greatly appreciated! I'm sorry for a long
>>> post, but hopefully its got the information needed to help diagnosis.
>>
>> To be honest, there isn't enough info hear to diagnose the root cause
>> but there is enough to demonstrate that there is probably a problem and
>> maybe where to start looking. That might not seem like much but it is a
>> heck of a lot better than most of the reports we get here. Thanks for
>> providing such a useful problem report.
>>
>> Mark
>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
>> For additional commands, e-mail: users-help@tomcat.apache.org
>>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
> For additional commands, e-mail: users-help@tomcat.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Pid <pi...@pidster.com>.
On 22/07/2011 20:17, Mark Thomas wrote:
> On 22/07/2011 17:26, Ian Marsh wrote:
>> Hi,
>>
>> I am in charge of running a Apache-2, Tomcat-7, Ubuntu-10.04 set up
>> for which we have to be PCI Compliant. We recently upgraded to
>> Apache-2.2.17 and Tomcat-7.0.8 (from Apache-2.0.x and Tomcat 5.0.28)
>> in order to comply with the requirements of the PCI Compliance checks
>> and ironed out any issues to get us back to a satisfactory running
>> state.
> 
> Hmm. I think you need some better PCI auditors. If your app was running
> on Tomcat 5.0.x and you trust the app (which seems reasonable given it
> is doing something that requires PCI compliance) then an upgrade to
> 7.0.12 should be sufficient if you using the HTTP BIO connector.

Indeed.

In my experience, I'd expect a QSA/PCI Auditor to be far, far more
conservative than to promote Tomcat 7.0.x as a 'safe' version compared
to 6.0.recent.


p


> Since Tomcat appears to behind httpd then there is a strong chance you
> are using AJP (BIO or APR), in which case 7.0.2 should be sufficient.
> 
> It appears your current auditors are blindly (and wrongly) assuming any
> vulnerability in Tomcat will impact your installation. Expect a demand
> to upgrade to 7.0.19 when they get around to reading the Tomcat security
> pages again.
> 
> <snip/>
> 
>> It seems that the character arrays [C, java.lang.String and
>> javax.servlet.jsp.tagext.TagAttributeInfo entries are considerably
>> higher in Tomcat-7.0.10 than in Tomcat-7.0.8 and I am wondering if
>> this could lead to an explanation for the difference.
> 
> Maybe. What you really want to look at is the GC roots for those
> objects. That will tell you what is holding on to the references. Based
> on that data I'd start looking at the arrays of TagAttributeInfo but
> that might be completely the wrong place to look.
> 
> I've just triggered a heap dump on the ASF Jira instance (running
> 7.0.19) to see what that looks like. I'll report back what I find (once
> the 4GB heap has finished downloading - it may be some time).
> 
>> Would anyone know of any changes between the two versions, possibly
>> linked to those memory entries, that could lead to such behaviour?
> 
> Nothing jumped out at me from the changelog.
> 
>> Any help or suggestions is greatly appreciated! I'm sorry for a long
>> post, but hopefully its got the information needed to help diagnosis.
> 
> To be honest, there isn't enough info hear to diagnose the root cause
> but there is enough to demonstrate that there is probably a problem and
> maybe where to start looking. That might not seem like much but it is a
> heck of a lot better than most of the reports we get here. Thanks for
> providing such a useful problem report.
> 
> Mark
> 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
> For additional commands, e-mail: users-help@tomcat.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org


Re: Upgrading from Tomcat 7.0.8 to 7.0.10 and higher causes Old Generation memory problems

Posted by Mark Thomas <ma...@apache.org>.
On 22/07/2011 17:26, Ian Marsh wrote:
> Hi,
> 
> I am in charge of running a Apache-2, Tomcat-7, Ubuntu-10.04 set up
> for which we have to be PCI Compliant. We recently upgraded to
> Apache-2.2.17 and Tomcat-7.0.8 (from Apache-2.0.x and Tomcat 5.0.28)
> in order to comply with the requirements of the PCI Compliance checks
> and ironed out any issues to get us back to a satisfactory running
> state.

Hmm. I think you need some better PCI auditors. If your app was running
on Tomcat 5.0.x and you trust the app (which seems reasonable given it
is doing something that requires PCI compliance) then an upgrade to
7.0.12 should be sufficient if you using the HTTP BIO connector.

Since Tomcat appears to behind httpd then there is a strong chance you
are using AJP (BIO or APR), in which case 7.0.2 should be sufficient.

It appears your current auditors are blindly (and wrongly) assuming any
vulnerability in Tomcat will impact your installation. Expect a demand
to upgrade to 7.0.19 when they get around to reading the Tomcat security
pages again.

<snip/>

> It seems that the character arrays [C, java.lang.String and
> javax.servlet.jsp.tagext.TagAttributeInfo entries are considerably
> higher in Tomcat-7.0.10 than in Tomcat-7.0.8 and I am wondering if
> this could lead to an explanation for the difference.

Maybe. What you really want to look at is the GC roots for those
objects. That will tell you what is holding on to the references. Based
on that data I'd start looking at the arrays of TagAttributeInfo but
that might be completely the wrong place to look.

I've just triggered a heap dump on the ASF Jira instance (running
7.0.19) to see what that looks like. I'll report back what I find (once
the 4GB heap has finished downloading - it may be some time).

> Would anyone know of any changes between the two versions, possibly
> linked to those memory entries, that could lead to such behaviour?

Nothing jumped out at me from the changelog.

> Any help or suggestions is greatly appreciated! I'm sorry for a long
> post, but hopefully its got the information needed to help diagnosis.

To be honest, there isn't enough info hear to diagnose the root cause
but there is enough to demonstrate that there is probably a problem and
maybe where to start looking. That might not seem like much but it is a
heck of a lot better than most of the reports we get here. Thanks for
providing such a useful problem report.

Mark



---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@tomcat.apache.org
For additional commands, e-mail: users-help@tomcat.apache.org