You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@cayenne.apache.org by Joe Baldwin <jf...@earthlink.net> on 2009/08/13 18:22:57 UTC

Memory Management Practices

Background:

I have been attempting to do as much performance tuning as I can given  
the visibility of the middleware I am using, but am running into  
severe "out of memory" errors with Tomcat on my production server.  My  
current theory is that I may have missed something concerning how to  
properly maintain my Cayenne data objects.

Configuration:
1. I am using the most recent version of Cayenne.
2. This is primarily a web app, using JSP, Tomcat, Cayenne and MySQL
3. Tested both on OSX and Linux
4. Essentially, it is a webstore with a lot of products
	- small amount of UPDATES
	- very large amount of searches resulting in result sets that will  
average between 50-500 products for read-only
	- occasional large result sets of a few thousand products for read-only
5. The design is a simple 3-tier.

After a few hours to a few days of what appears to be very light  
usage, Tomcat reports "out of memory" errors.  Since the vast majority  
of the tasks performed by the app are funneled through Cayenne, I am  
assuming that is where I am making the mistake.

Question:
What is the best way to manage the data objects in this  
configuration?   (i.e. caching, releasing memory, etc)

Note:
I was researching the Java Library ArrayList (which I believe is the  
work-horse for the Cayenne result sets), and there are many  
recommendations concerning custom clearing of the list and releasing  
memory.   Is this possibly where I am may have missed something?

Thanks,
Joe


Re: Memory Management Practices

Posted by Malcolm Edgar <ma...@gmail.com>.
With web applications I don't recommend having the DataContext stored
in the users session.  I think you are better off having a new
DataContext created for each request.  This will enable your
DataContext to be GC'd after each request.

My experience is that creating DataContext's is very cheap. I can
provide an good DataContextFilter we use in the Apache Click project
if you want.  This is possibly something we could include in Apache
Cayenne as an alternative Filter.

regards Malcolm Edgar

On Fri, Aug 14, 2009 at 12:23 PM, Joe Baldwin<jf...@earthlink.net> wrote:
> Then I don't think these are viable options, my dev server uses Java 1.5.
>
> What I was hoping for is sort of a simple how-to on best practices when
> cleaning up after a large query.  Especially when there are many sessions
> anticipated.
>
>
>
>
> On Aug 13, 2009, at 4:45 PM, Michael Gentry wrote:
>
>> FWIW, jmap -dump is only on Java 1.6, not 1.5.
>>
>> mrg
>>
>>
>> On Thu, Aug 13, 2009 at 3:29 PM, Tore Halset<ha...@pvv.ntnu.no> wrote:
>>>
>>> Hello.
>>>
>>> It is hard to tell where the memory problems are without looking at the
>>> actual used memory. I normally use jmap to dump memory info and then jhat
>>> on
>>> a different computer to analyze the dump.
>>>
>>> jmap -dump:live,file=filename pid
>>> jhat -J-Xmx10G filename
>>>
>>> Depending on your heap size, jhat may need a lot of memory and cpu. That
>>> is
>>> why I normally copy the file to a separate non-production server.
>>>
>>> If your memory are filled with cayenne DataRows, then you should look at
>>> the
>>> size of the datarow cache. Both DataRows and CayenneDataObject use Maps
>>> that
>>> need quite a lot of memory, so you should not have too many.
>>>
>>> Regards,
>>>  - Tore.
>>>
>>> On Aug 13, 2009, at 6:22 PM, Joe Baldwin wrote:
>>>
>>>> Background:
>>>>
>>>> I have been attempting to do as much performance tuning as I can given
>>>> the
>>>> visibility of the middleware I am using, but am running into severe "out
>>>> of
>>>> memory" errors with Tomcat on my production server.  My current theory
>>>> is
>>>> that I may have missed something concerning how to properly maintain my
>>>> Cayenne data objects.
>>>>
>>>> Configuration:
>>>> 1. I am using the most recent version of Cayenne.
>>>> 2. This is primarily a web app, using JSP, Tomcat, Cayenne and MySQL
>>>> 3. Tested both on OSX and Linux
>>>> 4. Essentially, it is a webstore with a lot of products
>>>>       - small amount of UPDATES
>>>>       - very large amount of searches resulting in result sets that will
>>>> average between 50-500 products for read-only
>>>>       - occasional large result sets of a few thousand products for
>>>> read-only
>>>> 5. The design is a simple 3-tier.
>>>>
>>>> After a few hours to a few days of what appears to be very light usage,
>>>> Tomcat reports "out of memory" errors.  Since the vast majority of the
>>>> tasks
>>>> performed by the app are funneled through Cayenne, I am assuming that is
>>>> where I am making the mistake.
>>>>
>>>> Question:
>>>> What is the best way to manage the data objects in this configuration?
>>>> (i.e. caching, releasing memory, etc)
>>>>
>>>> Note:
>>>> I was researching the Java Library ArrayList (which I believe is the
>>>> work-horse for the Cayenne result sets), and there are many
>>>> recommendations
>>>> concerning custom clearing of the list and releasing memory.   Is this
>>>> possibly where I am may have missed something?
>>>>
>>>> Thanks,
>>>> Joe
>>>>
>>>>
>>>
>>>
>
>

Re: Memory Management Practices

Posted by Joe Baldwin <jf...@earthlink.net>.
Then I don't think these are viable options, my dev server uses Java  
1.5.

What I was hoping for is sort of a simple how-to on best practices  
when cleaning up after a large query.  Especially when there are many  
sessions anticipated.




On Aug 13, 2009, at 4:45 PM, Michael Gentry wrote:

> FWIW, jmap -dump is only on Java 1.6, not 1.5.
>
> mrg
>
>
> On Thu, Aug 13, 2009 at 3:29 PM, Tore Halset<ha...@pvv.ntnu.no>  
> wrote:
>> Hello.
>>
>> It is hard to tell where the memory problems are without looking at  
>> the
>> actual used memory. I normally use jmap to dump memory info and  
>> then jhat on
>> a different computer to analyze the dump.
>>
>> jmap -dump:live,file=filename pid
>> jhat -J-Xmx10G filename
>>
>> Depending on your heap size, jhat may need a lot of memory and cpu.  
>> That is
>> why I normally copy the file to a separate non-production server.
>>
>> If your memory are filled with cayenne DataRows, then you should  
>> look at the
>> size of the datarow cache. Both DataRows and CayenneDataObject use  
>> Maps that
>> need quite a lot of memory, so you should not have too many.
>>
>> Regards,
>>  - Tore.
>>
>> On Aug 13, 2009, at 6:22 PM, Joe Baldwin wrote:
>>
>>> Background:
>>>
>>> I have been attempting to do as much performance tuning as I can  
>>> given the
>>> visibility of the middleware I am using, but am running into  
>>> severe "out of
>>> memory" errors with Tomcat on my production server.  My current  
>>> theory is
>>> that I may have missed something concerning how to properly  
>>> maintain my
>>> Cayenne data objects.
>>>
>>> Configuration:
>>> 1. I am using the most recent version of Cayenne.
>>> 2. This is primarily a web app, using JSP, Tomcat, Cayenne and MySQL
>>> 3. Tested both on OSX and Linux
>>> 4. Essentially, it is a webstore with a lot of products
>>>        - small amount of UPDATES
>>>        - very large amount of searches resulting in result sets  
>>> that will
>>> average between 50-500 products for read-only
>>>        - occasional large result sets of a few thousand products for
>>> read-only
>>> 5. The design is a simple 3-tier.
>>>
>>> After a few hours to a few days of what appears to be very light  
>>> usage,
>>> Tomcat reports "out of memory" errors.  Since the vast majority of  
>>> the tasks
>>> performed by the app are funneled through Cayenne, I am assuming  
>>> that is
>>> where I am making the mistake.
>>>
>>> Question:
>>> What is the best way to manage the data objects in this  
>>> configuration?
>>> (i.e. caching, releasing memory, etc)
>>>
>>> Note:
>>> I was researching the Java Library ArrayList (which I believe is the
>>> work-horse for the Cayenne result sets), and there are many  
>>> recommendations
>>> concerning custom clearing of the list and releasing memory.   Is  
>>> this
>>> possibly where I am may have missed something?
>>>
>>> Thanks,
>>> Joe
>>>
>>>
>>
>>


Re: Memory Management Practices

Posted by Michael Gentry <mg...@masslight.net>.
FWIW, jmap -dump is only on Java 1.6, not 1.5.

mrg


On Thu, Aug 13, 2009 at 3:29 PM, Tore Halset<ha...@pvv.ntnu.no> wrote:
> Hello.
>
> It is hard to tell where the memory problems are without looking at the
> actual used memory. I normally use jmap to dump memory info and then jhat on
> a different computer to analyze the dump.
>
> jmap -dump:live,file=filename pid
> jhat -J-Xmx10G filename
>
> Depending on your heap size, jhat may need a lot of memory and cpu. That is
> why I normally copy the file to a separate non-production server.
>
> If your memory are filled with cayenne DataRows, then you should look at the
> size of the datarow cache. Both DataRows and CayenneDataObject use Maps that
> need quite a lot of memory, so you should not have too many.
>
> Regards,
>  - Tore.
>
> On Aug 13, 2009, at 6:22 PM, Joe Baldwin wrote:
>
>> Background:
>>
>> I have been attempting to do as much performance tuning as I can given the
>> visibility of the middleware I am using, but am running into severe "out of
>> memory" errors with Tomcat on my production server.  My current theory is
>> that I may have missed something concerning how to properly maintain my
>> Cayenne data objects.
>>
>> Configuration:
>> 1. I am using the most recent version of Cayenne.
>> 2. This is primarily a web app, using JSP, Tomcat, Cayenne and MySQL
>> 3. Tested both on OSX and Linux
>> 4. Essentially, it is a webstore with a lot of products
>>        - small amount of UPDATES
>>        - very large amount of searches resulting in result sets that will
>> average between 50-500 products for read-only
>>        - occasional large result sets of a few thousand products for
>> read-only
>> 5. The design is a simple 3-tier.
>>
>> After a few hours to a few days of what appears to be very light usage,
>> Tomcat reports "out of memory" errors.  Since the vast majority of the tasks
>> performed by the app are funneled through Cayenne, I am assuming that is
>> where I am making the mistake.
>>
>> Question:
>> What is the best way to manage the data objects in this configuration?
>> (i.e. caching, releasing memory, etc)
>>
>> Note:
>> I was researching the Java Library ArrayList (which I believe is the
>> work-horse for the Cayenne result sets), and there are many recommendations
>> concerning custom clearing of the list and releasing memory.   Is this
>> possibly where I am may have missed something?
>>
>> Thanks,
>> Joe
>>
>>
>
>

Re: Memory Management Practices

Posted by Tore Halset <ha...@pvv.ntnu.no>.
Hello.

It is hard to tell where the memory problems are without looking at  
the actual used memory. I normally use jmap to dump memory info and  
then jhat on a different computer to analyze the dump.

jmap -dump:live,file=filename pid
jhat -J-Xmx10G filename

Depending on your heap size, jhat may need a lot of memory and cpu.  
That is why I normally copy the file to a separate non-production  
server.

If your memory are filled with cayenne DataRows, then you should look  
at the size of the datarow cache. Both DataRows and CayenneDataObject  
use Maps that need quite a lot of memory, so you should not have too  
many.

Regards,
  - Tore.

On Aug 13, 2009, at 6:22 PM, Joe Baldwin wrote:

> Background:
>
> I have been attempting to do as much performance tuning as I can  
> given the visibility of the middleware I am using, but am running  
> into severe "out of memory" errors with Tomcat on my production  
> server.  My current theory is that I may have missed something  
> concerning how to properly maintain my Cayenne data objects.
>
> Configuration:
> 1. I am using the most recent version of Cayenne.
> 2. This is primarily a web app, using JSP, Tomcat, Cayenne and MySQL
> 3. Tested both on OSX and Linux
> 4. Essentially, it is a webstore with a lot of products
> 	- small amount of UPDATES
> 	- very large amount of searches resulting in result sets that will  
> average between 50-500 products for read-only
> 	- occasional large result sets of a few thousand products for read- 
> only
> 5. The design is a simple 3-tier.
>
> After a few hours to a few days of what appears to be very light  
> usage, Tomcat reports "out of memory" errors.  Since the vast  
> majority of the tasks performed by the app are funneled through  
> Cayenne, I am assuming that is where I am making the mistake.
>
> Question:
> What is the best way to manage the data objects in this  
> configuration?   (i.e. caching, releasing memory, etc)
>
> Note:
> I was researching the Java Library ArrayList (which I believe is the  
> work-horse for the Cayenne result sets), and there are many  
> recommendations concerning custom clearing of the list and releasing  
> memory.   Is this possibly where I am may have missed something?
>
> Thanks,
> Joe
>
>