You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@cayenne.apache.org by "Chris Murphy (www.strandz.org)" <ch...@strandz.org> on 2011/12/11 07:51:03 UTC

Continual creation is a memory leak

I have a server application that continually reads data from sensors. At
set intervals the data is summarized. This summary data is used to create
Cayenne data objects of type Reading. A short transaction commits these
Reading objects to the database, after which it is not important that they
are held in memory - they were created only to be stored. After a long
period of time their continual collection results in an 'OutOfMemory' JVM
condition.

There are many objects of type Reading for another Cayenne data object
called SubstancePoint. And there's a whole object graph going back from
there. I basically want to keep the whole of the object graph in memory,
except for these Reading objects.

Is there a way to 'disappear' data objects from the DataContext? For that
is what I think would solve my problem. I have tried calling
DataContext.unregisterObjects() post-commit on the Reading data objects I
want evacuated from memory, but I can see that the leak is still going on.

Thank you ~ Chris Murphy

Re: Continual creation is a memory leak

Posted by "Chris Murphy (www.strandz.org)" <ch...@strandz.org>.
Hi Malcolm,

I need to do everything to ensure that this server will stay up. So yes if
you can give me the code you used I will add it in for logging purposes, to
hopefully see what each requested connection is being used for, and thus if
any are not being reused, indicating they have not been returned to the
pool.

Meanwhile I can look at the log file to see if there are any stack traces
in it. My thinking being that usually a thread-death will be marked by some
kind of stack trace.

Even simply knowing how many connections are in use at any one time, over
time, will give me some peace of mind. I expect to see two being used. I
have evidence that three have been needed on occasion. If I see four being
used after a longer period of the server being up, then I can predict that
five and then a "Too many connections" crash will be on the horizon - as I
have set 'max connections' to five.

Thank you.

~ Chris Murphy

On 13 December 2011 20:38, Malcolm Edgar <ma...@gmail.com> wrote:

> Hi Chris,
>
> I once had an issue with JBoss 4.0.3 in a production environment where
> connections were not being returned to the pool and eventually the
> application would crash through connection pool exhaustion.
>
> It took a long time to figure out what was happening, it appeared that
> we has some kind of thread-dealth like behaviour with some connections
> never being closed and returned to the pool. The fix for this was to
> create a connection wrapper, so we could track whether connections
> were being returned to the pool with a request filter.  If you are
> interested I can provide the code, which is plugged in with a Cayenne
> DataSourceFactory.
>
> regards Malcolm Edgar
>
> On Tue, Dec 13, 2011 at 5:22 AM, Chris Murphy <mu...@gmail.com> wrote:
> > I'm using JProfiler to look for memory leaks. I've used it to see that
> > they've been plugged since I implemented Robert's suggestion of having a
> > separate temporary DataContext that gets garbage collected, collecting
> the
> > Reading data objects with it.
> >
> > I now think that garbage collection being the reason for three
> connections
> > to be required is quite silly! If a connection is ready for garbage
> > collection then it won't be visible to the Cayenne code. The Cayenne code
> > is baulking because two connections are currently in use and a third is
> > required to execute a query (that's my understanding of the error
> message).
> > To really see the cause of the problem it would be good if the "Too many
> > connections" stack trace told us what the other two connections are being
> > used for. Alternatively the instrumention at the DEBUG logging level
> might
> > show connections being picked up and released, and I would be able to
> work
> > out why the other two are being held from there.
> >
> >  ~ Chris Murphy
> >
> > On 13 December 2011 00:50, Durchholz, Joachim <
> > Joachim.Durchholz@hennig-fahrzeugteile.de> wrote:
> >
> >> > Perhaps the garbage collector thinks that its okay to leave around
> such
> >> a small amount of garbage for more than a minute.
> >> >
> >> > Am I right?? It sounds weird that a garbage collector can cause a
> >> program to crash!
> >>
> >> Probably not. Garbage collectors will kick in after a certain threshold
> >> has been reached, and that threshold is usually set at the end of the
> last
> >> cycle.
> >> In other words, unless you have a real memory leak, the collector should
> >> keep the memory footprint stable. It's part of its job description.
> >> (The actual mechanism is far more complicated, but that's the bottom
> line).
> >>
> >> If you are using Sun's Java 6, you can monitor memory usage via
> JVisualVM,
> >> which comes as part of the JDK.
> >> JVisualVM also works well for Sun's Java 5 runtime. You'll have slightly
> >> reduced functionality and you need to supply an extra -D parameter to
> the
> >> Java 5 RE, but it does give you the pretty real-time graphs. That tool
> >> helped me a ton, a year or two ago, with little hassle for set-up,
> despite
> >> being tied to Java 5 at that time. (JVisualVM will even give you
> statistics
> >> about which data types take up most memory, show you all objects of a
> type,
> >> and will even do a breadth-first search for a stack frame that's keeping
> >> any given object alive. I'm not 100% sure how much of this is available
> for
> >> Java 5, but if Java 5 fails you can still try Java 6.)
> >>
> >> HTH
> >> Jo
> >>
>

Re: Continual creation is a memory leak

Posted by Malcolm Edgar <ma...@gmail.com>.
Hi Chris,

I once had an issue with JBoss 4.0.3 in a production environment where
connections were not being returned to the pool and eventually the
application would crash through connection pool exhaustion.

It took a long time to figure out what was happening, it appeared that
we has some kind of thread-dealth like behaviour with some connections
never being closed and returned to the pool. The fix for this was to
create a connection wrapper, so we could track whether connections
were being returned to the pool with a request filter.  If you are
interested I can provide the code, which is plugged in with a Cayenne
DataSourceFactory.

regards Malcolm Edgar

On Tue, Dec 13, 2011 at 5:22 AM, Chris Murphy <mu...@gmail.com> wrote:
> I'm using JProfiler to look for memory leaks. I've used it to see that
> they've been plugged since I implemented Robert's suggestion of having a
> separate temporary DataContext that gets garbage collected, collecting the
> Reading data objects with it.
>
> I now think that garbage collection being the reason for three connections
> to be required is quite silly! If a connection is ready for garbage
> collection then it won't be visible to the Cayenne code. The Cayenne code
> is baulking because two connections are currently in use and a third is
> required to execute a query (that's my understanding of the error message).
> To really see the cause of the problem it would be good if the "Too many
> connections" stack trace told us what the other two connections are being
> used for. Alternatively the instrumention at the DEBUG logging level might
> show connections being picked up and released, and I would be able to work
> out why the other two are being held from there.
>
>  ~ Chris Murphy
>
> On 13 December 2011 00:50, Durchholz, Joachim <
> Joachim.Durchholz@hennig-fahrzeugteile.de> wrote:
>
>> > Perhaps the garbage collector thinks that its okay to leave around such
>> a small amount of garbage for more than a minute.
>> >
>> > Am I right?? It sounds weird that a garbage collector can cause a
>> program to crash!
>>
>> Probably not. Garbage collectors will kick in after a certain threshold
>> has been reached, and that threshold is usually set at the end of the last
>> cycle.
>> In other words, unless you have a real memory leak, the collector should
>> keep the memory footprint stable. It's part of its job description.
>> (The actual mechanism is far more complicated, but that's the bottom line).
>>
>> If you are using Sun's Java 6, you can monitor memory usage via JVisualVM,
>> which comes as part of the JDK.
>> JVisualVM also works well for Sun's Java 5 runtime. You'll have slightly
>> reduced functionality and you need to supply an extra -D parameter to the
>> Java 5 RE, but it does give you the pretty real-time graphs. That tool
>> helped me a ton, a year or two ago, with little hassle for set-up, despite
>> being tied to Java 5 at that time. (JVisualVM will even give you statistics
>> about which data types take up most memory, show you all objects of a type,
>> and will even do a breadth-first search for a stack frame that's keeping
>> any given object alive. I'm not 100% sure how much of this is available for
>> Java 5, but if Java 5 fails you can still try Java 6.)
>>
>> HTH
>> Jo
>>

Re: Continual creation is a memory leak

Posted by Chris Murphy <mu...@gmail.com>.
I'm using JProfiler to look for memory leaks. I've used it to see that
they've been plugged since I implemented Robert's suggestion of having a
separate temporary DataContext that gets garbage collected, collecting the
Reading data objects with it.

I now think that garbage collection being the reason for three connections
to be required is quite silly! If a connection is ready for garbage
collection then it won't be visible to the Cayenne code. The Cayenne code
is baulking because two connections are currently in use and a third is
required to execute a query (that's my understanding of the error message).
To really see the cause of the problem it would be good if the "Too many
connections" stack trace told us what the other two connections are being
used for. Alternatively the instrumention at the DEBUG logging level might
show connections being picked up and released, and I would be able to work
out why the other two are being held from there.

 ~ Chris Murphy

On 13 December 2011 00:50, Durchholz, Joachim <
Joachim.Durchholz@hennig-fahrzeugteile.de> wrote:

> > Perhaps the garbage collector thinks that its okay to leave around such
> a small amount of garbage for more than a minute.
> >
> > Am I right?? It sounds weird that a garbage collector can cause a
> program to crash!
>
> Probably not. Garbage collectors will kick in after a certain threshold
> has been reached, and that threshold is usually set at the end of the last
> cycle.
> In other words, unless you have a real memory leak, the collector should
> keep the memory footprint stable. It's part of its job description.
> (The actual mechanism is far more complicated, but that's the bottom line).
>
> If you are using Sun's Java 6, you can monitor memory usage via JVisualVM,
> which comes as part of the JDK.
> JVisualVM also works well for Sun's Java 5 runtime. You'll have slightly
> reduced functionality and you need to supply an extra -D parameter to the
> Java 5 RE, but it does give you the pretty real-time graphs. That tool
> helped me a ton, a year or two ago, with little hassle for set-up, despite
> being tied to Java 5 at that time. (JVisualVM will even give you statistics
> about which data types take up most memory, show you all objects of a type,
> and will even do a breadth-first search for a stack frame that's keeping
> any given object alive. I'm not 100% sure how much of this is available for
> Java 5, but if Java 5 fails you can still try Java 6.)
>
> HTH
> Jo
>

RE: Continual creation is a memory leak

Posted by "Durchholz, Joachim" <Jo...@hennig-fahrzeugteile.de>.
> Perhaps the garbage collector thinks that its okay to leave around such a small amount of garbage for more than a minute.
> 
> Am I right?? It sounds weird that a garbage collector can cause a program to crash!

Probably not. Garbage collectors will kick in after a certain threshold has been reached, and that threshold is usually set at the end of the last cycle.
In other words, unless you have a real memory leak, the collector should keep the memory footprint stable. It's part of its job description.
(The actual mechanism is far more complicated, but that's the bottom line).

If you are using Sun's Java 6, you can monitor memory usage via JVisualVM, which comes as part of the JDK.
JVisualVM also works well for Sun's Java 5 runtime. You'll have slightly reduced functionality and you need to supply an extra -D parameter to the Java 5 RE, but it does give you the pretty real-time graphs. That tool helped me a ton, a year or two ago, with little hassle for set-up, despite being tied to Java 5 at that time. (JVisualVM will even give you statistics about which data types take up most memory, show you all objects of a type, and will even do a breadth-first search for a stack frame that's keeping any given object alive. I'm not 100% sure how much of this is available for Java 5, but if Java 5 fails you can still try Java 6.)

HTH
Jo

Re: Continual creation is a memory leak

Posted by "Chris Murphy (www.strandz.org)" <ch...@strandz.org>.
Quite interestingly two was not enough. I got the error again after leaving
the server going for one and a half hours. My theory as to why two
connections was not enough is that I was relying on garbage collection to
have flushed away everything (including the connection) associated with the
old 'out of scoped' DataContext before the same method was called one
minute later. Perhaps the garbage collector thinks that its okay to leave
around such a small amount of garbage for more than a minute.

Am I right?? It sounds weird that a garbage collector can cause a program
to crash!

Anyway now I'm going to give it more garbage and leave the garbage around
for longer by lazily re-creating the DataContext every ten calls. This is
more akin to the way you were originally talking I think. Also I'm going to
increase the max number of connections to five.

 ~ Chris Murphy

On 12 December 2011 17:08, Robert Zeigler <ro...@roxanemy.com>wrote:

> You're right about the max # of concurrent users, but the max # of
> required connections is usually < the max # of concurrent users. Cayenne
> only needs a connection to query the db or to commit to the db. Outside of
> those operations, connections are returned to the pool, so you can usually
> get by with significantly fewer connections. I think you're on the right
> track with gradually increasing the max connections as opposed to gradually
> decreasing them.
>
> Cheers,
>
> Robert
>
> On Dec 11, 2011, at 12/1111:57 PM , Chris Murphy (www.strandz.org) wrote:
>
> > I just had a look at the Modeler and at the DomainNode's JDBC
> Configuration
> > (in the Main tab). I saw that 'Max Connections' was set to '1'. So
> perhaps
> > I wasn't giving Cayenne much to play with in terms of connection
> pooling! I
> > have now set it to '2'. (I imagine the default was a bit higher than
> these
> > numbers).
> >
> > In the program at the moment there ought to be at most two DataContexts.
> > And one connection each is enough. So two should suffice until I go for
> the
> > recommended approach with multiple users of having a DataContext per
> user.
> > In that case I imagine 'Max Connections' will be related to the expected
> > maximum number of concurrent users. But multi-user configuration is for
> > another day...
> >
> > So far this server program hasn't fallen over... And I don't think
> there's
> > any reason why I would need more than 2 connections is there? I like to
> > keep everything as tight as possible - 3 connections would actually be an
> > error that I would like to know about.
> >
> > Thank you ~ Chris Murphy
> >
> > On 12 December 2011 16:15, Robert Zeigler <robert.zeigler@roxanemy.com
> >wrote:
> >
> >> Hm. Cayenne is usually pretty good about managing the connection pooling
> >> for you.
> >> How many concurrent connections is your mysql db configured to allow?
> >>
> >> Robert
> >>
> >> On Dec 11, 2011, at 12/119:58 PM , Chris Murphy (www.strandz.org)
> wrote:
> >>
> >>> Thanks Robert,
> >>>
> >>> That was the answer. The memory leak disappeared. I now create a
> >>> DataContext with 'method scope' that gets gc-ed when the method has
> >>> finished. That brought up what must be another problem with my code
> which
> >>> I'm investigating now:
> >>>
> >>> Caused by: org.apache.cayenne.CayenneRuntimeException: [v.3.0 Apr 26
> 2010
> >>> 09:59:17] Error detecting database type: Data source rejected
> >> establishment
> >>> of connection message from server: "Too many connections"
> >>>   at
> >> org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:185)
> >>>   at
> org.apache.cayenne.dba.AutoAdapter.getAdapter(AutoAdapter.java:155)
> >>>   at
> >>>
> org.apache.cayenne.dba.AutoAdapter.getExtendedTypes(AutoAdapter.java:263)
> >>>   at
> >> org.apache.cayenne.access.DataNode.performQueries(DataNode.java:243)
> >>>   at
> >>>
> >>
> org.apache.cayenne.access.DataDomainQueryAction.runQuery(DataDomainQueryAction.java:422)
> >>>   at
> >>>
> >>
> org.apache.cayenne.access.DataDomainQueryAction.access$000(DataDomainQueryAction.java:69)
> >>>   at
> >>>
> >>
> org.apache.cayenne.access.DataDomainQueryAction$2.transform(DataDomainQueryAction.java:395)
> >>>   at
> >>>
> >>
> org.apache.cayenne.access.DataDomain.runInTransaction(DataDomain.java:850)
> >>>   at
> >>>
> >>
> org.apache.cayenne.access.DataDomainQueryAction.runQueryInTransaction(DataDomainQueryAction.java:392)
> >>>   at
> >>>
> >>
> org.apache.cayenne.access.DataDomainQueryAction.execute(DataDomainQueryAction.java:121)
> >>>   at org.apache.cayenne.access.DataDomain.onQuery(DataDomain.java:743)
> >>>   at
> >>>
> >>
> org.apache.cayenne.util.ObjectContextQueryAction.runQuery(ObjectContextQueryAction.java:333)
> >>>   at
> >>>
> >>
> org.apache.cayenne.util.ObjectContextQueryAction.execute(ObjectContextQueryAction.java:96)
> >>>   at
> >> org.apache.cayenne.access.DataContext.onQuery(DataContext.java:1278)
> >>>   at
> >>>
> org.apache.cayenne.access.DataContext.performQuery(DataContext.java:1267)
> >>>   at
> >>>
> >>
> com.seasoft.store.orm.CayenneInterpretedQuery.execute(CayenneInterpretedQuery.java:265)
> >>>   at com.seasoft.store.DomainQuery.execute(DomainQuery.java:43)
> >>>   at
> >>>
> >>
> com.seasoft.store.DomainQueries.executeRetCollection(DomainQueries.java:56)
> >>>   at
> >>>
> >>
> com.cmts.business.orm.DOActivity.storeRealtimeReadings(DOActivity.java:113)
> >>>   at
> >>>
> >>
> com.cmts.business.SenderGasReceiver.storeRealtimeSamplesIntoSummaryReadings(SenderGasReceiver.java:266)
> >>>   at com.cmts.business.StoreRealtimeIntoDBJob.execut
> >>> e(StoreRealtimeIntoDBJob.java:20)
> >>>   at org.quartz.core.JobRunShell.run(JobRunShell.java:216)
> >>>
> >>> Maybe I need to recycle connections or something?? In the Cayenne
> >>> documentation I found this: "Cayenne will also periodically close
> unused
> >>> database connections if it determines there are too many that are open
> >> and
> >>> idle." Is there some way to hint to Cayenne that you've finished with a
> >>> DataContext's connection pool, something like 'dataContext.close()'?
> >>>
> >>> Here is the exception at a lower level:
> >>>
> >>> Caused by: java.sql.SQLException: Data source rejected establishment of
> >>> connection message from server: "Too many connections"
> >>>   at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1997)
> >>>   at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1906)
> >>>   at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:2520)
> >>>   at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:817)
> >>>   at com.mysql.jdbc.Connection.createNewIO(Connection.java:1782)
> >>>   at com.mysql.jdbc.Connection.<init>(Connection.java:450)
> >>>   at
> >>>
> >>
> com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:411)
> >>>   at
> >>>
> >>
> org.apache.cayenne.conn.DriverDataSource.getConnection(DriverDataSource.java:156)
> >>>   at
> >>>
> >>
> org.apache.cayenne.conn.PooledConnectionImpl.reconnect(PooledConnectionImpl.java:83)
> >>>   at
> >>>
> >>
> org.apache.cayenne.conn.PooledConnectionImpl.getConnection(PooledConnectionImpl.java:120)
> >>>   at
> >>>
> >>
> org.apache.cayenne.conn.PoolManager.uncheckConnection(PoolManager.java:369)
> >>>   at
> >>> org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:353)
> >>>   at
> >>> org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:330)
> >>>   at
> >>>
> >>
> org.apache.cayenne.access.DataNode$TransactionDataSource.getConnection(DataNode.java:364)
> >>>   at
> >>>
> >>
> org.apache.cayenne.conf.NodeDataSource.getConnection(NodeDataSource.java:46)
> >>>   at
> >> org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:170)
> >>>
> >>> Thanks ~ Chris Murphy
> >>>
> >>> On 11 December 2011 17:57, Robert Zeigler <robert.zeigler@roxanemy.com
> >>> wrote:
> >>>
> >>>> Note that DataContext creation is cheap. What I typically do in
> >> situations
> >>>> like this is to periodically create a new data context that you can
> >> throw
> >>>> away. When it's gone, the associated objects will be gc'ed. Eg: you
> >> could
> >>>> periodically dump the "reading" data context and create a fresh one
> >> after
> >>>> every 1,000 Reading objects or whatever makes sense for your use-case.
> >> If
> >>>> the frequency of readings transactions is low, it probably makes sense
> >> to
> >>>> create a new DataContext, import whatever objects you need into it
> (via
> >>>> localObject), create your readings, commit, then discard the
> >> DataContext.
> >>>> If the readings are very frequent, then it makes sense to have a
> >> dedicated
> >>>> "readings" DataContext that you can periodically swap out.
> >>>>
> >>>> HTH,
> >>>>
> >>>> Robert
> >>>>
> >>>> On Dec 11, 2011, at 12/1112:51 AM , Chris Murphy (www.strandz.org)
> >> wrote:
> >>>>
> >>>>> I have a server application that continually reads data from sensors.
> >> At
> >>>>> set intervals the data is summarized. This summary data is used to
> >> create
> >>>>> Cayenne data objects of type Reading. A short transaction commits
> these
> >>>>> Reading objects to the database, after which it is not important that
> >>>> they
> >>>>> are held in memory - they were created only to be stored. After a
> long
> >>>>> period of time their continual collection results in an 'OutOfMemory'
> >> JVM
> >>>>> condition.
> >>>>>
> >>>>> There are many objects of type Reading for another Cayenne data
> object
> >>>>> called SubstancePoint. And there's a whole object graph going back
> from
> >>>>> there. I basically want to keep the whole of the object graph in
> >> memory,
> >>>>> except for these Reading objects.
> >>>>>
> >>>>> Is there a way to 'disappear' data objects from the DataContext? For
> >> that
> >>>>> is what I think would solve my problem. I have tried calling
> >>>>> DataContext.unregisterObjects() post-commit on the Reading data
> >> objects I
> >>>>> want evacuated from memory, but I can see that the leak is still
> going
> >>>> on.
> >>>>>
> >>>>> Thank you ~ Chris Murphy
> >>>>
> >>>>
> >>
> >>
>
>

Re: Continual creation is a memory leak

Posted by Robert Zeigler <ro...@roxanemy.com>.
You're right about the max # of concurrent users, but the max # of required connections is usually < the max # of concurrent users. Cayenne only needs a connection to query the db or to commit to the db. Outside of those operations, connections are returned to the pool, so you can usually get by with significantly fewer connections. I think you're on the right track with gradually increasing the max connections as opposed to gradually decreasing them.

Cheers,

Robert

On Dec 11, 2011, at 12/1111:57 PM , Chris Murphy (www.strandz.org) wrote:

> I just had a look at the Modeler and at the DomainNode's JDBC Configuration
> (in the Main tab). I saw that 'Max Connections' was set to '1'. So perhaps
> I wasn't giving Cayenne much to play with in terms of connection pooling! I
> have now set it to '2'. (I imagine the default was a bit higher than these
> numbers).
> 
> In the program at the moment there ought to be at most two DataContexts.
> And one connection each is enough. So two should suffice until I go for the
> recommended approach with multiple users of having a DataContext per user.
> In that case I imagine 'Max Connections' will be related to the expected
> maximum number of concurrent users. But multi-user configuration is for
> another day...
> 
> So far this server program hasn't fallen over... And I don't think there's
> any reason why I would need more than 2 connections is there? I like to
> keep everything as tight as possible - 3 connections would actually be an
> error that I would like to know about.
> 
> Thank you ~ Chris Murphy
> 
> On 12 December 2011 16:15, Robert Zeigler <ro...@roxanemy.com>wrote:
> 
>> Hm. Cayenne is usually pretty good about managing the connection pooling
>> for you.
>> How many concurrent connections is your mysql db configured to allow?
>> 
>> Robert
>> 
>> On Dec 11, 2011, at 12/119:58 PM , Chris Murphy (www.strandz.org) wrote:
>> 
>>> Thanks Robert,
>>> 
>>> That was the answer. The memory leak disappeared. I now create a
>>> DataContext with 'method scope' that gets gc-ed when the method has
>>> finished. That brought up what must be another problem with my code which
>>> I'm investigating now:
>>> 
>>> Caused by: org.apache.cayenne.CayenneRuntimeException: [v.3.0 Apr 26 2010
>>> 09:59:17] Error detecting database type: Data source rejected
>> establishment
>>> of connection message from server: "Too many connections"
>>>   at
>> org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:185)
>>>   at org.apache.cayenne.dba.AutoAdapter.getAdapter(AutoAdapter.java:155)
>>>   at
>>> org.apache.cayenne.dba.AutoAdapter.getExtendedTypes(AutoAdapter.java:263)
>>>   at
>> org.apache.cayenne.access.DataNode.performQueries(DataNode.java:243)
>>>   at
>>> 
>> org.apache.cayenne.access.DataDomainQueryAction.runQuery(DataDomainQueryAction.java:422)
>>>   at
>>> 
>> org.apache.cayenne.access.DataDomainQueryAction.access$000(DataDomainQueryAction.java:69)
>>>   at
>>> 
>> org.apache.cayenne.access.DataDomainQueryAction$2.transform(DataDomainQueryAction.java:395)
>>>   at
>>> 
>> org.apache.cayenne.access.DataDomain.runInTransaction(DataDomain.java:850)
>>>   at
>>> 
>> org.apache.cayenne.access.DataDomainQueryAction.runQueryInTransaction(DataDomainQueryAction.java:392)
>>>   at
>>> 
>> org.apache.cayenne.access.DataDomainQueryAction.execute(DataDomainQueryAction.java:121)
>>>   at org.apache.cayenne.access.DataDomain.onQuery(DataDomain.java:743)
>>>   at
>>> 
>> org.apache.cayenne.util.ObjectContextQueryAction.runQuery(ObjectContextQueryAction.java:333)
>>>   at
>>> 
>> org.apache.cayenne.util.ObjectContextQueryAction.execute(ObjectContextQueryAction.java:96)
>>>   at
>> org.apache.cayenne.access.DataContext.onQuery(DataContext.java:1278)
>>>   at
>>> org.apache.cayenne.access.DataContext.performQuery(DataContext.java:1267)
>>>   at
>>> 
>> com.seasoft.store.orm.CayenneInterpretedQuery.execute(CayenneInterpretedQuery.java:265)
>>>   at com.seasoft.store.DomainQuery.execute(DomainQuery.java:43)
>>>   at
>>> 
>> com.seasoft.store.DomainQueries.executeRetCollection(DomainQueries.java:56)
>>>   at
>>> 
>> com.cmts.business.orm.DOActivity.storeRealtimeReadings(DOActivity.java:113)
>>>   at
>>> 
>> com.cmts.business.SenderGasReceiver.storeRealtimeSamplesIntoSummaryReadings(SenderGasReceiver.java:266)
>>>   at com.cmts.business.StoreRealtimeIntoDBJob.execut
>>> e(StoreRealtimeIntoDBJob.java:20)
>>>   at org.quartz.core.JobRunShell.run(JobRunShell.java:216)
>>> 
>>> Maybe I need to recycle connections or something?? In the Cayenne
>>> documentation I found this: "Cayenne will also periodically close unused
>>> database connections if it determines there are too many that are open
>> and
>>> idle." Is there some way to hint to Cayenne that you've finished with a
>>> DataContext's connection pool, something like 'dataContext.close()'?
>>> 
>>> Here is the exception at a lower level:
>>> 
>>> Caused by: java.sql.SQLException: Data source rejected establishment of
>>> connection message from server: "Too many connections"
>>>   at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1997)
>>>   at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1906)
>>>   at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:2520)
>>>   at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:817)
>>>   at com.mysql.jdbc.Connection.createNewIO(Connection.java:1782)
>>>   at com.mysql.jdbc.Connection.<init>(Connection.java:450)
>>>   at
>>> 
>> com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:411)
>>>   at
>>> 
>> org.apache.cayenne.conn.DriverDataSource.getConnection(DriverDataSource.java:156)
>>>   at
>>> 
>> org.apache.cayenne.conn.PooledConnectionImpl.reconnect(PooledConnectionImpl.java:83)
>>>   at
>>> 
>> org.apache.cayenne.conn.PooledConnectionImpl.getConnection(PooledConnectionImpl.java:120)
>>>   at
>>> 
>> org.apache.cayenne.conn.PoolManager.uncheckConnection(PoolManager.java:369)
>>>   at
>>> org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:353)
>>>   at
>>> org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:330)
>>>   at
>>> 
>> org.apache.cayenne.access.DataNode$TransactionDataSource.getConnection(DataNode.java:364)
>>>   at
>>> 
>> org.apache.cayenne.conf.NodeDataSource.getConnection(NodeDataSource.java:46)
>>>   at
>> org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:170)
>>> 
>>> Thanks ~ Chris Murphy
>>> 
>>> On 11 December 2011 17:57, Robert Zeigler <robert.zeigler@roxanemy.com
>>> wrote:
>>> 
>>>> Note that DataContext creation is cheap. What I typically do in
>> situations
>>>> like this is to periodically create a new data context that you can
>> throw
>>>> away. When it's gone, the associated objects will be gc'ed. Eg: you
>> could
>>>> periodically dump the "reading" data context and create a fresh one
>> after
>>>> every 1,000 Reading objects or whatever makes sense for your use-case.
>> If
>>>> the frequency of readings transactions is low, it probably makes sense
>> to
>>>> create a new DataContext, import whatever objects you need into it (via
>>>> localObject), create your readings, commit, then discard the
>> DataContext.
>>>> If the readings are very frequent, then it makes sense to have a
>> dedicated
>>>> "readings" DataContext that you can periodically swap out.
>>>> 
>>>> HTH,
>>>> 
>>>> Robert
>>>> 
>>>> On Dec 11, 2011, at 12/1112:51 AM , Chris Murphy (www.strandz.org)
>> wrote:
>>>> 
>>>>> I have a server application that continually reads data from sensors.
>> At
>>>>> set intervals the data is summarized. This summary data is used to
>> create
>>>>> Cayenne data objects of type Reading. A short transaction commits these
>>>>> Reading objects to the database, after which it is not important that
>>>> they
>>>>> are held in memory - they were created only to be stored. After a long
>>>>> period of time their continual collection results in an 'OutOfMemory'
>> JVM
>>>>> condition.
>>>>> 
>>>>> There are many objects of type Reading for another Cayenne data object
>>>>> called SubstancePoint. And there's a whole object graph going back from
>>>>> there. I basically want to keep the whole of the object graph in
>> memory,
>>>>> except for these Reading objects.
>>>>> 
>>>>> Is there a way to 'disappear' data objects from the DataContext? For
>> that
>>>>> is what I think would solve my problem. I have tried calling
>>>>> DataContext.unregisterObjects() post-commit on the Reading data
>> objects I
>>>>> want evacuated from memory, but I can see that the leak is still going
>>>> on.
>>>>> 
>>>>> Thank you ~ Chris Murphy
>>>> 
>>>> 
>> 
>> 


Re: Continual creation is a memory leak

Posted by "Chris Murphy (www.strandz.org)" <ch...@strandz.org>.
I just had a look at the Modeler and at the DomainNode's JDBC Configuration
(in the Main tab). I saw that 'Max Connections' was set to '1'. So perhaps
I wasn't giving Cayenne much to play with in terms of connection pooling! I
have now set it to '2'. (I imagine the default was a bit higher than these
numbers).

In the program at the moment there ought to be at most two DataContexts.
And one connection each is enough. So two should suffice until I go for the
recommended approach with multiple users of having a DataContext per user.
In that case I imagine 'Max Connections' will be related to the expected
maximum number of concurrent users. But multi-user configuration is for
another day...

So far this server program hasn't fallen over... And I don't think there's
any reason why I would need more than 2 connections is there? I like to
keep everything as tight as possible - 3 connections would actually be an
error that I would like to know about.

Thank you ~ Chris Murphy

On 12 December 2011 16:15, Robert Zeigler <ro...@roxanemy.com>wrote:

> Hm. Cayenne is usually pretty good about managing the connection pooling
> for you.
> How many concurrent connections is your mysql db configured to allow?
>
> Robert
>
> On Dec 11, 2011, at 12/119:58 PM , Chris Murphy (www.strandz.org) wrote:
>
> > Thanks Robert,
> >
> > That was the answer. The memory leak disappeared. I now create a
> > DataContext with 'method scope' that gets gc-ed when the method has
> > finished. That brought up what must be another problem with my code which
> > I'm investigating now:
> >
> > Caused by: org.apache.cayenne.CayenneRuntimeException: [v.3.0 Apr 26 2010
> > 09:59:17] Error detecting database type: Data source rejected
> establishment
> > of connection message from server: "Too many connections"
> >    at
> org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:185)
> >    at org.apache.cayenne.dba.AutoAdapter.getAdapter(AutoAdapter.java:155)
> >    at
> > org.apache.cayenne.dba.AutoAdapter.getExtendedTypes(AutoAdapter.java:263)
> >    at
> org.apache.cayenne.access.DataNode.performQueries(DataNode.java:243)
> >    at
> >
> org.apache.cayenne.access.DataDomainQueryAction.runQuery(DataDomainQueryAction.java:422)
> >    at
> >
> org.apache.cayenne.access.DataDomainQueryAction.access$000(DataDomainQueryAction.java:69)
> >    at
> >
> org.apache.cayenne.access.DataDomainQueryAction$2.transform(DataDomainQueryAction.java:395)
> >    at
> >
> org.apache.cayenne.access.DataDomain.runInTransaction(DataDomain.java:850)
> >    at
> >
> org.apache.cayenne.access.DataDomainQueryAction.runQueryInTransaction(DataDomainQueryAction.java:392)
> >    at
> >
> org.apache.cayenne.access.DataDomainQueryAction.execute(DataDomainQueryAction.java:121)
> >    at org.apache.cayenne.access.DataDomain.onQuery(DataDomain.java:743)
> >    at
> >
> org.apache.cayenne.util.ObjectContextQueryAction.runQuery(ObjectContextQueryAction.java:333)
> >    at
> >
> org.apache.cayenne.util.ObjectContextQueryAction.execute(ObjectContextQueryAction.java:96)
> >    at
> org.apache.cayenne.access.DataContext.onQuery(DataContext.java:1278)
> >    at
> > org.apache.cayenne.access.DataContext.performQuery(DataContext.java:1267)
> >    at
> >
> com.seasoft.store.orm.CayenneInterpretedQuery.execute(CayenneInterpretedQuery.java:265)
> >    at com.seasoft.store.DomainQuery.execute(DomainQuery.java:43)
> >    at
> >
> com.seasoft.store.DomainQueries.executeRetCollection(DomainQueries.java:56)
> >    at
> >
> com.cmts.business.orm.DOActivity.storeRealtimeReadings(DOActivity.java:113)
> >    at
> >
> com.cmts.business.SenderGasReceiver.storeRealtimeSamplesIntoSummaryReadings(SenderGasReceiver.java:266)
> >    at com.cmts.business.StoreRealtimeIntoDBJob.execut
> > e(StoreRealtimeIntoDBJob.java:20)
> >    at org.quartz.core.JobRunShell.run(JobRunShell.java:216)
> >
> > Maybe I need to recycle connections or something?? In the Cayenne
> > documentation I found this: "Cayenne will also periodically close unused
> > database connections if it determines there are too many that are open
> and
> > idle." Is there some way to hint to Cayenne that you've finished with a
> > DataContext's connection pool, something like 'dataContext.close()'?
> >
> > Here is the exception at a lower level:
> >
> > Caused by: java.sql.SQLException: Data source rejected establishment of
> > connection message from server: "Too many connections"
> >    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1997)
> >    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1906)
> >    at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:2520)
> >    at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:817)
> >    at com.mysql.jdbc.Connection.createNewIO(Connection.java:1782)
> >    at com.mysql.jdbc.Connection.<init>(Connection.java:450)
> >    at
> >
> com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:411)
> >    at
> >
> org.apache.cayenne.conn.DriverDataSource.getConnection(DriverDataSource.java:156)
> >    at
> >
> org.apache.cayenne.conn.PooledConnectionImpl.reconnect(PooledConnectionImpl.java:83)
> >    at
> >
> org.apache.cayenne.conn.PooledConnectionImpl.getConnection(PooledConnectionImpl.java:120)
> >    at
> >
> org.apache.cayenne.conn.PoolManager.uncheckConnection(PoolManager.java:369)
> >    at
> > org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:353)
> >    at
> > org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:330)
> >    at
> >
> org.apache.cayenne.access.DataNode$TransactionDataSource.getConnection(DataNode.java:364)
> >    at
> >
> org.apache.cayenne.conf.NodeDataSource.getConnection(NodeDataSource.java:46)
> >    at
> org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:170)
> >
> > Thanks ~ Chris Murphy
> >
> > On 11 December 2011 17:57, Robert Zeigler <robert.zeigler@roxanemy.com
> >wrote:
> >
> >> Note that DataContext creation is cheap. What I typically do in
> situations
> >> like this is to periodically create a new data context that you can
> throw
> >> away. When it's gone, the associated objects will be gc'ed. Eg: you
> could
> >> periodically dump the "reading" data context and create a fresh one
> after
> >> every 1,000 Reading objects or whatever makes sense for your use-case.
> If
> >> the frequency of readings transactions is low, it probably makes sense
> to
> >> create a new DataContext, import whatever objects you need into it (via
> >> localObject), create your readings, commit, then discard the
> DataContext.
> >> If the readings are very frequent, then it makes sense to have a
> dedicated
> >> "readings" DataContext that you can periodically swap out.
> >>
> >> HTH,
> >>
> >> Robert
> >>
> >> On Dec 11, 2011, at 12/1112:51 AM , Chris Murphy (www.strandz.org)
> wrote:
> >>
> >>> I have a server application that continually reads data from sensors.
> At
> >>> set intervals the data is summarized. This summary data is used to
> create
> >>> Cayenne data objects of type Reading. A short transaction commits these
> >>> Reading objects to the database, after which it is not important that
> >> they
> >>> are held in memory - they were created only to be stored. After a long
> >>> period of time their continual collection results in an 'OutOfMemory'
> JVM
> >>> condition.
> >>>
> >>> There are many objects of type Reading for another Cayenne data object
> >>> called SubstancePoint. And there's a whole object graph going back from
> >>> there. I basically want to keep the whole of the object graph in
> memory,
> >>> except for these Reading objects.
> >>>
> >>> Is there a way to 'disappear' data objects from the DataContext? For
> that
> >>> is what I think would solve my problem. I have tried calling
> >>> DataContext.unregisterObjects() post-commit on the Reading data
> objects I
> >>> want evacuated from memory, but I can see that the leak is still going
> >> on.
> >>>
> >>> Thank you ~ Chris Murphy
> >>
> >>
>
>

Re: Continual creation is a memory leak

Posted by Robert Zeigler <ro...@roxanemy.com>.
Hm. Cayenne is usually pretty good about managing the connection pooling for you.
How many concurrent connections is your mysql db configured to allow?

Robert

On Dec 11, 2011, at 12/119:58 PM , Chris Murphy (www.strandz.org) wrote:

> Thanks Robert,
> 
> That was the answer. The memory leak disappeared. I now create a
> DataContext with 'method scope' that gets gc-ed when the method has
> finished. That brought up what must be another problem with my code which
> I'm investigating now:
> 
> Caused by: org.apache.cayenne.CayenneRuntimeException: [v.3.0 Apr 26 2010
> 09:59:17] Error detecting database type: Data source rejected establishment
> of connection message from server: "Too many connections"
>    at org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:185)
>    at org.apache.cayenne.dba.AutoAdapter.getAdapter(AutoAdapter.java:155)
>    at
> org.apache.cayenne.dba.AutoAdapter.getExtendedTypes(AutoAdapter.java:263)
>    at org.apache.cayenne.access.DataNode.performQueries(DataNode.java:243)
>    at
> org.apache.cayenne.access.DataDomainQueryAction.runQuery(DataDomainQueryAction.java:422)
>    at
> org.apache.cayenne.access.DataDomainQueryAction.access$000(DataDomainQueryAction.java:69)
>    at
> org.apache.cayenne.access.DataDomainQueryAction$2.transform(DataDomainQueryAction.java:395)
>    at
> org.apache.cayenne.access.DataDomain.runInTransaction(DataDomain.java:850)
>    at
> org.apache.cayenne.access.DataDomainQueryAction.runQueryInTransaction(DataDomainQueryAction.java:392)
>    at
> org.apache.cayenne.access.DataDomainQueryAction.execute(DataDomainQueryAction.java:121)
>    at org.apache.cayenne.access.DataDomain.onQuery(DataDomain.java:743)
>    at
> org.apache.cayenne.util.ObjectContextQueryAction.runQuery(ObjectContextQueryAction.java:333)
>    at
> org.apache.cayenne.util.ObjectContextQueryAction.execute(ObjectContextQueryAction.java:96)
>    at org.apache.cayenne.access.DataContext.onQuery(DataContext.java:1278)
>    at
> org.apache.cayenne.access.DataContext.performQuery(DataContext.java:1267)
>    at
> com.seasoft.store.orm.CayenneInterpretedQuery.execute(CayenneInterpretedQuery.java:265)
>    at com.seasoft.store.DomainQuery.execute(DomainQuery.java:43)
>    at
> com.seasoft.store.DomainQueries.executeRetCollection(DomainQueries.java:56)
>    at
> com.cmts.business.orm.DOActivity.storeRealtimeReadings(DOActivity.java:113)
>    at
> com.cmts.business.SenderGasReceiver.storeRealtimeSamplesIntoSummaryReadings(SenderGasReceiver.java:266)
>    at com.cmts.business.StoreRealtimeIntoDBJob.execut
> e(StoreRealtimeIntoDBJob.java:20)
>    at org.quartz.core.JobRunShell.run(JobRunShell.java:216)
> 
> Maybe I need to recycle connections or something?? In the Cayenne
> documentation I found this: "Cayenne will also periodically close unused
> database connections if it determines there are too many that are open and
> idle." Is there some way to hint to Cayenne that you've finished with a
> DataContext's connection pool, something like 'dataContext.close()'?
> 
> Here is the exception at a lower level:
> 
> Caused by: java.sql.SQLException: Data source rejected establishment of
> connection message from server: "Too many connections"
>    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1997)
>    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1906)
>    at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:2520)
>    at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:817)
>    at com.mysql.jdbc.Connection.createNewIO(Connection.java:1782)
>    at com.mysql.jdbc.Connection.<init>(Connection.java:450)
>    at
> com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:411)
>    at
> org.apache.cayenne.conn.DriverDataSource.getConnection(DriverDataSource.java:156)
>    at
> org.apache.cayenne.conn.PooledConnectionImpl.reconnect(PooledConnectionImpl.java:83)
>    at
> org.apache.cayenne.conn.PooledConnectionImpl.getConnection(PooledConnectionImpl.java:120)
>    at
> org.apache.cayenne.conn.PoolManager.uncheckConnection(PoolManager.java:369)
>    at
> org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:353)
>    at
> org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:330)
>    at
> org.apache.cayenne.access.DataNode$TransactionDataSource.getConnection(DataNode.java:364)
>    at
> org.apache.cayenne.conf.NodeDataSource.getConnection(NodeDataSource.java:46)
>    at org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:170)
> 
> Thanks ~ Chris Murphy
> 
> On 11 December 2011 17:57, Robert Zeigler <ro...@roxanemy.com>wrote:
> 
>> Note that DataContext creation is cheap. What I typically do in situations
>> like this is to periodically create a new data context that you can throw
>> away. When it's gone, the associated objects will be gc'ed. Eg: you could
>> periodically dump the "reading" data context and create a fresh one after
>> every 1,000 Reading objects or whatever makes sense for your use-case. If
>> the frequency of readings transactions is low, it probably makes sense to
>> create a new DataContext, import whatever objects you need into it (via
>> localObject), create your readings, commit, then discard the DataContext.
>> If the readings are very frequent, then it makes sense to have a dedicated
>> "readings" DataContext that you can periodically swap out.
>> 
>> HTH,
>> 
>> Robert
>> 
>> On Dec 11, 2011, at 12/1112:51 AM , Chris Murphy (www.strandz.org) wrote:
>> 
>>> I have a server application that continually reads data from sensors. At
>>> set intervals the data is summarized. This summary data is used to create
>>> Cayenne data objects of type Reading. A short transaction commits these
>>> Reading objects to the database, after which it is not important that
>> they
>>> are held in memory - they were created only to be stored. After a long
>>> period of time their continual collection results in an 'OutOfMemory' JVM
>>> condition.
>>> 
>>> There are many objects of type Reading for another Cayenne data object
>>> called SubstancePoint. And there's a whole object graph going back from
>>> there. I basically want to keep the whole of the object graph in memory,
>>> except for these Reading objects.
>>> 
>>> Is there a way to 'disappear' data objects from the DataContext? For that
>>> is what I think would solve my problem. I have tried calling
>>> DataContext.unregisterObjects() post-commit on the Reading data objects I
>>> want evacuated from memory, but I can see that the leak is still going
>> on.
>>> 
>>> Thank you ~ Chris Murphy
>> 
>> 


Re: Continual creation is a memory leak

Posted by "Chris Murphy (www.strandz.org)" <ch...@strandz.org>.
Thanks Robert,

That was the answer. The memory leak disappeared. I now create a
DataContext with 'method scope' that gets gc-ed when the method has
finished. That brought up what must be another problem with my code which
I'm investigating now:

Caused by: org.apache.cayenne.CayenneRuntimeException: [v.3.0 Apr 26 2010
09:59:17] Error detecting database type: Data source rejected establishment
of connection message from server: "Too many connections"
    at org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:185)
    at org.apache.cayenne.dba.AutoAdapter.getAdapter(AutoAdapter.java:155)
    at
org.apache.cayenne.dba.AutoAdapter.getExtendedTypes(AutoAdapter.java:263)
    at org.apache.cayenne.access.DataNode.performQueries(DataNode.java:243)
    at
org.apache.cayenne.access.DataDomainQueryAction.runQuery(DataDomainQueryAction.java:422)
    at
org.apache.cayenne.access.DataDomainQueryAction.access$000(DataDomainQueryAction.java:69)
    at
org.apache.cayenne.access.DataDomainQueryAction$2.transform(DataDomainQueryAction.java:395)
    at
org.apache.cayenne.access.DataDomain.runInTransaction(DataDomain.java:850)
    at
org.apache.cayenne.access.DataDomainQueryAction.runQueryInTransaction(DataDomainQueryAction.java:392)
    at
org.apache.cayenne.access.DataDomainQueryAction.execute(DataDomainQueryAction.java:121)
    at org.apache.cayenne.access.DataDomain.onQuery(DataDomain.java:743)
    at
org.apache.cayenne.util.ObjectContextQueryAction.runQuery(ObjectContextQueryAction.java:333)
    at
org.apache.cayenne.util.ObjectContextQueryAction.execute(ObjectContextQueryAction.java:96)
    at org.apache.cayenne.access.DataContext.onQuery(DataContext.java:1278)
    at
org.apache.cayenne.access.DataContext.performQuery(DataContext.java:1267)
    at
com.seasoft.store.orm.CayenneInterpretedQuery.execute(CayenneInterpretedQuery.java:265)
    at com.seasoft.store.DomainQuery.execute(DomainQuery.java:43)
    at
com.seasoft.store.DomainQueries.executeRetCollection(DomainQueries.java:56)
    at
com.cmts.business.orm.DOActivity.storeRealtimeReadings(DOActivity.java:113)
    at
com.cmts.business.SenderGasReceiver.storeRealtimeSamplesIntoSummaryReadings(SenderGasReceiver.java:266)
    at com.cmts.business.StoreRealtimeIntoDBJob.execut
e(StoreRealtimeIntoDBJob.java:20)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:216)

Maybe I need to recycle connections or something?? In the Cayenne
documentation I found this: "Cayenne will also periodically close unused
database connections if it determines there are too many that are open and
idle." Is there some way to hint to Cayenne that you've finished with a
DataContext's connection pool, something like 'dataContext.close()'?

Here is the exception at a lower level:

Caused by: java.sql.SQLException: Data source rejected establishment of
connection message from server: "Too many connections"
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1997)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1906)
    at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:2520)
    at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:817)
    at com.mysql.jdbc.Connection.createNewIO(Connection.java:1782)
    at com.mysql.jdbc.Connection.<init>(Connection.java:450)
    at
com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:411)
    at
org.apache.cayenne.conn.DriverDataSource.getConnection(DriverDataSource.java:156)
    at
org.apache.cayenne.conn.PooledConnectionImpl.reconnect(PooledConnectionImpl.java:83)
    at
org.apache.cayenne.conn.PooledConnectionImpl.getConnection(PooledConnectionImpl.java:120)
    at
org.apache.cayenne.conn.PoolManager.uncheckConnection(PoolManager.java:369)
    at
org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:353)
    at
org.apache.cayenne.conn.PoolManager.getConnection(PoolManager.java:330)
    at
org.apache.cayenne.access.DataNode$TransactionDataSource.getConnection(DataNode.java:364)
    at
org.apache.cayenne.conf.NodeDataSource.getConnection(NodeDataSource.java:46)
    at org.apache.cayenne.dba.AutoAdapter.loadAdapter(AutoAdapter.java:170)

Thanks ~ Chris Murphy

On 11 December 2011 17:57, Robert Zeigler <ro...@roxanemy.com>wrote:

> Note that DataContext creation is cheap. What I typically do in situations
> like this is to periodically create a new data context that you can throw
> away. When it's gone, the associated objects will be gc'ed. Eg: you could
> periodically dump the "reading" data context and create a fresh one after
> every 1,000 Reading objects or whatever makes sense for your use-case. If
> the frequency of readings transactions is low, it probably makes sense to
> create a new DataContext, import whatever objects you need into it (via
> localObject), create your readings, commit, then discard the DataContext.
>  If the readings are very frequent, then it makes sense to have a dedicated
> "readings" DataContext that you can periodically swap out.
>
> HTH,
>
> Robert
>
> On Dec 11, 2011, at 12/1112:51 AM , Chris Murphy (www.strandz.org) wrote:
>
> > I have a server application that continually reads data from sensors. At
> > set intervals the data is summarized. This summary data is used to create
> > Cayenne data objects of type Reading. A short transaction commits these
> > Reading objects to the database, after which it is not important that
> they
> > are held in memory - they were created only to be stored. After a long
> > period of time their continual collection results in an 'OutOfMemory' JVM
> > condition.
> >
> > There are many objects of type Reading for another Cayenne data object
> > called SubstancePoint. And there's a whole object graph going back from
> > there. I basically want to keep the whole of the object graph in memory,
> > except for these Reading objects.
> >
> > Is there a way to 'disappear' data objects from the DataContext? For that
> > is what I think would solve my problem. I have tried calling
> > DataContext.unregisterObjects() post-commit on the Reading data objects I
> > want evacuated from memory, but I can see that the leak is still going
> on.
> >
> > Thank you ~ Chris Murphy
>
>

Re: Continual creation is a memory leak

Posted by Robert Zeigler <ro...@roxanemy.com>.
Note that DataContext creation is cheap. What I typically do in situations like this is to periodically create a new data context that you can throw away. When it's gone, the associated objects will be gc'ed. Eg: you could periodically dump the "reading" data context and create a fresh one after every 1,000 Reading objects or whatever makes sense for your use-case. If the frequency of readings transactions is low, it probably makes sense to create a new DataContext, import whatever objects you need into it (via localObject), create your readings, commit, then discard the DataContext.  If the readings are very frequent, then it makes sense to have a dedicated "readings" DataContext that you can periodically swap out.

HTH,

Robert

On Dec 11, 2011, at 12/1112:51 AM , Chris Murphy (www.strandz.org) wrote:

> I have a server application that continually reads data from sensors. At
> set intervals the data is summarized. This summary data is used to create
> Cayenne data objects of type Reading. A short transaction commits these
> Reading objects to the database, after which it is not important that they
> are held in memory - they were created only to be stored. After a long
> period of time their continual collection results in an 'OutOfMemory' JVM
> condition.
> 
> There are many objects of type Reading for another Cayenne data object
> called SubstancePoint. And there's a whole object graph going back from
> there. I basically want to keep the whole of the object graph in memory,
> except for these Reading objects.
> 
> Is there a way to 'disappear' data objects from the DataContext? For that
> is what I think would solve my problem. I have tried calling
> DataContext.unregisterObjects() post-commit on the Reading data objects I
> want evacuated from memory, but I can see that the leak is still going on.
> 
> Thank you ~ Chris Murphy