You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@cayenne.apache.org by "Paul Ponec (JIRA)" <ji...@apache.org> on 2006/08/03 12:04:48 UTC

[JIRA] Created: (CAY-620) DataObject memory management

DataObject memory management
----------------------------

         Key: CAY-620
         URL: http://issues.apache.org/cayenne/browse/CAY-620
     Project: Cayenne
        Type: Bug

  Components: Cayenne Core Library  
    Versions: 2.0 [STABLE]    
 Environment: Oracle Application server was launched with -Xmx30m.
Oracle DB version: Oracle Database 10g Enterprise Edition Release 10.1.0.4.2 - Production With the Partitioning, OLAP and Data Mining options
OS: Windows XP or Linux Fedora
    Reporter: Paul Ponec
    Priority: Minor


Hi, 

I am reusing my instance of DataObject after calling context.commitChanges().
In the case my application throws OutOfMemoryError exception after 30000 inserts approximately on Oracle DB
but I am expecting all resources will be released after the commit. 

The commit is preformed for each new data object in my plain code:
public void run() {
    int MAX_LIMIT = 200*1000 ;
    DataContext context = DataContext.createDataContext();
        
    for (int i=0; i<=MAX_LIMIT; i++) {
	User user = (User) context.createAndRegisterNewObject(User.class);
	user.setAttrib1("1");
	user.setAttrib2("2");
	user.setAttrib3("3");
	context.commitChanges();
    }
}

Note:
- No stack trace is available
- It seems, a Derby DB works correctly on the same code.

I have found a solution for the bug:
Create new data context after commit by sample:
context = DataContext.createDataContext();


Regards
Paul


-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
   http://issues.apache.org/cayenne/secure/Administrators.jspa
-
For more information on JIRA, see:
   http://www.atlassian.com/software/jira


RE: [JIRA] Created: (CAY-620) DataObject memory management

Posted by "Gentry, Michael (Contractor)" <mi...@fanniemae.com>.
What, you expected me to go look at Jira?  :-)


-----Original Message-----
From: Mike Kienenberger [mailto:mkienenb@gmail.com] 
Sent: Thursday, August 03, 2006 9:46 AM
To: cayenne-dev@incubator.apache.org
Subject: Re: [JIRA] Created: (CAY-620) DataObject memory management


It's already answered and closed with a note to that effect :)

On 8/3/06, Gentry, Michael (Contractor) <mi...@fanniemae.com>
wrote:
> I don't consider this a bug.  Doing a commitChanges() does not discard
> all objects from the DataContext after the commit -- after all, you
> might need them again.  There are methods to discard (invalidate)
> objects from the DataContext if you wish, or you can create a new
> DataContext and allow the previous one to be garbage collected.
>
> /dev/mrg
>
>
>
> -----Original Message-----
> From: Paul Ponec (JIRA) [mailto:jira@apache.org]
> Sent: Thursday, August 03, 2006 6:05 AM
> To: cayenne-dev@incubator.apache.org
> Subject: [JIRA] Created: (CAY-620) DataObject memory management
>
>
> DataObject memory management
> ----------------------------
>
>          Key: CAY-620
>          URL: http://issues.apache.org/cayenne/browse/CAY-620
>      Project: Cayenne
>         Type: Bug
>
>   Components: Cayenne Core Library
>     Versions: 2.0 [STABLE]
>  Environment: Oracle Application server was launched with -Xmx30m.
> Oracle DB version: Oracle Database 10g Enterprise Edition Release
> 10.1.0.4.2 - Production With the Partitioning, OLAP and Data Mining
> options
> OS: Windows XP or Linux Fedora
>     Reporter: Paul Ponec
>     Priority: Minor
>
>
> Hi,
>
> I am reusing my instance of DataObject after calling
> context.commitChanges().
> In the case my application throws OutOfMemoryError exception after
30000
> inserts approximately on Oracle DB
> but I am expecting all resources will be released after the commit.
>
> The commit is preformed for each new data object in my plain code:
> public void run() {
>     int MAX_LIMIT = 200*1000 ;
>     DataContext context = DataContext.createDataContext();
>
>     for (int i=0; i<=MAX_LIMIT; i++) {
>         User user = (User)
> context.createAndRegisterNewObject(User.class);
>         user.setAttrib1("1");
>         user.setAttrib2("2");
>         user.setAttrib3("3");
>         context.commitChanges();
>     }
> }
>
> Note:
> - No stack trace is available
> - It seems, a Derby DB works correctly on the same code.
>
> I have found a solution for the bug:
> Create new data context after commit by sample:
> context = DataContext.createDataContext();
>
>
> Regards
> Paul
>
>
> --
> This message is automatically generated by JIRA.
> -
> If you think it was sent incorrectly contact one of the
administrators:
>    http://issues.apache.org/cayenne/secure/Administrators.jspa
> -
> For more information on JIRA, see:
>    http://www.atlassian.com/software/jira
>
>

Re: [JIRA] Created: (CAY-620) DataObject memory management

Posted by Mike Kienenberger <mk...@gmail.com>.
It's already answered and closed with a note to that effect :)

On 8/3/06, Gentry, Michael (Contractor) <mi...@fanniemae.com> wrote:
> I don't consider this a bug.  Doing a commitChanges() does not discard
> all objects from the DataContext after the commit -- after all, you
> might need them again.  There are methods to discard (invalidate)
> objects from the DataContext if you wish, or you can create a new
> DataContext and allow the previous one to be garbage collected.
>
> /dev/mrg
>
>
>
> -----Original Message-----
> From: Paul Ponec (JIRA) [mailto:jira@apache.org]
> Sent: Thursday, August 03, 2006 6:05 AM
> To: cayenne-dev@incubator.apache.org
> Subject: [JIRA] Created: (CAY-620) DataObject memory management
>
>
> DataObject memory management
> ----------------------------
>
>          Key: CAY-620
>          URL: http://issues.apache.org/cayenne/browse/CAY-620
>      Project: Cayenne
>         Type: Bug
>
>   Components: Cayenne Core Library
>     Versions: 2.0 [STABLE]
>  Environment: Oracle Application server was launched with -Xmx30m.
> Oracle DB version: Oracle Database 10g Enterprise Edition Release
> 10.1.0.4.2 - Production With the Partitioning, OLAP and Data Mining
> options
> OS: Windows XP or Linux Fedora
>     Reporter: Paul Ponec
>     Priority: Minor
>
>
> Hi,
>
> I am reusing my instance of DataObject after calling
> context.commitChanges().
> In the case my application throws OutOfMemoryError exception after 30000
> inserts approximately on Oracle DB
> but I am expecting all resources will be released after the commit.
>
> The commit is preformed for each new data object in my plain code:
> public void run() {
>     int MAX_LIMIT = 200*1000 ;
>     DataContext context = DataContext.createDataContext();
>
>     for (int i=0; i<=MAX_LIMIT; i++) {
>         User user = (User)
> context.createAndRegisterNewObject(User.class);
>         user.setAttrib1("1");
>         user.setAttrib2("2");
>         user.setAttrib3("3");
>         context.commitChanges();
>     }
> }
>
> Note:
> - No stack trace is available
> - It seems, a Derby DB works correctly on the same code.
>
> I have found a solution for the bug:
> Create new data context after commit by sample:
> context = DataContext.createDataContext();
>
>
> Regards
> Paul
>
>
> --
> This message is automatically generated by JIRA.
> -
> If you think it was sent incorrectly contact one of the administrators:
>    http://issues.apache.org/cayenne/secure/Administrators.jspa
> -
> For more information on JIRA, see:
>    http://www.atlassian.com/software/jira
>
>

RE: [JIRA] Created: (CAY-620) DataObject memory management

Posted by "Gentry, Michael (Contractor)" <mi...@fanniemae.com>.
I don't consider this a bug.  Doing a commitChanges() does not discard
all objects from the DataContext after the commit -- after all, you
might need them again.  There are methods to discard (invalidate)
objects from the DataContext if you wish, or you can create a new
DataContext and allow the previous one to be garbage collected.

/dev/mrg



-----Original Message-----
From: Paul Ponec (JIRA) [mailto:jira@apache.org] 
Sent: Thursday, August 03, 2006 6:05 AM
To: cayenne-dev@incubator.apache.org
Subject: [JIRA] Created: (CAY-620) DataObject memory management


DataObject memory management
----------------------------

         Key: CAY-620
         URL: http://issues.apache.org/cayenne/browse/CAY-620
     Project: Cayenne
        Type: Bug

  Components: Cayenne Core Library  
    Versions: 2.0 [STABLE]    
 Environment: Oracle Application server was launched with -Xmx30m.
Oracle DB version: Oracle Database 10g Enterprise Edition Release
10.1.0.4.2 - Production With the Partitioning, OLAP and Data Mining
options
OS: Windows XP or Linux Fedora
    Reporter: Paul Ponec
    Priority: Minor


Hi, 

I am reusing my instance of DataObject after calling
context.commitChanges().
In the case my application throws OutOfMemoryError exception after 30000
inserts approximately on Oracle DB
but I am expecting all resources will be released after the commit. 

The commit is preformed for each new data object in my plain code:
public void run() {
    int MAX_LIMIT = 200*1000 ;
    DataContext context = DataContext.createDataContext();
        
    for (int i=0; i<=MAX_LIMIT; i++) {
	User user = (User)
context.createAndRegisterNewObject(User.class);
	user.setAttrib1("1");
	user.setAttrib2("2");
	user.setAttrib3("3");
	context.commitChanges();
    }
}

Note:
- No stack trace is available
- It seems, a Derby DB works correctly on the same code.

I have found a solution for the bug:
Create new data context after commit by sample:
context = DataContext.createDataContext();


Regards
Paul


-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
   http://issues.apache.org/cayenne/secure/Administrators.jspa
-
For more information on JIRA, see:
   http://www.atlassian.com/software/jira