You are viewing a plain text version of this content. The canonical link for it is here.
Posted to derby-dev@db.apache.org by "Ibrahim (JIRA)" <de...@db.apache.org> on 2006/08/17 12:25:13 UTC

[jira] Created: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0

Memory do not return to the system after Shuting down derby 10.2.1.0
--------------------------------------------------------------------

                 Key: DERBY-1713
                 URL: http://issues.apache.org/jira/browse/DERBY-1713
             Project: Derby
          Issue Type: Bug
          Components: Performance
    Affects Versions: 10.2.0.0
         Environment: Windows XP SP2
JRE 1.6 beta2
            Reporter: Ibrahim
            Priority: Critical


I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:

SELECT count(*) FROM <table> WHERE .....

N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 

Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)

I'm using this to shutdown the DB:

try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}

I'm using a memory Profiler for monitoring the memory usage.

Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "John H. Embretsen (JIRA)" <de...@db.apache.org>.
     [ http://issues.apache.org/jira/browse/DERBY-1713?page=all ]

John H. Embretsen updated DERBY-1713:
-------------------------------------

    Attachment: Derby1713repro.java

I have not been able to try Ibrahim's Test1.java, since I am not able to decompress the database needed (my version of bzip2 tells me it (test.zip) is not a valid bzip2 archive - and I'm not using Windows (at the moment)).

Instead, I have created a fully reproducible test case which does not require having a database in advance. See attachment 'Derby1713repro.java'. To run, put derby.jar in your classpath and run:

$ java Derby1713repro

If you already have a DB to test with, you can run with the "haveDB" argument:

$ java Derby1713repro haveDB

The test case inserts 40k rows into a table (same DDL as Ibrahim described in previous comments), which results in a 30 MB database (unless "haveDB" is specified, in which case the test case only performs a query against an already existing database).

I think one important thing to do is to reduce as many variables as possible when trying to reproduce or debug a problem. With this repro, I do get an OutOfMemoryError (OOME) if the max heap size is 32 MB (-Xmx32M). This happens without using an inner class (as is done in Test1.java), and it even happens without executing a 'SELECT count(*)' statement. I have also tried commenting out the setFetchSize(60000) call, and the OOME still happens (Sun's jvm version 1.5.0_06).

The problem seems to be the huge SELECT, FROM, WHERE statement. At start-up (when using the "haveDB" option), jstat reports a tenured space usage of 512 bytes. When executing the query, tenured space usage increases to as much as 30272 bytes, which is almost the whole heap. In addition, other parts of the heap need some space, hence we get an OutOfMemoryError. Feel free to use this repro as basis for creating a minimalistic test showing the problem.

I think Mike is right when he says that 32 MB max heap is not suitable ("safe") for this kind of database and this kind of usage. I therefore recommend lowering the Priority and Urgency of this Jira issue somewhat.

I also recommend changing the summary of this issue once the real problem has been positively identified. The fact that the JVM is not able to free memory following an OutOfMemoryEvent is as far as I can tell not a bug in Derby.



> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.1.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>         Attachments: Derby1713repro.java, test.zip, Test1.java
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Commented: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Ibrahim (JIRA)" <de...@db.apache.org>.
    [ http://issues.apache.org/jira/browse/DERBY-1713?page=comments#action_12428932 ] 
            
Ibrahim commented on DERBY-1713:
--------------------------------

Thanks for these quick responses. It seems that I understood the problem. It is the unexpected behavior of OutOfMemory exception.

So sorry because I was not careful for the SQL statement that generates the problem. It is:

SELECT * FROM Contents ORDER BY Code, Seq

Where Code & Seq are column names. So the problem with ORDER BY with two columns since it consumes the memory where if I ORDER BY one column the memory consumption is negligible.

[I discovered this when I tried to simplify the program to send it]

F.Y.I

DDL (~92,000 lines):
CREATE TABLE Contents(Code INTEGER, sheekhid INTEGER, Book VARCHAR(50), Seq INTEGER, Line LONG VARCHAR, Offset VARCHAR(10))

'Line' column takes sometimes pages.

Sorry for any inconvenience and thank you.

> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.1.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Commented: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0

Posted by "John H. Embretsen (JIRA)" <de...@db.apache.org>.
    [ http://issues.apache.org/jira/browse/DERBY-1713?page=comments#action_12428625 ] 
            
John H. Embretsen commented on DERBY-1713:
------------------------------------------

Once you get an OutOfMemoryError, you cannot expect your JVM to function properly. OutOfMemoryError is a subclass of java.lang.VirtualMachineError, which JavaDoc says:

"Thrown to indicate that the Java Virtual Machine is broken or has run out of resources necessary for it to continue operating."

So the fact that memory is not freed when you try to shut down Derby after getting the OutOfMemoryError is probably not a bug. However, the fact that you do run out of memory when executing a SELECT count(*) statement sounds like a serious bug to me... 

It would probably help if you could submit a test case (small java program) reproducing the error.


> Memory do not return to the system after Shuting down derby 10.2.1.0
> --------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.0.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Ibrahim (JIRA)" <de...@db.apache.org>.
     [ http://issues.apache.org/jira/browse/DERBY-1713?page=all ]

Ibrahim updated DERBY-1713:
---------------------------

    Attachment: test.zip

db For the test case.

> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.1.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>         Attachments: test.zip
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Ibrahim (JIRA)" <de...@db.apache.org>.
     [ http://issues.apache.org/jira/browse/DERBY-1713?page=all ]

Ibrahim updated DERBY-1713:
---------------------------

    Attachment: Test1.java

I simulated the problem exactly. I attached the code file [replace it with the one in the previous attachment].
I found very strange behavior. It occurs in very specific situation.
The problem of not freeing the memory happens when calling a class which contains db connection part [if you call the statements directly in the class, the freeing process will work]. The SQL statements play a role also. The SQL statement that causes the problem contains long WHERE clause that changes dynamically, but this in its own will not cause a problem unless it comes after another statement  [SELECT count(*) ... ].
In summary, If I make any change in the SQL format or its place or .... , the freeing process works perfectly except the case that I attached.
Please run the code and check  it's memory behavior. I'm not sure if it will work with you in the same way that it works with me. Also I'm not sure what is the problem.
Please check if this is really a bug.

> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.1.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>         Attachments: test.zip, Test1.java
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Commented: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Kathey Marsden (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/DERBY-1713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12516497 ] 

Kathey Marsden commented on DERBY-1713:
---------------------------------------

Can this be closed as Invalid since the out of memory error should not return memory to the system and also since the heap was set too small for the query/database in question?

An alternative would be to change it to improvement and change the description to "Reduce memory usage with query with order by"

Thoughts?

Kathey


> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: https://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance, SQL
>    Affects Versions: 10.2.1.6
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>         Attachments: Derby1713repro.java, test.zip, Test1.java
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Updated: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Mike Matrigali (JIRA)" <de...@db.apache.org>.
     [ http://issues.apache.org/jira/browse/DERBY-1713?page=all ]

Mike Matrigali updated DERBY-1713:
----------------------------------


I admit I don't understand why there is much of a difference, during runtime for those 2 queries. Do you have
any indexes on this data?  From your ddl, derby will pick 32k pages for that table, and a select * should fill up the 1000 page cache leading to more that 32,000,000 bytes used, no matter what the order by clause.  The sort  should use at most 1 meg of memory in addition, by default once a sort is bigger than that an external merge sort is used where external files are used rather than memory.  I guess it could be that you are 
right at the very edge of running out of memory durring processing and that the sort with one extra  field uses up slightly more memory.

32mb is not a safe minimum for a database of this size and a 1000 page buffer cache.  If memory is a consideration I suggest you reduce the page cache size, if not raise the minimum memory for the jvm.

When you say memory use is "neglibible" do you mean after you successfully shutdown=true?  I would assume derby will use at least 32,000,000 during runtime processing of this query (assuming  that the
db size on disk is 40mb as you say).  

> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.1.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Resolved: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Kathey Marsden (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/DERBY-1713?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kathey Marsden resolved DERBY-1713.
-----------------------------------

    Resolution: Invalid

It was discussed in this issue that the out of memory error should not return memory to the system, so resolving  this as Invalid.

> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: https://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance, SQL
>    Affects Versions: 10.2.1.6
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>         Attachments: Derby1713repro.java, test.zip, Test1.java
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Updated: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "John H. Embretsen (JIRA)" <de...@db.apache.org>.
     [ http://issues.apache.org/jira/browse/DERBY-1713?page=all ]

John H. Embretsen updated DERBY-1713:
-------------------------------------

    Component/s: SQL
        Urgency: Normal  (was: Urgent)
       Priority: Major  (was: Critical)

This issue is seen when executing certain kinds of queries against large tables. Two work-arounds have been identified and tested:

1. Increase max heap size (e.g. from 32 MB to 38 MB, -Xmx38M)
2. Reduce the size of Derby's page cache. With Derby1713repro.java, the following command succeeded:

> java -Xmx32m -Dderby.storage.pageCacheSize=800 Derby1713repro haveDB

whereas it fails with the default pageCacheSize value (1000). Using different values for derby.storage.pageSize only did not seem to have a noticeable effect on memory usage.

See Derby's tuning guide for more information about these properties ( http://db.apache.org/derby/docs/dev/tuning/ ).

The reported problem of not being able to free memory following an OutOfMemoryError is not a bug. Quoting the Java Language Specification, 3rd edition, Chapter 11 ( http://java.sun.com/docs/books/jls/third_edition/html/exceptions.html ):

"The class Error and its subclasses are exceptions from which ordinary programs are not ordinarily expected to recover. (...) recovery is typically not possible".

If any of the reported work-arounds are not acceptable, the programmer probably needs to handle such situations preemptively. There may be some useful advice in the Tuning Guide (see http://db.apache.org/derby/docs/dev/tuning/ctundepth10525.html ).

Lowering Priority (Major; work-arounds exist) and Urgency (Normal; "If this issue scratches the itch of any particular developer, then they should help to solve it and provide a patch").


> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance, SQL
>    Affects Versions: 10.2.1.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>         Attachments: Derby1713repro.java, test.zip, Test1.java
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Commented: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "John H. Embretsen (JIRA)" <de...@db.apache.org>.
    [ http://issues.apache.org/jira/browse/DERBY-1713?page=comments#action_12431902 ] 
            
John H. Embretsen commented on DERBY-1713:
------------------------------------------

After trying the 'Derby1713repro' test case some more, I have come to the preliminary conclusion that it is the "ORDER BY" part of the query, combined with the size of the database, that is causing the problem.

I reduced the huge query to a rather simple one:

SELECT Code FROM Contents ORDER BY Code

and still got an OutOfMemoryError with -Xmx32M, using the same database as before. Removing the "ORDER BY" clause drastically reduces memory usage (tenured space usage ~1800 bytes). I also tried the same with a different database (fewer columns), and did not get an OOME.


> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.1.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>         Attachments: Derby1713repro.java, test.zip, Test1.java
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Dag H. Wanvik (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/DERBY-1713?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dag H. Wanvik updated DERBY-1713:
---------------------------------

    Derby Categories: [Performance]

> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: https://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 10.2.1.6
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>         Attachments: Derby1713repro.java, test.zip, Test1.java
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Updated: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Mike Matrigali (JIRA)" <de...@db.apache.org>.
     [ http://issues.apache.org/jira/browse/DERBY-1713?page=all ]

Mike Matrigali updated DERBY-1713:
----------------------------------

    Summary: Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event  (was: Memory do not return to the system after Shuting down derby 10.2.1.0)

Can you post a reproducible case?

If not can you describe your db in more detail.  For instance what is the ddl for the table in question.  If the table contains variable length fields what is the average size of those fields.  Are you using anything special to set page size for the table?  What I am trying to figure out is the page size of your table, derby may set it to 
8 or 32k depending on the ddl.

If it is 32k, and the default cache size is 1000 pages - it is very likely 32meg is not enough memory.  Each entry in the cache is going to start with a copy of the page and then each has a variable amount of memory 
associated with it that is linearly related to the number of rows on the page.  The first thing I always suggest
with these kinds of out of memory cases is to bump the page cache size down and see if it reproduces.  In
the past I have tested with a 40 page cache.  The page cache will grow greedily up to whatever size you have
it set to (default 1000 pages), and there is no code to remove pages based on memory situations (I don't
think there is reasonable java support to do so).  It should release the memory when the database is 
successfully shutdown with the shutdown=true approach you are giving, but as has been commented once
the jvm runs out of memory I have found nothing after is guaranteed to work.

You say you are running with a memory profiler, can you post the top memory classes and how much memory they are using.  That should make it clear what  the issue is.

> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.0.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Closed: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Kristian Waagan (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/DERBY-1713?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kristian Waagan closed DERBY-1713.
----------------------------------


> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: https://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 10.2.1.6
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>         Attachments: Derby1713repro.java, test.zip, Test1.java
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


[jira] Commented: (DERBY-1713) Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event

Posted by "Ibrahim (JIRA)" <de...@db.apache.org>.
    [ http://issues.apache.org/jira/browse/DERBY-1713?page=comments#action_12429186 ] 
            
Ibrahim commented on DERBY-1713:
--------------------------------

Thank you again.
Yes I have indexes on Code, Seq and sheekhid.
When I applied:
SELECT * FROM Contents ORDER BY Code, Seq
The memory reach the max. when:
SELECT * FROM Contents ORDER BY Code
OR
SELECT * FROM Contents ORDER BY Seq
The memory is negligible during the processing of the query (less than 2 MB)
For completeness I attached the whole db (bzip2 fomat so please use winzip 10) with sample program that simulate the case. 
I understand that I need to increase the memory but first I want to monitor all the leakages since the db will increase soon so I got these OutOfMemory problems  and the memory is not free. By the way the freeing process is working in the attached sample program after shutdown the db where it is not the case in my project (Cataloger) although I got the confirmation exception (SQLException: Derby system shutdown) and  I'm trying now to simulate it exactly  since I checked it many times but I couldn't get it right in the sample program.

> Memory do not return to the system after Shuting down derby 10.2.1.0, following an out of memory event
> ------------------------------------------------------------------------------------------------------
>
>                 Key: DERBY-1713
>                 URL: http://issues.apache.org/jira/browse/DERBY-1713
>             Project: Derby
>          Issue Type: Bug
>          Components: Performance
>    Affects Versions: 10.2.1.0
>         Environment: Windows XP SP2
> JRE 1.6 beta2
>            Reporter: Ibrahim
>            Priority: Critical
>         Attachments: test.zip
>
>
> I face a problem when querying large tables. I run the below SQL and it stuck in this query and throws java heap exception OutOfMemory:
> SELECT count(*) FROM <table> WHERE .....
> N.B. I'm using a database of more than 90,000 records (40 MB). I set the maxHeap to 32 MB (all other settings have the default value, pageCache ... etc ). 
> Then, I shutdown the database but the memory is not returned to the system (and remain 32 MB [max threshold]). I tried to increase the maxHeap to 128 MB in which it works and releases the memory, so I think the problem is when it reaches the maxHeap then it seems to not respond to anything such as closing the connection or shutting down the database. How can I get rid of this? (because i cannot increase the maxHeap as the database increases, I want to throw an exception and release the memory)
> I'm using this to shutdown the DB:
> try{DriverManager.getConnection("jdbc:derby:;shutdown=true");}
> catch(SQLException ex){System.err.println("SQLException: " + ex.getMessage());}
> I'm using a memory Profiler for monitoring the memory usage.
> Thanks in advanced.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira