You are viewing a plain text version of this content. The canonical link for it is here.
Posted to derby-user@db.apache.org by 张伟华 <xj...@hotmail.com> on 2007/06/22 16:15:45 UTC
help for derby out of memory question.
hi
my project use the derby 10.1 version .some table have 100000 records data ,when add ,delete ,query from the table frequently,the derby will out of memory ,why the memory can't release auto? how can i free them by myself,help help thanks a lot.
_________________________________________________________________
通过 Live.com 查看资讯、娱乐信息和您关心的其他信息!
http://www.live.com/getstarted.aspx
Re: help for derby out of memory question.
Posted by Bryan Pendleton <bp...@amberpoint.com>.
> my project use the derby 10.1 version .some table have 100000 records
> data ,when add ,delete ,query from the table frequently,the derby will
> out of memory ,why the memory can't release auto?
Two suggestions:
1) Upgrade to 10.2.2.0 as soon as you can. It's got
many fixes, including a number of memory-related fixes.
2) Ensure that you are closing all JDBC objects that
you allocate. Verify that every ResultSet, every
Statement, and every Connection that you use is
closed when you are done using it. Closing these
objects allows Derby to reclaim the memory resources.
thanks,
bryan
Re: help for derby out of memory question.
Posted by Stanley Bradbury <St...@gmail.com>.
张伟华 wrote:
> hi
> my project use the derby 10.1 version .some table have 100000 records
> data ,when add ,delete ,query from the table frequently,the derby will
> out of memory ,why the memory can't release auto? how can i free them
> by myself,help help thanks a lot.
>
> ------------------------------------------------------------------------
> 通过 Windows Live Spaces 与朋友轻松共享您的生活。 立即尝试!
> <http://spaces.live.com/spacesapi.aspx?wx_action=create&wx_url=/friends.aspx&mkt=zh-cn>
Definitely check for the problems Bryan points out. The amount of memory
the Derby engine requires from the JVM will stabilize when all the
caches have been filled. The exact amount of memory required depends on
the database schema. I have seen systems where the default JVM size is
not sufficient to support the default Derby PageCache, resulting in
OutOfMemory exceptions. If the problem continues after verifying that
all objects are being closed then you can setup tests using a larger JVM
maxHeap or a smaller Derby PageCache until the two are balanced and you
no longer experience OutOfMemory exceptions.