You are viewing a plain text version of this content. The canonical link for it is here.
Posted to java-user@lucene.apache.org by Steve Rajavuori <St...@opin.com> on 2005/12/22 22:41:46 UTC

OutOfMemory during optimize

I am periodically getting "Too many open files" error when searching. Currently there are over 500 files in my Lucene directory. I am attempting to run optimize( ) to reduce the number of files. However, optimize never finishes because whenever I run it, it quits with a Java exception OutOfMemory error. I have tried using the -Xmx and -Xms switches to increase heap size, but that has not helped.
 
Any suggestions?
 
Steve Rajavuori
OPIN Systems

Voice: 651-994-6556
Fax:   651-994-7828

2600 Eagan Woods Dr., Suite 400
Eagan, MN  55121
800-888-1804
 <http://www.opin.com/> www.opin.com

Re: OutOfMemory during optimize

Posted by "Michael D. Curtin" <mi...@curtin.com>.
Steve Rajavuori wrote:
> I am periodically getting "Too many open files" error when searching. Currently there are over 500 files in my Lucene directory. I am attempting to run optimize( ) to reduce the number of files. However, optimize never finishes because whenever I run it, it quits with a Java exception OutOfMemory error. I have tried using the -Xmx and -Xms switches to increase heap size, but that has not helped.
>  
> Any suggestions?

What is mergeFactor set to in your IndexWriter?  Decreasing it makes 
optimize() take longer, but use less memory.  How much RAM does the 
machine you're running on have, compared to the size of your index's 
documents?

--MDC

---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org


RE: OutOfMemory during optimize

Posted by an...@orbita1.ru.
We increase system parameter "max number open files". For do it use command
ulimit -n [max number]. 

-----Original Message-----
From: Steve Rajavuori [mailto:SteveR@opin.com] 
Sent: Friday, December 23, 2005 12:42 AM
To: lucene-user@jakarta.apache.org
Subject: OutOfMemory during optimize

I am periodically getting "Too many open files" error when searching.
Currently there are over 500 files in my Lucene directory. I am attempting
to run optimize( ) to reduce the number of files. However, optimize never
finishes because whenever I run it, it quits with a Java exception
OutOfMemory error. I have tried using the -Xmx and -Xms switches to increase
heap size, but that has not helped.
 
Any suggestions?
 
Steve Rajavuori
OPIN Systems

Voice: 651-994-6556
Fax:   651-994-7828

2600 Eagan Woods Dr., Suite 400
Eagan, MN  55121
800-888-1804
 <http://www.opin.com/> www.opin.com



---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org