You are viewing a plain text version of this content. The canonical link for it is here.
Posted to java-user@lucene.apache.org by Monique Monteiro <mo...@gmail.com> on 2010/03/05 19:38:31 UTC

OutOfMemoryError

Hi all,



  I’m new to Lucene and I’m evaluating it in a web application which looks
up strings in a huge index – the index file contains 32GB. I keep a
reference to a Searcher object during the application’s lifetime, but this
object has strong memory requirements and keeps memory consumption around
950MB.  I did some optimization in order to share some fields in two
“composed” indices, but in a web application with less than 1GB for JVM,
OutOfMemoryError is generated. It seems that the searcher keeps some form of
cache which is not frequently released.


  I’d like to know if this kind of memory leak is normal according to
Lucene’s behaviour and if the only available solution is adding memory to
the JVM.

Thanks in advance!

-- 
Monique Monteiro, MSc
IBM OOAD / SCJP / MCTS Web
Blog: http://moniquelouise.spaces.live.com/
Twitter: http://twitter.com/monilouise
MSN: monique_louise@msn.com
GTalk: monique.louise@gmail.com

Re: OutOfMemoryError

Posted by Monique Monteiro <mo...@gmail.com>.
Hi Otis,

  no, I don't use sort. But I use TopFieldCollector and I have to
instantiate a Sort object with new Sort(). The data are returned unsorted.

On Fri, Mar 5, 2010 at 7:38 PM, Otis Gospodnetic <otis_gospodnetic@yahoo.com
> wrote:

> Maybe it's not a leak, Monique. :)
> If you use sorting in Lucene, then the FieldCache object will keep some
> data permanently in memory, for example.
>
>
> Otis
> ----
> Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch
> Hadoop ecosystem search :: http://search-hadoop.com/
>
>
>
> ----- Original Message ----
> > From: Monique Monteiro <mo...@gmail.com>
> > To: java-user@lucene.apache.org
> > Sent: Fri, March 5, 2010 1:38:31 PM
> > Subject: OutOfMemoryError
> >
> > Hi all,
> >
> >
> >
> >   I’m new to Lucene and I’m evaluating it in a web application which
> looks
> > up strings in a huge index – the index file contains 32GB. I keep a
> > reference to a Searcher object during the application’s lifetime, but
> this
> > object has strong memory requirements and keeps memory consumption around
> > 950MB.  I did some optimization in order to share some fields in two
> > “composed” indices, but in a web application with less than 1GB for JVM,
> > OutOfMemoryError is generated. It seems that the searcher keeps some form
> of
> > cache which is not frequently released.
> >
> >
> >   I’d like to know if this kind of memory leak is normal according to
> > Lucene’s behaviour and if the only available solution is adding memory to
> > the JVM.
> >
> > Thanks in advance!
> >
> > --
> > Monique Monteiro, MSc
> > IBM OOAD / SCJP / MCTS Web
> > Blog: http://moniquelouise.spaces.live.com/
> > Twitter: http://twitter.com/monilouise
> > MSN: monique_louise@msn.com
> > GTalk: monique.louise@gmail.com
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
> For additional commands, e-mail: java-user-help@lucene.apache.org
>
>


-- 
Monique Monteiro, MSc
Auditora Federal de Controle Externo - Tribunal de Contas da União (TCU)
IBM OOAD / SCJP / MCTS Web
Blog: http://moniquelouise.spaces.live.com/
Twitter: http://twitter.com/monilouise
MSN: monique_louise@msn.com
GTalk: monique.louise@gmail.com

Re: OutOfMemoryError

Posted by Otis Gospodnetic <ot...@yahoo.com>.
Maybe it's not a leak, Monique. :)
If you use sorting in Lucene, then the FieldCache object will keep some data permanently in memory, for example.


Otis
----
Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch
Hadoop ecosystem search :: http://search-hadoop.com/



----- Original Message ----
> From: Monique Monteiro <mo...@gmail.com>
> To: java-user@lucene.apache.org
> Sent: Fri, March 5, 2010 1:38:31 PM
> Subject: OutOfMemoryError
> 
> Hi all,
> 
> 
> 
>   I’m new to Lucene and I’m evaluating it in a web application which looks
> up strings in a huge index – the index file contains 32GB. I keep a
> reference to a Searcher object during the application’s lifetime, but this
> object has strong memory requirements and keeps memory consumption around
> 950MB.  I did some optimization in order to share some fields in two
> “composed” indices, but in a web application with less than 1GB for JVM,
> OutOfMemoryError is generated. It seems that the searcher keeps some form of
> cache which is not frequently released.
> 
> 
>   I’d like to know if this kind of memory leak is normal according to
> Lucene’s behaviour and if the only available solution is adding memory to
> the JVM.
> 
> Thanks in advance!
> 
> -- 
> Monique Monteiro, MSc
> IBM OOAD / SCJP / MCTS Web
> Blog: http://moniquelouise.spaces.live.com/
> Twitter: http://twitter.com/monilouise
> MSN: monique_louise@msn.com
> GTalk: monique.louise@gmail.com


---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org