You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-dev@lucene.apache.org by Mike Klaas <mi...@gmail.com> on 2007/07/19 02:17:12 UTC

Re: solr index problem

On 18-Jul-07, at 2:58 PM, Yonik Seeley wrote:

> On 7/18/07, Mike Klaas <mi...@gmail.com> wrote:
>>
>> Could happen when doDeleting the pending docs too.  James: try
>> sending commit every 500k docs or so.
>
> Hmmm, right... some of the memory usage will be related to the treemap
> keeping track of deleted items, and some of it will be related to the
> reader itself (the term index, and the norms).
>
> Perhaps we should have some sort of limit on the number of deletions
> we cache so people don't have to worry about that part.

I logged this in http://issues.apache.org/jira/browse/SOLR-310

Perhaps a solrconfig.xml setting, <maxPendingDocs>? Default to 100k?

I have experienced this from time to time... it is one of the reasons  
I fastidiously wipe the index and restart solr before reindexing.

-Mike

Re: solr index problem

Posted by James liu <li...@gmail.com>.
I correct it,,,i index 17M docs. not 1.7M,,,so OutOfMemory happen when it
finish index ~11.3m docs

It is new index.

i think it maybe the reason:

On 7/18/07, Otis Gospodnetic <ot...@yahoo.com> wrote:
> > Why?  Too small of a Java heap. :)
> > Increase the size of the Java heap and lower the maxBufferedDocs number
> in solrconfig.xml and then try again.
>
> If it only happens after a lot of docs, it's probably not
> maxBufferedDocs, but when a big luicene merge is triggered.
>


2007/7/19, Mike Klaas <mi...@gmail.com>:
>
>
> On 18-Jul-07, at 2:58 PM, Yonik Seeley wrote:
>
> > On 7/18/07, Mike Klaas <mi...@gmail.com> wrote:
> >>
> >> Could happen when doDeleting the pending docs too.  James: try
> >> sending commit every 500k docs or so.
> >
> > Hmmm, right... some of the memory usage will be related to the treemap
> > keeping track of deleted items, and some of it will be related to the
> > reader itself (the term index, and the norms).
> >
> > Perhaps we should have some sort of limit on the number of deletions
> > we cache so people don't have to worry about that part.
>
> I logged this in http://issues.apache.org/jira/browse/SOLR-310
>
> Perhaps a solrconfig.xml setting, <maxPendingDocs>? Default to 100k?
>
> I have experienced this from time to time... it is one of the reasons
> I fastidiously wipe the index and restart solr before reindexing.


aha,,me too。

-Mike
>



-- 
regards
jl