You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by "Narayanan, Karthikeyan" <Ka...@gs.com> on 2009/04/08 20:13:06 UTC

Exception while solr commit

Hello,
         I am calling commit for every record (document) added/updated
to the index.   Our number of records size is < 50k.  Getting the
following exception during commit. Is it correct approach 
to call commit for every insert/update?.  

Apr 7, 2009 4:41:23 PM org.apache.solr.handler.dataimport.SolrWriter
commit
SEVERE: Exception while solr commit.
java.lang.RuntimeException: after flush: fdx size mismatch: 20096 docs
vs 65536 length in bytes of _6.fdx
        at
org.apache.lucene.index.StoredFieldsWriter.closeDocStore(StoredFieldsWri
ter.java:94)
        at
org.apache.lucene.index.DocFieldConsumers.closeDocStore(DocFieldConsumer
s.java:83)
        at
org.apache.lucene.index.DocFieldProcessor.closeDocStore(DocFieldProcesso
r.java:47)
        at
org.apache.lucene.index.DocumentsWriter.closeDocStore(DocumentsWriter.ja
va:367)
        at
org.apache.lucene.index.IndexWriter.flushDocStores(IndexWriter.java:1774
)
        at
org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3600)
        at
org.apache.lucene.index.IndexWriter._mergeInit(IndexWriter.java:4151)
        at
org.apache.lucene.index.IndexWriter.mergeInit(IndexWriter.java:4031)
        at
org.apache.lucene.index.ConcurrentMergeScheduler.merge(ConcurrentMergeSc
heduler.java:176)
        at
org.apache.lucene.index.IndexWriter.maybeMerge(IndexWriter.java:2485)
        at
org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2332)
        at
org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2280)
        at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.
java:355)
        at
org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpd
ateProcessorFactory.java:77)
        at
org.apache.solr.handler.dataimport.SolrWriter.commit(SolrWriter.java:180
)
        at
org.apache.solr.handler.dataimport.DocBuilder.commit(DocBuilder.java:168
)
        at
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:15
2)
        at
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporte
r.java:334)
        at
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java
:386)
        at
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:
377)
Apr 7, 2009 4:41:23 PM org.apache.solr.handler.dataimport.DocBuilder
execute



Thanks.

Karthik

Re: Exception while solr commit

Posted by Michael McCandless <lu...@mikemccandless.com>.
This is a spooky exception.

Committing after every update will give very poor performance, but
should be "fine" (ie, not cause exceptions like this).

What filesystem are you on?  Is there any possibility that two writers
are open against the same index?  Is this easily reproduced?

Mike

On Wed, Apr 8, 2009 at 2:13 PM, Narayanan, Karthikeyan
<Ka...@gs.com> wrote:
>
> Hello,
>         I am calling commit for every record (document) added/updated
> to the index.   Our number of records size is < 50k.  Getting the
> following exception during commit. Is it correct approach
> to call commit for every insert/update?.
>
> Apr 7, 2009 4:41:23 PM org.apache.solr.handler.dataimport.SolrWriter
> commit
> SEVERE: Exception while solr commit.
> java.lang.RuntimeException: after flush: fdx size mismatch: 20096 docs
> vs 65536 length in bytes of _6.fdx
>        at
> org.apache.lucene.index.StoredFieldsWriter.closeDocStore(StoredFieldsWri
> ter.java:94)
>        at
> org.apache.lucene.index.DocFieldConsumers.closeDocStore(DocFieldConsumer
> s.java:83)
>        at
> org.apache.lucene.index.DocFieldProcessor.closeDocStore(DocFieldProcesso
> r.java:47)
>        at
> org.apache.lucene.index.DocumentsWriter.closeDocStore(DocumentsWriter.ja
> va:367)
>        at
> org.apache.lucene.index.IndexWriter.flushDocStores(IndexWriter.java:1774
> )
>        at
> org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3600)
>        at
> org.apache.lucene.index.IndexWriter._mergeInit(IndexWriter.java:4151)
>        at
> org.apache.lucene.index.IndexWriter.mergeInit(IndexWriter.java:4031)
>        at
> org.apache.lucene.index.ConcurrentMergeScheduler.merge(ConcurrentMergeSc
> heduler.java:176)
>        at
> org.apache.lucene.index.IndexWriter.maybeMerge(IndexWriter.java:2485)
>        at
> org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2332)
>        at
> org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2280)
>        at
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.
> java:355)
>        at
> org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpd
> ateProcessorFactory.java:77)
>        at
> org.apache.solr.handler.dataimport.SolrWriter.commit(SolrWriter.java:180
> )
>        at
> org.apache.solr.handler.dataimport.DocBuilder.commit(DocBuilder.java:168
> )
>        at
> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:15
> 2)
>        at
> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporte
> r.java:334)
>        at
> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java
> :386)
>        at
> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:
> 377)
> Apr 7, 2009 4:41:23 PM org.apache.solr.handler.dataimport.DocBuilder
> execute
>
>
>
> Thanks.
>
> Karthik
>