You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by "Earwin Burrfoot (JIRA)" <ji...@apache.org> on 2010/04/07 11:03:33 UTC
[jira] Commented: (LUCENE-2376) java.lang.OutOfMemoryError:Java
heap space
[ https://issues.apache.org/jira/browse/LUCENE-2376?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12854393#action_12854393 ]
Earwin Burrfoot commented on LUCENE-2376:
-----------------------------------------
That's the duplicate of LUCENE-2361.
It seems to me you guys have a segment with insane amount of fields. Well, or your index is corrupt and this segment just parses as one having insane amount of fields.
> java.lang.OutOfMemoryError:Java heap space
> ------------------------------------------
>
> Key: LUCENE-2376
> URL: https://issues.apache.org/jira/browse/LUCENE-2376
> Project: Lucene - Java
> Issue Type: Bug
> Components: Index
> Affects Versions: 2.9.1
> Environment: Windows
> Reporter: Shivender Devarakonda
> Attachments: InfoStreamOutput.txt
>
>
> I see an OutOfMemory error in our product and it is happening when we have some data objects on which we built the index. I see the following OutOfmemory error, this is happening after we call Indexwriter.optimize():
> 4/06/10 02:03:42.160 PM PDT [ERROR] [Lucene Merge Thread #12] In thread Lucene Merge Thread #12 and the message is org.apache.lucene.index.MergePolicy$MergeException: java.lang.OutOfMemoryError: Java heap space
> 4/06/10 02:03:42.207 PM PDT [VERBOSE] [Lucene Merge Thread #12] [Manager] Uncaught Exception in thread Lucene Merge Thread #12
> org.apache.lucene.index.MergePolicy$MergeException: java.lang.OutOfMemoryError: Java heap space
> at org.apache.lucene.index.ConcurrentMergeScheduler.handleMergeException(ConcurrentMergeScheduler.java:351)
> at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:315)
> Caused by: java.lang.OutOfMemoryError: Java heap space
> at java.util.HashMap.resize(HashMap.java:462)
> at java.util.HashMap.addEntry(HashMap.java:755)
> at java.util.HashMap.put(HashMap.java:385)
> at org.apache.lucene.index.FieldInfos.addInternal(FieldInfos.java:256)
> at org.apache.lucene.index.FieldInfos.read(FieldInfos.java:366)
> at org.apache.lucene.index.FieldInfos.<init>(FieldInfos.java:71)
> at org.apache.lucene.index.SegmentReader$CoreReaders.<init>(SegmentReader.java:116)
> at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:638)
> at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:608)
> at org.apache.lucene.index.IndexWriter$ReaderPool.get(IndexWriter.java:686)
> at org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:4979)
> at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:4614)
> at org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:235)
> at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:291)
> 4/06/10 02:03:42.895 PM PDT [ERROR] this writer hit an OutOfMemoryError; cannot complete optimize
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
---------------------------------------------------------------------
To unsubscribe, e-mail: java-dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-dev-help@lucene.apache.org