You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by "Van Tassell, Kristian" <kr...@siemens.com> on 2013/04/03 18:43:10 UTC
SolrException: Error opening new searcher
We're suddenly seeing an error when trying to do updates/commits.
This is on Solr 4.2 (Tomcat, solr war deployed to webapps, on Linux SuSE 11).
Based off of some initial searching on things related to this issue, I have set ulimit in Linux to 'unlimited' and verified that Tomcat has enough memory for the virtual memory needed to run the Solr index (which is 1.1GB in size).
Does anyone have any ideas?
1:25:41
SEVERE
UpdateLog
Error opening realtime searcher for deleteByQuery:org.apache.solr.common.SolrException: Error opening new searcher
Error opening realtime searcher for deleteByQuery:org.apache.solr.common.SolrException: Error opening new searcher
11:25:39
SEVERE
UpdateLog
Replay exception: final commit.
java.io.IOException: Map failed
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:761)
at org.apache.lucene.store.MMapDirectory.map(MMapDirectory.java:283)
at org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDirectory.java:228)
at org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:195)
at org.apache.lucene.codecs.lucene41.Lucene41PostingsReader.<init>(Lucene41PostingsReader.java:81)
at org.apache.lucene.codecs.lucene41.Lucene41PostingsFormat.fieldsProducer(Lucene41PostingsFormat.java:430)
at org.apache.lucene.codecs.perfield.PerFieldPostingsFormat$FieldsReader.<init>(PerFieldPostingsFormat.java:194)
at org.apache.lucene.codecs.perfield.PerFieldPostingsFormat.fieldsProducer(PerFieldPostingsFormat.java:233)
at org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:127)
at org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:56)
at org.apache.lucene.index.ReadersAndLiveDocs.getReader(ReadersAndLiveDocs.java:121)
at org.apache.lucene.index.BufferedDeletesStream.applyDeletes(BufferedDeletesStream.java:269)
at org.apache.lucene.index.IndexWriter.applyAllDeletes(IndexWriter.java:2961)
at org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.java:2952)
at org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2692)
at org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2827)
at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2807)
at org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:541)
at org.apache.solr.update.UpdateLog$LogReplayer.doReplay(UpdateLog.java:1341)
at org.apache.solr.update.UpdateLog$LogReplayer.run(UpdateLog.java:1160)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
Caused by: java.lang.OutOfMemoryError: Map failed
at sun.nio.ch.FileChannelImpl.map0(Native Method)
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:758)
... 28 more
SolrConfig:
<query>
<useColdSearcher>true</useColdSearcher>
<maxBooleanClauses>1024</maxBooleanClauses>
<filterCache class="solr.FastLRUCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<queryResultCache class="solr.LRUCache" size="512" initialSize="512" autowarmCount="0"/>
<documentCache class="solr.LRUCache" size="512" initialSize="512" autowarmCount="0"/>
<queryResultWindowSize>20</queryResultWindowSize>
<queryResultMaxDocsCached>200</queryResultMaxDocsCached>
<maxWarmingSearchers>6</maxWarmingSearchers>
</query>