You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Arkadi Colson <ar...@smartbit.be> on 2013/04/02 10:24:31 UTC
java.lang.OutOfMemoryError: Map failed
Hi
Recently solr crashed. I've found this in the error log.
My commit settings are loking like this:
<autoCommit>
<maxTime>10000</maxTime>
<openSearcher>false</openSearcher>
</autoCommit>
<autoSoftCommit>
<maxTime>2000</maxTime>
</autoSoftCommit>
The machine has 10GB of memory. Tomcat is running with -Xms2048m -Xmx6144m
Versions
Solr: 4.2
Tomcat: 7.0.33
Java: 1.7
Anybody any idea?
Thx!
Arkadi
SEVERE: auto commit error...:org.apache.solr.common.SolrException: Error
opening new searcher
at
org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1415)
at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1527)
at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:562)
at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.IOException: Map failed
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
at
org.apache.lucene.store.MMapDirectory.map(MMapDirectory.java:283)
at
org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDirectory.java:228)
at
org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:195)
at
org.apache.lucene.store.NRTCachingDirectory.openInput(NRTCachingDirectory.java:232)
at
org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.<init>(CompressingStoredFieldsReader.java:96)
at
org.apache.lucene.codecs.compressing.CompressingStoredFieldsFormat.fieldsReader(CompressingStoredFieldsFormat.java:113)
at
org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:147)
at
org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:56)
at
org.apache.lucene.index.ReadersAndLiveDocs.getReader(ReadersAndLiveDocs.java:121)
at
org.apache.lucene.index.BufferedDeletesStream.applyDeletes(BufferedDeletesStream.java:269)
at
org.apache.lucene.index.IndexWriter.applyAllDeletes(IndexWriter.java:2961)
at
org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.java:2952)
at
org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:368)
at
org.apache.lucene.index.StandardDirectoryReader.doOpenFromWriter(StandardDirectoryReader.java:270)
at
org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(StandardDirectoryReader.java:255)
at
org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReader.java:249)
at
org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1353)
... 11 more
Caused by: java.lang.OutOfMemoryError: Map failed
at sun.nio.ch.FileChannelImpl.map0(Native Method)
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:846)
... 28 more
SEVERE: auto commit error...:java.lang.IllegalStateException: this
writer hit an OutOfMemoryError; cannot commit
at
org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2661)
at
org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2827)
at
org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2807)
at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:541)
at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
RE: AW: AW: java.lang.OutOfMemoryError: Map failed
Posted by "Van Tassell, Kristian" <kr...@siemens.com>.
I just posted a similar error and discovered that decreasing the Xmx fixed the problem for me. The "free" command/top, etc. indicated I was flying just below the threshold for my allowed memory, and with swap/virtual space available, so I'm still confused as to what the issue is, but you may try this in your configurations to see if it helps.
-----Original Message-----
From: Per Steffensen [mailto:steff@designware.dk]
Sent: Tuesday, April 02, 2013 6:09 AM
To: solr-user@lucene.apache.org
Subject: Re: AW: AW: java.lang.OutOfMemoryError: Map failed
I have seen the exact same on Ubuntu Server 12.04. It helped adding some swap space, but I do not understand why this is necessary, since OS ought to just use the actual memory mapped files if there is not room in
(virtual) memory, swapping pages in and out on demand. Note that I saw this for memory mapped files opened for read+write - not in the exact same context as you see it where MMapDirectory is trying to map memory mapped files.
If you find a solution/explanation, please post it here. I really want to know more about why FileChannel.map can cause OOM. I do not think the OOM is a "real" OOM indicating no more space on java heap, but is more an exception saying that OS has no more memory (in some interpretation of that).
Regards, Per Steffensen
On 4/2/13 11:32 AM, Arkadi Colson wrote:
> It is running as root:
>
> root@solr01-dcg:~# ps aux | grep tom
> root 1809 10.2 67.5 49460420 6931232 ? Sl Mar28 706:29
> /usr/bin/java
> -Djava.util.logging.config.file=/usr/local/tomcat/conf/logging.propert
> ies -server -Xms2048m -Xmx6144m -XX:PermSize=64m -XX:MaxPermSize=128m
> -XX:+UseG1GC -verbose:gc -Xloggc:/solr/tomcat-logs/gc.log
> -XX:+PrintGCTimeStamps -XX:+PrintGCDetails -Duser.timezone=UTC
> -Dfile.encoding=UTF8 -Dsolr.solr.home=/opt/solr/ -Dport=8983
> -Dcollection.configName=smsc -DzkClientTimeout=20000
> -DzkHost=solr01-dcg.intnet.smartbit.be:2181,solr01-gs.intnet.smartbit.
> be:2181,solr02-dcg.intnet.smartbit.be:2181,solr02-gs.intnet.smartbit.b
> e:2181,solr03-dcg.intnet.smartbit.be:2181,solr03-gs.intnet.smartbit.be
> :2181
> -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
> -Dcom.sun.management.jmxremote
> -Dcom.sun.management.jmxremote.port=9999
> -Dcom.sun.management.jmxremote.ssl=false
> -Dcom.sun.management.jmxremote.authenticate=false
> -Djava.endorsed.dirs=/usr/local/tomcat/endorsed -classpath
> /usr/local/tomcat/bin/bootstrap.jar:/usr/local/tomcat/bin/tomcat-juli.
> jar -Dcatalina.base=/usr/local/tomcat
> -Dcatalina.home=/usr/local/tomcat
> -Djava.io.tmpdir=/usr/local/tomcat/temp
> org.apache.catalina.startup.Bootstrap start
>
> Arkadi
>
> On 04/02/2013 11:29 AM, André Widhani wrote:
>> The output is from the root user. Are you running Solr as root?
>>
>> If not, please try again using the operating system user that runs Solr.
>>
>> André
>> ________________________________________
>> Von: Arkadi Colson [arkadi@smartbit.be]
>> Gesendet: Dienstag, 2. April 2013 11:26
>> An: solr-user@lucene.apache.org
>> Cc: André Widhani
>> Betreff: Re: AW: java.lang.OutOfMemoryError: Map failed
>>
>> Hmmm I checked it and it seems to be ok:
>>
>> root@solr01-dcg:~# ulimit -v
>> unlimited
>>
>> Any other tips or do you need more debug info?
>>
>> BR
>>
>> On 04/02/2013 11:15 AM, André Widhani wrote:
>>> Hi Arkadi,
>>>
>>> this error usually indicates that virtual memory is not sufficient
>>> (should be "unlimited").
>>>
>>> Please see
>>> http://comments.gmane.org/gmane.comp.jakarta.lucene.solr.user/69168
>>>
>>> Regards,
>>> André
>>>
>>> ________________________________________
>>> Von: Arkadi Colson [arkadi@smartbit.be]
>>> Gesendet: Dienstag, 2. April 2013 10:24
>>> An: solr-user@lucene.apache.org
>>> Betreff: java.lang.OutOfMemoryError: Map failed
>>>
>>> Hi
>>>
>>> Recently solr crashed. I've found this in the error log.
>>> My commit settings are loking like this:
>>> <autoCommit>
>>> <maxTime>10000</maxTime>
>>> <openSearcher>false</openSearcher>
>>> </autoCommit>
>>>
>>> <autoSoftCommit>
>>> <maxTime>2000</maxTime>
>>> </autoSoftCommit>
>>>
>>> The machine has 10GB of memory. Tomcat is running with -Xms2048m
>>> -Xmx6144m
>>>
>>> Versions
>>> Solr: 4.2
>>> Tomcat: 7.0.33
>>> Java: 1.7
>>>
>>> Anybody any idea?
>>>
>>> Thx!
>>>
>>> Arkadi
>>>
>>> SEVERE: auto commit error...:org.apache.solr.common.SolrException:
>>> Error
>>> opening new searcher
>>> at
>>> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1415)
>>> at
>>> org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1527)
>>> at
>>> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandl
>>> er2.java:562)
>>>
>>> at
>>> org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at
>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask
>>> .access$201(ScheduledThreadPoolExecutor.java:178)
>>>
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask
>>> .run(ScheduledThreadPoolExecutor.java:292)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor
>>> .java:1145)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecuto
>>> r.java:615)
>>>
>>> at java.lang.Thread.run(Thread.java:722)
>>> Caused by: java.io.IOException: Map failed
>>> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
>>> at
>>> org.apache.lucene.store.MMapDirectory.map(MMapDirectory.java:283)
>>> at
>>> org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDire
>>> ctory.java:228)
>>>
>>> at
>>> org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:195)
>>> at
>>> org.apache.lucene.store.NRTCachingDirectory.openInput(NRTCachingDire
>>> ctory.java:232)
>>>
>>> at
>>> org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.<
>>> init>(CompressingStoredFieldsReader.java:96)
>>>
>>> at
>>> org.apache.lucene.codecs.compressing.CompressingStoredFieldsFormat.f
>>> ieldsReader(CompressingStoredFieldsFormat.java:113)
>>>
>>> at
>>> org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders
>>> .java:147)
>>>
>>> at
>>> org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:56)
>>> at
>>> org.apache.lucene.index.ReadersAndLiveDocs.getReader(ReadersAndLiveD
>>> ocs.java:121)
>>>
>>> at
>>> org.apache.lucene.index.BufferedDeletesStream.applyDeletes(BufferedD
>>> eletesStream.java:269)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.applyAllDeletes(IndexWriter.java
>>> :2961)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.ja
>>> va:2952)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:368)
>>> at
>>> org.apache.lucene.index.StandardDirectoryReader.doOpenFromWriter(Sta
>>> ndardDirectoryReader.java:270)
>>>
>>> at
>>> org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(Stan
>>> dardDirectoryReader.java:255)
>>>
>>> at
>>> org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReade
>>> r.java:249)
>>>
>>> at
>>> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1353)
>>> ... 11 more
>>> Caused by: java.lang.OutOfMemoryError: Map failed
>>> at sun.nio.ch.FileChannelImpl.map0(Native Method)
>>> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:846)
>>> ... 28 more
>>>
>>>
>>> SEVERE: auto commit error...:java.lang.IllegalStateException: this
>>> writer hit an OutOfMemoryError; cannot commit
>>> at
>>> org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWrite
>>> r.java:2661)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:
>>> 2827)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2807)
>>> at
>>> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandl
>>> er2.java:541)
>>>
>>> at
>>> org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at
>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask
>>> .access$201(ScheduledThreadPoolExecutor.java:178)
>>>
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask
>>> .run(ScheduledThreadPoolExecutor.java:292)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor
>>> .java:1145)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecuto
>>> r.java:615)
>>>
>>> at java.lang.Thread.run(Thread.java:722)
>>>
>>>
>>>
>>>
>>
>>
>
>
Re: AW: AW: java.lang.OutOfMemoryError: Map failed
Posted by Per Steffensen <st...@designware.dk>.
I have seen the exact same on Ubuntu Server 12.04. It helped adding some
swap space, but I do not understand why this is necessary, since OS
ought to just use the actual memory mapped files if there is not room in
(virtual) memory, swapping pages in and out on demand. Note that I saw
this for memory mapped files opened for read+write - not in the exact
same context as you see it where MMapDirectory is trying to map memory
mapped files.
If you find a solution/explanation, please post it here. I really want
to know more about why FileChannel.map can cause OOM. I do not think the
OOM is a "real" OOM indicating no more space on java heap, but is more
an exception saying that OS has no more memory (in some interpretation
of that).
Regards, Per Steffensen
On 4/2/13 11:32 AM, Arkadi Colson wrote:
> It is running as root:
>
> root@solr01-dcg:~# ps aux | grep tom
> root 1809 10.2 67.5 49460420 6931232 ? Sl Mar28 706:29
> /usr/bin/java
> -Djava.util.logging.config.file=/usr/local/tomcat/conf/logging.properties
> -server -Xms2048m -Xmx6144m -XX:PermSize=64m -XX:MaxPermSize=128m
> -XX:+UseG1GC -verbose:gc -Xloggc:/solr/tomcat-logs/gc.log
> -XX:+PrintGCTimeStamps -XX:+PrintGCDetails -Duser.timezone=UTC
> -Dfile.encoding=UTF8 -Dsolr.solr.home=/opt/solr/ -Dport=8983
> -Dcollection.configName=smsc -DzkClientTimeout=20000
> -DzkHost=solr01-dcg.intnet.smartbit.be:2181,solr01-gs.intnet.smartbit.be:2181,solr02-dcg.intnet.smartbit.be:2181,solr02-gs.intnet.smartbit.be:2181,solr03-dcg.intnet.smartbit.be:2181,solr03-gs.intnet.smartbit.be:2181
> -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
> -Dcom.sun.management.jmxremote
> -Dcom.sun.management.jmxremote.port=9999
> -Dcom.sun.management.jmxremote.ssl=false
> -Dcom.sun.management.jmxremote.authenticate=false
> -Djava.endorsed.dirs=/usr/local/tomcat/endorsed -classpath
> /usr/local/tomcat/bin/bootstrap.jar:/usr/local/tomcat/bin/tomcat-juli.jar
> -Dcatalina.base=/usr/local/tomcat -Dcatalina.home=/usr/local/tomcat
> -Djava.io.tmpdir=/usr/local/tomcat/temp
> org.apache.catalina.startup.Bootstrap start
>
> Arkadi
>
> On 04/02/2013 11:29 AM, André Widhani wrote:
>> The output is from the root user. Are you running Solr as root?
>>
>> If not, please try again using the operating system user that runs Solr.
>>
>> André
>> ________________________________________
>> Von: Arkadi Colson [arkadi@smartbit.be]
>> Gesendet: Dienstag, 2. April 2013 11:26
>> An: solr-user@lucene.apache.org
>> Cc: André Widhani
>> Betreff: Re: AW: java.lang.OutOfMemoryError: Map failed
>>
>> Hmmm I checked it and it seems to be ok:
>>
>> root@solr01-dcg:~# ulimit -v
>> unlimited
>>
>> Any other tips or do you need more debug info?
>>
>> BR
>>
>> On 04/02/2013 11:15 AM, André Widhani wrote:
>>> Hi Arkadi,
>>>
>>> this error usually indicates that virtual memory is not sufficient
>>> (should be "unlimited").
>>>
>>> Please see
>>> http://comments.gmane.org/gmane.comp.jakarta.lucene.solr.user/69168
>>>
>>> Regards,
>>> André
>>>
>>> ________________________________________
>>> Von: Arkadi Colson [arkadi@smartbit.be]
>>> Gesendet: Dienstag, 2. April 2013 10:24
>>> An: solr-user@lucene.apache.org
>>> Betreff: java.lang.OutOfMemoryError: Map failed
>>>
>>> Hi
>>>
>>> Recently solr crashed. I've found this in the error log.
>>> My commit settings are loking like this:
>>> <autoCommit>
>>> <maxTime>10000</maxTime>
>>> <openSearcher>false</openSearcher>
>>> </autoCommit>
>>>
>>> <autoSoftCommit>
>>> <maxTime>2000</maxTime>
>>> </autoSoftCommit>
>>>
>>> The machine has 10GB of memory. Tomcat is running with -Xms2048m
>>> -Xmx6144m
>>>
>>> Versions
>>> Solr: 4.2
>>> Tomcat: 7.0.33
>>> Java: 1.7
>>>
>>> Anybody any idea?
>>>
>>> Thx!
>>>
>>> Arkadi
>>>
>>> SEVERE: auto commit error...:org.apache.solr.common.SolrException:
>>> Error
>>> opening new searcher
>>> at
>>> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1415)
>>> at
>>> org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1527)
>>> at
>>> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:562)
>>>
>>> at
>>> org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at
>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
>>>
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>
>>> at java.lang.Thread.run(Thread.java:722)
>>> Caused by: java.io.IOException: Map failed
>>> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
>>> at
>>> org.apache.lucene.store.MMapDirectory.map(MMapDirectory.java:283)
>>> at
>>> org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDirectory.java:228)
>>>
>>> at
>>> org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:195)
>>> at
>>> org.apache.lucene.store.NRTCachingDirectory.openInput(NRTCachingDirectory.java:232)
>>>
>>> at
>>> org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.<init>(CompressingStoredFieldsReader.java:96)
>>>
>>> at
>>> org.apache.lucene.codecs.compressing.CompressingStoredFieldsFormat.fieldsReader(CompressingStoredFieldsFormat.java:113)
>>>
>>> at
>>> org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:147)
>>>
>>> at
>>> org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:56)
>>> at
>>> org.apache.lucene.index.ReadersAndLiveDocs.getReader(ReadersAndLiveDocs.java:121)
>>>
>>> at
>>> org.apache.lucene.index.BufferedDeletesStream.applyDeletes(BufferedDeletesStream.java:269)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.applyAllDeletes(IndexWriter.java:2961)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.java:2952)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:368)
>>> at
>>> org.apache.lucene.index.StandardDirectoryReader.doOpenFromWriter(StandardDirectoryReader.java:270)
>>>
>>> at
>>> org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(StandardDirectoryReader.java:255)
>>>
>>> at
>>> org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReader.java:249)
>>>
>>> at
>>> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1353)
>>> ... 11 more
>>> Caused by: java.lang.OutOfMemoryError: Map failed
>>> at sun.nio.ch.FileChannelImpl.map0(Native Method)
>>> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:846)
>>> ... 28 more
>>>
>>>
>>> SEVERE: auto commit error...:java.lang.IllegalStateException: this
>>> writer hit an OutOfMemoryError; cannot commit
>>> at
>>> org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2661)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2827)
>>>
>>> at
>>> org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2807)
>>> at
>>> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:541)
>>>
>>> at
>>> org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>> at
>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
>>>
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>
>>> at java.lang.Thread.run(Thread.java:722)
>>>
>>>
>>>
>>>
>>
>>
>
>
Re: AW: AW: java.lang.OutOfMemoryError: Map failed
Posted by Arkadi Colson <ar...@smartbit.be>.
It is running as root:
root@solr01-dcg:~# ps aux | grep tom
root 1809 10.2 67.5 49460420 6931232 ? Sl Mar28 706:29
/usr/bin/java
-Djava.util.logging.config.file=/usr/local/tomcat/conf/logging.properties -server
-Xms2048m -Xmx6144m -XX:PermSize=64m -XX:MaxPermSize=128m -XX:+UseG1GC
-verbose:gc -Xloggc:/solr/tomcat-logs/gc.log -XX:+PrintGCTimeStamps
-XX:+PrintGCDetails -Duser.timezone=UTC -Dfile.encoding=UTF8
-Dsolr.solr.home=/opt/solr/ -Dport=8983 -Dcollection.configName=smsc
-DzkClientTimeout=20000
-DzkHost=solr01-dcg.intnet.smartbit.be:2181,solr01-gs.intnet.smartbit.be:2181,solr02-dcg.intnet.smartbit.be:2181,solr02-gs.intnet.smartbit.be:2181,solr03-dcg.intnet.smartbit.be:2181,solr03-gs.intnet.smartbit.be:2181
-Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=9999
-Dcom.sun.management.jmxremote.ssl=false
-Dcom.sun.management.jmxremote.authenticate=false
-Djava.endorsed.dirs=/usr/local/tomcat/endorsed -classpath
/usr/local/tomcat/bin/bootstrap.jar:/usr/local/tomcat/bin/tomcat-juli.jar -Dcatalina.base=/usr/local/tomcat
-Dcatalina.home=/usr/local/tomcat
-Djava.io.tmpdir=/usr/local/tomcat/temp
org.apache.catalina.startup.Bootstrap start
Arkadi
On 04/02/2013 11:29 AM, André Widhani wrote:
> The output is from the root user. Are you running Solr as root?
>
> If not, please try again using the operating system user that runs Solr.
>
> André
> ________________________________________
> Von: Arkadi Colson [arkadi@smartbit.be]
> Gesendet: Dienstag, 2. April 2013 11:26
> An: solr-user@lucene.apache.org
> Cc: André Widhani
> Betreff: Re: AW: java.lang.OutOfMemoryError: Map failed
>
> Hmmm I checked it and it seems to be ok:
>
> root@solr01-dcg:~# ulimit -v
> unlimited
>
> Any other tips or do you need more debug info?
>
> BR
>
> On 04/02/2013 11:15 AM, André Widhani wrote:
>> Hi Arkadi,
>>
>> this error usually indicates that virtual memory is not sufficient (should be "unlimited").
>>
>> Please see http://comments.gmane.org/gmane.comp.jakarta.lucene.solr.user/69168
>>
>> Regards,
>> André
>>
>> ________________________________________
>> Von: Arkadi Colson [arkadi@smartbit.be]
>> Gesendet: Dienstag, 2. April 2013 10:24
>> An: solr-user@lucene.apache.org
>> Betreff: java.lang.OutOfMemoryError: Map failed
>>
>> Hi
>>
>> Recently solr crashed. I've found this in the error log.
>> My commit settings are loking like this:
>> <autoCommit>
>> <maxTime>10000</maxTime>
>> <openSearcher>false</openSearcher>
>> </autoCommit>
>>
>> <autoSoftCommit>
>> <maxTime>2000</maxTime>
>> </autoSoftCommit>
>>
>> The machine has 10GB of memory. Tomcat is running with -Xms2048m -Xmx6144m
>>
>> Versions
>> Solr: 4.2
>> Tomcat: 7.0.33
>> Java: 1.7
>>
>> Anybody any idea?
>>
>> Thx!
>>
>> Arkadi
>>
>> SEVERE: auto commit error...:org.apache.solr.common.SolrException: Error
>> opening new searcher
>> at
>> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1415)
>> at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1527)
>> at
>> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:562)
>> at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
>> at
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at
>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:722)
>> Caused by: java.io.IOException: Map failed
>> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
>> at
>> org.apache.lucene.store.MMapDirectory.map(MMapDirectory.java:283)
>> at
>> org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDirectory.java:228)
>> at
>> org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:195)
>> at
>> org.apache.lucene.store.NRTCachingDirectory.openInput(NRTCachingDirectory.java:232)
>> at
>> org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.<init>(CompressingStoredFieldsReader.java:96)
>> at
>> org.apache.lucene.codecs.compressing.CompressingStoredFieldsFormat.fieldsReader(CompressingStoredFieldsFormat.java:113)
>> at
>> org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:147)
>> at
>> org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:56)
>> at
>> org.apache.lucene.index.ReadersAndLiveDocs.getReader(ReadersAndLiveDocs.java:121)
>> at
>> org.apache.lucene.index.BufferedDeletesStream.applyDeletes(BufferedDeletesStream.java:269)
>> at
>> org.apache.lucene.index.IndexWriter.applyAllDeletes(IndexWriter.java:2961)
>> at
>> org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.java:2952)
>> at
>> org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:368)
>> at
>> org.apache.lucene.index.StandardDirectoryReader.doOpenFromWriter(StandardDirectoryReader.java:270)
>> at
>> org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(StandardDirectoryReader.java:255)
>> at
>> org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReader.java:249)
>> at
>> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1353)
>> ... 11 more
>> Caused by: java.lang.OutOfMemoryError: Map failed
>> at sun.nio.ch.FileChannelImpl.map0(Native Method)
>> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:846)
>> ... 28 more
>>
>>
>> SEVERE: auto commit error...:java.lang.IllegalStateException: this
>> writer hit an OutOfMemoryError; cannot commit
>> at
>> org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2661)
>> at
>> org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2827)
>> at
>> org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2807)
>> at
>> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:541)
>> at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
>> at
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> at
>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:722)
>>
>>
>>
>>
>
>
AW: AW: java.lang.OutOfMemoryError: Map failed
Posted by André Widhani <An...@digicol.de>.
The output is from the root user. Are you running Solr as root?
If not, please try again using the operating system user that runs Solr.
André
________________________________________
Von: Arkadi Colson [arkadi@smartbit.be]
Gesendet: Dienstag, 2. April 2013 11:26
An: solr-user@lucene.apache.org
Cc: André Widhani
Betreff: Re: AW: java.lang.OutOfMemoryError: Map failed
Hmmm I checked it and it seems to be ok:
root@solr01-dcg:~# ulimit -v
unlimited
Any other tips or do you need more debug info?
BR
On 04/02/2013 11:15 AM, André Widhani wrote:
> Hi Arkadi,
>
> this error usually indicates that virtual memory is not sufficient (should be "unlimited").
>
> Please see http://comments.gmane.org/gmane.comp.jakarta.lucene.solr.user/69168
>
> Regards,
> André
>
> ________________________________________
> Von: Arkadi Colson [arkadi@smartbit.be]
> Gesendet: Dienstag, 2. April 2013 10:24
> An: solr-user@lucene.apache.org
> Betreff: java.lang.OutOfMemoryError: Map failed
>
> Hi
>
> Recently solr crashed. I've found this in the error log.
> My commit settings are loking like this:
> <autoCommit>
> <maxTime>10000</maxTime>
> <openSearcher>false</openSearcher>
> </autoCommit>
>
> <autoSoftCommit>
> <maxTime>2000</maxTime>
> </autoSoftCommit>
>
> The machine has 10GB of memory. Tomcat is running with -Xms2048m -Xmx6144m
>
> Versions
> Solr: 4.2
> Tomcat: 7.0.33
> Java: 1.7
>
> Anybody any idea?
>
> Thx!
>
> Arkadi
>
> SEVERE: auto commit error...:org.apache.solr.common.SolrException: Error
> opening new searcher
> at
> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1415)
> at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1527)
> at
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:562)
> at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.io.IOException: Map failed
> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
> at
> org.apache.lucene.store.MMapDirectory.map(MMapDirectory.java:283)
> at
> org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDirectory.java:228)
> at
> org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:195)
> at
> org.apache.lucene.store.NRTCachingDirectory.openInput(NRTCachingDirectory.java:232)
> at
> org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.<init>(CompressingStoredFieldsReader.java:96)
> at
> org.apache.lucene.codecs.compressing.CompressingStoredFieldsFormat.fieldsReader(CompressingStoredFieldsFormat.java:113)
> at
> org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:147)
> at
> org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:56)
> at
> org.apache.lucene.index.ReadersAndLiveDocs.getReader(ReadersAndLiveDocs.java:121)
> at
> org.apache.lucene.index.BufferedDeletesStream.applyDeletes(BufferedDeletesStream.java:269)
> at
> org.apache.lucene.index.IndexWriter.applyAllDeletes(IndexWriter.java:2961)
> at
> org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.java:2952)
> at
> org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:368)
> at
> org.apache.lucene.index.StandardDirectoryReader.doOpenFromWriter(StandardDirectoryReader.java:270)
> at
> org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(StandardDirectoryReader.java:255)
> at
> org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReader.java:249)
> at
> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1353)
> ... 11 more
> Caused by: java.lang.OutOfMemoryError: Map failed
> at sun.nio.ch.FileChannelImpl.map0(Native Method)
> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:846)
> ... 28 more
>
>
> SEVERE: auto commit error...:java.lang.IllegalStateException: this
> writer hit an OutOfMemoryError; cannot commit
> at
> org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2661)
> at
> org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2827)
> at
> org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2807)
> at
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:541)
> at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:722)
>
>
>
>
Re: AW: java.lang.OutOfMemoryError: Map failed
Posted by Arkadi Colson <ar...@smartbit.be>.
Hmmm I checked it and it seems to be ok:
root@solr01-dcg:~# ulimit -v
unlimited
Any other tips or do you need more debug info?
BR
On 04/02/2013 11:15 AM, André Widhani wrote:
> Hi Arkadi,
>
> this error usually indicates that virtual memory is not sufficient (should be "unlimited").
>
> Please see http://comments.gmane.org/gmane.comp.jakarta.lucene.solr.user/69168
>
> Regards,
> André
>
> ________________________________________
> Von: Arkadi Colson [arkadi@smartbit.be]
> Gesendet: Dienstag, 2. April 2013 10:24
> An: solr-user@lucene.apache.org
> Betreff: java.lang.OutOfMemoryError: Map failed
>
> Hi
>
> Recently solr crashed. I've found this in the error log.
> My commit settings are loking like this:
> <autoCommit>
> <maxTime>10000</maxTime>
> <openSearcher>false</openSearcher>
> </autoCommit>
>
> <autoSoftCommit>
> <maxTime>2000</maxTime>
> </autoSoftCommit>
>
> The machine has 10GB of memory. Tomcat is running with -Xms2048m -Xmx6144m
>
> Versions
> Solr: 4.2
> Tomcat: 7.0.33
> Java: 1.7
>
> Anybody any idea?
>
> Thx!
>
> Arkadi
>
> SEVERE: auto commit error...:org.apache.solr.common.SolrException: Error
> opening new searcher
> at
> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1415)
> at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1527)
> at
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:562)
> at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.io.IOException: Map failed
> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
> at
> org.apache.lucene.store.MMapDirectory.map(MMapDirectory.java:283)
> at
> org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDirectory.java:228)
> at
> org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:195)
> at
> org.apache.lucene.store.NRTCachingDirectory.openInput(NRTCachingDirectory.java:232)
> at
> org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.<init>(CompressingStoredFieldsReader.java:96)
> at
> org.apache.lucene.codecs.compressing.CompressingStoredFieldsFormat.fieldsReader(CompressingStoredFieldsFormat.java:113)
> at
> org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:147)
> at
> org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:56)
> at
> org.apache.lucene.index.ReadersAndLiveDocs.getReader(ReadersAndLiveDocs.java:121)
> at
> org.apache.lucene.index.BufferedDeletesStream.applyDeletes(BufferedDeletesStream.java:269)
> at
> org.apache.lucene.index.IndexWriter.applyAllDeletes(IndexWriter.java:2961)
> at
> org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.java:2952)
> at
> org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:368)
> at
> org.apache.lucene.index.StandardDirectoryReader.doOpenFromWriter(StandardDirectoryReader.java:270)
> at
> org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(StandardDirectoryReader.java:255)
> at
> org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReader.java:249)
> at
> org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1353)
> ... 11 more
> Caused by: java.lang.OutOfMemoryError: Map failed
> at sun.nio.ch.FileChannelImpl.map0(Native Method)
> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:846)
> ... 28 more
>
>
> SEVERE: auto commit error...:java.lang.IllegalStateException: this
> writer hit an OutOfMemoryError; cannot commit
> at
> org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2661)
> at
> org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2827)
> at
> org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2807)
> at
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:541)
> at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:722)
>
>
>
>
AW: java.lang.OutOfMemoryError: Map failed
Posted by André Widhani <An...@digicol.de>.
Hi Arkadi,
this error usually indicates that virtual memory is not sufficient (should be "unlimited").
Please see http://comments.gmane.org/gmane.comp.jakarta.lucene.solr.user/69168
Regards,
André
________________________________________
Von: Arkadi Colson [arkadi@smartbit.be]
Gesendet: Dienstag, 2. April 2013 10:24
An: solr-user@lucene.apache.org
Betreff: java.lang.OutOfMemoryError: Map failed
Hi
Recently solr crashed. I've found this in the error log.
My commit settings are loking like this:
<autoCommit>
<maxTime>10000</maxTime>
<openSearcher>false</openSearcher>
</autoCommit>
<autoSoftCommit>
<maxTime>2000</maxTime>
</autoSoftCommit>
The machine has 10GB of memory. Tomcat is running with -Xms2048m -Xmx6144m
Versions
Solr: 4.2
Tomcat: 7.0.33
Java: 1.7
Anybody any idea?
Thx!
Arkadi
SEVERE: auto commit error...:org.apache.solr.common.SolrException: Error
opening new searcher
at
org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1415)
at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1527)
at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:562)
at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.IOException: Map failed
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
at
org.apache.lucene.store.MMapDirectory.map(MMapDirectory.java:283)
at
org.apache.lucene.store.MMapDirectory$MMapIndexInput.<init>(MMapDirectory.java:228)
at
org.apache.lucene.store.MMapDirectory.openInput(MMapDirectory.java:195)
at
org.apache.lucene.store.NRTCachingDirectory.openInput(NRTCachingDirectory.java:232)
at
org.apache.lucene.codecs.compressing.CompressingStoredFieldsReader.<init>(CompressingStoredFieldsReader.java:96)
at
org.apache.lucene.codecs.compressing.CompressingStoredFieldsFormat.fieldsReader(CompressingStoredFieldsFormat.java:113)
at
org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:147)
at
org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:56)
at
org.apache.lucene.index.ReadersAndLiveDocs.getReader(ReadersAndLiveDocs.java:121)
at
org.apache.lucene.index.BufferedDeletesStream.applyDeletes(BufferedDeletesStream.java:269)
at
org.apache.lucene.index.IndexWriter.applyAllDeletes(IndexWriter.java:2961)
at
org.apache.lucene.index.IndexWriter.maybeApplyDeletes(IndexWriter.java:2952)
at
org.apache.lucene.index.IndexWriter.getReader(IndexWriter.java:368)
at
org.apache.lucene.index.StandardDirectoryReader.doOpenFromWriter(StandardDirectoryReader.java:270)
at
org.apache.lucene.index.StandardDirectoryReader.doOpenIfChanged(StandardDirectoryReader.java:255)
at
org.apache.lucene.index.DirectoryReader.openIfChanged(DirectoryReader.java:249)
at
org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1353)
... 11 more
Caused by: java.lang.OutOfMemoryError: Map failed
at sun.nio.ch.FileChannelImpl.map0(Native Method)
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:846)
... 28 more
SEVERE: auto commit error...:java.lang.IllegalStateException: this
writer hit an OutOfMemoryError; cannot commit
at
org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2661)
at
org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2827)
at
org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2807)
at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:541)
at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)