You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by sivaprasad <si...@echidnainc.com> on 2011/11/09 07:00:37 UTC

Out of memory during the indexing

Hi,

I am getting the following error during the indexing.I am trying to index 14
million records but the document size is very minimal.

*Error:*
2011-11-08 14:53:24,634 ERROR [STDERR] (Thread-12)
java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:07,910 ERROR [org.apache.coyote.http11.Http11Protocol]
(http-10.32.7.136-8180-2) Error reading request, ignored

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:53:54,961 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread) Exception in thread
"DefaultQuartzScheduler_QuartzSchedulerThread"

2011-11-08 14:54:21,780 ERROR
[org.apache.catalina.core.ContainerBase.[jboss.web].[localhost].[/solr].[jsp]]
(http-10.32.7.136-8180-9) Servlet.service() for servlet jsp threw exception

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:18,417 ERROR [org.apache.catalina.connector.CoyoteAdapter]
(http-10.32.7.136-8180-7) An exception or error occurred in the container
during the request processing

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:18,417 ERROR [org.apache.catalina.connector.CoyoteAdapter]
(http-10.32.7.136-8180-6) An exception or error occurred in the container
during the request processing

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:36,237 SEVERE
[org.apache.solr.handler.dataimport.SolrWriter] (Thread-19) Exception while
solr commit.

java.lang.RuntimeException: java.lang.OutOfMemoryError: GC overhead limit
exceeded

at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1099)

at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:425)

at
org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:85)

at org.apache.solr.handler.dataimport.SolrWriter.commit(SolrWriter.java:179)

at org.apache.solr.handler.dataimport.DocBuilder.finish(DocBuilder.java:236)

at
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:208)

at
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:359)

at
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:427)

at
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:408)

Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded

at java.util.Arrays.copyOfRange(Arrays.java:3209)

at java.lang.String.<init>(String.java:215)

at org.apache.lucene.index.TermBuffer.toTerm(TermBuffer.java:122)

at org.apache.lucene.index.SegmentTermEnum.term(SegmentTermEnum.java:176)

at org.apache.lucene.index.TermInfosReader.<init>(TermInfosReader.java:122)

at
org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:75)

at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:114)

at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:92)

at org.apache.lucene.index.DirectoryReader.<init>(DirectoryReader.java:235)

at
org.apache.lucene.index.ReadOnlyDirectoryReader.<init>(ReadOnlyDirectoryReader.java:34)

at
org.apache.lucene.index.DirectoryReader.doReopen(DirectoryReader.java:484)

at
org.apache.lucene.index.DirectoryReader.access$000(DirectoryReader.java:45)

at
org.apache.lucene.index.DirectoryReader$2.doBody(DirectoryReader.java:476)

at
org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:750)

at
org.apache.lucene.index.DirectoryReader.doReopenNoWriter(DirectoryReader.java:471)

at
org.apache.lucene.index.DirectoryReader.doReopen(DirectoryReader.java:429)

at org.apache.lucene.index.DirectoryReader.reopen(DirectoryReader.java:392)

at org.apache.solr.search.SolrIndexReader.reopen(SolrIndexReader.java:414)

at org.apache.solr.search.SolrIndexReader.reopen(SolrIndexReader.java:425)

at org.apache.solr.search.SolrIndexReader.reopen(SolrIndexReader.java:35)

at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1080)

... 8 more

2011-11-08 14:54:34,905 WARN 
[org.jboss.system.server.profileservice.hotdeploy.HDScanner] (HDScanner)
Scan failed

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:25,132 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread) java.lang.OutOfMemoryError:
GC overhead limit exceeded

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
java.util.TreeMap.key(TreeMap.java:1206)

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
java.util.TreeMap.firstKey(TreeMap.java:267)

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
java.util.TreeSet.first(TreeSet.java:377)

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
org.quartz.simpl.RAMJobStore.acquireNextTrigger(RAMJobStore.java:1131)

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
org.quartz.core.QuartzSchedulerThread.run(QuartzSchedulerThread.java:233)

2011-11-08 14:54:36,238 ERROR [STDERR]
(ContainerBackgroundProcessor[StandardEngine[jboss.web]]) Exception in
thread "ContainerBackgroundProcessor[StandardEngine[jboss.web]]"

2011-11-08 14:54:36,239 ERROR [STDERR]
(ContainerBackgroundProcessor[StandardEngine[jboss.web]])
java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:36,240 ERROR [STDERR] (Timer-Log4jService) Exception in
thread "Timer-Log4jService"

2011-11-08 14:54:36,240 ERROR [STDERR] (Timer-Log4jService)
java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:36,247 INFO 
[org.apache.solr.handler.dataimport.SolrWriter] (Thread-19) Read
dataimport.properties

2011-11-08 14:54:36,256 INFO 
[org.apache.solr.handler.dataimport.SolrWriter] (Thread-19) Wrote last
indexed time to
/data/solr/Proliphiq/ProliphiqSearch/ProliphiqSolr_Master/profileAutoSuggest/conf/dataimport.properties

2011-11-08 14:54:36,256 INFO 
[org.apache.solr.update.processor.UpdateRequestProcessor] (Thread-19)
{deleteByQuery=*:*,add=[17505883, 17505887, 17505891, 17505895, 17505899,
17505903, 17505907, 17505911, ... (14673008 adds)]} 0 11

2011-11-08 14:54:36,256 INFO 
[org.apache.solr.handler.dataimport.DocBuilder] (Thread-19) Time taken =
6:40:39.955

Do i need to increase the heap size for JVM?

My solrconfig settings are given below.

<indexDefaults>
  
    <useCompoundFile>false</useCompoundFile>

    <mergeFactor>25</mergeFactor>
    
    <maxBufferedDocs>2</maxBufferedDocs>
   
    <ramBufferSizeMB>1024</ramBufferSizeMB>
    <maxMergeDocs>2147483647</maxMergeDocs>
    <maxFieldLength>10000</maxFieldLength>
    <writeLockTimeout>1000</writeLockTimeout>
    <commitLockTimeout>10000</commitLockTimeout>

and the main index values are 

<useCompoundFile>false</useCompoundFile>
    <ramBufferSizeMB>512</ramBufferSizeMB>
    <mergeFactor>10</mergeFactor>
    <maxMergeDocs>2147483647</maxMergeDocs>
    <maxFieldLength>10000</maxFieldLength>

Do i need to increase the ramBufferSizeMB to a little higher?

Please provide your inputs.

Regards,
Siva
 

--
View this message in context: http://lucene.472066.n3.nabble.com/Out-of-memory-during-the-indexing-tp3492701p3492701.html
Sent from the Solr - User mailing list archive at Nabble.com.

Re: Out of memory during the indexing

Posted by Erick Erickson <er...@gmail.com>.
I'm going to defer to the folks who actually know the guts here.
If you've turned down the cache entries for your Solr caches,
you're pretty much left with Lucene caching which is a mystery...

Best
Erick

On Mon, Dec 5, 2011 at 9:23 AM, Jeff Crump <je...@gmail.com> wrote:
> Yes, and without doing much in the way of queries, either.   Basically, our
> test data has large numbers of distinct terms, each of which can be large
> in themselves.   Heap usage is a straight line -- up --  75 percent of the
> heap is consumed with byte[] allocations at the leaf of an object graph
> like so:
>
> SolrCore
> SolrIndexSearcher
> DirectoryReader
> SegmentReader
> SegmentCoreReaders
> PerFieldPostingsFormat$FieldsReader
> ...
> FST
> byte[]
>
> Our application is less concerned with query performance than it is with
> making sure our index doesn't OOM.   My suspicion is that we're looking at
> just in-memory representation of the index rather than any caching (it's
> all turned down to levels suggested in other documentation); plus, we're
> not doing much querying in this test anyway.
>
> Any suggestions or places to go for further information?
>
> On 5 December 2011 08:38, Erick Erickson <er...@gmail.com> wrote:
>
>> There's no good way to say to Solr "Use only this
>> much memory for searching". You can certainly
>> limit the size somewhat by configuring your caches
>> to be small. But if you're sorting, then Lucene will
>> use up some cache space etc.
>>
>> Are you actually running into problems?
>>
>> Best
>> Erick
>>
>> On Fri, Dec 2, 2011 at 2:26 PM, Jeff Crump <je...@gmail.com>
>> wrote:
>> > Can anyone advise techniques for limiting the size of the RAM buffers to
>> > begin with?  As the index grows, I shouldn't have to keep increasing the
>> > heap.  We have a high-ingest, low-query-rate environment and I'm not as
>> > much concerned with the query-time caches as I am with the segment core
>> > readers/SolrIndexSearchers themselves.
>> >
>> > On 9 November 2011 06:10, Andre Bois-Crettez <an...@kelkoo.com>
>> wrote:
>> >
>> >> How much memory you actually allocate to the JVM ?
>> >> http://wiki.apache.org/solr/**SolrPerformanceFactors#Memory_**
>> >> allocated_to_the_Java_VM<
>> http://wiki.apache.org/solr/SolrPerformanceFactors#Memory_allocated_to_the_Java_VM
>> >
>> >> You need to increase the -Xmx value, otherwise your large ram buffers
>> >> won't fit in the java heap.
>> >>
>> >>
>> >>
>> >> sivaprasad wrote:
>> >>
>> >>> Hi,
>> >>>
>> >>> I am getting the following error during the indexing.I am trying to
>> index
>> >>> 14
>> >>> million records but the document size is very minimal.
>> >>>
>> >>> *Error:*
>> >>> 2011-11-08 14:53:24,634 ERROR [STDERR] (Thread-12)
>> >>> java.lang.OutOfMemoryError: GC overhead limit exceeded
>> >>>
>> >>>
>> >>>
>> >> [...]
>> >>
>> >>  Do i need to increase the heap size for JVM?
>> >>>
>> >>> My solrconfig settings are given below.
>> >>>
>> >>> <indexDefaults>
>> >>>      <useCompoundFile>false</**useCompoundFile>
>> >>>
>> >>>    <mergeFactor>25</mergeFactor>
>> >>>        <maxBufferedDocs>2</**maxBufferedDocs>
>> >>>       <ramBufferSizeMB>1024</**ramBufferSizeMB>
>> >>>    <maxMergeDocs>2147483647</maxMergeDocs>
>> >>>    <maxFieldLength>10000</**maxFieldLength>
>> >>>    <writeLockTimeout>1000</**writeLockTimeout>
>> >>>    <commitLockTimeout>10000</**commitLockTimeout>
>> >>>
>> >>> and the main index values are
>> >>> <useCompoundFile>false</**useCompoundFile>
>> >>>    <ramBufferSizeMB>512</**ramBufferSizeMB>
>> >>>    <mergeFactor>10</mergeFactor>
>> >>>    <maxMergeDocs>2147483647</maxMergeDocs>
>> >>>    <maxFieldLength>10000</**maxFieldLength>
>> >>>
>> >>> Do i need to increase the ramBufferSizeMB to a little higher?
>> >>>
>> >>> Please provide your inputs.
>> >>>
>> >>> Regards,
>> >>> Siva
>> >>>
>> >>> --
>> >>> View this message in context: http://lucene.472066.n3.**
>> >>>
>> nabble.com/Out-of-memory-**during-the-indexing-**tp3492701p3492701.html<
>> http://lucene.472066.n3.nabble.com/Out-of-memory-during-the-indexing-tp3492701p3492701.html
>> >
>> >>> Sent from the Solr - User mailing list archive at Nabble.com.
>> >>>
>> >>>
>> >>>
>> >>
>> >> --
>> >> André Bois-Crettez
>> >>
>> >> Search technology, Kelkoo
>> >> http://www.kelkoo.com/
>> >>
>> >>
>>

Re: Out of memory during the indexing

Posted by Jeff Crump <je...@gmail.com>.
Yes, and without doing much in the way of queries, either.   Basically, our
test data has large numbers of distinct terms, each of which can be large
in themselves.   Heap usage is a straight line -- up --  75 percent of the
heap is consumed with byte[] allocations at the leaf of an object graph
like so:

SolrCore
SolrIndexSearcher
DirectoryReader
SegmentReader
SegmentCoreReaders
PerFieldPostingsFormat$FieldsReader
...
FST
byte[]

Our application is less concerned with query performance than it is with
making sure our index doesn't OOM.   My suspicion is that we're looking at
just in-memory representation of the index rather than any caching (it's
all turned down to levels suggested in other documentation); plus, we're
not doing much querying in this test anyway.

Any suggestions or places to go for further information?

On 5 December 2011 08:38, Erick Erickson <er...@gmail.com> wrote:

> There's no good way to say to Solr "Use only this
> much memory for searching". You can certainly
> limit the size somewhat by configuring your caches
> to be small. But if you're sorting, then Lucene will
> use up some cache space etc.
>
> Are you actually running into problems?
>
> Best
> Erick
>
> On Fri, Dec 2, 2011 at 2:26 PM, Jeff Crump <je...@gmail.com>
> wrote:
> > Can anyone advise techniques for limiting the size of the RAM buffers to
> > begin with?  As the index grows, I shouldn't have to keep increasing the
> > heap.  We have a high-ingest, low-query-rate environment and I'm not as
> > much concerned with the query-time caches as I am with the segment core
> > readers/SolrIndexSearchers themselves.
> >
> > On 9 November 2011 06:10, Andre Bois-Crettez <an...@kelkoo.com>
> wrote:
> >
> >> How much memory you actually allocate to the JVM ?
> >> http://wiki.apache.org/solr/**SolrPerformanceFactors#Memory_**
> >> allocated_to_the_Java_VM<
> http://wiki.apache.org/solr/SolrPerformanceFactors#Memory_allocated_to_the_Java_VM
> >
> >> You need to increase the -Xmx value, otherwise your large ram buffers
> >> won't fit in the java heap.
> >>
> >>
> >>
> >> sivaprasad wrote:
> >>
> >>> Hi,
> >>>
> >>> I am getting the following error during the indexing.I am trying to
> index
> >>> 14
> >>> million records but the document size is very minimal.
> >>>
> >>> *Error:*
> >>> 2011-11-08 14:53:24,634 ERROR [STDERR] (Thread-12)
> >>> java.lang.OutOfMemoryError: GC overhead limit exceeded
> >>>
> >>>
> >>>
> >> [...]
> >>
> >>  Do i need to increase the heap size for JVM?
> >>>
> >>> My solrconfig settings are given below.
> >>>
> >>> <indexDefaults>
> >>>      <useCompoundFile>false</**useCompoundFile>
> >>>
> >>>    <mergeFactor>25</mergeFactor>
> >>>        <maxBufferedDocs>2</**maxBufferedDocs>
> >>>       <ramBufferSizeMB>1024</**ramBufferSizeMB>
> >>>    <maxMergeDocs>2147483647</maxMergeDocs>
> >>>    <maxFieldLength>10000</**maxFieldLength>
> >>>    <writeLockTimeout>1000</**writeLockTimeout>
> >>>    <commitLockTimeout>10000</**commitLockTimeout>
> >>>
> >>> and the main index values are
> >>> <useCompoundFile>false</**useCompoundFile>
> >>>    <ramBufferSizeMB>512</**ramBufferSizeMB>
> >>>    <mergeFactor>10</mergeFactor>
> >>>    <maxMergeDocs>2147483647</maxMergeDocs>
> >>>    <maxFieldLength>10000</**maxFieldLength>
> >>>
> >>> Do i need to increase the ramBufferSizeMB to a little higher?
> >>>
> >>> Please provide your inputs.
> >>>
> >>> Regards,
> >>> Siva
> >>>
> >>> --
> >>> View this message in context: http://lucene.472066.n3.**
> >>>
> nabble.com/Out-of-memory-**during-the-indexing-**tp3492701p3492701.html<
> http://lucene.472066.n3.nabble.com/Out-of-memory-during-the-indexing-tp3492701p3492701.html
> >
> >>> Sent from the Solr - User mailing list archive at Nabble.com.
> >>>
> >>>
> >>>
> >>
> >> --
> >> André Bois-Crettez
> >>
> >> Search technology, Kelkoo
> >> http://www.kelkoo.com/
> >>
> >>
>

Re: Out of memory during the indexing

Posted by Erick Erickson <er...@gmail.com>.
There's no good way to say to Solr "Use only this
much memory for searching". You can certainly
limit the size somewhat by configuring your caches
to be small. But if you're sorting, then Lucene will
use up some cache space etc.

Are you actually running into problems?

Best
Erick

On Fri, Dec 2, 2011 at 2:26 PM, Jeff Crump <je...@gmail.com> wrote:
> Can anyone advise techniques for limiting the size of the RAM buffers to
> begin with?  As the index grows, I shouldn't have to keep increasing the
> heap.  We have a high-ingest, low-query-rate environment and I'm not as
> much concerned with the query-time caches as I am with the segment core
> readers/SolrIndexSearchers themselves.
>
> On 9 November 2011 06:10, Andre Bois-Crettez <an...@kelkoo.com> wrote:
>
>> How much memory you actually allocate to the JVM ?
>> http://wiki.apache.org/solr/**SolrPerformanceFactors#Memory_**
>> allocated_to_the_Java_VM<http://wiki.apache.org/solr/SolrPerformanceFactors#Memory_allocated_to_the_Java_VM>
>> You need to increase the -Xmx value, otherwise your large ram buffers
>> won't fit in the java heap.
>>
>>
>>
>> sivaprasad wrote:
>>
>>> Hi,
>>>
>>> I am getting the following error during the indexing.I am trying to index
>>> 14
>>> million records but the document size is very minimal.
>>>
>>> *Error:*
>>> 2011-11-08 14:53:24,634 ERROR [STDERR] (Thread-12)
>>> java.lang.OutOfMemoryError: GC overhead limit exceeded
>>>
>>>
>>>
>> [...]
>>
>>  Do i need to increase the heap size for JVM?
>>>
>>> My solrconfig settings are given below.
>>>
>>> <indexDefaults>
>>>      <useCompoundFile>false</**useCompoundFile>
>>>
>>>    <mergeFactor>25</mergeFactor>
>>>        <maxBufferedDocs>2</**maxBufferedDocs>
>>>       <ramBufferSizeMB>1024</**ramBufferSizeMB>
>>>    <maxMergeDocs>2147483647</maxMergeDocs>
>>>    <maxFieldLength>10000</**maxFieldLength>
>>>    <writeLockTimeout>1000</**writeLockTimeout>
>>>    <commitLockTimeout>10000</**commitLockTimeout>
>>>
>>> and the main index values are
>>> <useCompoundFile>false</**useCompoundFile>
>>>    <ramBufferSizeMB>512</**ramBufferSizeMB>
>>>    <mergeFactor>10</mergeFactor>
>>>    <maxMergeDocs>2147483647</maxMergeDocs>
>>>    <maxFieldLength>10000</**maxFieldLength>
>>>
>>> Do i need to increase the ramBufferSizeMB to a little higher?
>>>
>>> Please provide your inputs.
>>>
>>> Regards,
>>> Siva
>>>
>>> --
>>> View this message in context: http://lucene.472066.n3.**
>>> nabble.com/Out-of-memory-**during-the-indexing-**tp3492701p3492701.html<http://lucene.472066.n3.nabble.com/Out-of-memory-during-the-indexing-tp3492701p3492701.html>
>>> Sent from the Solr - User mailing list archive at Nabble.com.
>>>
>>>
>>>
>>
>> --
>> André Bois-Crettez
>>
>> Search technology, Kelkoo
>> http://www.kelkoo.com/
>>
>>

Re: Out of memory during the indexing

Posted by Jeff Crump <je...@gmail.com>.
Can anyone advise techniques for limiting the size of the RAM buffers to
begin with?  As the index grows, I shouldn't have to keep increasing the
heap.  We have a high-ingest, low-query-rate environment and I'm not as
much concerned with the query-time caches as I am with the segment core
readers/SolrIndexSearchers themselves.

On 9 November 2011 06:10, Andre Bois-Crettez <an...@kelkoo.com> wrote:

> How much memory you actually allocate to the JVM ?
> http://wiki.apache.org/solr/**SolrPerformanceFactors#Memory_**
> allocated_to_the_Java_VM<http://wiki.apache.org/solr/SolrPerformanceFactors#Memory_allocated_to_the_Java_VM>
> You need to increase the -Xmx value, otherwise your large ram buffers
> won't fit in the java heap.
>
>
>
> sivaprasad wrote:
>
>> Hi,
>>
>> I am getting the following error during the indexing.I am trying to index
>> 14
>> million records but the document size is very minimal.
>>
>> *Error:*
>> 2011-11-08 14:53:24,634 ERROR [STDERR] (Thread-12)
>> java.lang.OutOfMemoryError: GC overhead limit exceeded
>>
>>
>>
> [...]
>
>  Do i need to increase the heap size for JVM?
>>
>> My solrconfig settings are given below.
>>
>> <indexDefaults>
>>      <useCompoundFile>false</**useCompoundFile>
>>
>>    <mergeFactor>25</mergeFactor>
>>        <maxBufferedDocs>2</**maxBufferedDocs>
>>       <ramBufferSizeMB>1024</**ramBufferSizeMB>
>>    <maxMergeDocs>2147483647</maxMergeDocs>
>>    <maxFieldLength>10000</**maxFieldLength>
>>    <writeLockTimeout>1000</**writeLockTimeout>
>>    <commitLockTimeout>10000</**commitLockTimeout>
>>
>> and the main index values are
>> <useCompoundFile>false</**useCompoundFile>
>>    <ramBufferSizeMB>512</**ramBufferSizeMB>
>>    <mergeFactor>10</mergeFactor>
>>    <maxMergeDocs>2147483647</maxMergeDocs>
>>    <maxFieldLength>10000</**maxFieldLength>
>>
>> Do i need to increase the ramBufferSizeMB to a little higher?
>>
>> Please provide your inputs.
>>
>> Regards,
>> Siva
>>
>> --
>> View this message in context: http://lucene.472066.n3.**
>> nabble.com/Out-of-memory-**during-the-indexing-**tp3492701p3492701.html<http://lucene.472066.n3.nabble.com/Out-of-memory-during-the-indexing-tp3492701p3492701.html>
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
>>
>>
>
> --
> André Bois-Crettez
>
> Search technology, Kelkoo
> http://www.kelkoo.com/
>
>

Re: Out of memory during the indexing

Posted by Andre Bois-Crettez <an...@kelkoo.com>.
How much memory you actually allocate to the JVM ?
http://wiki.apache.org/solr/SolrPerformanceFactors#Memory_allocated_to_the_Java_VM
You need to increase the -Xmx value, otherwise your large ram buffers 
won't fit in the java heap.


sivaprasad wrote:
> Hi,
>
> I am getting the following error during the indexing.I am trying to index 14
> million records but the document size is very minimal.
>
> *Error:*
> 2011-11-08 14:53:24,634 ERROR [STDERR] (Thread-12)
> java.lang.OutOfMemoryError: GC overhead limit exceeded
>
>   
[...]
> Do i need to increase the heap size for JVM?
>
> My solrconfig settings are given below.
>
> <indexDefaults>
>   
>     <useCompoundFile>false</useCompoundFile>
>
>     <mergeFactor>25</mergeFactor>
>     
>     <maxBufferedDocs>2</maxBufferedDocs>
>    
>     <ramBufferSizeMB>1024</ramBufferSizeMB>
>     <maxMergeDocs>2147483647</maxMergeDocs>
>     <maxFieldLength>10000</maxFieldLength>
>     <writeLockTimeout>1000</writeLockTimeout>
>     <commitLockTimeout>10000</commitLockTimeout>
>
> and the main index values are 
>
> <useCompoundFile>false</useCompoundFile>
>     <ramBufferSizeMB>512</ramBufferSizeMB>
>     <mergeFactor>10</mergeFactor>
>     <maxMergeDocs>2147483647</maxMergeDocs>
>     <maxFieldLength>10000</maxFieldLength>
>
> Do i need to increase the ramBufferSizeMB to a little higher?
>
> Please provide your inputs.
>
> Regards,
> Siva
>  
>
> --
> View this message in context: http://lucene.472066.n3.nabble.com/Out-of-memory-during-the-indexing-tp3492701p3492701.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>   

-- 
André Bois-Crettez

Search technology, Kelkoo
http://www.kelkoo.com/