You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Willie Wong <ww...@metalogic-inc.com> on 2008/10/22 03:48:14 UTC

Out of Memory Errors

Hello,

I've been having issues with out of memory errors on searches in Solr. I 
was wondering if I'm hitting a limit with solr or if I've configured 
something seriously wrong.

Solr Setup
- 3 cores 
- 3163615 documents each
- 10 GB size
- approx 10 fields
- document sizes vary from a few kb to a few MB
- no faceting is used however the search query can be fairly complex with 
8 or more fields being searched on at once

Environment:
- windows 2003
- 2.8 GHz zeon processor
- 1.5 GB memory assigned to solr
- Jetty 6 server

Once we get to around a few  concurrent users OOM start occuring and Jetty 
restarts.  Would this just be a case of more memory or are there certain 
configuration settings that need to be set?  We're using an out of the box 
Solr 1.3 beta version. 

A few of the things we considered that might help:
- Removing sorts on the result sets (result sets are approx 40,000 + 
documents)
- Reducing cache sizes such as the queryResultMaxDocsCached setting, 
document cache, queryResultCache, filterCache, etc

Am I missing anything else that should be looked at, or is it time to 
simply increase the memory/start looking at distributing the indexes?  Any 
help would be much appreciated.


Regards,

WW

RE: Out of Memory Errors

Posted by r....@almatech.es.
Hi Willie,

Are you using highliting ???

If, the response is yes, you need to know that for each document retrieved,
the solr highliting load into memory the full field who is using for this
functionality. If the field is too long, you have problems with memory.

You can solve the problem using this patch:

http://mail-archives.apache.org/mod_mbox/lucene-solr-dev/200806.mbox/%3C1552
380249.1213955865069.JavaMail.jira@brutus%3E

to copy the content of the field who is used to highliting to another field
and reduce the size.

You also need to know too that Windows have a limitation for memory process
in 2 GB.



-----Mensaje original-----
De: Willie Wong [mailto:wwong@metalogic-inc.com] 
Enviado el: miƩrcoles, 22 de octubre de 2008 3:48
Para: solr-user@lucene.apache.org
Asunto: Out of Memory Errors

Hello,

I've been having issues with out of memory errors on searches in Solr. I 
was wondering if I'm hitting a limit with solr or if I've configured 
something seriously wrong.

Solr Setup
- 3 cores 
- 3163615 documents each
- 10 GB size
- approx 10 fields
- document sizes vary from a few kb to a few MB
- no faceting is used however the search query can be fairly complex with 
8 or more fields being searched on at once

Environment:
- windows 2003
- 2.8 GHz zeon processor
- 1.5 GB memory assigned to solr
- Jetty 6 server

Once we get to around a few  concurrent users OOM start occuring and Jetty 
restarts.  Would this just be a case of more memory or are there certain 
configuration settings that need to be set?  We're using an out of the box 
Solr 1.3 beta version. 

A few of the things we considered that might help:
- Removing sorts on the result sets (result sets are approx 40,000 + 
documents)
- Reducing cache sizes such as the queryResultMaxDocsCached setting, 
document cache, queryResultCache, filterCache, etc

Am I missing anything else that should be looked at, or is it time to 
simply increase the memory/start looking at distributing the indexes?  Any 
help would be much appreciated.


Regards,

WW


Re: Out of Memory Errors

Posted by Jae Joo <ja...@gmail.com>.
Here is what I am doing to check the memory statues.
1. Run the Servelt and Solr application.
2. On command prompt, jstat -gc <pid> 5s (5s means that getting data every 5
seconds.)
3. Watch it or pipe to the file.
4. Analyze the data gathered.

Jae

On Tue, Oct 21, 2008 at 9:48 PM, Willie Wong <ww...@metalogic-inc.com>wrote:

> Hello,
>
> I've been having issues with out of memory errors on searches in Solr. I
> was wondering if I'm hitting a limit with solr or if I've configured
> something seriously wrong.
>
> Solr Setup
> - 3 cores
> - 3163615 documents each
> - 10 GB size
> - approx 10 fields
> - document sizes vary from a few kb to a few MB
> - no faceting is used however the search query can be fairly complex with
> 8 or more fields being searched on at once
>
> Environment:
> - windows 2003
> - 2.8 GHz zeon processor
> - 1.5 GB memory assigned to solr
> - Jetty 6 server
>
> Once we get to around a few  concurrent users OOM start occuring and Jetty
> restarts.  Would this just be a case of more memory or are there certain
> configuration settings that need to be set?  We're using an out of the box
> Solr 1.3 beta version.
>
> A few of the things we considered that might help:
> - Removing sorts on the result sets (result sets are approx 40,000 +
> documents)
> - Reducing cache sizes such as the queryResultMaxDocsCached setting,
> document cache, queryResultCache, filterCache, etc
>
> Am I missing anything else that should be looked at, or is it time to
> simply increase the memory/start looking at distributing the indexes?  Any
> help would be much appreciated.
>
>
> Regards,
>
> WW
>

Re: Out of Memory Errors

Posted by Nick Jenkin <nj...@gmail.com>.
Have you confirmed Java's -Xmx setting? (Max memory)

e.g. java -Xmx2000MB -jar start.jar
-Nick

On Wed, Oct 22, 2008 at 3:24 PM, Mark Miller <ma...@gmail.com> wrote:
> How much RAM in the box total? How many sort fields and what types? Sorts on
> each core?
>
> Willie Wong wrote:
>>
>> Hello,
>>
>> I've been having issues with out of memory errors on searches in Solr. I
>> was wondering if I'm hitting a limit with solr or if I've configured
>> something seriously wrong.
>>
>> Solr Setup
>> - 3 cores - 3163615 documents each
>> - 10 GB size
>> - approx 10 fields
>> - document sizes vary from a few kb to a few MB
>> - no faceting is used however the search query can be fairly complex with
>> 8 or more fields being searched on at once
>>
>> Environment:
>> - windows 2003
>> - 2.8 GHz zeon processor
>> - 1.5 GB memory assigned to solr
>> - Jetty 6 server
>>
>> Once we get to around a few  concurrent users OOM start occuring and Jetty
>> restarts.  Would this just be a case of more memory or are there certain
>> configuration settings that need to be set?  We're using an out of the box
>> Solr 1.3 beta version.
>> A few of the things we considered that might help:
>> - Removing sorts on the result sets (result sets are approx 40,000 +
>> documents)
>> - Reducing cache sizes such as the queryResultMaxDocsCached setting,
>> document cache, queryResultCache, filterCache, etc
>>
>> Am I missing anything else that should be looked at, or is it time to
>> simply increase the memory/start looking at distributing the indexes?  Any
>> help would be much appreciated.
>>
>>
>> Regards,
>>
>> WW
>>
>>
>
>

Re: Out of Memory Errors

Posted by Mark Miller <ma...@gmail.com>.
How much RAM in the box total? How many sort fields and what types? 
Sorts on each core?

Willie Wong wrote:
> Hello,
>
> I've been having issues with out of memory errors on searches in Solr. I 
> was wondering if I'm hitting a limit with solr or if I've configured 
> something seriously wrong.
>
> Solr Setup
> - 3 cores 
> - 3163615 documents each
> - 10 GB size
> - approx 10 fields
> - document sizes vary from a few kb to a few MB
> - no faceting is used however the search query can be fairly complex with 
> 8 or more fields being searched on at once
>
> Environment:
> - windows 2003
> - 2.8 GHz zeon processor
> - 1.5 GB memory assigned to solr
> - Jetty 6 server
>
> Once we get to around a few  concurrent users OOM start occuring and Jetty 
> restarts.  Would this just be a case of more memory or are there certain 
> configuration settings that need to be set?  We're using an out of the box 
> Solr 1.3 beta version. 
>
> A few of the things we considered that might help:
> - Removing sorts on the result sets (result sets are approx 40,000 + 
> documents)
> - Reducing cache sizes such as the queryResultMaxDocsCached setting, 
> document cache, queryResultCache, filterCache, etc
>
> Am I missing anything else that should be looked at, or is it time to 
> simply increase the memory/start looking at distributing the indexes?  Any 
> help would be much appreciated.
>
>
> Regards,
>
> WW
>
>