You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Jon Drukman <jd...@gmail.com> on 2012/08/06 19:48:11 UTC

Running out of memory

Hi there.  I am running Solr 1.4.1 on an Amazon EC2 box with 7.5GB of RAM.
 It was set up about 18 months ago and has been largely trouble-free.
 Unfortunately, lately it has started to run out of memory pretty much
every day.  We are seeing

SEVERE: java.lang.OutOfMemoryError: Java heap space

When that happens, a simple query like
"http://localhost:8983/solr/select?q=*:*'"
returns nothing.

I am starting Solr with the following:

/usr/lib/jvm/jre/bin/java -XX:+UseConcMarkSweepGC -Xms1G -Xmx5G -jar
start.jar

It would be vastly preferable if Solr could just exit when it gets a memory
error, because we have it running under daemontools, and that would cause
an automatic restart.  After restarting, Solr works fine for another 12-18
hours.  Not ideal but at least it wouldn't require human intervention to
get it going again.

What can I do to reduce the memory pressure?  Does Solr require the entire
index to fit in memory at all times?  The on disk size is 15GB.  There are
27.5 million documents, but they are all tiny (mostly one line forum
comments like "this game is awesome").

We're using Sun openJava SDK 1.6 if that matters.

-jsd-

Re: Running out of memory

Posted by Amit Nithian <an...@gmail.com>.
I am debugging an out of memory error myself and a few suggestions:
1) Are you looking at your search logs around the time of the memory
error? In my case, I found a few bad queries requesting a ton of rows
(basically the whole index's worth which I think is an error somewhere
in our app just have to find it) which happened close to the OOM error
being thrown.
2) Do you have Solr hooked up to something like NewRelic/AppDynamics
to see the cache usage in real time? Maybe as was suggested, tuning
down or eliminating low used caches could help.
3) Are you ensuring that you aren't setting "stored=true" on fields
that don't need it? This will increase the index size and possibly the
cache size if lazy loading isn't enabled (to be honest, this part I am
a bit unclear of since I haven't had much experience with this
myself).

Thanks
Amit

On Mon, Aug 13, 2012 at 11:37 AM, Jon Drukman <jd...@gmail.com> wrote:
> On Sun, Aug 12, 2012 at 12:31 PM, Alexey Serba <as...@gmail.com> wrote:
>
>> > It would be vastly preferable if Solr could just exit when it gets a
>> memory
>> > error, because we have it running under daemontools, and that would cause
>> > an automatic restart.
>> -XX:OnOutOfMemoryError="<cmd args>; <cmd args>"
>> Run user-defined commands when an OutOfMemoryError is first thrown.
>>
>> > Does Solr require the entire index to fit in memory at all times?
>> No.
>>
>> But it's hard to say about your particular problem without additional
>> information. How often do you commit? Do you use faceting? Do you sort
>> by Solr fields and if yes what are those fields? And you should also
>> check caches.
>>
>
> I upgraded to solr-3.6.1 and an extra large amazon instance (15GB RAM) so
> we'll see if that helps.  So far no out of memory errors.

Re: Running out of memory

Posted by Jon Drukman <jd...@gmail.com>.
On Sun, Aug 12, 2012 at 12:31 PM, Alexey Serba <as...@gmail.com> wrote:

> > It would be vastly preferable if Solr could just exit when it gets a
> memory
> > error, because we have it running under daemontools, and that would cause
> > an automatic restart.
> -XX:OnOutOfMemoryError="<cmd args>; <cmd args>"
> Run user-defined commands when an OutOfMemoryError is first thrown.
>
> > Does Solr require the entire index to fit in memory at all times?
> No.
>
> But it's hard to say about your particular problem without additional
> information. How often do you commit? Do you use faceting? Do you sort
> by Solr fields and if yes what are those fields? And you should also
> check caches.
>

I upgraded to solr-3.6.1 and an extra large amazon instance (15GB RAM) so
we'll see if that helps.  So far no out of memory errors.

Re: Running out of memory

Posted by Alexey Serba <as...@gmail.com>.
> It would be vastly preferable if Solr could just exit when it gets a memory
> error, because we have it running under daemontools, and that would cause
> an automatic restart.
-XX:OnOutOfMemoryError="<cmd args>; <cmd args>"
Run user-defined commands when an OutOfMemoryError is first thrown.

> Does Solr require the entire index to fit in memory at all times?
No.

But it's hard to say about your particular problem without additional
information. How often do you commit? Do you use faceting? Do you sort
by Solr fields and if yes what are those fields? And you should also
check caches.

Re: Running out of memory

Posted by Michael Della Bitta <mi...@appinions.com>.
You might want to look at turning down or eliminating your caches if
you're running out of RAM. Possibly some of them have a low hit rate,
which you can see on the Stats page. Caches with a low hit rate are
only consuming RAM and CPU cycles.

Also, using this JVM arg might reduce the memory footprint:
-XX:+UseCompressedOops

In the end though, the surefire solution would be to go to an instance
type with more RAM: http://www.ec2instances.info/

Michael Della Bitta

------------------------------------------------
Appinions | 18 East 41st St., Suite 1806 | New York, NY 10017
www.appinions.com
Where Influence Isn’t a Game


On Mon, Aug 6, 2012 at 1:48 PM, Jon Drukman <jd...@gmail.com> wrote:
> Hi there.  I am running Solr 1.4.1 on an Amazon EC2 box with 7.5GB of RAM.
>  It was set up about 18 months ago and has been largely trouble-free.
>  Unfortunately, lately it has started to run out of memory pretty much
> every day.  We are seeing
>
> SEVERE: java.lang.OutOfMemoryError: Java heap space
>
> When that happens, a simple query like
> "http://localhost:8983/solr/select?q=*:*'"
> returns nothing.
>
> I am starting Solr with the following:
>
> /usr/lib/jvm/jre/bin/java -XX:+UseConcMarkSweepGC -Xms1G -Xmx5G -jar
> start.jar
>
> It would be vastly preferable if Solr could just exit when it gets a memory
> error, because we have it running under daemontools, and that would cause
> an automatic restart.  After restarting, Solr works fine for another 12-18
> hours.  Not ideal but at least it wouldn't require human intervention to
> get it going again.
>
> What can I do to reduce the memory pressure?  Does Solr require the entire
> index to fit in memory at all times?  The on disk size is 15GB.  There are
> 27.5 million documents, but they are all tiny (mostly one line forum
> comments like "this game is awesome").
>
> We're using Sun openJava SDK 1.6 if that matters.
>
> -jsd-