You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Kuchekar <ku...@gmail.com> on 2013/09/11 16:54:23 UTC

Higher Memory Usage with solr 4.4

Hi,

     We are using solr 4.4 on Linux with OpenJDK 64-Bit. We started the
Solr with 40GB but we noticed that the QTime is way high compared to
similar on 3.5 solr.
Both the 3.5 and 4.4 solr's configurations and schema are similarly
constructed. Also during the triage we found the physical memory to be
utilized at 95..%.

Is there any configuration we might be missing.

Looking forward for your reply.

Thanks.
Kuchekar, Nilesh

Re: Higher Memory Usage with solr 4.4

Posted by Shawn Heisey <so...@elyograg.org>.
On 9/11/2013 8:54 AM, Kuchekar wrote:
>       We are using solr 4.4 on Linux with OpenJDK 64-Bit. We started the
> Solr with 40GB but we noticed that the QTime is way high compared to
> similar on 3.5 solr.
> Both the 3.5 and 4.4 solr's configurations and schema are similarly
> constructed. Also during the triage we found the physical memory to be
> utilized at 95..%.

A 40GB heap is *huge*.  Unless you are dealing with millions of 
super-large documents or many many millions of smaller documents, there 
should be no need for a heap that large.  Additionally, if you are 
allocating most of your system memory to Java, then you will have little 
or no RAM available for OS disk caching, which will cause major 
performance issues.

For most indexes, memory usage should be less after an upgrade, but 
there are exceptions.

I see that you had an earlier question about stored field compression, 
and that you talked about exporting data from your 3.5 install to index 
into 4.4, in which you had stored every field, including copyFields.

If you have a lot of stored data, memory usage for decompression can 
become a problem.  It's usually a lot better to store minimal 
information, just enough to display a result grid/list, and some ID 
information so that when someone clicks on an individual result, you can 
retrieve the entire record from another data source, like a database or 
a filesystem.

Here's a more exhaustive list of potential performance and memory 
problems with Solr:

http://wiki.apache.org/solr/SolrPerformanceProblems

OpenJDK may be problematic, especially if it's version 6.  With Java 7, 
OpenJDK is actually the reference implementation, so if you are using 
OpenJDK 7, I would be less concerned.  With either version, Oracle Java 
tends to produce better results.

Thanks,
Shawn


Re: Higher Memory Usage with solr 4.4

Posted by Erick Erickson <er...@gmail.com>.
There are some defaults (sorry, don't have them listed) that are
somewhat different. If you took your 3.5 and just used it for
4.x, it's probably worth going back over it and start with the 4.x
example and add in any customizations you did for 3.5...

But in general, the memory usage for 4.x should be much smaller
than for 3.5, there were some _major_ improvements in that area.
So I'm guessing you've moved over some innocent-seeming config..

FWIW,
Erick


On Wed, Sep 11, 2013 at 10:54 AM, Kuchekar <ku...@gmail.com>wrote:

> Hi,
>
>      We are using solr 4.4 on Linux with OpenJDK 64-Bit. We started the
> Solr with 40GB but we noticed that the QTime is way high compared to
> similar on 3.5 solr.
> Both the 3.5 and 4.4 solr's configurations and schema are similarly
> constructed. Also during the triage we found the physical memory to be
> utilized at 95..%.
>
> Is there any configuration we might be missing.
>
> Looking forward for your reply.
>
> Thanks.
> Kuchekar, Nilesh
>