You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Scott Yeadon <sc...@anu.edu.au> on 2019/06/20 01:15:38 UTC

8.0 upgrade issue

Hi,

I’m running Solr on Ubuntu 18.04 (32-bit) using OpenJDK 10.0.2. Up until now I have had no problem with Solr (started running it since 4.x), however after upgrading from 7.x to 8.x I am getting serious memory issues.

I have a small repository of 30,000 documents currently using Solr 7.1 for the search function (for the last two years without issue). I attempted an upgrade to 8.1.1 and tried to perform a full reindex however, it manages about 1000 documents and then dies from lack of memory (or so it says). I tried 8.1.0 with the same result. I then tried 8.0.0 which did successfully manage a full reindex but then after performing a couple of search queries died from lack of memory. I then tried 7.7.2 which worked fine. I have now gone back to my original 7.1 as I can’t risk 8.x in my production system. Has anyone else had these issues with 8.x?

Note that I did increase Xmx to 1024m (previously 512m) but that made no difference, it may be some other resource than memory, but if it is, it isn’t saying so, and it’s such a small repository it doesn’t seem to make sense to be running out of memory.

Scott.

Re: 8.0 upgrade issue

Posted by Scott Yeadon <sc...@anu.edu.au>.
Hi Shawn,

The GC seems to be the issue, changing back to CMS worked and I see in the G1 docs it states it really doesn’t work well for small heap sizes. We’ll be changing to a better resourced 64-bit VM with more memory later next year with the next Ubuntu LTS release, so it should cease to be a problem after that. Thanks for the help.

Scott.

> On 11 Jul 2019, at 12:20 pm, Shawn Heisey <ap...@elyograg.org> wrote:
> 
> On 6/19/2019 7:15 PM, Scott Yeadon wrote:
>> I’m running Solr on Ubuntu 18.04 (32-bit) using OpenJDK 10.0.2. Up until now I have had no problem with Solr (started running it since 4.x), however after upgrading from 7.x to 8.x I am getting serious memory issues.
>> I have a small repository of 30,000 documents currently using Solr 7.1 for the search function (for the last two years without issue). I attempted an upgrade to 8.1.1 and tried to perform a full reindex however, it manages about 1000 documents and then dies from lack of memory (or so it says). I tried 8.1.0 with the same result. I then tried 8.0.0 which did successfully manage a full reindex but then after performing a couple of search queries died from lack of memory. I then tried 7.7.2 which worked fine. I have now gone back to my original 7.1 as I can’t risk 8.x in my production system. Has anyone else had these issues with 8.x?
>> Note that I did increase Xmx to 1024m (previously 512m) but that made no difference, it may be some other resource than memory, but if it is, it isn’t saying so, and it’s such a small repository it doesn’t seem to make sense to be running out of memory.
> 
> Solr 8 has switched the garbage collector from CMS to G1, because CMS is deprecated in newer versions of Java, and will be removed in the near future.
> 
> G1 is a more efficient collector, but it does require somewhat more memory beyond the heap than CMS does.  For most users, this is not a problem, but for the small heap values and total system memory you're using, it might be enough to go over the threshold.
> 
> You could try setting the old 7.x GC_TUNE settings in your include file, normally named solr.in.sh on non-windows platforms.
> 
>      GC_TUNE=('-XX:NewRatio=3' \
>        '-XX:SurvivorRatio=4' \
>        '-XX:TargetSurvivorRatio=90' \
>        '-XX:MaxTenuringThreshold=8' \
>        '-XX:+UseConcMarkSweepGC' \
>        '-XX:ConcGCThreads=4' '-XX:ParallelGCThreads=4' \
>        '-XX:+CMSScavengeBeforeRemark' \
>        '-XX:PretenureSizeThreshold=64m' \
>        '-XX:+UseCMSInitiatingOccupancyOnly' \
>        '-XX:CMSInitiatingOccupancyFraction=50' \
>        '-XX:CMSMaxAbortablePrecleanTime=6000' \
>        '-XX:+CMSParallelRemarkEnabled' \
>        '-XX:+ParallelRefProcEnabled' \
>        '-XX:-OmitStackTraceInFastThrow')
> 
> I would probably also use Java 8 rather than Java 10.  Java 10 is not an LTS version, and the older version might require a little bit less memory, which is a premium resource on your setup.  Upgrading to Java 11, the next LTS version, would likely require even more memory.
> 
> Why are you running a 32-bit OS with such a small memory size?  It's not possible to use heap sizes much larger than 1.5 GB on a 32-bit OS. There are also some known bugs with running Lucene-based software on 32-bit Java -- and one of them is specifically related to the G1 collector.
> 
> Thanks,
> Shawn


Re: 8.0 upgrade issue

Posted by Shawn Heisey <ap...@elyograg.org>.
On 6/19/2019 7:15 PM, Scott Yeadon wrote:
> I’m running Solr on Ubuntu 18.04 (32-bit) using OpenJDK 10.0.2. Up until now I have had no problem with Solr (started running it since 4.x), however after upgrading from 7.x to 8.x I am getting serious memory issues.
> 
> I have a small repository of 30,000 documents currently using Solr 7.1 for the search function (for the last two years without issue). I attempted an upgrade to 8.1.1 and tried to perform a full reindex however, it manages about 1000 documents and then dies from lack of memory (or so it says). I tried 8.1.0 with the same result. I then tried 8.0.0 which did successfully manage a full reindex but then after performing a couple of search queries died from lack of memory. I then tried 7.7.2 which worked fine. I have now gone back to my original 7.1 as I can’t risk 8.x in my production system. Has anyone else had these issues with 8.x?
> 
> Note that I did increase Xmx to 1024m (previously 512m) but that made no difference, it may be some other resource than memory, but if it is, it isn’t saying so, and it’s such a small repository it doesn’t seem to make sense to be running out of memory.

Solr 8 has switched the garbage collector from CMS to G1, because CMS is 
deprecated in newer versions of Java, and will be removed in the near 
future.

G1 is a more efficient collector, but it does require somewhat more 
memory beyond the heap than CMS does.  For most users, this is not a 
problem, but for the small heap values and total system memory you're 
using, it might be enough to go over the threshold.

You could try setting the old 7.x GC_TUNE settings in your include file, 
normally named solr.in.sh on non-windows platforms.

       GC_TUNE=('-XX:NewRatio=3' \
         '-XX:SurvivorRatio=4' \
         '-XX:TargetSurvivorRatio=90' \
         '-XX:MaxTenuringThreshold=8' \
         '-XX:+UseConcMarkSweepGC' \
         '-XX:ConcGCThreads=4' '-XX:ParallelGCThreads=4' \
         '-XX:+CMSScavengeBeforeRemark' \
         '-XX:PretenureSizeThreshold=64m' \
         '-XX:+UseCMSInitiatingOccupancyOnly' \
         '-XX:CMSInitiatingOccupancyFraction=50' \
         '-XX:CMSMaxAbortablePrecleanTime=6000' \
         '-XX:+CMSParallelRemarkEnabled' \
         '-XX:+ParallelRefProcEnabled' \
         '-XX:-OmitStackTraceInFastThrow')

I would probably also use Java 8 rather than Java 10.  Java 10 is not an 
LTS version, and the older version might require a little bit less 
memory, which is a premium resource on your setup.  Upgrading to Java 
11, the next LTS version, would likely require even more memory.

Why are you running a 32-bit OS with such a small memory size?  It's not 
possible to use heap sizes much larger than 1.5 GB on a 32-bit OS. 
There are also some known bugs with running Lucene-based software on 
32-bit Java -- and one of them is specifically related to the G1 collector.

Thanks,
Shawn

Re: 8.0 upgrade issue

Posted by Jan Høydahl <ja...@cominvent.com>.
See comments in issue https://issues.apache.org/jira/browse/SOLR-13617 which is now closed.

Appears you try to run Solr on a system with only 1Gb physical ram and allocate 1Gb to Solr. Remember that Linux also needs memory!

I’d leave 1Gb for Linux, 1Gb for Solr and 2Gb un-allocated, which will then be used automatically by Linux to cache your index files for better performance. If your index grows larger add more physical (free) ram, if you get OOM increase heap (Xmx).

So try a VM with 4Gb ram, and give 1Gb to Solr and iterate from there.

Jan

> 20. jun. 2019 kl. 03:15 skrev Scott Yeadon <sc...@anu.edu.au>:
> 
> Hi,
> 
> I’m running Solr on Ubuntu 18.04 (32-bit) using OpenJDK 10.0.2. Up until now I have had no problem with Solr (started running it since 4.x), however after upgrading from 7.x to 8.x I am getting serious memory issues.
> 
> I have a small repository of 30,000 documents currently using Solr 7.1 for the search function (for the last two years without issue). I attempted an upgrade to 8.1.1 and tried to perform a full reindex however, it manages about 1000 documents and then dies from lack of memory (or so it says). I tried 8.1.0 with the same result. I then tried 8.0.0 which did successfully manage a full reindex but then after performing a couple of search queries died from lack of memory. I then tried 7.7.2 which worked fine. I have now gone back to my original 7.1 as I can’t risk 8.x in my production system. Has anyone else had these issues with 8.x?
> 
> Note that I did increase Xmx to 1024m (previously 512m) but that made no difference, it may be some other resource than memory, but if it is, it isn’t saying so, and it’s such a small repository it doesn’t seem to make sense to be running out of memory.
> 
> Scott.