You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by "Lewin Joy (TMS)" <le...@toyota.com> on 2017/06/20 18:36:29 UTC

Estimating CPU

** PROTECTED 関係者外秘
Hi,

Is there anyway to estimate the CPU needed to setup solr environment?
We use pivot facets extensively. We use it in json facet api and also native queries.

For our 150 million record collection, we are seeing high CPU usage of 100% with small loads.
If we have to increase our configuration, is there somehow we can estimate the CPU usage?

We have five VMs with 8 CPU each and 32gb RAM, for which solr uses 24gb heap.

Thanks,
Lewin

Re: Estimating CPU

Posted by Erick Erickson <er...@gmail.com>.
In a word, "stress test". Here's the blog I wrote on topic outlining
why it's hard to give a more helpful answer....

https://lucidworks.com/2012/07/23/sizing-hardware-in-the-abstract-why-we-dont-have-a-definitive-answer/

You might want to explore the hyper-log-log approach which provides
pretty good estimates without so many resources.

Best,
Erick

On Tue, Jun 20, 2017 at 11:36 AM, Lewin Joy (TMS) <le...@toyota.com> wrote:
> ** PROTECTED 関係者外秘
> Hi,
>
> Is there anyway to estimate the CPU needed to setup solr environment?
> We use pivot facets extensively. We use it in json facet api and also native queries.
>
> For our 150 million record collection, we are seeing high CPU usage of 100% with small loads.
> If we have to increase our configuration, is there somehow we can estimate the CPU usage?
>
> We have five VMs with 8 CPU each and 32gb RAM, for which solr uses 24gb heap.
>
> Thanks,
> Lewin