You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Vignan Malyala <ds...@gmail.com> on 2019/08/28 06:55:11 UTC

Max number of cores in solr

Hi
Im planning to create separate core for each of my client in solr.
Can I create around 500 cores in solr. Is it a good idea?
For each client i have around 100000 records on average currently.

How much physical memory it might consume. Plz help with this.
Thank you

Re: Max number of cores in solr

Posted by Pure Host - Wolfgang Freudenberger <w....@pure-host.de>.
We run SOLR with Replica n=2 and are happily torture them with 1500~ 
cores and above, each set contains at least 10.000 docs, most of them 
are over millions. It works.


We have a 256GB RAM in the servers, allocation for SOLR is 140G.

Mit freundlichem Gruß / kind regards

Wolfgang Freudenberger
Pure Host IT-Services
Münsterstr. 14
48341 Altenberge
GERMANY
Tel.: (+49) 25 71 - 99 20 170
Fax: (+49) 25 71 - 99 20 171

Umsatzsteuer ID DE259181123

Informieren Sie sich über unser gesamtes Leistungsspektrum unter www.pure-host.de
Get our whole services at www.pure-host.de

Am 28.08.2019 um 08:55 schrieb Vignan Malyala:
> Hi
> Im planning to create separate core for each of my client in solr.
> Can I create around 500 cores in solr. Is it a good idea?
> For each client i have around 100000 records on average currently.
>
> How much physical memory it might consume. Plz help with this.
> Thank you
>


Re: Max number of cores in solr

Posted by Shawn Heisey <ap...@elyograg.org>.
On 8/28/2019 12:55 AM, Vignan Malyala wrote:
> Im planning to create separate core for each of my client in solr.
> Can I create around 500 cores in solr. Is it a good idea?
> For each client i have around 100000 records on average currently.

There is no limit that I know of to the number of cores.  You're only 
limited by system resources.  That many cores will have a lot of files 
to open, and a lot of threads, so you would definitely need to increase 
the OS limits on file handles and processes.

Solr startup with that many cores could take a very long time.  If you 
run SolrCloud, I would say that you should find a way to run fewer 
indexes -- SolrCloud begins to have scalability problems with only a few 
hundred.

> How much physical memory it might consume. Plz help with this.
> Thank you

500 cores each with 100000 documents is only 50 million total documents. 
  This isn't very big, but you will need plenty of resources.

The most important resource for good performance will be memory.  And we 
can't tell you how much you'll need.  That will depend on exactly how 
you use Solr and the nature of your data.  I've personally handled 
several cores with about 80 million documents total with 8GB of heap and 
64GB of total system memory, which only left enough memory to cache 
about a third of the total index size.  Some indexes can have difficulty 
handling only a few million documents on the same hardware.

https://cwiki.apache.org/confluence/display/solr/SolrPerformanceProblems#RAM

https://lucidworks.com/post/sizing-hardware-in-the-abstract-why-we-dont-have-a-definitive-answer/

Thanks,
Shawn