You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Harish Reddy <hr...@3clogic.com> on 2014/02/23 07:17:09 UTC

Fwd: configuration for heavy system

Hi,
We are testing solr.
We have a document with some 100 indexes and there are around 10 million
records.It is failing,either stuck or timed out on query.

Is this indexing job possible with solr?
If Yes,what should be the hardware,solr configuration and how many nodes
would be optimum?
Now I am running solr on four nodes with solr config: number of shards=2 on
a 16 GB machine.

If No,how many indexes/records can solr handle without issues?

Re: configuration for heavy system

Posted by Erick Erickson <er...@gmail.com>.
You haven't told us anything about _how_ you're
trying to index this document nor what it's format
is. Nor what "100 indexes and around 10 million
records" means. 1B total records? 10M total records?

Solr easily handles 10s of M records on a single decent
size node, I've seen between 50M and 300M....

Perhaps you should review:

http://wiki.apache.org/solr/UsingMailingLists

Best,
Erick


On Sat, Feb 22, 2014 at 10:17 PM, Harish Reddy <hr...@3clogic.com> wrote:

> Hi,
> We are testing solr.
> We have a document with some 100 indexes and there are around 10 million
> records.It is failing,either stuck or timed out on query.
>
> Is this indexing job possible with solr?
> If Yes,what should be the hardware,solr configuration and how many nodes
> would be optimum?
> Now I am running solr on four nodes with solr config: number of shards=2 on
> a 16 GB machine.
>
> If No,how many indexes/records can solr handle without issues?
>