You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by ravicv <ra...@gmail.com> on 2012/03/21 09:10:26 UTC

Best way to index huge data quickly in solr multi core configuration

Hi

I am using Oracle Exadata as my DB. I want to index nearly 4 crore rows. I
have tried with specifing batchsize as 10000. and with out specifing
batchsize. But both tests takes nearly same time.

Could anyone suggest me best way to index huge data Quickly? 

Thanks
Ravi

--
View this message in context: http://lucene.472066.n3.nabble.com/Best-way-to-index-huge-data-quickly-in-solr-multi-core-configuration-tp3844922p3844922.html
Sent from the Solr - User mailing list archive at Nabble.com.

Re: Best way to index huge data quickly in solr multi core configuration

Posted by Erick Erickson <er...@gmail.com>.
First question: What's taking the time? The data acquisition or the
actual indexing process? Until you answer that question, you don't
know where to spend your efforts....

Best
Erick

On Wed, Mar 21, 2012 at 4:10 AM, ravicv <ra...@gmail.com> wrote:
> Hi
>
> I am using Oracle Exadata as my DB. I want to index nearly 4 crore rows. I
> have tried with specifing batchsize as 10000. and with out specifing
> batchsize. But both tests takes nearly same time.
>
> Could anyone suggest me best way to index huge data Quickly?
>
> Thanks
> Ravi
>
> --
> View this message in context: http://lucene.472066.n3.nabble.com/Best-way-to-index-huge-data-quickly-in-solr-multi-core-configuration-tp3844922p3844922.html
> Sent from the Solr - User mailing list archive at Nabble.com.