You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by mskeerthi <ms...@gmail.com> on 2014/07/01 12:57:53 UTC
Out of Memory when i downdload 5 Million records from sqlserver to
solr
I have to download my 5 million records from sqlserver to solr into one
index. I am getting below exception after downloading 1 Million records. Is
there any configuration or another to download from sqlserver to solr.
Below is the exception i am getting in solr:
org.apache.solr.common.SolrException; auto commit
error...:java.lang.IllegalStateException: this writer hit an
OutOfMemoryError; cannot commit
at
org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2915)
at
org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3096)
at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3063)
at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:578)
at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(Unknown
Source)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown
Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
--
View this message in context: http://lucene.472066.n3.nabble.com/Out-of-Memory-when-i-downdload-5-Million-records-from-sqlserver-to-solr-tp4144949.html
Sent from the Solr - User mailing list archive at Nabble.com.
Re: Out of Memory when i downdload 5 Million records from sqlserver
to solr
Posted by Shawn Heisey <so...@elyograg.org>.
On 7/1/2014 4:57 AM, mskeerthi wrote:
> I have to download my 5 million records from sqlserver to solr into one
> index. I am getting below exception after downloading 1 Million records. Is
> there any configuration or another to download from sqlserver to solr.
>
> Below is the exception i am getting in solr:
> org.apache.solr.common.SolrException; auto commit
> error...:java.lang.IllegalStateException: this writer hit an
> OutOfMemoryError; cannot commit
JDBC has a bad habit of defaulting to a mode where it will try to load
the entire SQL result set into RAM. Different JDBC drivers have
different ways of dealing with this problem. For Microsoft SQL Server,
here's a guide:
https://wiki.apache.org/solr/DataImportHandlerFaq#I.27m_using_DataImportHandler_with_MS_SQL_Server_database_with_sqljdbc_driver._DataImportHandler_is_going_out_of_memory._I_tried_adjustng_the_batchSize_values_but_they_don.27t_seem_to_make_any_difference._How_do_I_fix_this.3F
If you have trouble with that really long URL in your mail client, just
visit the main FAQ page and click on the link for SQL Server:
https://wiki.apache.org/solr/DataImportHandlerFaq
Thanks,
Shawn
Re: Out of Memory when i downdload 5 Million records from sqlserver
to solr
Posted by IJ <ja...@gmail.com>.
We faced similar problems on our side. We found it more reliable to have a
mechanism to extract all data from the Database into a flat file - and then
use a JAVA program to bulk index into Solr from the file via SolrJ API.
--
View this message in context: http://lucene.472066.n3.nabble.com/Out-of-Memory-when-i-downdload-5-Million-records-from-sqlserver-to-solr-tp4144949p4145041.html
Sent from the Solr - User mailing list archive at Nabble.com.
Re: Out of Memory when i downdload 5 Million records from sqlserver
to solr
Posted by Aman Tandon <am...@gmail.com>.
You can try gave some more memory to solr
On Jul 1, 2014 4:41 PM, "mskeerthi" <ms...@gmail.com> wrote:
> I have to download my 5 million records from sqlserver to solr into one
> index. I am getting below exception after downloading 1 Million records. Is
> there any configuration or another to download from sqlserver to solr.
>
> Below is the exception i am getting in solr:
> org.apache.solr.common.SolrException; auto commit
> error...:java.lang.IllegalStateException: this writer hit an
> OutOfMemoryError; cannot commit
> at
>
> org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2915)
> at
> org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3096)
> at
> org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3063)
> at
>
> org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:578)
> at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
> at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
> Source)
> at java.util.concurrent.FutureTask.run(Unknown Source)
> at
>
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(Unknown
> Source)
> at
>
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown
> Source)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown
> Source)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
> Source)
> at java.lang.Thread.run(Unknown Source)
>
>
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/Out-of-Memory-when-i-downdload-5-Million-records-from-sqlserver-to-solr-tp4144949.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>