You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Manisha Rahatadkar <ma...@AnjuSoftware.com> on 2020/10/01 03:36:35 UTC

Solr 7.7 Indexing issue

Hello all

We are using Apache Solr 7.7 on Windows platform. The data is synced to Solr using Solr.Net commit. The data is being synced to SOLR in batches. The document size is very huge (~0.5GB average) and solr indexing is taking long time. Total document size is ~200GB. As the solr commit is done as a part of API, the API calls are failing as document indexing is not completed.

  1.  What is your advise on syncing such a large volume of data to Solr KB.
  2.  Because of the search fields requirements, almost 8 fields are defined as Text fields.
  3.  Currently Solr_JAVA_MEM is set to 2gb. Is that enough for such a large volume of data? ( IF "%SOLR_JAVA_MEM%"=="" set SOLR_JAVA_MEM=-Xms2g -Xmx2g)
  4.  How to set up Solr in production on Windows? Currently it's set up as a standalone engine and client is requested to take the backup of the drive. Is there any other better way to do? How to set up for the disaster recovery?

Thanks in advance.

Regards
Manisha Rahatadkar


Confidentiality Notice
====================
This email message, including any attachments, is for the sole use of the intended recipient and may contain confidential and privileged information. Any unauthorized view, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message. Anju Software, Inc. 4500 S. Lakeshore Drive, Suite 620, Tempe, AZ USA 85282.