You are viewing a plain text version of this content. The canonical link for it is here.
Posted to java-user@lucene.apache.org by Ian Lea <ia...@gmail.com> on 2010/05/18 11:15:41 UTC
Re: Deciding memory requirements for Lucene indexes proactively --
How to?
> Is there a way (perhaps a formulae) to accurately
> judge the memory requirement for a Lucene index?
> (May be based on number of documents or index
> size etc?)
The short answer is no, although there are some things you can
estimate based on the number of fields, terms etc. Sorting will use
memory - maybe a lot.
> Reason I am asking is that we had two indexes
> running on separate Tomcat instances and we decided
> to move both these webapps (Solr) to a single Tomcat
> for effective memory sharing. However our JVM
> memory allocation was not accurate enough and the
> Indexes started running OutOfMemory errors on
> our production environment.
>
> It would be much helpful if we can identify the
> requirement for resources pro-actively.
>
> Any help on the matter much appreciated.
>
> We use: Solr 1.4, Java 1.6.0_20
You might get better answers on the Solr list.
--
Ian.
---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org