You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by "Shah, Vineel" <vi...@hotjobs.com> on 2003/04/03 22:04:18 UTC

RE: Lucene stress Testing (Searchable Index of 40 MB)

My system:
My search code runs in a JSP contained by Tomcat. It calls lucene as an added library. I'm currently using a development build somewhere between 1.2 and 1.3 RC1. My index is 200mb and ~270,000 records. The index is stored in a disk directory, and periodically read into RAM.

1. To get it to run at all, I had to change my Java runtime options to give the VM more RAM. I'm using -Xmx512m -Xms512m. This means the min and max ram allotment when Tomcat invokes the jvm is 512 megabytes.

2. Incremental indexing, meaning adding and deleting candidates, works fine in RAM.

3. Optimizing the index after updating doesn't work. When my dataset is small (30m) it was fine, but when I moved to the 200m set, it choked. I could probably up the RAM allotment even more to make it work, but I have limits on that machine.

4. I have a seperate, command-line process that updates the disk index. I run this with a large RAM allotment also, and it manages the optimization just fine. Every 30 updates, I reload the index into RAM.

Incidentally, I was getting .1 searches/sec on disk with 10 concurrent users, w/no delay between users. When I searched on a RAMDirectory instead, I started getting 8-9 searches/sec.

I hope this helps.

Vineel Shah

-----Original Message-----
From: Amit Kapur [mailto:amitkapur@newgen.co.in]
Sent: Monday, March 31, 2003 6:21 AM
To: lucene-dev@jakarta.apache.org
Subject: Lucene stress Testing (Searchable Index of 40 MB)




hi all,

I am using Lucene to Index/Search Documents in a very large Index in our Document Management System. (About 50,000 Documents).
I am using the following code,
try
  {
       Directory oDirectory = FSDirectory.getDirectory(strDir, false);
       IndexReader oReader = IndexReader.open(oDirectory);

// strDir=FTS and false for create is given because i want to update an existing index and not erase the contents.
// The current size of the index in the FTS Directory is about 40 MB.
  }
  catch (IOException ioe)
  {
      //  ioe.getMessage() --> Gives the Following Message :- "F:\Program Files\OmniDocs Server\FTS\segments (The system cannot find the file specified)"
  }
Initially when the size of the Index in the FTS Directory was about 30 MB the adding of documents into the index was working fine, but ever since the index size has increased the IndexReader is not opening. I have also seen the code of the FSDirectory.java file but havent been able to figure out
anything.

Kindly help asap.

Regards
Amit Kapur
Software Developer,
Newgen Software Technologies Ltd.
Okhla
Delhi, INDIA.