You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by "Daniel Naber (JIRA)" <ji...@apache.org> on 2006/01/21 21:16:43 UTC
[jira] Commented: (LUCENE-488) adding docs with large (binary)
fields of 5mb causes OOM regardless of heap size
[ http://issues.apache.org/jira/browse/LUCENE-488?page=comments#action_12363524 ]
Daniel Naber commented on LUCENE-488:
-------------------------------------
writer.setMaxBufferedDocs(5); solves to OOM error, at least for binary stuff that's 5MB. So with writer.setMaxBufferedDocs(1) you can probably add documents that are almost as big as your JVM maximum memory I guess.
> adding docs with large (binary) fields of 5mb causes OOM regardless of heap size
> --------------------------------------------------------------------------------
>
> Key: LUCENE-488
> URL: http://issues.apache.org/jira/browse/LUCENE-488
> Project: Lucene - Java
> Type: Bug
> Versions: 1.9
> Environment: Linux asimov 2.6.6.hoss1 #1 SMP Tue Jul 6 16:31:01 PDT 2004 i686 GNU/Linux
> Reporter: Hoss Man
> Attachments: TestBigBinary.java
>
> as reported by George Washington in a message to java-user@lucene.apache.org with subect "Storing large text or binary source documents in the index and memory usage" arround 2006-01-21 there seems to be a problem with adding docs containing really large fields.
> I'll attach a test case in a moment, note that (for me) regardless of how big i make my heap size, and regardless of what value I set MIN_MB to, once it starts trying to make documents of containing 5mb of data, it can only add 9 before it rolls over and dies.
> here's the output from the code as i will attach in a moment...
> [junit] Testsuite: org.apache.lucene.document.TestBigBinary
> [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 78.656 sec
> [junit] ------------- Standard Output ---------------
> [junit] NOTE: directory will not be cleaned up automatically...
> [junit] Dir: /tmp/org.apache.lucene.document.TestBigBinary.97856146.100iters.4mb
> [junit] iters completed: 100
> [junit] totalBytes Allocated: 419430400
> [junit] NOTE: directory will not be cleaned up automatically...
> [junit] Dir: /tmp/org.apache.lucene.document.TestBigBinary.97856146.100iters.5mb
> [junit] iters completed: 9
> [junit] totalBytes Allocated: 52428800
> [junit] ------------- ---------------- ---------------
> [junit] Testcase: testBigBinaryFields(org.apache.lucene.document.TestBigBinary): Caused an ERROR
> [junit] Java heap space
> [junit] java.lang.OutOfMemoryError: Java heap space
> [junit] Test org.apache.lucene.document.TestBigBinary FAILED
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira
---------------------------------------------------------------------
To unsubscribe, e-mail: java-dev-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-dev-help@lucene.apache.org