You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@opennlp.apache.org by Johnson J <jo...@gmail.com> on 2011/12/10 08:11:13 UTC

Open NLP out of memory error

Hi,

I am using document Categorizer tool and trying to train my document that
size is nearly 200 MB, hence I got the out of memory error in GISModel
class, this is class creating two dimensional array of float as well as two
dimensional array of object. I tried to increase the heap to -Xmx1550m,
since I have the 4GB RAM, please tell me is there any way to solve this
error for large training text.

Thanks in advance,
Johnson J.

Re: Open NLP out of memory error

Posted by James Kosin <ja...@gmail.com>.
Johnson,

More RAM will help, yes.  But, it isn't necessary.

(1)  The value passed to Java is a suggested maximum size.  Java doesn't
statically allocate that much unless needed.
(2)  Like almost all applications, once Java runs out of real memory the
memory management should switch more to virtual memory and swap to
disk.  Granted less efficient; but as long as you have the time
shouldn't cause too much trouble.

Good Luck,
James


On 12/11/2011 1:52 AM, Johnson J wrote:
> Thanks James, I think I need to increase my RAM size for this.
>
>
> On Sat, Dec 10, 2011 at 7:11 PM, James Kosin <ja...@gmail.com> wrote:
>
>> Johnson,
>>
>> I usually use -Xmx4096m when training, though you might be able to get
>> away with a little less.  I've seen training use up to about 2-3G of
>> memory.
>>
>> James
>>
>> On 12/10/2011 2:11 AM, Johnson J wrote:
>>> Hi,
>>>
>>> I am using document Categorizer tool and trying to train my document that
>>> size is nearly 200 MB, hence I got the out of memory error in GISModel
>>> class, this is class creating two dimensional array of float as well as
>> two
>>> dimensional array of object. I tried to increase the heap to -Xmx1550m,
>>> since I have the 4GB RAM, please tell me is there any way to solve this
>>> error for large training text.
>>>
>>> Thanks in advance,
>>> Johnson J.
>>>
>>


Re: Open NLP out of memory error

Posted by Johnson J <jo...@gmail.com>.
Thanks James, I think I need to increase my RAM size for this.


On Sat, Dec 10, 2011 at 7:11 PM, James Kosin <ja...@gmail.com> wrote:

> Johnson,
>
> I usually use -Xmx4096m when training, though you might be able to get
> away with a little less.  I've seen training use up to about 2-3G of
> memory.
>
> James
>
> On 12/10/2011 2:11 AM, Johnson J wrote:
> > Hi,
> >
> > I am using document Categorizer tool and trying to train my document that
> > size is nearly 200 MB, hence I got the out of memory error in GISModel
> > class, this is class creating two dimensional array of float as well as
> two
> > dimensional array of object. I tried to increase the heap to -Xmx1550m,
> > since I have the 4GB RAM, please tell me is there any way to solve this
> > error for large training text.
> >
> > Thanks in advance,
> > Johnson J.
> >
>
>

Re: Open NLP out of memory error

Posted by Jason Baldridge <ja...@gmail.com>.
I've trained up to 16GB.

On Sat, Dec 10, 2011 at 7:41 AM, James Kosin <ja...@gmail.com> wrote:

> Johnson,
>
> I usually use -Xmx4096m when training, though you might be able to get
> away with a little less.  I've seen training use up to about 2-3G of
> memory.
>
> James
>
> On 12/10/2011 2:11 AM, Johnson J wrote:
> > Hi,
> >
> > I am using document Categorizer tool and trying to train my document that
> > size is nearly 200 MB, hence I got the out of memory error in GISModel
> > class, this is class creating two dimensional array of float as well as
> two
> > dimensional array of object. I tried to increase the heap to -Xmx1550m,
> > since I have the 4GB RAM, please tell me is there any way to solve this
> > error for large training text.
> >
> > Thanks in advance,
> > Johnson J.
> >
>
>


-- 
Jason Baldridge
Associate Professor, Department of Linguistics
The University of Texas at Austin
http://www.jasonbaldridge.com
http://twitter.com/jasonbaldridge

Re: Open NLP out of memory error

Posted by James Kosin <ja...@gmail.com>.
Johnson,

I usually use -Xmx4096m when training, though you might be able to get
away with a little less.  I've seen training use up to about 2-3G of memory.

James

On 12/10/2011 2:11 AM, Johnson J wrote:
> Hi,
>
> I am using document Categorizer tool and trying to train my document that
> size is nearly 200 MB, hence I got the out of memory error in GISModel
> class, this is class creating two dimensional array of float as well as two
> dimensional array of object. I tried to increase the heap to -Xmx1550m,
> since I have the 4GB RAM, please tell me is there any way to solve this
> error for large training text.
>
> Thanks in advance,
> Johnson J.
>