You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Peeyush Bishnoi <pe...@yahoo-inc.com> on 2008/01/31 19:22:02 UTC
Re: about the exception in mapreduce program?
You can also set in the Hbase HeapSize in hbase-env.sh file itself ,
which is located at $HBASE_HOME/conf/hbase-env.sh
---
Peeyush Bishnoi
On Fri, 2008-02-01 at 10:55 +0530, Jaideep Dhok wrote:
> You can change the max memory used by JVM using the -Xmx option. There is
> also a HADOOP_HEAPSIZE option in hadoop-env.sh, which you can increase.
>
> On Feb 1, 2008 10:22 AM, ma qiang <ma...@gmail.com> wrote:
>
> > Hi all:
> > I meet this problem as below:
> > My map function read from a table in HBase, then merge several
> > string and finally save these string into another table HBase. The
> > number of string and the length of the string is large. After ten
> > minutes, the hadoop print error "out of memory, java heap is not
> > enough" . And the program is tested using small string and there is no
> > error. But when the number and length of string become large, the
> > error happened. I installed the hadoop in non-distributed mode, and
> > the size of my computer's memory is 2G, this size is enough fit for my
> > simple program in theory.
> > Who can tell me why?
> > Thank you very much!
> >
> >
> > Best Wishes!
> >
>
>
>