You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by Mikhail Bautin <ba...@gmail.com> on 2012/02/10 21:48:55 UTC
how to increase Hadoop QA heap
Hello,
Does anyone know how to increase heap allocation for Hadoop QA runs, or at
least check the available amount of memory?
Thanks,
--Mikhail
Re: how to increase Hadoop QA heap
Posted by Ted Yu <yu...@gmail.com>.
May not be necessary. Mikhail has found the cause for Out Of Memory error
in TestHFileBlock.
Cheers
On Fri, Feb 10, 2012 at 8:53 PM, Stack <st...@duboce.net> wrote:
> On Fri, Feb 10, 2012 at 8:50 PM, Ted Yu <yu...@gmail.com> wrote:
> > I found that it was -Xmx2300m that caused JVM error, not
> > -XX:MaxDirectMemorySize=200m
> >
> > The following setting allows unit tests to run on MacBook:
> > -d32 -XX:MaxDirectMemorySize=200m -enableassertions -Xmx1900m
> >
>
> I committed patch which sets hadoopqa to run w/ 3g heap. Should I
> have set maxdirectmemorysize too?
> St.Ack
>
Re: how to increase Hadoop QA heap
Posted by Stack <st...@duboce.net>.
On Fri, Feb 10, 2012 at 8:50 PM, Ted Yu <yu...@gmail.com> wrote:
> I found that it was -Xmx2300m that caused JVM error, not
> -XX:MaxDirectMemorySize=200m
>
> The following setting allows unit tests to run on MacBook:
> -d32 -XX:MaxDirectMemorySize=200m -enableassertions -Xmx1900m
>
I committed patch which sets hadoopqa to run w/ 3g heap. Should I
have set maxdirectmemorysize too?
St.Ack
Re: how to increase Hadoop QA heap
Posted by Ted Yu <yu...@gmail.com>.
I found that it was -Xmx2300m that caused JVM error, not
-XX:MaxDirectMemorySize=200m
The following setting allows unit tests to run on MacBook:
-d32 -XX:MaxDirectMemorySize=200m -enableassertions -Xmx1900m
FYI
On Fri, Feb 10, 2012 at 1:42 PM, N Keywal <nk...@gmail.com> wrote:
> Hi,
>
> If you want to check the resources available during the tests execution you
> can enhance org.apache.hadoop.hbase.ResourceChecker, and log a message if
> something looks wrong. There's a UnixOperatingSystemMXBean from which you
> can get some stuff. This rule is executed before & after each test method.
>
> Cheers,
>
> N.
>
> On Fri, Feb 10, 2012 at 10:16 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > Mikhail:
> > Would this help
> >
> >
> http://stackoverflow.com/questions/6878883/how-do-i-determine-maxdirectmemorysize-on-a-running-jvm
> > ?
> >
> > I tried to set XX:MaxDirectMemorySize
> > According to
> >
> >
> http://stackoverflow.com/questions/3773775/default-for-xxmaxdirectmemorysize
> > ,
> > the default is 64 MB.
> >
> > But even if I set XX:MaxDirectMemorySize=64m, I got the following on
> > MacBook:
> >
> > Error occurred during initialization of VM
> > Could not reserve enough space for object heap
> > Could not create the Java virtual machine.
> >
> > So some expert advice is needed :-)
> >
> > On Fri, Feb 10, 2012 at 1:06 PM, Mikhail Bautin <
> > bautin.mailing.lists@gmail.com> wrote:
> >
> > > @Ted: thanks for the suggestion.
> > >
> > > Maybe I should have worded my question differently. I am interested in
> > the
> > > actual amount of memory available on Hadoop QA machines, because I see
> > > out-of-memory errors in native memory allocation (not part of Java
> heap)
> > > that only happen in Hadoop QA.
> > >
> > > Perhaps we should define a "reference configuration" for HBase test
> > suite.
> > > E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box,
> etc.?
> > >
> > > Thanks,
> > > --Mikhail
> > >
> > > On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu <yu...@gmail.com> wrote:
> > >
> > > > This should do:
> > > >
> > > > Index: pom.xml
> > > > ===================================================================
> > > > --- pom.xml (revision 1242915)
> > > > +++ pom.xml (working copy)
> > > > @@ -350,7 +350,7 @@
> > > >
> > > > <configuration>
> > > >
> > > > <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
> > > > - <argLine>-enableassertions -Xmx1900m
> > > > -Djava.security.egd=file:/dev/./urandom</argLine>
> > > > + <argLine>-d32 -enableassertions -Xmx2300m
> > > > -Djava.security.egd=file:/dev/./urandom</argLine>
> > > > <redirectTestOutputToFile>true</redirectTestOutputToFile>
> > > > </configuration>
> > > > </plugin>
> > > >
> > > > On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
> > > > bautin.mailing.lists@gmail.com> wrote:
> > > >
> > > > > Hello,
> > > > >
> > > > > Does anyone know how to increase heap allocation for Hadoop QA
> runs,
> > or
> > > > at
> > > > > least check the available amount of memory?
> > > > >
> > > > > Thanks,
> > > > > --Mikhail
> > > > >
> > > >
> > >
> >
>
Re: how to increase Hadoop QA heap
Posted by N Keywal <nk...@gmail.com>.
Hi,
If you want to check the resources available during the tests execution you
can enhance org.apache.hadoop.hbase.ResourceChecker, and log a message if
something looks wrong. There's a UnixOperatingSystemMXBean from which you
can get some stuff. This rule is executed before & after each test method.
Cheers,
N.
On Fri, Feb 10, 2012 at 10:16 PM, Ted Yu <yu...@gmail.com> wrote:
> Mikhail:
> Would this help
>
> http://stackoverflow.com/questions/6878883/how-do-i-determine-maxdirectmemorysize-on-a-running-jvm
> ?
>
> I tried to set XX:MaxDirectMemorySize
> According to
>
> http://stackoverflow.com/questions/3773775/default-for-xxmaxdirectmemorysize
> ,
> the default is 64 MB.
>
> But even if I set XX:MaxDirectMemorySize=64m, I got the following on
> MacBook:
>
> Error occurred during initialization of VM
> Could not reserve enough space for object heap
> Could not create the Java virtual machine.
>
> So some expert advice is needed :-)
>
> On Fri, Feb 10, 2012 at 1:06 PM, Mikhail Bautin <
> bautin.mailing.lists@gmail.com> wrote:
>
> > @Ted: thanks for the suggestion.
> >
> > Maybe I should have worded my question differently. I am interested in
> the
> > actual amount of memory available on Hadoop QA machines, because I see
> > out-of-memory errors in native memory allocation (not part of Java heap)
> > that only happen in Hadoop QA.
> >
> > Perhaps we should define a "reference configuration" for HBase test
> suite.
> > E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box, etc.?
> >
> > Thanks,
> > --Mikhail
> >
> > On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu <yu...@gmail.com> wrote:
> >
> > > This should do:
> > >
> > > Index: pom.xml
> > > ===================================================================
> > > --- pom.xml (revision 1242915)
> > > +++ pom.xml (working copy)
> > > @@ -350,7 +350,7 @@
> > >
> > > <configuration>
> > >
> > > <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
> > > - <argLine>-enableassertions -Xmx1900m
> > > -Djava.security.egd=file:/dev/./urandom</argLine>
> > > + <argLine>-d32 -enableassertions -Xmx2300m
> > > -Djava.security.egd=file:/dev/./urandom</argLine>
> > > <redirectTestOutputToFile>true</redirectTestOutputToFile>
> > > </configuration>
> > > </plugin>
> > >
> > > On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
> > > bautin.mailing.lists@gmail.com> wrote:
> > >
> > > > Hello,
> > > >
> > > > Does anyone know how to increase heap allocation for Hadoop QA runs,
> or
> > > at
> > > > least check the available amount of memory?
> > > >
> > > > Thanks,
> > > > --Mikhail
> > > >
> > >
> >
>
Re: how to increase Hadoop QA heap
Posted by Ted Yu <yu...@gmail.com>.
Mikhail:
Would this help
http://stackoverflow.com/questions/6878883/how-do-i-determine-maxdirectmemorysize-on-a-running-jvm?
I tried to set XX:MaxDirectMemorySize
According to
http://stackoverflow.com/questions/3773775/default-for-xxmaxdirectmemorysize,
the default is 64 MB.
But even if I set XX:MaxDirectMemorySize=64m, I got the following on
MacBook:
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
So some expert advice is needed :-)
On Fri, Feb 10, 2012 at 1:06 PM, Mikhail Bautin <
bautin.mailing.lists@gmail.com> wrote:
> @Ted: thanks for the suggestion.
>
> Maybe I should have worded my question differently. I am interested in the
> actual amount of memory available on Hadoop QA machines, because I see
> out-of-memory errors in native memory allocation (not part of Java heap)
> that only happen in Hadoop QA.
>
> Perhaps we should define a "reference configuration" for HBase test suite.
> E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box, etc.?
>
> Thanks,
> --Mikhail
>
> On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > This should do:
> >
> > Index: pom.xml
> > ===================================================================
> > --- pom.xml (revision 1242915)
> > +++ pom.xml (working copy)
> > @@ -350,7 +350,7 @@
> >
> > <configuration>
> >
> > <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
> > - <argLine>-enableassertions -Xmx1900m
> > -Djava.security.egd=file:/dev/./urandom</argLine>
> > + <argLine>-d32 -enableassertions -Xmx2300m
> > -Djava.security.egd=file:/dev/./urandom</argLine>
> > <redirectTestOutputToFile>true</redirectTestOutputToFile>
> > </configuration>
> > </plugin>
> >
> > On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
> > bautin.mailing.lists@gmail.com> wrote:
> >
> > > Hello,
> > >
> > > Does anyone know how to increase heap allocation for Hadoop QA runs, or
> > at
> > > least check the available amount of memory?
> > >
> > > Thanks,
> > > --Mikhail
> > >
> >
>
Re: how to increase Hadoop QA heap
Posted by Mikhail Bautin <ba...@gmail.com>.
@Ted: thanks for the suggestion.
Maybe I should have worded my question differently. I am interested in the
actual amount of memory available on Hadoop QA machines, because I see
out-of-memory errors in native memory allocation (not part of Java heap)
that only happen in Hadoop QA.
Perhaps we should define a "reference configuration" for HBase test suite.
E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box, etc.?
Thanks,
--Mikhail
On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu <yu...@gmail.com> wrote:
> This should do:
>
> Index: pom.xml
> ===================================================================
> --- pom.xml (revision 1242915)
> +++ pom.xml (working copy)
> @@ -350,7 +350,7 @@
>
> <configuration>
>
> <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
> - <argLine>-enableassertions -Xmx1900m
> -Djava.security.egd=file:/dev/./urandom</argLine>
> + <argLine>-d32 -enableassertions -Xmx2300m
> -Djava.security.egd=file:/dev/./urandom</argLine>
> <redirectTestOutputToFile>true</redirectTestOutputToFile>
> </configuration>
> </plugin>
>
> On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
> bautin.mailing.lists@gmail.com> wrote:
>
> > Hello,
> >
> > Does anyone know how to increase heap allocation for Hadoop QA runs, or
> at
> > least check the available amount of memory?
> >
> > Thanks,
> > --Mikhail
> >
>
Re: how to increase Hadoop QA heap
Posted by Ted Yu <yu...@gmail.com>.
This should do:
Index: pom.xml
===================================================================
--- pom.xml (revision 1242915)
+++ pom.xml (working copy)
@@ -350,7 +350,7 @@
<configuration>
<forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
- <argLine>-enableassertions -Xmx1900m
-Djava.security.egd=file:/dev/./urandom</argLine>
+ <argLine>-d32 -enableassertions -Xmx2300m
-Djava.security.egd=file:/dev/./urandom</argLine>
<redirectTestOutputToFile>true</redirectTestOutputToFile>
</configuration>
</plugin>
On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
bautin.mailing.lists@gmail.com> wrote:
> Hello,
>
> Does anyone know how to increase heap allocation for Hadoop QA runs, or at
> least check the available amount of memory?
>
> Thanks,
> --Mikhail
>