You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@avro.apache.org by Ken Krugler <kk...@transpac.com> on 2010/06/24 19:20:25 UTC
Setting maxmemory for JUnit tests
I was running into an out-of-memory error while running the
org.apache.avro.mapred.TestWordCountSpecific and TestWordCountGeneric
tests:
[junit] java.lang.OutOfMemoryError: Java heap space
[junit] at org.apache.hadoop.mapred.MapTask
$MapOutputBuffer.<init>(MapTask.java:781)
[junit] at
org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:350)
[junit] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
[junit] at org.apache.hadoop.mapred.LocalJobRunner
$Job.run(LocalJobRunner.java:177)
I added a maxmemory attribute to the junit task in the Java build.xml,
and it seemed to fix the problem:
<macrodef name="test-runner">
<attribute name="files.location" />
<attribute name="tests.pattern" />
<attribute name="test.dir" default="${test.java.build.dir}" />
<sequential>
<junit showoutput="yes"
printsummary="withOutAndErr"
haltonfailure="no"
fork="yes"
maxmemory="256m"
I can file a Jira issue and a patch, but seems pretty trivial...
-- Ken
--------------------------------------------
Ken Krugler
+1 530-210-6378
http://bixolabs.com
e l a s t i c w e b m i n i n g
Re: Setting maxmemory for JUnit tests
Posted by Doug Cutting <cu...@apache.org>.
It would be good to find out why it runs out memory. It really
shouldn't take huge gobs of memory to run a LocalRunner and local
file-based word count over a few lines of text. I wonder if perhaps
something like io.sort.mb causes big buffer pre-allocations that could
be reduced for these tests with a simple setting.
Doug
On 06/24/2010 10:20 AM, Ken Krugler wrote:
> I was running into an out-of-memory error while running the
> org.apache.avro.mapred.TestWordCountSpecific and TestWordCountGeneric
> tests:
>
> [junit] java.lang.OutOfMemoryError: Java heap space
> [junit] at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:781)
> [junit] at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:350)
> [junit] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
> [junit] at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
>
> I added a maxmemory attribute to the junit task in the Java build.xml,
> and it seemed to fix the problem:
>
> <macrodef name="test-runner">
> <attribute name="files.location" />
> <attribute name="tests.pattern" />
> <attribute name="test.dir" default="${test.java.build.dir}" />
> <sequential>
> <junit showoutput="yes"
> printsummary="withOutAndErr"
> haltonfailure="no"
> fork="yes"
> maxmemory="256m"
>
> I can file a Jira issue and a patch, but seems pretty trivial...
>
> -- Ken
>
> --------------------------------------------
> Ken Krugler
> +1 530-210-6378
> http://bixolabs.com
> e l a s t i c w e b m i n i n g
>
>
>
>
>