You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@nutch.apache.org by Gal Nitzan <gn...@usa.net> on 2007/01/19 16:57:01 UTC
java.lang.OutOfMemoryError - trunk
Thanks Sean,
I get out of memory errors.
I have set max heap for both nutch and hadoop 2000mb each but it doesn't
seem to affect anything. The out of memory happenes immediately after start
of a task.
Any idea?
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2786)
at
java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:94)
at java.io.DataOutputStream.write(DataOutputStream.java:90)
at org.apache.hadoop.io.Text.writeString(Text.java:399)
at org.apache.nutch.parse.Outlink.write(Outlink.java:52)
at org.apache.nutch.parse.ParseData.write(ParseData.java:163)
at org.apache.nutch.parse.ParseImpl.write(ParseImpl.java:55)
at
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:323)
at org.apache.nutch.parse.ParseSegment.map(ParseSegment.java:96)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:183)
at
org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1367)