You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by Mahmood Naderan <nt...@yahoo.com> on 2014/04/01 10:00:06 UTC

NegativeArraySizeException in seqdirecotry

Hello,
Running seqdirecotry (from Mahout 0.9) on a large input file gives an exception which is shown as below. Any idea?

MAHOUT_LOCAL is set, running locally
14/04/01 12:15:17 INFO common.AbstractJob: Command line arguments: {--charset=[UTF-8], --chunkSize=[64], --endPhase=[2147483647], --fileFilterClass=[org.apache.mahout.text.PrefixAdditionFilter], --input=[enwiki-latest-pages-articles.xml], --keyPrefix=[], --method=[mapreduce], --output=[test-data], --startPhase=[0], --tempDir=[temp]}
14/04/01 12:15:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/04/01 12:15:17 INFO input.FileInputFormat: Total input paths to process : 1
14/04/01 12:15:17 WARN snappy.LoadSnappy: Snappy native library not loaded
14/04/01 12:15:18 INFO mapred.JobClient: Running job: job_local1962282193_0001
14/04/01 12:15:18 INFO mapred.LocalJobRunner: Waiting for map tasks
14/04/01 12:15:18 INFO mapred.LocalJobRunner: Starting task: attempt_local1962282193_0001_m_000000_0
14/04/01 12:15:18 INFO util.ProcessTree: setsid exited with exit code 0
14/04/01 12:15:18 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@26bcdb74
14/04/01 12:15:18 INFO mapred.MapTask: Processing split: Paths:/home/hadoop/enwiki-latest-pages-articles.xml:0+47078573765
14/04/01 12:15:18 INFO compress.CodecPool: Got brand-new compressor
14/04/01 12:15:18 INFO mapred.LocalJobRunner: Map task executor complete.
14/04/01 12:15:18 WARN mapred.LocalJobRunner: job_local1962282193_0001
java.lang.Exception: java.lang.NegativeArraySizeException
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:354)
Caused by: java.lang.NegativeArraySizeException
    at org.apache.mahout.text.WholeFileRecordReader.nextKeyValue(WholeFileRecordReader.java)
    at org.apache.hadoop.mapreduce.lib.input.CombineFileRecordReader.nextKeyValue(CombineFileRecordReader.java:69)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:531)
    at org.apache.hadoop.mapreduce.MapContext.nextKeyValue(MapContext.java:67)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
    at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:223)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
14/04/01 12:15:19 INFO mapred.JobClient:  map 0% reduce 0%
14/04/01 12:15:19 INFO mapred.JobClient: Job complete: job_local1962282193_0001
14/04/01 12:15:19 INFO mapred.JobClient: Counters: 0
14/04/01 12:15:19 INFO driver.MahoutDriver: Program took 2453 ms (Minutes: 0.040883333333333334)


 
Regards,
Mahmood