You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-issues@hadoop.apache.org by "diptee dalal (JIRA)" <ji...@apache.org> on 2010/07/15 11:03:51 UTC

[jira] Created: (MAPREDUCE-1944) Not able to rum worcount test example of Map/REduce

Not able to rum worcount test example of Map/REduce
---------------------------------------------------

                 Key: MAPREDUCE-1944
                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1944
             Project: Hadoop Map/Reduce
          Issue Type: Test
    Affects Versions: 0.20.2
         Environment: Centos
            Reporter: diptee dalal


Not able to  run Workcount test.

I get this logs
 INFO mapred.FileInputFormat: Total input paths to process : 2
10/07/15 14:22:43 INFO mapred.JobClient: Running job: job_201007151211_0007
10/07/15 14:22:44 INFO mapred.JobClient:  map 0% reduce 0%
10/07/15 14:22:53 INFO mapred.JobClient:  map 66% reduce 0%
10/07/15 14:22:56 INFO mapred.JobClient:  map 100% reduce 0%
10/07/15 14:22:58 INFO mapred.JobClient: Task Id : attempt_201007151211_0007_r_000000_0, Status : FAILED
Error: java.lang.NullPointerException
        at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
        at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
        at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)

10/07/15 14:23:04 INFO mapred.JobClient: Task Id : attempt_201007151211_0007_r_000000_1, Status : FAILED
Error: java.lang.NullPointerException
        at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
        at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
        at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)

10/07/15 14:23:10 INFO mapred.JobClient: Task Id : attempt_201007151211_0007_r_000000_2, Status : FAILED
Error: java.lang.NullPointerException
        at java.util.concurrent.ConcurrentHashMap.get(Unknown Source)
        at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2683)
        at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2605)

10/07/15 14:23:19 INFO mapred.JobClient: Job complete: job_201007151211_0007
10/07/15 14:23:19 INFO mapred.JobClient: Counters: 13
10/07/15 14:23:19 INFO mapred.JobClient:   Job Counters
10/07/15 14:23:19 INFO mapred.JobClient:     Launched reduce tasks=4
10/07/15 14:23:19 INFO mapred.JobClient:     Launched map tasks=3
10/07/15 14:23:19 INFO mapred.JobClient:     Data-local map tasks=3
10/07/15 14:23:19 INFO mapred.JobClient:     Failed reduce tasks=1
10/07/15 14:23:19 INFO mapred.JobClient:   FileSystemCounters
10/07/15 14:23:19 INFO mapred.JobClient:     HDFS_BYTES_READ=41
10/07/15 14:23:19 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=190
10/07/15 14:23:19 INFO mapred.JobClient:   Map-Reduce Framework
10/07/15 14:23:19 INFO mapred.JobClient:     Combine output records=7
10/07/15 14:23:19 INFO mapred.JobClient:     Map input records=4
10/07/15 14:23:19 INFO mapred.JobClient:     Spilled Records=7
10/07/15 14:23:19 INFO mapred.JobClient:     Map output bytes=62
10/07/15 14:23:19 INFO mapred.JobClient:     Map input bytes=38
10/07/15 14:23:19 INFO mapred.JobClient:     Combine input records=7
10/07/15 14:23:19 INFO mapred.JobClient:     Map output records=7
Exception in thread "main" java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1252)
        at org.myorg.WordCount.main(WordCount.java:55)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)


-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.