You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by zr...@sina.com on 2008/03/18 06:30:39 UTC

my questions

On Solaris 10, it had some problems with the "bin/hadoop jar hadoop-0.16.1-examples.jar grep input output 'dfs[a-z.]+'"
08/03/18 13:12:29 INFO mapred.FileInputFormat: Total input paths to process : 11
08/03/18 13:12:30 INFO mapred.JobClient: Running job: job_200803181307_0001
08/03/18 13:12:31 INFO mapred.JobClient:  map 0% reduce 0%
08/03/18 13:12:32 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000000_0, Status : FAILED
Error initializing task_200803181307_0001_m_000000_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"         
        at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:32 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_0&filter=stdout
08/03/18 13:12:32 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_0&filter=stderr
08/03/18 13:12:37 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000001_0, Status : FAILED
Error initializing task_200803181307_0001_m_000001_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:37 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000001_0&filter=stdout
08/03/18 13:12:37 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000001_0&filter=stderr
08/03/18 13:12:37 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000002_0, Status : FAILED
Error initializing task_200803181307_0001_m_000002_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:37 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000002_0&filter=stdout
08/03/18 13:12:37 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000002_0&filter=stderr
08/03/18 13:12:42 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000003_0, Status : FAILED
Error initializing task_200803181307_0001_m_000003_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:42 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000003_0&filter=stdout
08/03/18 13:12:42 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000003_0&filter=stderr
08/03/18 13:12:42 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000004_0, Status : FAILED
Error initializing task_200803181307_0001_m_000004_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:42 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000004_0&filter=stdout
08/03/18 13:12:42 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000004_0&filter=stderr
08/03/18 13:12:47 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000005_0, Status : FAILED
Error initializing task_200803181307_0001_m_000005_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:47 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000005_0&filter=stdout
08/03/18 13:12:47 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000005_0&filter=stderr
08/03/18 13:12:47 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000006_0, Status : FAILED
Error initializing task_200803181307_0001_m_000006_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:47 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000006_0&filter=stdout
08/03/18 13:12:47 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000006_0&filter=stderr
08/03/18 13:12:52 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000007_0, Status : FAILED
Error initializing task_200803181307_0001_m_000007_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:52 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000007_0&filter=stdout
08/03/18 13:12:52 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000007_0&filter=stderr
08/03/18 13:12:52 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000008_0, Status : FAILED
Error initializing task_200803181307_0001_m_000008_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:52 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000008_0&filter=stdout
08/03/18 13:12:52 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000008_0&filter=stderr
08/03/18 13:12:57 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000009_0, Status : FAILED
Error initializing task_200803181307_0001_m_000009_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:57 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000009_0&filter=stdout
08/03/18 13:12:57 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000009_0&filter=stderr
08/03/18 13:12:57 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000010_0, Status : FAILED
Error initializing task_200803181307_0001_m_000010_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:12:57 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000010_0&filter=stdout
08/03/18 13:12:57 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000010_0&filter=stderr
08/03/18 13:13:02 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000011_0, Status : FAILED
Error initializing task_200803181307_0001_m_000011_0:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:13:02 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000011_0&filter=stdout
08/03/18 13:13:02 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000011_0&filter=stderr
08/03/18 13:13:02 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000000_1, Status : FAILED
Error initializing task_200803181307_0001_m_000000_1:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:13:02 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_1&filter=stdout
08/03/18 13:13:02 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_1&filter=stderr
08/03/18 13:13:06 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000000_2, Status : FAILED
Error initializing task_200803181307_0001_m_000000_2:
java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
08/03/18 13:13:07 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_2&filter=stdout
08/03/18 13:13:07 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_2&filter=stderr
08/03/18 13:13:08 INFO mapred.JobClient:&nbsp; map 100% reduce 100%
java.io.IOException: Job failed!
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:894)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.examples.Grep.run(Grep.java:69)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.examples.Grep.main(Grep.java:93)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at java.lang.reflect.Method.invoke(Method.java:597)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:52)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at java.lang.reflect.Method.invoke(Method.java:597)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.util.RunJar.main(RunJar.java:155)


Please do reply and help me out as this is driving me crazy.


-------------------------------------------------------------------
惠普春季促销送大礼,直降五千优惠连连( http://d1.sina.com.cn/sina/limeng3/mail_zhuiyu/2008/mail_zhuiyu_20080317.html )

-------------------------------------------------------------------
注册新浪2G免费邮箱(http://mail.sina.com.cn/)

Re: my questions

Posted by Erwan Arzur <ea...@gmail.com>.
        conf.set ("hadoop.job.ugi", "hadoop,hadoop");

did the trick for me in some unit tests i am writing. I still have problem
running a MiniDFSCluster of my own.

I guess that, with 0.16.1, setting this on any Windows installation would be
mandatory.

This prevents UnixUserGroupInfo to try to run 'whoami' and try to get groups
membership information.

I think it would be more portable to use System.getProperty ("user.name")
than executing whoami.

The group problem stays ... execing "bash -c groups" can not be stated as
portable :-)

Erwan

2008/3/18 Eddie C <ed...@gmail.com>:

> hadoop is dependant on having a whoami binary. I never really
> understood this it makes problems on windows as well I am not sure if
> you can specify the user. I would suggest creating your own whoami
> shell script and make it match the linux whoami output.
>
>
> 2008/3/18  <zr...@sina.com>:
> > On Solaris 10, it had some problems with the "bin/hadoop jar
> hadoop-0.16.1-examples.jar grep input output 'dfs[a-z.]+'"
> >  08/03/18 13:12:29 INFO mapred.FileInputFormat: Total input paths to
> process : 11
> >  08/03/18 13:12:30 INFO mapred.JobClient: Running job:
> job_200803181307_0001
> >  08/03/18 13:12:31 INFO mapred.JobClient:&nbsp; map 0% reduce 0%
> >  08/03/18 13:12:32 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000000_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000000_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:32 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_0&filter=stdout
> >  08/03/18 13:12:32 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_0&filter=stderr
> >  08/03/18 13:12:37 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000001_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000001_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:37 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000001_0&filter=stdout
> >  08/03/18 13:12:37 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000001_0&filter=stderr
> >  08/03/18 13:12:37 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000002_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000002_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:37 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000002_0&filter=stdout
> >  08/03/18 13:12:37 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000002_0&filter=stderr
> >  08/03/18 13:12:42 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000003_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000003_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:42 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000003_0&filter=stdout
> >  08/03/18 13:12:42 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000003_0&filter=stderr
> >  08/03/18 13:12:42 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000004_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000004_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:42 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000004_0&filter=stdout
> >  08/03/18 13:12:42 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000004_0&filter=stderr
> >  08/03/18 13:12:47 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000005_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000005_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:47 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000005_0&filter=stdout
> >  08/03/18 13:12:47 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000005_0&filter=stderr
> >  08/03/18 13:12:47 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000006_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000006_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:47 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000006_0&filter=stdout
> >  08/03/18 13:12:47 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000006_0&filter=stderr
> >  08/03/18 13:12:52 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000007_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000007_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:52 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000007_0&filter=stdout
> >  08/03/18 13:12:52 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000007_0&filter=stderr
> >  08/03/18 13:12:52 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000008_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000008_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:52 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000008_0&filter=stdout
> >  08/03/18 13:12:52 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000008_0&filter=stderr
> >  08/03/18 13:12:57 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000009_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000009_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:57 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000009_0&filter=stdout
> >  08/03/18 13:12:57 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000009_0&filter=stderr
> >  08/03/18 13:12:57 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000010_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000010_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:12:57 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000010_0&filter=stdout
> >  08/03/18 13:12:57 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000010_0&filter=stderr
> >  08/03/18 13:13:02 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000011_0, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000011_0:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:13:02 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000011_0&filter=stdout
> >  08/03/18 13:13:02 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000011_0&filter=stderr
> >  08/03/18 13:13:02 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000000_1, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000000_1:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or directory"&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:13:02 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_1&filter=stdout
> >  08/03/18 13:13:02 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_1&filter=stderr
> >  08/03/18 13:13:06 INFO mapred.JobClient: Task Id :
> task_200803181307_0001_m_000000_2, Status : FAILED
> >  Error initializing task_200803181307_0001_m_000000_2:
> >  java.io.IOException: Login failed: Cannot run program "whoami":
> error=2, No such file or
> directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.dfs.DistributedFileSystem.initialize(
> DistributedFileSystem.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
> >  08/03/18 13:13:07 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_2&filter=stdout
> >  08/03/18 13:13:07 WARN mapred.JobClient: Error reading task
> outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_2&filter=stderr
> >  08/03/18 13:13:08 INFO mapred.JobClient:&nbsp; map 100% reduce 100%
> >  java.io.IOException: Job failed!
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:894)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.examples.Grep.run(Grep.java:69)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.examples.Grep.main(Grep.java:93)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
> :39)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:25)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> java.lang.reflect.Method.invoke(Method.java:597)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(
> ProgramDriver.java:68)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:52)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
> :39)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:25)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> java.lang.reflect.Method.invoke(Method.java:597)
> >  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at
> org.apache.hadoop.util.RunJar.main(RunJar.java:155)
> >
> >
> >  Please do reply and help me out as this is driving me crazy.
> >
> >
> >  -------------------------------------------------------------------
> >  惠普春季促销送大礼,直降五千优惠连连(
> http://d1.sina.com.cn/sina/limeng3/mail_zhuiyu/2008/mail_zhuiyu_20080317.html)
> >
> >  -------------------------------------------------------------------
> >  注册新浪2G免费邮箱(http://mail.sina.com.cn/)
>

Re: my questions

Posted by Eddie C <ed...@gmail.com>.
hadoop is dependant on having a whoami binary. I never really
understood this it makes problems on windows as well I am not sure if
you can specify the user. I would suggest creating your own whoami
shell script and make it match the linux whoami output.


2008/3/18  <zr...@sina.com>:
> On Solaris 10, it had some problems with the "bin/hadoop jar hadoop-0.16.1-examples.jar grep input output 'dfs[a-z.]+'"
>  08/03/18 13:12:29 INFO mapred.FileInputFormat: Total input paths to process : 11
>  08/03/18 13:12:30 INFO mapred.JobClient: Running job: job_200803181307_0001
>  08/03/18 13:12:31 INFO mapred.JobClient:&nbsp; map 0% reduce 0%
>  08/03/18 13:12:32 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000000_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000000_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:32 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_0&filter=stdout
>  08/03/18 13:12:32 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_0&filter=stderr
>  08/03/18 13:12:37 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000001_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000001_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:37 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000001_0&filter=stdout
>  08/03/18 13:12:37 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000001_0&filter=stderr
>  08/03/18 13:12:37 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000002_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000002_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:37 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000002_0&filter=stdout
>  08/03/18 13:12:37 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000002_0&filter=stderr
>  08/03/18 13:12:42 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000003_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000003_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:42 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000003_0&filter=stdout
>  08/03/18 13:12:42 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000003_0&filter=stderr
>  08/03/18 13:12:42 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000004_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000004_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:42 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000004_0&filter=stdout
>  08/03/18 13:12:42 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000004_0&filter=stderr
>  08/03/18 13:12:47 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000005_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000005_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:47 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000005_0&filter=stdout
>  08/03/18 13:12:47 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000005_0&filter=stderr
>  08/03/18 13:12:47 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000006_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000006_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:47 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000006_0&filter=stdout
>  08/03/18 13:12:47 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000006_0&filter=stderr
>  08/03/18 13:12:52 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000007_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000007_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:52 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000007_0&filter=stdout
>  08/03/18 13:12:52 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000007_0&filter=stderr
>  08/03/18 13:12:52 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000008_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000008_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:52 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000008_0&filter=stdout
>  08/03/18 13:12:52 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000008_0&filter=stderr
>  08/03/18 13:12:57 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000009_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000009_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:57 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000009_0&filter=stdout
>  08/03/18 13:12:57 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000009_0&filter=stderr
>  08/03/18 13:12:57 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000010_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000010_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:12:57 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000010_0&filter=stdout
>  08/03/18 13:12:57 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000010_0&filter=stderr
>  08/03/18 13:13:02 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000011_0, Status : FAILED
>  Error initializing task_200803181307_0001_m_000011_0:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:13:02 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000011_0&filter=stdout
>  08/03/18 13:13:02 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000011_0&filter=stderr
>  08/03/18 13:13:02 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000000_1, Status : FAILED
>  Error initializing task_200803181307_0001_m_000000_1:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:13:02 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_1&filter=stdout
>  08/03/18 13:13:02 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_1&filter=stderr
>  08/03/18 13:13:06 INFO mapred.JobClient: Task Id : task_200803181307_0001_m_000000_2, Status : FAILED
>  Error initializing task_200803181307_0001_m_000000_2:
>  java.io.IOException: Login failed: Cannot run program "whoami": error=2, No such file or directory"&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.createNamenode(DFSClient.java:124)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:143)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1180)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.access$400(FileSystem.java:53)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1197)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:148)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:122)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:94)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:620)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:1282)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:923)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1318)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2210)
>  08/03/18 13:13:07 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_2&filter=stdout
>  08/03/18 13:13:07 WARN mapred.JobClient: Error reading task outputhttp://zr:50060/tasklog?plaintext=true&taskid=task_200803181307_0001_m_000000_2&filter=stderr
>  08/03/18 13:13:08 INFO mapred.JobClient:&nbsp; map 100% reduce 100%
>  java.io.IOException: Job failed!
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:894)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.examples.Grep.run(Grep.java:69)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.examples.Grep.main(Grep.java:93)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at java.lang.reflect.Method.invoke(Method.java:597)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:52)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at java.lang.reflect.Method.invoke(Method.java:597)
>  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>
>
>  Please do reply and help me out as this is driving me crazy.
>
>
>  -------------------------------------------------------------------
>  惠普春季促销送大礼,直降五千优惠连连( http://d1.sina.com.cn/sina/limeng3/mail_zhuiyu/2008/mail_zhuiyu_20080317.html )
>
>  -------------------------------------------------------------------
>  注册新浪2G免费邮箱(http://mail.sina.com.cn/)