You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by "Brown, Berlin [GCG-PFS]" <Be...@Primerica.com> on 2011/08/12 21:48:59 UTC

basic usage map/reduce error

I am getting this error with a mostly out of the box configuration from
version 0.20.203.0
 
When I try to run the wordcount examples.
$ hadoop jar hadoop-examples-0.20.203.0.jar wordcount
/user/hduser/gutenberg /user/hduser/gutenberg-output6
 
2011-08-12 15:45:38,299 WARN org.apache.hadoop.mapred.TaskRunner:
attempt_201108121544_0001_m_000008_2 : Child Error
java.io.IOException: Task process exit with nonzero status of 127.
 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
2011-08-12 15:45:38,878 WARN org.apache.hadoop.mapred.TaskLog: Failed to
retrieve stdout log for task: attempt_201108121544_0001_m_000008_1
java.io.FileNotFoundException:
E:\projects\workspace_mar11\ParseLogCriticalErrors\lib\h\logs\userlogs\j
ob_201108121544_0001\attempt_201108121544_0001_m_000008_1\log.index (The
system cannot find the file specified)
 at java.io.FileInputStream.open(Native Method)
 at java.io.FileInputStream.<init>(FileInputStream.java:106)
 at
org.apache.hadoop.io.SecureIOUtils.openForRead(SecureIOUtils.java:102)
 at
org.apache.hadoop.mapred.TaskLog.getAllLogsFileDetails(TaskLog.java:112)
...
 
The userlogs/job*  directory is empty.  Maybe there is some permission
issue with those directories.
 
I am running on windows with cygwin so I don't really know permissions
to set.

RE: basic usage map/reduce error

Posted by "Brown, Berlin [GCG-PFS]" <Be...@Primerica.com>.
Back again. Sorry.
 
I found the error.  This taskjvm.sh child process doesn't run with my
configuration under Cygwin.

2011-08-13 14:38:49,677 INFO
org.apache.hadoop.mapred.DefaultTaskController: >>Attempt to run command
from DefaultTaskController -
/usr/local/file/search_logs/hadoop/mapred/local/ttprivate/taskTracker/US
ER/jobcache/job_201108131437_0001/attempt_201108131437_0001_m_000001_3/t
askjvm.sh
2011-08-13 14:38:49,677 INFO
org.apache.hadoop.mapred.DefaultTaskController: >>Attempt to run command
from DefaultTaskController - working directory :
\usr\local\file\search_logs\hadoop\mapred\local\taskTracker\USER\jobcach
e\job_201108131437_0001\attempt_201108131437_0001_m_000001_3\work
2011-08-13 14:38:49,802 ERROR
org.apache.hadoop.mapred.DefaultTaskController:
org.apache.hadoop.util.Shell$ExitCodeException: bash:
/usr/local/file/search_logs/hadoop/mapred/local/ttprivate/taskTracker/US
ER/jobcache/job_201108131437_0001/attempt_201108131437_0001_m_000001_3/t
askjvm.sh: No such file or directory
 
...
 
It seems that everything with hadoop works with cygwin thus far but I
think this launching JVM task for task children does not work.
 
Does anyone have any idea how I could get taskjvm.sh pointed to a cygwn
path.
 
/usr/local/file/search_logs/hadoop/mapred/local/ttprivate/taskTracker/US
ER/jobcache/job_201108131437_0001/attempt_201108131437_0001_m_000001_3/t
askjvm.sh
 
It could be:
 
/cygdrive/e/usr/local ... in this case.


________________________________

From: Brown, Berlin [GCG-PFS] 
Sent: Saturday, August 13, 2011 2:00 AM
To: 'common-user@hadoop.apache.org'
Cc: 'berlin.brown@gmail.com'
Subject: RE: basic usage map/reduce error



OK, that wasn't the real error, it looks like this was:

When working with cygwin.  I am guessing that the task failed.  Is
mapred/task runner launch a new jvm process?  That seems to be failing


MapAttempt TASK_TYPE="SETUP" TASKID="task_201108130149_0001_m_000002"
TASK_ATTEMPT_ID="attempt_201108130149_0001_m_000002_1"
START_TIME="1313214584866"
TRACKER_NAME="tracker_USER-2\.xxxx\.com:localhost/127\.0\.0\.1:2937"
HTTP_PORT="50060" .
MapAttempt TASK_TYPE="SETUP" TASKID="task_201108130149_0001_m_000002"
TASK_ATTEMPT_ID="attempt_201108130149_0001_m_000002_1"
TASK_STATUS="FAILED" FINISH_TIME="1313214588194"
HOSTNAME="USER-2\.xxxx.com" ERROR="java\.lang\.Throwable: Child Error
 at org\.apache\.hadoop\.mapred\.TaskRunner\.run(TaskRunner\.java:271)
Caused by: java\.io\.IOException: Task process exit with nonzero status
of 127\.
 at org\.apache\.hadoop\.mapred\.TaskRunner\.run(TaskRunner\.java:258)


________________________________

From: Brown, Berlin [GCG-PFS] 
Sent: Friday, August 12, 2011 3:49 PM
To: 'common-user@hadoop.apache.org'
Cc: berlin.brown@gmail.com
Subject: basic usage map/reduce error


I am getting this error with a mostly out of the box configuration from
version 0.20.203.0
 
When I try to run the wordcount examples.
$ hadoop jar hadoop-examples-0.20.203.0.jar wordcount
/user/hduser/gutenberg /user/hduser/gutenberg-output6
 
2011-08-12 15:45:38,299 WARN org.apache.hadoop.mapred.TaskRunner:
attempt_201108121544_0001_m_000008_2 : Child Error
java.io.IOException: Task process exit with nonzero status of 127.
 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
2011-08-12 15:45:38,878 WARN org.apache.hadoop.mapred.TaskLog: Failed to
retrieve stdout log for task: attempt_201108121544_0001_m_000008_1
java.io.FileNotFoundException:
E:\projects\workspace_mar11\ParseLogCriticalErrors\lib\h\logs\userlogs\j
ob_201108121544_0001\attempt_201108121544_0001_m_000008_1\log.index (The
system cannot find the file specified)
 at java.io.FileInputStream.open(Native Method)
 at java.io.FileInputStream.<init>(FileInputStream.java:106)
 at
org.apache.hadoop.io.SecureIOUtils.openForRead(SecureIOUtils.java:102)
 at
org.apache.hadoop.mapred.TaskLog.getAllLogsFileDetails(TaskLog.java:112)
...
 
The userlogs/job*  directory is empty.  Maybe there is some permission
issue with those directories.
 
I am running on windows with cygwin so I don't really know permissions
to set.

RE: basic usage map/reduce error

Posted by "Brown, Berlin [GCG-PFS]" <Be...@Primerica.com>.
OK, that wasn't the real error, it looks like this was:

When working with cygwin.  I am guessing that the task failed.  Is
mapred/task runner launch a new jvm process?  That seems to be failing


MapAttempt TASK_TYPE="SETUP" TASKID="task_201108130149_0001_m_000002"
TASK_ATTEMPT_ID="attempt_201108130149_0001_m_000002_1"
START_TIME="1313214584866"
TRACKER_NAME="tracker_USER-2\.xxxx\.com:localhost/127\.0\.0\.1:2937"
HTTP_PORT="50060" .
MapAttempt TASK_TYPE="SETUP" TASKID="task_201108130149_0001_m_000002"
TASK_ATTEMPT_ID="attempt_201108130149_0001_m_000002_1"
TASK_STATUS="FAILED" FINISH_TIME="1313214588194"
HOSTNAME="USER-2\.xxxx.com" ERROR="java\.lang\.Throwable: Child Error
 at org\.apache\.hadoop\.mapred\.TaskRunner\.run(TaskRunner\.java:271)
Caused by: java\.io\.IOException: Task process exit with nonzero status
of 127\.
 at org\.apache\.hadoop\.mapred\.TaskRunner\.run(TaskRunner\.java:258)


________________________________

From: Brown, Berlin [GCG-PFS] 
Sent: Friday, August 12, 2011 3:49 PM
To: 'common-user@hadoop.apache.org'
Cc: berlin.brown@gmail.com
Subject: basic usage map/reduce error


I am getting this error with a mostly out of the box configuration from
version 0.20.203.0
 
When I try to run the wordcount examples.
$ hadoop jar hadoop-examples-0.20.203.0.jar wordcount
/user/hduser/gutenberg /user/hduser/gutenberg-output6
 
2011-08-12 15:45:38,299 WARN org.apache.hadoop.mapred.TaskRunner:
attempt_201108121544_0001_m_000008_2 : Child Error
java.io.IOException: Task process exit with nonzero status of 127.
 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
2011-08-12 15:45:38,878 WARN org.apache.hadoop.mapred.TaskLog: Failed to
retrieve stdout log for task: attempt_201108121544_0001_m_000008_1
java.io.FileNotFoundException:
E:\projects\workspace_mar11\ParseLogCriticalErrors\lib\h\logs\userlogs\j
ob_201108121544_0001\attempt_201108121544_0001_m_000008_1\log.index (The
system cannot find the file specified)
 at java.io.FileInputStream.open(Native Method)
 at java.io.FileInputStream.<init>(FileInputStream.java:106)
 at
org.apache.hadoop.io.SecureIOUtils.openForRead(SecureIOUtils.java:102)
 at
org.apache.hadoop.mapred.TaskLog.getAllLogsFileDetails(TaskLog.java:112)
...
 
The userlogs/job*  directory is empty.  Maybe there is some permission
issue with those directories.
 
I am running on windows with cygwin so I don't really know permissions
to set.