You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Koji Noguchi (JIRA)" <ji...@apache.org> on 2008/05/14 18:18:55 UTC
[jira] Resolved: (HADOOP-3384) streaming process hang when no input
to task + a large stderr debug output
[ https://issues.apache.org/jira/browse/HADOOP-3384?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Koji Noguchi resolved HADOOP-3384.
----------------------------------
Resolution: Duplicate
Release Note:
Duplicate of HADOOP-3089.
Thanks Rick!
> streaming process hang when no input to task + a large stderr debug output
> --------------------------------------------------------------------------
>
> Key: HADOOP-3384
> URL: https://issues.apache.org/jira/browse/HADOOP-3384
> Project: Hadoop Core
> Issue Type: Bug
> Components: contrib/streaming
> Affects Versions: 0.16.4
> Reporter: Koji Noguchi
>
> We've seen streaming task hang forever, (thus hang the job), when
> 1) There's no input to a mapper or a reducer
> 2) Streaming process writes some debug statements.
> (Either flush() or a large output to cause a flush)
> PipeMapper/Reducer waits for the streaming process to finish.
> Streaming process stuck on writing to stdout/stderr.
> This is due to no MROutputThread/MRErrThread being created until first input record is passed to map() or reduce().
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.