You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by "Alejandro Abdelnur (JIRA)" <ji...@apache.org> on 2012/05/16 06:22:15 UTC
[jira] [Resolved] (MAPREDUCE-4250) hadoop-config.sh missing
variable exports, causes Yarn jobs to fail with ClassNotFoundException
MRAppMaster
[ https://issues.apache.org/jira/browse/MAPREDUCE-4250?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Alejandro Abdelnur resolved MAPREDUCE-4250.
-------------------------------------------
Resolution: Fixed
Fix Version/s: 2.0.1
Hadoop Flags: Reviewed
Thanks Patrick. committed to trunk and branch-2.
> hadoop-config.sh missing variable exports, causes Yarn jobs to fail with ClassNotFoundException MRAppMaster
> -----------------------------------------------------------------------------------------------------------
>
> Key: MAPREDUCE-4250
> URL: https://issues.apache.org/jira/browse/MAPREDUCE-4250
> Project: Hadoop Map/Reduce
> Issue Type: Bug
> Components: nodemanager
> Reporter: Patrick Hunt
> Assignee: Patrick Hunt
> Fix For: 2.0.1
>
> Attachments: MAPREDUCE-4250.patch
>
>
> This is the MR side of HADOOP-8393
> If you start a pseudo distributed yarn using "start-yarn.sh" you need to specify exports for HADOOP_COMMON_HOME, HADOOP_HDFS_HOME, YARN_HOME, YARN_CONF_DIR, and HADOOP_MAPRED_HOME in hadoop-env.sh (or elsewhere), otherwise the spawned node manager will be missing
> these in it's environment. This is due to start-yarn using yarn-daemons. With this fix it's possible to start yarn (etc...) with only HADOOP_CONF_DIR specified in the environment. Took some time to track down this failure, so seems worthwhile to fix.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira