You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by RD <rd...@gmail.com> on 2012/11/30 04:49:31 UTC

Environment settings not picked up by nodemanager while running MiniMRYarnCluster

Hi all,
    My nodemanger logs cribs about MRAppMaster not being available on the
classpath. The exception is
"Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/mapreduce/v2/app/MRAppMaster". Before this, it was
complaining about "/bin/java" not found (I symlinked /bin/java/ to
/usr/bin/java to get around this).
To me it seems like the environment is not set properly for it.

My setup is as follows:
 I'm running a MiniMRYarnCluster and MiniDFSCluster in one jvm. Before
starting this process , I'm setting the env. vars HADOOP_COND_DIR. and
HADOOP_COMMON_HOME, YARN_HOME.

HADOOP_COMMON_HOME and YARN_HOME point to the same lib dir containing all
the hadoop jars.

I'm running another jvm which contains embedded oozie and some other code
which submits jobs to oozie. Oozie talks to the mini clusters to submit
jobs. On seeing oozie logs I see that the jobs are stuck in RUNNING state
for a while and then fail. The failure is because of the above mentioned
error, since the LauncherMapper cannot run the job on a nodemanager
container because of env. issues i guess?

Any ideas on how to set the env. correctly?


Appreciate the help
R.


Thank