You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Anand Murali <an...@yahoo.com.INVALID> on 2016/05/11 07:57:03 UTC

Unable to start Daemons

Dear All:
Please advise after viewing carefully below
1. Altered .profile after ssh local host and included 
    export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_91/
2. Into cd/etc/hadoop, and added following entries in hadoop-env.sh
     export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_91
      export HADOOP_INSTALL=/home/anand/hadoop-2.6.0
      export PATH=$PATH:$HADOOP_INSTALL/bin:$HADOOP_INSTALL/sbin
3. $. hadoop-env.sh
4.hadoop version
nand@anand-Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /home/anand/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar

5. anand@anand-Latitude-E5540:~/hadoop-2.6.0/sbin$ start-dfs.sh --config /home/anand/hadoop-2.6.0/sbin
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
cat: /home/anand/hadoop-2.6.0/sbin/slaves: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.


I have installed latest JDK as above. Removed all old versions of JDK, gone thru all update-alternative sequences but still get this error. Shall be thankful if somebody can help.
Regards Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)

Re: Unable to start Daemons

Posted by Arun Natva <ar...@gmail.com>.
Add the JAVA_HOME statement inside Hadoop-env.sh and retry.

Sent from my iPhone

> On May 11, 2016, at 3:57 AM, Anand Murali <an...@yahoo.com.INVALID> wrote:
> 
> Dear All:
> 
> Please advise after viewing carefully below
> 
> 1. Altered .profile after ssh local host and included 
>     export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_91/
> 
> 2. Into cd/etc/hadoop, and added following entries in hadoop-env.sh
> 
>      export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_91
>       export HADOOP_INSTALL=/home/anand/hadoop-2.6.0
>       export PATH=$PATH:$HADOOP_INSTALL/bin:$HADOOP_INSTALL/sbin
> 
> 3. $. hadoop-env.sh
> 
> 4.hadoop version
> 
> nand@anand-Latitude-E5540:~/hadoop-2.6.0/etc/hadoop$ hadoop version
> Hadoop 2.6.0
> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
> Compiled by jenkins on 2014-11-13T21:10Z
> Compiled with protoc 2.5.0
> From source with checksum 18e43357c8f927c0695f1e9522859d6a
> This command was run using /home/anand/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar
> 
> 5. anand@anand-Latitude-E5540:~/hadoop-2.6.0/sbin$ start-dfs.sh --config /home/anand/hadoop-2.6.0/sbin
> Starting namenodes on [localhost]
> localhost: Error: JAVA_HOME is not set and could not be found.
> cat: /home/anand/hadoop-2.6.0/sbin/slaves: No such file or directory
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: Error: JAVA_HOME is not set and could not be found.
> 
> 
> I have installed latest JDK as above. Removed all old versions of JDK, gone thru all update-alternative sequences but still get this error. Shall be thankful if somebody can help.
> 
> Regards
>  
> Anand Murali  
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)