You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Mohamed Riadh Trad <Mo...@inria.fr> on 2012/01/30 22:27:56 UTC

Problem with Symlinks

Hi,

I am upgraded my cluster to hadoop 1.0.0, however, hdfs fails to start and I get the following message:

###################################

starting namenode, logging to /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-namenode-master_dfs.out
slave001: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun fichier ou dossier de ce type.
slave002: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun fichier ou dossier de ce type.
slave003: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun fichier ou dossier de ce type.
master_dfs: starting secondarynamenode, logging to /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-secondarynamenode-master_dfs.out

###################

Hadoop is installed as follows:

on master_dfs: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/ However the /local/ is actually a symlink to /home/local so the actual path is : /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/
on slave001,slave002,slave003: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/

How to force hadoop 1.0.0 to bypass this redirection?

Kind regards



Trad Mohamed Riadh, M.Sc, Ing.
PhD. student
INRIA-TELECOM PARISTECH - ENPC School of International Management

Office: 11-15
Phone: (33)-1 39 63 59 33
Fax: (33)-1 39 63 56 74
Email: riadh.trad@inria.fr
Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/


Re: Problem with Symlinks

Posted by Harsh J <ha...@cloudera.com>.
Glad to know you fixed it! I reckon 1.0 changed much scripts and
tarball structure and its probably HADOOP_PREFIX instead now.

If you feel this may help others as well, please also file a JIRA at
https://issues.apache.org/jira/browse/HADOOP with a patch. Thanks!

On Tue, Jan 31, 2012 at 9:57 PM, Mohamed Riadh Trad
<Mo...@inria.fr> wrote:
> HADOOP_HOME is deprecated in hadoop 1.0.
>
> I solved my problem by changing  the libexec/hadoop-config.sh:
>
> this="${BASH_SOURCE-$0}"
> #echo $this
> -common_bin=$(cd -P -- "$(dirname -- "$this")" && pwd -P)
>
> to
>
> +common_bin=$(cd -L -- "$(dirname -- "$this")" && pwd -L)
>
> Bests,
>
> Trad Mohamed Riadh, M.Sc, Ing.
> PhD. student
> INRIA-TELECOM PARISTECH - ENPC School of International Management
>
> Office: 11-15
> Phone: (33)-1 39 63 59 33
> Fax: (33)-1 39 63 56 74
> Email: riadh.trad@inria.fr
> Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/
>
>
>
>
> Le 31 janv. 2012 à 07:10, Harsh J a écrit :
>
> Try exporting HADOOP_HOME in the launching user's environment to the
> right path, on every one of your slave nodes where there is no such
> symlink. Then try starting all again.
>
> On Tue, Jan 31, 2012 at 2:57 AM, Mohamed Riadh Trad
> <Mo...@inria.fr> wrote:
>
> Hi,
>
>
> I am upgraded my cluster to hadoop 1.0.0, however, hdfs fails to start and I
>
> get the following message:
>
>
> ###################################
>
>
> starting namenode, logging to
>
> /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-namenode-master_dfs.out
>
> slave001: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
>
> fichier ou dossier de ce type.
>
> slave002: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
>
> fichier ou dossier de ce type.
>
> slave003: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
>
> fichier ou dossier de ce type.
>
> master_dfs: starting secondarynamenode, logging to
>
> /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-secondarynamenode-master_dfs.out
>
>
> ###################
>
>
> Hadoop is installed as follows:
>
>
> on master_dfs: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/ However the
>
> /local/ is actually a symlink to /home/local so the actual path is
>
> : /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/
>
> on slave001,slave002,slave003: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/
>
>
> How to force hadoop 1.0.0 to bypass this redirection?
>
>
> Kind regards
>
>
>
>
> Trad Mohamed Riadh, M.Sc, Ing.
>
> PhD. student
>
> INRIA-TELECOM PARISTECH - ENPC School of International Management
>
>
> Office: 11-15
>
> Phone: (33)-1 39 63 59 33
>
> Fax: (33)-1 39 63 56 74
>
> Email: riadh.trad@inria.fr
>
> Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/
>
>
>
>
>
> --
> Harsh J
> Customer Ops. Engineer, Cloudera
>
>



-- 
Harsh J
Customer Ops. Engineer, Cloudera

Re: Problem with Symlinks

Posted by Mohamed Riadh Trad <Mo...@inria.fr>.
HADOOP_HOME is deprecated in hadoop 1.0.

I solved my problem by changing  the libexec/hadoop-config.sh:

this="${BASH_SOURCE-$0}"
#echo $this
-common_bin=$(cd -P -- "$(dirname -- "$this")" && pwd -P)

to 

+common_bin=$(cd -L -- "$(dirname -- "$this")" && pwd -L)

Bests,

Trad Mohamed Riadh, M.Sc, Ing.
PhD. student
INRIA-TELECOM PARISTECH - ENPC School of International Management

Office: 11-15
Phone: (33)-1 39 63 59 33
Fax: (33)-1 39 63 56 74
Email: riadh.trad@inria.fr
Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/




Le 31 janv. 2012 à 07:10, Harsh J a écrit :

> Try exporting HADOOP_HOME in the launching user's environment to the
> right path, on every one of your slave nodes where there is no such
> symlink. Then try starting all again.
> 
> On Tue, Jan 31, 2012 at 2:57 AM, Mohamed Riadh Trad
> <Mo...@inria.fr> wrote:
>> Hi,
>> 
>> I am upgraded my cluster to hadoop 1.0.0, however, hdfs fails to start and I
>> get the following message:
>> 
>> ###################################
>> 
>> starting namenode, logging to
>> /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-namenode-master_dfs.out
>> slave001: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
>> fichier ou dossier de ce type.
>> slave002: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
>> fichier ou dossier de ce type.
>> slave003: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
>> fichier ou dossier de ce type.
>> master_dfs: starting secondarynamenode, logging to
>> /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-secondarynamenode-master_dfs.out
>> 
>> ###################
>> 
>> Hadoop is installed as follows:
>> 
>> on master_dfs: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/ However the
>> /local/ is actually a symlink to /home/local so the actual path is
>> : /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/
>> on slave001,slave002,slave003: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/
>> 
>> How to force hadoop 1.0.0 to bypass this redirection?
>> 
>> Kind regards
>> 
>> 
>> 
>> Trad Mohamed Riadh, M.Sc, Ing.
>> PhD. student
>> INRIA-TELECOM PARISTECH - ENPC School of International Management
>> 
>> Office: 11-15
>> Phone: (33)-1 39 63 59 33
>> Fax: (33)-1 39 63 56 74
>> Email: riadh.trad@inria.fr
>> Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/
>> 
> 
> 
> 
> -- 
> Harsh J
> Customer Ops. Engineer, Cloudera


Re: Problem with Symlinks

Posted by Harsh J <ha...@cloudera.com>.
Try exporting HADOOP_HOME in the launching user's environment to the
right path, on every one of your slave nodes where there is no such
symlink. Then try starting all again.

On Tue, Jan 31, 2012 at 2:57 AM, Mohamed Riadh Trad
<Mo...@inria.fr> wrote:
> Hi,
>
> I am upgraded my cluster to hadoop 1.0.0, however, hdfs fails to start and I
> get the following message:
>
> ###################################
>
> starting namenode, logging to
> /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-namenode-master_dfs.out
> slave001: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
> fichier ou dossier de ce type.
> slave002: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
> fichier ou dossier de ce type.
> slave003: /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/..: Aucun
> fichier ou dossier de ce type.
> master_dfs: starting secondarynamenode, logging to
> /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/bin/../logs/hadoop-trad-secondarynamenode-master_dfs.out
>
> ###################
>
> Hadoop is installed as follows:
>
> on master_dfs: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/ However the
> /local/ is actually a symlink to /home/local so the actual path is
> : /home/local/trad/hadoop/cluster/hadoop-1.0.0.dfs/
> on slave001,slave002,slave003: /local/trad/hadoop/cluster/hadoop-1.0.0.dfs/
>
> How to force hadoop 1.0.0 to bypass this redirection?
>
> Kind regards
>
>
>
> Trad Mohamed Riadh, M.Sc, Ing.
> PhD. student
> INRIA-TELECOM PARISTECH - ENPC School of International Management
>
> Office: 11-15
> Phone: (33)-1 39 63 59 33
> Fax: (33)-1 39 63 56 74
> Email: riadh.trad@inria.fr
> Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/
>



-- 
Harsh J
Customer Ops. Engineer, Cloudera