You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Aman <am...@hotmail.com> on 2010/12/08 22:13:01 UTC

Re: Help with

bin/start-dfs.sh sources bin/hadoop-config.sh hence please put the following
command in file bin/hadoop-config.sh 

export HADOOP_HOME=/user/local/hadoop



On Nov 30, 2010, at 12:00 PM, "Greg Troyan" <Gr...@zecco.net> wrote:

> I am building a cluster using Michael G. Noll's instructions found here:
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-multi-node-cluster/
>
> I have set up two single node clusters and they work fine. When I change
> their configurations to behave as a single cluster (by changing
> conf/masters, conf/slaves, conf/core-site.xml, conf/mapred-site.xml and
> conf/hdfs-site.xml on each server) and then run bin/start-dfs.sh on the
> master node I receive a "Hadoop common not found" error. I receive this
> error with 0.21.0 and 0.20.1
>
> I've found two solutions regarding this.
>
> One is regarding a bug fix for 0.21.0:
>
> https://issues.apache.org/jira/browse/HADOOP-6953
>
> that talks about adding code to "hdfs-config.sh" and "mapred-config.sh". I
> added the code at the beginnings of each file and it didn't fix it.
>
> I tried using 0.20.1 and per Michael's suggestion, I get that I'm supposed
> to "set the HADOOP_HOME variable", but in which script? I've tried adding
>
> export HADOOP_HOME=/usr/local/hadoop
> export HADOOP_COMMON_HOME=/usr/local/hadoop/common
>
> and/or
>
> export HADOOP_HOME=/usr/local/hadoop
> export HADOOP_COMMON_HOME=/usr/local/hadoop
>
> to conf/hadoop-env.sh to no avail.
>
>
> Can anyone help me fix this problem?

-- 
View this message in context: http://lucene.472066.n3.nabble.com/Help-with-Hadoop-common-not-found-error-when-launching-bin-start-dfs-sh-tp1994841p2053803.html
Sent from the Hadoop lucene-users mailing list archive at Nabble.com.