You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Subroto <ss...@datameer.com> on 2012/06/05 15:42:41 UTC

Hadoop Dist structure doubts

Hi

Following is the hadoop directory structure after extracting the tar ball. I would like to know where and to which folder I need to set the HADOOP_MAPRED_HOME, HADOOP_HDFS_HOME,HADOOP_COMMON_HOME,YARN_HOME etc so that this cluster can be accessed within the cluster and from outside as well.

/usr/local/hadoop
		bin
		etc
			hadoop
		include
		example
		lib
		libexec
		sbin
		share
			doc
			hadoop
				common
				hdfs
				httpfs
				mapreduce
				tools
		src
Cheers,
Subroto Sanyal

Re: Hadoop Dist structure doubts

Posted by Subroto <ss...@datameer.com>.
Thanks Jagat….
The tutorial is really nice ….

Cheers,
Subroto Sanyal

On Jun 6, 2012, at 9:47 AM, Jagat wrote:

> Hello Subroto , 
> 
> There are multiple ways to install and set the environment variables for 2.x series.
> Download the latest tar in your computer for Hadoop 2.0.x and unzip it to some directory lets say HADOOP_PREFIX
> 
> Export the following environment variables in your computer
> 
> export HADOOP_PREFIX="/home/hadoop/software/hadoop-2.0.0-alpha"
> export PATH=$PATH:$HADOOP_PREFIX/bin
> export PATH=$PATH:$HADOOP_PREFIX/sbin
> 
> export HADOOP_MAPRED_HOME=${HADOOP_PREFIX}
> export HADOOP_COMMON_HOME=${HADOOP_PREFIX}
> export HADOOP_HDFS_HOME=${HADOOP_PREFIX}
> export YARN_HOME=${HADOOP_PREFIX}
> 
> A detailed discussion for this is present at Jira for Hadoop RPMs why this project structure etc.
> 
> Few days back i wrote tutorial on how to install 2.x series , you can also have a look at that.
> 
> Regards,
> 
> Jagat Singh
> 
> On Tue, Jun 5, 2012 at 7:12 PM, Subroto <ss...@datameer.com> wrote:
> Hi
> 
> Following is the hadoop directory structure after extracting the tar ball. I would like to know where and to which folder I need to set the HADOOP_MAPRED_HOME, HADOOP_HDFS_HOME,HADOOP_COMMON_HOME,YARN_HOME etc so that this cluster can be accessed within the cluster and from outside as well.
> 
> /usr/local/hadoop
>                bin
>                etc
>                        hadoop
>                include
>                example
>                lib
>                libexec
>                sbin
>                share
>                        doc
>                        hadoop
>                                common
>                                hdfs
>                                httpfs
>                                mapreduce
>                                tools
>                src
> Cheers,
> Subroto Sanyal
> 


Re: Hadoop Dist structure doubts

Posted by Jagat <ja...@gmail.com>.
Hello Subroto ,

There are multiple ways to install and set the environment variables for
2.x series.

Download the latest tar in your computer for Hadoop 2.0.x and unzip it to
some directory lets say HADOOP_PREFIX

Export the following environment variables in your computer

export HADOOP_PREFIX="/home/hadoop/software/hadoop-2.0.0-alpha"
export PATH=$PATH:$HADOOP_PREFIX/bin
export PATH=$PATH:$HADOOP_PREFIX/sbin
export HADOOP_MAPRED_HOME=${HADOOP_PREFIX}
export HADOOP_COMMON_HOME=${HADOOP_PREFIX}
export HADOOP_HDFS_HOME=${HADOOP_PREFIX}
export YARN_HOME=${HADOOP_PREFIX}

A detailed discussion for this is present at Jira for Hadoop RPMs why this
project structure etc.

Few days back i wrote tutorial on how to install 2.x
series<http://jugnu-life.blogspot.in/2012/05/hadoop-20-install-tutorial-023x.html>,
you can also have a look at that.

Regards,

Jagat Singh

On Tue, Jun 5, 2012 at 7:12 PM, Subroto <ss...@datameer.com> wrote:

> Hi
>
> Following is the hadoop directory structure after extracting the tar ball.
> I would like to know where and to which folder I need to set the
> HADOOP_MAPRED_HOME, HADOOP_HDFS_HOME,HADOOP_COMMON_HOME,YARN_HOME etc so
> that this cluster can be accessed within the cluster and from outside as
> well.
>
> /usr/local/hadoop
>                bin
>                etc
>                        hadoop
>                include
>                example
>                lib
>                libexec
>                sbin
>                share
>                        doc
>                        hadoop
>                                common
>                                hdfs
>                                httpfs
>                                mapreduce
>                                tools
>                src
> Cheers,
> Subroto Sanyal