You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Bill Bruns <bi...@yahoo.com> on 2014/01/29 07:12:04 UTC

How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Hello,
I downloaded the latest stable hadoop release rfom the mirrors as a tarball: hadoop.2.2.0.tar.gz
Then extracted the files with Archive Manager (on Ubuntu 12.10)


There are no install docs in the top level and no documentation directory.

Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
led to http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
that has the text 

"Assuming you have installed hadoop-common/hadoop-hdf" 
That seems like a strange assumption for a "Getting Started" document.
Perhaps the pointer leads to the wrong document?


Can someone say where to find the process for how to install from the downloaded stable release tarball?

Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Thomas Bentsen <th...@bentzn.com>.
Also:
I was not able to access the Yarn-web-page on 8088 but I suppose it's a
setup thing because I am not on localhost.



/th






On Wed, 2014-01-29 at 00:39 -0800, Sujee Maniyam wrote:
> You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
> (covers single node install)
> Sujee Maniyam (http://sujee.net)
> 
> 
> On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> > Hello,
> > I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> > hadoop.2.2.0.tar.gz
> > Then extracted the files with Archive Manager (on Ubuntu 12.10)
> >
> > There are no install docs in the top level and no documentation directory.
> >
> > Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> > led to
> > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> > that has the text
> > "Assuming you have installed hadoop-common/hadoop-hdf"
> > That seems like a strange assumption for a "Getting Started" document.
> > Perhaps the pointer leads to the wrong document?
> >
> > Can someone say where to find the process for how to install from the
> > downloaded stable release tarball?
> >
> >



Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Thomas Bentsen <th...@bentzn.com>.
Thanks a lot Sujee!
You saved my week!



I followed your instructions and they worked - with a few non-essential
comments or 'quirks':



Setup:
AWS AMI Linux x86_64
Oracle JDK 1.7.0_25

------------------------------------------
Comments:
------------------------------------------

System complains that core-site.xml and hdfs-site.xml should not contain
'&' 
If removed everything runs smoothly

------------------------------------------

Full console output from starting HDFS (in case you want it for other
users to be prepared):

>>> start-dfs.sh 
14/01/29 10:27:15 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user
namenode-ip-XX.xx.XX.xx.out
localhost: starting datanode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user-datanode-ip-XX.xx.XX.xx.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
RSA key fingerprint is XX:xx:XX:xx:XX:xx:XX:xx:XX:xx:XX:xx:XX:xx.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (RSA) to the list of known
hosts.
0.0.0.0: starting secondarynamenode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user-secondarynamenode-ip-XX.xx.XX.xx.out
14/01/29 10:27:44 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable

------------------------------------------

I've never used jps-command and couldn't find it on the path.

Instead confirmed HDFS-daemon running with 'ps aux | grep java' 
 - output (a lot!):
3 procecesses with a lot of info but containing '-Dproc_namenode',
'-Dproc_datanode' and '-Dproc_secondarynamenode'

------------------------------------------

Confirmed YARN up with 'ps aux | grep java' 
'-Dproc_resourcemanager' and '-Dproc_nodemanager' are now also listed

------------------------------------------

Still get that 
'WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable'
 - but it does not seem to matter

------------------------------------------

Get a lot of warnings about names being deprecated when running the
MR-job - but it works.

------------------------------------------

Command to stop the HDFS is 'stop-dfs-sh' and not 'stop-hdfs.sh'

------------------------------------------


Again: Thanks a lot!!!




/th





============================================
============================================






On Wed, 2014-01-29 at 00:39 -0800, Sujee Maniyam wrote:
> You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
> (covers single node install)
> Sujee Maniyam (http://sujee.net)
> 
> 
> On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> > Hello,
> > I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> > hadoop.2.2.0.tar.gz
> > Then extracted the files with Archive Manager (on Ubuntu 12.10)
> >
> > There are no install docs in the top level and no documentation directory.
> >
> > Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> > led to
> > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> > that has the text
> > "Assuming you have installed hadoop-common/hadoop-hdf"
> > That seems like a strange assumption for a "Getting Started" document.
> > Perhaps the pointer leads to the wrong document?
> >
> > Can someone say where to find the process for how to install from the
> > downloaded stable release tarball?
> >
> >



Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Thomas Bentsen <th...@bentzn.com>.
Thanks a lot Sujee!
You saved my week!



I followed your instructions and they worked - with a few non-essential
comments or 'quirks':



Setup:
AWS AMI Linux x86_64
Oracle JDK 1.7.0_25

------------------------------------------
Comments:
------------------------------------------

System complains that core-site.xml and hdfs-site.xml should not contain
'&' 
If removed everything runs smoothly

------------------------------------------

Full console output from starting HDFS (in case you want it for other
users to be prepared):

>>> start-dfs.sh 
14/01/29 10:27:15 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user
namenode-ip-XX.xx.XX.xx.out
localhost: starting datanode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user-datanode-ip-XX.xx.XX.xx.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
RSA key fingerprint is XX:xx:XX:xx:XX:xx:XX:xx:XX:xx:XX:xx:XX:xx.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (RSA) to the list of known
hosts.
0.0.0.0: starting secondarynamenode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user-secondarynamenode-ip-XX.xx.XX.xx.out
14/01/29 10:27:44 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable

------------------------------------------

I've never used jps-command and couldn't find it on the path.

Instead confirmed HDFS-daemon running with 'ps aux | grep java' 
 - output (a lot!):
3 procecesses with a lot of info but containing '-Dproc_namenode',
'-Dproc_datanode' and '-Dproc_secondarynamenode'

------------------------------------------

Confirmed YARN up with 'ps aux | grep java' 
'-Dproc_resourcemanager' and '-Dproc_nodemanager' are now also listed

------------------------------------------

Still get that 
'WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable'
 - but it does not seem to matter

------------------------------------------

Get a lot of warnings about names being deprecated when running the
MR-job - but it works.

------------------------------------------

Command to stop the HDFS is 'stop-dfs-sh' and not 'stop-hdfs.sh'

------------------------------------------


Again: Thanks a lot!!!




/th





============================================
============================================






On Wed, 2014-01-29 at 00:39 -0800, Sujee Maniyam wrote:
> You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
> (covers single node install)
> Sujee Maniyam (http://sujee.net)
> 
> 
> On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> > Hello,
> > I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> > hadoop.2.2.0.tar.gz
> > Then extracted the files with Archive Manager (on Ubuntu 12.10)
> >
> > There are no install docs in the top level and no documentation directory.
> >
> > Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> > led to
> > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> > that has the text
> > "Assuming you have installed hadoop-common/hadoop-hdf"
> > That seems like a strange assumption for a "Getting Started" document.
> > Perhaps the pointer leads to the wrong document?
> >
> > Can someone say where to find the process for how to install from the
> > downloaded stable release tarball?
> >
> >



Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Thomas Bentsen <th...@bentzn.com>.
Also:
I was not able to access the Yarn-web-page on 8088 but I suppose it's a
setup thing because I am not on localhost.



/th






On Wed, 2014-01-29 at 00:39 -0800, Sujee Maniyam wrote:
> You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
> (covers single node install)
> Sujee Maniyam (http://sujee.net)
> 
> 
> On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> > Hello,
> > I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> > hadoop.2.2.0.tar.gz
> > Then extracted the files with Archive Manager (on Ubuntu 12.10)
> >
> > There are no install docs in the top level and no documentation directory.
> >
> > Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> > led to
> > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> > that has the text
> > "Assuming you have installed hadoop-common/hadoop-hdf"
> > That seems like a strange assumption for a "Getting Started" document.
> > Perhaps the pointer leads to the wrong document?
> >
> > Can someone say where to find the process for how to install from the
> > downloaded stable release tarball?
> >
> >



Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Thomas Bentsen <th...@bentzn.com>.
Thanks a lot Sujee!
You saved my week!



I followed your instructions and they worked - with a few non-essential
comments or 'quirks':



Setup:
AWS AMI Linux x86_64
Oracle JDK 1.7.0_25

------------------------------------------
Comments:
------------------------------------------

System complains that core-site.xml and hdfs-site.xml should not contain
'&' 
If removed everything runs smoothly

------------------------------------------

Full console output from starting HDFS (in case you want it for other
users to be prepared):

>>> start-dfs.sh 
14/01/29 10:27:15 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user
namenode-ip-XX.xx.XX.xx.out
localhost: starting datanode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user-datanode-ip-XX.xx.XX.xx.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
RSA key fingerprint is XX:xx:XX:xx:XX:xx:XX:xx:XX:xx:XX:xx:XX:xx.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (RSA) to the list of known
hosts.
0.0.0.0: starting secondarynamenode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user-secondarynamenode-ip-XX.xx.XX.xx.out
14/01/29 10:27:44 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable

------------------------------------------

I've never used jps-command and couldn't find it on the path.

Instead confirmed HDFS-daemon running with 'ps aux | grep java' 
 - output (a lot!):
3 procecesses with a lot of info but containing '-Dproc_namenode',
'-Dproc_datanode' and '-Dproc_secondarynamenode'

------------------------------------------

Confirmed YARN up with 'ps aux | grep java' 
'-Dproc_resourcemanager' and '-Dproc_nodemanager' are now also listed

------------------------------------------

Still get that 
'WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable'
 - but it does not seem to matter

------------------------------------------

Get a lot of warnings about names being deprecated when running the
MR-job - but it works.

------------------------------------------

Command to stop the HDFS is 'stop-dfs-sh' and not 'stop-hdfs.sh'

------------------------------------------


Again: Thanks a lot!!!




/th





============================================
============================================






On Wed, 2014-01-29 at 00:39 -0800, Sujee Maniyam wrote:
> You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
> (covers single node install)
> Sujee Maniyam (http://sujee.net)
> 
> 
> On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> > Hello,
> > I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> > hadoop.2.2.0.tar.gz
> > Then extracted the files with Archive Manager (on Ubuntu 12.10)
> >
> > There are no install docs in the top level and no documentation directory.
> >
> > Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> > led to
> > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> > that has the text
> > "Assuming you have installed hadoop-common/hadoop-hdf"
> > That seems like a strange assumption for a "Getting Started" document.
> > Perhaps the pointer leads to the wrong document?
> >
> > Can someone say where to find the process for how to install from the
> > downloaded stable release tarball?
> >
> >



Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Thomas Bentsen <th...@bentzn.com>.
Also:
I was not able to access the Yarn-web-page on 8088 but I suppose it's a
setup thing because I am not on localhost.



/th






On Wed, 2014-01-29 at 00:39 -0800, Sujee Maniyam wrote:
> You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
> (covers single node install)
> Sujee Maniyam (http://sujee.net)
> 
> 
> On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> > Hello,
> > I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> > hadoop.2.2.0.tar.gz
> > Then extracted the files with Archive Manager (on Ubuntu 12.10)
> >
> > There are no install docs in the top level and no documentation directory.
> >
> > Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> > led to
> > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> > that has the text
> > "Assuming you have installed hadoop-common/hadoop-hdf"
> > That seems like a strange assumption for a "Getting Started" document.
> > Perhaps the pointer leads to the wrong document?
> >
> > Can someone say where to find the process for how to install from the
> > downloaded stable release tarball?
> >
> >



Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Thomas Bentsen <th...@bentzn.com>.
Also:
I was not able to access the Yarn-web-page on 8088 but I suppose it's a
setup thing because I am not on localhost.



/th






On Wed, 2014-01-29 at 00:39 -0800, Sujee Maniyam wrote:
> You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
> (covers single node install)
> Sujee Maniyam (http://sujee.net)
> 
> 
> On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> > Hello,
> > I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> > hadoop.2.2.0.tar.gz
> > Then extracted the files with Archive Manager (on Ubuntu 12.10)
> >
> > There are no install docs in the top level and no documentation directory.
> >
> > Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> > led to
> > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> > that has the text
> > "Assuming you have installed hadoop-common/hadoop-hdf"
> > That seems like a strange assumption for a "Getting Started" document.
> > Perhaps the pointer leads to the wrong document?
> >
> > Can someone say where to find the process for how to install from the
> > downloaded stable release tarball?
> >
> >



Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Thomas Bentsen <th...@bentzn.com>.
Thanks a lot Sujee!
You saved my week!



I followed your instructions and they worked - with a few non-essential
comments or 'quirks':



Setup:
AWS AMI Linux x86_64
Oracle JDK 1.7.0_25

------------------------------------------
Comments:
------------------------------------------

System complains that core-site.xml and hdfs-site.xml should not contain
'&' 
If removed everything runs smoothly

------------------------------------------

Full console output from starting HDFS (in case you want it for other
users to be prepared):

>>> start-dfs.sh 
14/01/29 10:27:15 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user
namenode-ip-XX.xx.XX.xx.out
localhost: starting datanode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user-datanode-ip-XX.xx.XX.xx.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
RSA key fingerprint is XX:xx:XX:xx:XX:xx:XX:xx:XX:xx:XX:xx:XX:xx.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (RSA) to the list of known
hosts.
0.0.0.0: starting secondarynamenode, logging
to /data/system/hadoop-2.2.0/logs/hadoop-ec2-user-secondarynamenode-ip-XX.xx.XX.xx.out
14/01/29 10:27:44 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable

------------------------------------------

I've never used jps-command and couldn't find it on the path.

Instead confirmed HDFS-daemon running with 'ps aux | grep java' 
 - output (a lot!):
3 procecesses with a lot of info but containing '-Dproc_namenode',
'-Dproc_datanode' and '-Dproc_secondarynamenode'

------------------------------------------

Confirmed YARN up with 'ps aux | grep java' 
'-Dproc_resourcemanager' and '-Dproc_nodemanager' are now also listed

------------------------------------------

Still get that 
'WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable'
 - but it does not seem to matter

------------------------------------------

Get a lot of warnings about names being deprecated when running the
MR-job - but it works.

------------------------------------------

Command to stop the HDFS is 'stop-dfs-sh' and not 'stop-hdfs.sh'

------------------------------------------


Again: Thanks a lot!!!




/th





============================================
============================================






On Wed, 2014-01-29 at 00:39 -0800, Sujee Maniyam wrote:
> You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
> (covers single node install)
> Sujee Maniyam (http://sujee.net)
> 
> 
> On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> > Hello,
> > I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> > hadoop.2.2.0.tar.gz
> > Then extracted the files with Archive Manager (on Ubuntu 12.10)
> >
> > There are no install docs in the top level and no documentation directory.
> >
> > Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> > led to
> > http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> > that has the text
> > "Assuming you have installed hadoop-common/hadoop-hdf"
> > That seems like a strange assumption for a "Getting Started" document.
> > Perhaps the pointer leads to the wrong document?
> >
> > Can someone say where to find the process for how to install from the
> > downloaded stable release tarball?
> >
> >



Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Sujee Maniyam <su...@sujee.net>.
You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
(covers single node install)
Sujee Maniyam (http://sujee.net)


On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> Hello,
> I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> hadoop.2.2.0.tar.gz
> Then extracted the files with Archive Manager (on Ubuntu 12.10)
>
> There are no install docs in the top level and no documentation directory.
>
> Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> led to
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> that has the text
> "Assuming you have installed hadoop-common/hadoop-hdf"
> That seems like a strange assumption for a "Getting Started" document.
> Perhaps the pointer leads to the wrong document?
>
> Can someone say where to find the process for how to install from the
> downloaded stable release tarball?
>
>

Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Sujee Maniyam <su...@sujee.net>.
You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
(covers single node install)
Sujee Maniyam (http://sujee.net)


On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> Hello,
> I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> hadoop.2.2.0.tar.gz
> Then extracted the files with Archive Manager (on Ubuntu 12.10)
>
> There are no install docs in the top level and no documentation directory.
>
> Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> led to
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> that has the text
> "Assuming you have installed hadoop-common/hadoop-hdf"
> That seems like a strange assumption for a "Getting Started" document.
> Perhaps the pointer leads to the wrong document?
>
> Can someone say where to find the process for how to install from the
> downloaded stable release tarball?
>
>

Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Sujee Maniyam <su...@sujee.net>.
You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
(covers single node install)
Sujee Maniyam (http://sujee.net)


On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> Hello,
> I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> hadoop.2.2.0.tar.gz
> Then extracted the files with Archive Manager (on Ubuntu 12.10)
>
> There are no install docs in the top level and no documentation directory.
>
> Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> led to
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> that has the text
> "Assuming you have installed hadoop-common/hadoop-hdf"
> That seems like a strange assumption for a "Getting Started" document.
> Perhaps the pointer leads to the wrong document?
>
> Can someone say where to find the process for how to install from the
> downloaded stable release tarball?
>
>

Re: How to install from downloaded tarball from hadoop.2.2.0.tar.gz

Posted by Sujee Maniyam <su...@sujee.net>.
You might find this post (mine) useful : http://hadoopilluminated.com/blog/?p=34
(covers single node install)
Sujee Maniyam (http://sujee.net)


On Tue, Jan 28, 2014 at 10:12 PM, Bill Bruns <bi...@yahoo.com> wrote:
> Hello,
> I downloaded the latest stable hadoop release rfom the mirrors as a tarball:
> hadoop.2.2.0.tar.gz
> Then extracted the files with Archive Manager (on Ubuntu 12.10)
>
> There are no install docs in the top level and no documentation directory.
>
> Then, the "Getting Started" links on http://hadoop.apache.org/docs/current/
> led to
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
> that has the text
> "Assuming you have installed hadoop-common/hadoop-hdf"
> That seems like a strange assumption for a "Getting Started" document.
> Perhaps the pointer leads to the wrong document?
>
> Can someone say where to find the process for how to install from the
> downloaded stable release tarball?
>
>