You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Thomas Bentsen <th...@bentzn.com> on 2014/01/27 16:34:05 UTC
Starting... -help needed
Hello everyone
I have recently decided to try out the Hadoop complex.
According to the getting started I am supposed to change the config in
[hadoop]/conf/*
But there is no such conf directory.
It looks a lot like I am supposed to copy the files from the tar-file
all over the OS into the folders with relatively well known names on a
Linux machine. I can try to guess where they all go... -but..? Really?
I need a very basic:
1) First you download the package
2) Then you unpack it
3) All the files in ... are copied/symlinked to ...
4) All the files in ... are copied/symlinked to ...
..
..
x) Config is done in ...
y) Starting services is done with ...
I wouldn't mind writing it myself to help others get started from
scratch but I need somewhere to start.
best regards
Thomas Bentsen
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
Anything on this?
I am pretty stuck here.
_Not_ possible to install and run Hadoop 2.2.0 with the instructions on
the website. I am sure this is not how it's supposed to be with SW from
the Apache Software Foundation: frustrating!
Where is the 'MapReduce tarball' in the binary download? It's mentioned
on the 'Single Node Setup' page - with the title 'Hadoop MapReduce Next
Generation - Setting up a Single Node Cluster'.
Where do I find the 'hadoop-common' and the 'hadoop-common/hadoop-hdfs'
path in the binary download as mentioned on the 'Single Node Setup'
page?
The 'core-default.xml' mentioned on the 'cluster setup' page is buried
in
'[hadoop_2.2.0_bin_package]/share/doc/hadoop/hadoop-project-dist/hadoop-common'
Is that _really_ where 'Read-only default configuration' is supposed to
be?
The 'conf/core-site.xml' does not exist!
However there is a
'[hadoop_2.2.0_bin_package]/etc/hadoop/core-site.xml'? Is that it?
After all this confusion I dropped everything about exporting any env
vars pointing anywhere and running something - and then I was kinda
stuck.
As I mentioned before: I wouldn't mind writing this so other newbies can
get started without beating up the dog too much but I myself need to get
started somewhere.
best regards
Thomas Bentsen
PS -to introduce myself:
10+ years of experience architecting and programming proprietary,
hi-load server-systems (Java) for gaming, data acquisition and business.
On Mon, 2014-01-27 at 19:53 +0000, Thomas Bentsen wrote:
> I did try the 1.2 release and that worked right out of the box -
> positive surprise.
>
>
> For 2.2.0 I followed:
>
>
> hadoop.apache.org > 'Getting Started' > 'Learn about'
> - bring up a page that tells about 'What's new in 2.2.0'
>
> > Single Node Setup (in the left menu)
> - tells me that I should do something to 'the MapReduce tarball from
> the release' ?
>
> > Cluster Setup (also left menu)
> - mentions the conf directory under 'Running Hadoop in Non-Secure Mode'
>
>
> Since I would like to at least give the cluster setup a spin before
> going any further that's where I stopped.
>
>
> I just noticed that the menu-listing - just under the Hadoop logo - is
> 'Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop
> 2.1.1-beta'
>
> Maybe some links need changing?
>
>
>
>
>
>
>
> On Mon, 2014-01-27 at 14:27 -0500, Chris Mawata wrote:
> > Correct. It seems you are reading literature on version 1.x but your
> > software is 2.x. (What are
> > you using for directions?) There will be a few changes in the location
> > of files.
> >
> > On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> > > I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> > > less the same config files that are in the v. 1.2 conf dir.
> > >
> > >
> > >
> > >
> > >
> > > On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> > >> Check if you have [hadoop]/etc/hadoop as the configuration directory
> > >> is different in version 2.x
> > >>
> > >> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> > >> Hello everyone
> > >>
> > >> I have recently decided to try out the Hadoop complex.
> > >>
> > >> According to the getting started I am supposed to change the
> > >> config in
> > >> [hadoop]/conf/*
> > >> But there is no such conf directory.
> > >>
> > >> It looks a lot like I am supposed to copy the files from the
> > >> tar-file
> > >> all over the OS into the folders with relatively well known
> > >> names on a
> > >> Linux machine. I can try to guess where they all go... -but..?
> > >> Really?
> > >>
> > >> I need a very basic:
> > >> 1) First you download the package
> > >> 2) Then you unpack it
> > >> 3) All the files in ... are copied/symlinked to ...
> > >> 4) All the files in ... are copied/symlinked to ...
> > >> ..
> > >> ..
> > >> x) Config is done in ...
> > >> y) Starting services is done with ...
> > >>
> > >> I wouldn't mind writing it myself to help others get started
> > >> from
> > >> scratch but I need somewhere to start.
> > >>
> > >>
> > >>
> > >> best regards
> > >>
> > >> Thomas Bentsen
> > >>
> > >
> > >
> >
>
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
Anything on this?
I am pretty stuck here.
_Not_ possible to install and run Hadoop 2.2.0 with the instructions on
the website. I am sure this is not how it's supposed to be with SW from
the Apache Software Foundation: frustrating!
Where is the 'MapReduce tarball' in the binary download? It's mentioned
on the 'Single Node Setup' page - with the title 'Hadoop MapReduce Next
Generation - Setting up a Single Node Cluster'.
Where do I find the 'hadoop-common' and the 'hadoop-common/hadoop-hdfs'
path in the binary download as mentioned on the 'Single Node Setup'
page?
The 'core-default.xml' mentioned on the 'cluster setup' page is buried
in
'[hadoop_2.2.0_bin_package]/share/doc/hadoop/hadoop-project-dist/hadoop-common'
Is that _really_ where 'Read-only default configuration' is supposed to
be?
The 'conf/core-site.xml' does not exist!
However there is a
'[hadoop_2.2.0_bin_package]/etc/hadoop/core-site.xml'? Is that it?
After all this confusion I dropped everything about exporting any env
vars pointing anywhere and running something - and then I was kinda
stuck.
As I mentioned before: I wouldn't mind writing this so other newbies can
get started without beating up the dog too much but I myself need to get
started somewhere.
best regards
Thomas Bentsen
PS -to introduce myself:
10+ years of experience architecting and programming proprietary,
hi-load server-systems (Java) for gaming, data acquisition and business.
On Mon, 2014-01-27 at 19:53 +0000, Thomas Bentsen wrote:
> I did try the 1.2 release and that worked right out of the box -
> positive surprise.
>
>
> For 2.2.0 I followed:
>
>
> hadoop.apache.org > 'Getting Started' > 'Learn about'
> - bring up a page that tells about 'What's new in 2.2.0'
>
> > Single Node Setup (in the left menu)
> - tells me that I should do something to 'the MapReduce tarball from
> the release' ?
>
> > Cluster Setup (also left menu)
> - mentions the conf directory under 'Running Hadoop in Non-Secure Mode'
>
>
> Since I would like to at least give the cluster setup a spin before
> going any further that's where I stopped.
>
>
> I just noticed that the menu-listing - just under the Hadoop logo - is
> 'Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop
> 2.1.1-beta'
>
> Maybe some links need changing?
>
>
>
>
>
>
>
> On Mon, 2014-01-27 at 14:27 -0500, Chris Mawata wrote:
> > Correct. It seems you are reading literature on version 1.x but your
> > software is 2.x. (What are
> > you using for directions?) There will be a few changes in the location
> > of files.
> >
> > On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> > > I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> > > less the same config files that are in the v. 1.2 conf dir.
> > >
> > >
> > >
> > >
> > >
> > > On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> > >> Check if you have [hadoop]/etc/hadoop as the configuration directory
> > >> is different in version 2.x
> > >>
> > >> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> > >> Hello everyone
> > >>
> > >> I have recently decided to try out the Hadoop complex.
> > >>
> > >> According to the getting started I am supposed to change the
> > >> config in
> > >> [hadoop]/conf/*
> > >> But there is no such conf directory.
> > >>
> > >> It looks a lot like I am supposed to copy the files from the
> > >> tar-file
> > >> all over the OS into the folders with relatively well known
> > >> names on a
> > >> Linux machine. I can try to guess where they all go... -but..?
> > >> Really?
> > >>
> > >> I need a very basic:
> > >> 1) First you download the package
> > >> 2) Then you unpack it
> > >> 3) All the files in ... are copied/symlinked to ...
> > >> 4) All the files in ... are copied/symlinked to ...
> > >> ..
> > >> ..
> > >> x) Config is done in ...
> > >> y) Starting services is done with ...
> > >>
> > >> I wouldn't mind writing it myself to help others get started
> > >> from
> > >> scratch but I need somewhere to start.
> > >>
> > >>
> > >>
> > >> best regards
> > >>
> > >> Thomas Bentsen
> > >>
> > >
> > >
> >
>
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
Anything on this?
I am pretty stuck here.
_Not_ possible to install and run Hadoop 2.2.0 with the instructions on
the website. I am sure this is not how it's supposed to be with SW from
the Apache Software Foundation: frustrating!
Where is the 'MapReduce tarball' in the binary download? It's mentioned
on the 'Single Node Setup' page - with the title 'Hadoop MapReduce Next
Generation - Setting up a Single Node Cluster'.
Where do I find the 'hadoop-common' and the 'hadoop-common/hadoop-hdfs'
path in the binary download as mentioned on the 'Single Node Setup'
page?
The 'core-default.xml' mentioned on the 'cluster setup' page is buried
in
'[hadoop_2.2.0_bin_package]/share/doc/hadoop/hadoop-project-dist/hadoop-common'
Is that _really_ where 'Read-only default configuration' is supposed to
be?
The 'conf/core-site.xml' does not exist!
However there is a
'[hadoop_2.2.0_bin_package]/etc/hadoop/core-site.xml'? Is that it?
After all this confusion I dropped everything about exporting any env
vars pointing anywhere and running something - and then I was kinda
stuck.
As I mentioned before: I wouldn't mind writing this so other newbies can
get started without beating up the dog too much but I myself need to get
started somewhere.
best regards
Thomas Bentsen
PS -to introduce myself:
10+ years of experience architecting and programming proprietary,
hi-load server-systems (Java) for gaming, data acquisition and business.
On Mon, 2014-01-27 at 19:53 +0000, Thomas Bentsen wrote:
> I did try the 1.2 release and that worked right out of the box -
> positive surprise.
>
>
> For 2.2.0 I followed:
>
>
> hadoop.apache.org > 'Getting Started' > 'Learn about'
> - bring up a page that tells about 'What's new in 2.2.0'
>
> > Single Node Setup (in the left menu)
> - tells me that I should do something to 'the MapReduce tarball from
> the release' ?
>
> > Cluster Setup (also left menu)
> - mentions the conf directory under 'Running Hadoop in Non-Secure Mode'
>
>
> Since I would like to at least give the cluster setup a spin before
> going any further that's where I stopped.
>
>
> I just noticed that the menu-listing - just under the Hadoop logo - is
> 'Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop
> 2.1.1-beta'
>
> Maybe some links need changing?
>
>
>
>
>
>
>
> On Mon, 2014-01-27 at 14:27 -0500, Chris Mawata wrote:
> > Correct. It seems you are reading literature on version 1.x but your
> > software is 2.x. (What are
> > you using for directions?) There will be a few changes in the location
> > of files.
> >
> > On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> > > I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> > > less the same config files that are in the v. 1.2 conf dir.
> > >
> > >
> > >
> > >
> > >
> > > On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> > >> Check if you have [hadoop]/etc/hadoop as the configuration directory
> > >> is different in version 2.x
> > >>
> > >> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> > >> Hello everyone
> > >>
> > >> I have recently decided to try out the Hadoop complex.
> > >>
> > >> According to the getting started I am supposed to change the
> > >> config in
> > >> [hadoop]/conf/*
> > >> But there is no such conf directory.
> > >>
> > >> It looks a lot like I am supposed to copy the files from the
> > >> tar-file
> > >> all over the OS into the folders with relatively well known
> > >> names on a
> > >> Linux machine. I can try to guess where they all go... -but..?
> > >> Really?
> > >>
> > >> I need a very basic:
> > >> 1) First you download the package
> > >> 2) Then you unpack it
> > >> 3) All the files in ... are copied/symlinked to ...
> > >> 4) All the files in ... are copied/symlinked to ...
> > >> ..
> > >> ..
> > >> x) Config is done in ...
> > >> y) Starting services is done with ...
> > >>
> > >> I wouldn't mind writing it myself to help others get started
> > >> from
> > >> scratch but I need somewhere to start.
> > >>
> > >>
> > >>
> > >> best regards
> > >>
> > >> Thomas Bentsen
> > >>
> > >
> > >
> >
>
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
Anything on this?
I am pretty stuck here.
_Not_ possible to install and run Hadoop 2.2.0 with the instructions on
the website. I am sure this is not how it's supposed to be with SW from
the Apache Software Foundation: frustrating!
Where is the 'MapReduce tarball' in the binary download? It's mentioned
on the 'Single Node Setup' page - with the title 'Hadoop MapReduce Next
Generation - Setting up a Single Node Cluster'.
Where do I find the 'hadoop-common' and the 'hadoop-common/hadoop-hdfs'
path in the binary download as mentioned on the 'Single Node Setup'
page?
The 'core-default.xml' mentioned on the 'cluster setup' page is buried
in
'[hadoop_2.2.0_bin_package]/share/doc/hadoop/hadoop-project-dist/hadoop-common'
Is that _really_ where 'Read-only default configuration' is supposed to
be?
The 'conf/core-site.xml' does not exist!
However there is a
'[hadoop_2.2.0_bin_package]/etc/hadoop/core-site.xml'? Is that it?
After all this confusion I dropped everything about exporting any env
vars pointing anywhere and running something - and then I was kinda
stuck.
As I mentioned before: I wouldn't mind writing this so other newbies can
get started without beating up the dog too much but I myself need to get
started somewhere.
best regards
Thomas Bentsen
PS -to introduce myself:
10+ years of experience architecting and programming proprietary,
hi-load server-systems (Java) for gaming, data acquisition and business.
On Mon, 2014-01-27 at 19:53 +0000, Thomas Bentsen wrote:
> I did try the 1.2 release and that worked right out of the box -
> positive surprise.
>
>
> For 2.2.0 I followed:
>
>
> hadoop.apache.org > 'Getting Started' > 'Learn about'
> - bring up a page that tells about 'What's new in 2.2.0'
>
> > Single Node Setup (in the left menu)
> - tells me that I should do something to 'the MapReduce tarball from
> the release' ?
>
> > Cluster Setup (also left menu)
> - mentions the conf directory under 'Running Hadoop in Non-Secure Mode'
>
>
> Since I would like to at least give the cluster setup a spin before
> going any further that's where I stopped.
>
>
> I just noticed that the menu-listing - just under the Hadoop logo - is
> 'Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop
> 2.1.1-beta'
>
> Maybe some links need changing?
>
>
>
>
>
>
>
> On Mon, 2014-01-27 at 14:27 -0500, Chris Mawata wrote:
> > Correct. It seems you are reading literature on version 1.x but your
> > software is 2.x. (What are
> > you using for directions?) There will be a few changes in the location
> > of files.
> >
> > On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> > > I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> > > less the same config files that are in the v. 1.2 conf dir.
> > >
> > >
> > >
> > >
> > >
> > > On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> > >> Check if you have [hadoop]/etc/hadoop as the configuration directory
> > >> is different in version 2.x
> > >>
> > >> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> > >> Hello everyone
> > >>
> > >> I have recently decided to try out the Hadoop complex.
> > >>
> > >> According to the getting started I am supposed to change the
> > >> config in
> > >> [hadoop]/conf/*
> > >> But there is no such conf directory.
> > >>
> > >> It looks a lot like I am supposed to copy the files from the
> > >> tar-file
> > >> all over the OS into the folders with relatively well known
> > >> names on a
> > >> Linux machine. I can try to guess where they all go... -but..?
> > >> Really?
> > >>
> > >> I need a very basic:
> > >> 1) First you download the package
> > >> 2) Then you unpack it
> > >> 3) All the files in ... are copied/symlinked to ...
> > >> 4) All the files in ... are copied/symlinked to ...
> > >> ..
> > >> ..
> > >> x) Config is done in ...
> > >> y) Starting services is done with ...
> > >>
> > >> I wouldn't mind writing it myself to help others get started
> > >> from
> > >> scratch but I need somewhere to start.
> > >>
> > >>
> > >>
> > >> best regards
> > >>
> > >> Thomas Bentsen
> > >>
> > >
> > >
> >
>
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
I did try the 1.2 release and that worked right out of the box -
positive surprise.
For 2.2.0 I followed:
hadoop.apache.org > 'Getting Started' > 'Learn about'
- bring up a page that tells about 'What's new in 2.2.0'
> Single Node Setup (in the left menu)
- tells me that I should do something to 'the MapReduce tarball from
the release' ?
> Cluster Setup (also left menu)
- mentions the conf directory under 'Running Hadoop in Non-Secure Mode'
Since I would like to at least give the cluster setup a spin before
going any further that's where I stopped.
I just noticed that the menu-listing - just under the Hadoop logo - is
'Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop
2.1.1-beta'
Maybe some links need changing?
On Mon, 2014-01-27 at 14:27 -0500, Chris Mawata wrote:
> Correct. It seems you are reading literature on version 1.x but your
> software is 2.x. (What are
> you using for directions?) There will be a few changes in the location
> of files.
>
> On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> > I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> > less the same config files that are in the v. 1.2 conf dir.
> >
> >
> >
> >
> >
> > On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> >> Check if you have [hadoop]/etc/hadoop as the configuration directory
> >> is different in version 2.x
> >>
> >> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> >> Hello everyone
> >>
> >> I have recently decided to try out the Hadoop complex.
> >>
> >> According to the getting started I am supposed to change the
> >> config in
> >> [hadoop]/conf/*
> >> But there is no such conf directory.
> >>
> >> It looks a lot like I am supposed to copy the files from the
> >> tar-file
> >> all over the OS into the folders with relatively well known
> >> names on a
> >> Linux machine. I can try to guess where they all go... -but..?
> >> Really?
> >>
> >> I need a very basic:
> >> 1) First you download the package
> >> 2) Then you unpack it
> >> 3) All the files in ... are copied/symlinked to ...
> >> 4) All the files in ... are copied/symlinked to ...
> >> ..
> >> ..
> >> x) Config is done in ...
> >> y) Starting services is done with ...
> >>
> >> I wouldn't mind writing it myself to help others get started
> >> from
> >> scratch but I need somewhere to start.
> >>
> >>
> >>
> >> best regards
> >>
> >> Thomas Bentsen
> >>
> >
> >
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
I did try the 1.2 release and that worked right out of the box -
positive surprise.
For 2.2.0 I followed:
hadoop.apache.org > 'Getting Started' > 'Learn about'
- bring up a page that tells about 'What's new in 2.2.0'
> Single Node Setup (in the left menu)
- tells me that I should do something to 'the MapReduce tarball from
the release' ?
> Cluster Setup (also left menu)
- mentions the conf directory under 'Running Hadoop in Non-Secure Mode'
Since I would like to at least give the cluster setup a spin before
going any further that's where I stopped.
I just noticed that the menu-listing - just under the Hadoop logo - is
'Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop
2.1.1-beta'
Maybe some links need changing?
On Mon, 2014-01-27 at 14:27 -0500, Chris Mawata wrote:
> Correct. It seems you are reading literature on version 1.x but your
> software is 2.x. (What are
> you using for directions?) There will be a few changes in the location
> of files.
>
> On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> > I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> > less the same config files that are in the v. 1.2 conf dir.
> >
> >
> >
> >
> >
> > On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> >> Check if you have [hadoop]/etc/hadoop as the configuration directory
> >> is different in version 2.x
> >>
> >> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> >> Hello everyone
> >>
> >> I have recently decided to try out the Hadoop complex.
> >>
> >> According to the getting started I am supposed to change the
> >> config in
> >> [hadoop]/conf/*
> >> But there is no such conf directory.
> >>
> >> It looks a lot like I am supposed to copy the files from the
> >> tar-file
> >> all over the OS into the folders with relatively well known
> >> names on a
> >> Linux machine. I can try to guess where they all go... -but..?
> >> Really?
> >>
> >> I need a very basic:
> >> 1) First you download the package
> >> 2) Then you unpack it
> >> 3) All the files in ... are copied/symlinked to ...
> >> 4) All the files in ... are copied/symlinked to ...
> >> ..
> >> ..
> >> x) Config is done in ...
> >> y) Starting services is done with ...
> >>
> >> I wouldn't mind writing it myself to help others get started
> >> from
> >> scratch but I need somewhere to start.
> >>
> >>
> >>
> >> best regards
> >>
> >> Thomas Bentsen
> >>
> >
> >
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
I did try the 1.2 release and that worked right out of the box -
positive surprise.
For 2.2.0 I followed:
hadoop.apache.org > 'Getting Started' > 'Learn about'
- bring up a page that tells about 'What's new in 2.2.0'
> Single Node Setup (in the left menu)
- tells me that I should do something to 'the MapReduce tarball from
the release' ?
> Cluster Setup (also left menu)
- mentions the conf directory under 'Running Hadoop in Non-Secure Mode'
Since I would like to at least give the cluster setup a spin before
going any further that's where I stopped.
I just noticed that the menu-listing - just under the Hadoop logo - is
'Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop
2.1.1-beta'
Maybe some links need changing?
On Mon, 2014-01-27 at 14:27 -0500, Chris Mawata wrote:
> Correct. It seems you are reading literature on version 1.x but your
> software is 2.x. (What are
> you using for directions?) There will be a few changes in the location
> of files.
>
> On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> > I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> > less the same config files that are in the v. 1.2 conf dir.
> >
> >
> >
> >
> >
> > On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> >> Check if you have [hadoop]/etc/hadoop as the configuration directory
> >> is different in version 2.x
> >>
> >> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> >> Hello everyone
> >>
> >> I have recently decided to try out the Hadoop complex.
> >>
> >> According to the getting started I am supposed to change the
> >> config in
> >> [hadoop]/conf/*
> >> But there is no such conf directory.
> >>
> >> It looks a lot like I am supposed to copy the files from the
> >> tar-file
> >> all over the OS into the folders with relatively well known
> >> names on a
> >> Linux machine. I can try to guess where they all go... -but..?
> >> Really?
> >>
> >> I need a very basic:
> >> 1) First you download the package
> >> 2) Then you unpack it
> >> 3) All the files in ... are copied/symlinked to ...
> >> 4) All the files in ... are copied/symlinked to ...
> >> ..
> >> ..
> >> x) Config is done in ...
> >> y) Starting services is done with ...
> >>
> >> I wouldn't mind writing it myself to help others get started
> >> from
> >> scratch but I need somewhere to start.
> >>
> >>
> >>
> >> best regards
> >>
> >> Thomas Bentsen
> >>
> >
> >
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
I did try the 1.2 release and that worked right out of the box -
positive surprise.
For 2.2.0 I followed:
hadoop.apache.org > 'Getting Started' > 'Learn about'
- bring up a page that tells about 'What's new in 2.2.0'
> Single Node Setup (in the left menu)
- tells me that I should do something to 'the MapReduce tarball from
the release' ?
> Cluster Setup (also left menu)
- mentions the conf directory under 'Running Hadoop in Non-Secure Mode'
Since I would like to at least give the cluster setup a spin before
going any further that's where I stopped.
I just noticed that the menu-listing - just under the Hadoop logo - is
'Apache > Hadoop > Apache Hadoop Project Dist POM > Apache Hadoop
2.1.1-beta'
Maybe some links need changing?
On Mon, 2014-01-27 at 14:27 -0500, Chris Mawata wrote:
> Correct. It seems you are reading literature on version 1.x but your
> software is 2.x. (What are
> you using for directions?) There will be a few changes in the location
> of files.
>
> On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> > I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> > less the same config files that are in the v. 1.2 conf dir.
> >
> >
> >
> >
> >
> > On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> >> Check if you have [hadoop]/etc/hadoop as the configuration directory
> >> is different in version 2.x
> >>
> >> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> >> Hello everyone
> >>
> >> I have recently decided to try out the Hadoop complex.
> >>
> >> According to the getting started I am supposed to change the
> >> config in
> >> [hadoop]/conf/*
> >> But there is no such conf directory.
> >>
> >> It looks a lot like I am supposed to copy the files from the
> >> tar-file
> >> all over the OS into the folders with relatively well known
> >> names on a
> >> Linux machine. I can try to guess where they all go... -but..?
> >> Really?
> >>
> >> I need a very basic:
> >> 1) First you download the package
> >> 2) Then you unpack it
> >> 3) All the files in ... are copied/symlinked to ...
> >> 4) All the files in ... are copied/symlinked to ...
> >> ..
> >> ..
> >> x) Config is done in ...
> >> y) Starting services is done with ...
> >>
> >> I wouldn't mind writing it myself to help others get started
> >> from
> >> scratch but I need somewhere to start.
> >>
> >>
> >>
> >> best regards
> >>
> >> Thomas Bentsen
> >>
> >
> >
>
Re: Starting... -help needed
Posted by Chris Mawata <ch...@gmail.com>.
Correct. It seems you are reading literature on version 1.x but your
software is 2.x. (What are
you using for directions?) There will be a few changes in the location
of files.
On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> less the same config files that are in the v. 1.2 conf dir.
>
>
>
>
>
> On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
>> Check if you have [hadoop]/etc/hadoop as the configuration directory
>> is different in version 2.x
>>
>> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
>> Hello everyone
>>
>> I have recently decided to try out the Hadoop complex.
>>
>> According to the getting started I am supposed to change the
>> config in
>> [hadoop]/conf/*
>> But there is no such conf directory.
>>
>> It looks a lot like I am supposed to copy the files from the
>> tar-file
>> all over the OS into the folders with relatively well known
>> names on a
>> Linux machine. I can try to guess where they all go... -but..?
>> Really?
>>
>> I need a very basic:
>> 1) First you download the package
>> 2) Then you unpack it
>> 3) All the files in ... are copied/symlinked to ...
>> 4) All the files in ... are copied/symlinked to ...
>> ..
>> ..
>> x) Config is done in ...
>> y) Starting services is done with ...
>>
>> I wouldn't mind writing it myself to help others get started
>> from
>> scratch but I need somewhere to start.
>>
>>
>>
>> best regards
>>
>> Thomas Bentsen
>>
>
>
Re: Starting... -help needed
Posted by Chris Mawata <ch...@gmail.com>.
Correct. It seems you are reading literature on version 1.x but your
software is 2.x. (What are
you using for directions?) There will be a few changes in the location
of files.
On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> less the same config files that are in the v. 1.2 conf dir.
>
>
>
>
>
> On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
>> Check if you have [hadoop]/etc/hadoop as the configuration directory
>> is different in version 2.x
>>
>> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
>> Hello everyone
>>
>> I have recently decided to try out the Hadoop complex.
>>
>> According to the getting started I am supposed to change the
>> config in
>> [hadoop]/conf/*
>> But there is no such conf directory.
>>
>> It looks a lot like I am supposed to copy the files from the
>> tar-file
>> all over the OS into the folders with relatively well known
>> names on a
>> Linux machine. I can try to guess where they all go... -but..?
>> Really?
>>
>> I need a very basic:
>> 1) First you download the package
>> 2) Then you unpack it
>> 3) All the files in ... are copied/symlinked to ...
>> 4) All the files in ... are copied/symlinked to ...
>> ..
>> ..
>> x) Config is done in ...
>> y) Starting services is done with ...
>>
>> I wouldn't mind writing it myself to help others get started
>> from
>> scratch but I need somewhere to start.
>>
>>
>>
>> best regards
>>
>> Thomas Bentsen
>>
>
>
Re: Starting... -help needed
Posted by Chris Mawata <ch...@gmail.com>.
Correct. It seems you are reading literature on version 1.x but your
software is 2.x. (What are
you using for directions?) There will be a few changes in the location
of files.
On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> less the same config files that are in the v. 1.2 conf dir.
>
>
>
>
>
> On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
>> Check if you have [hadoop]/etc/hadoop as the configuration directory
>> is different in version 2.x
>>
>> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
>> Hello everyone
>>
>> I have recently decided to try out the Hadoop complex.
>>
>> According to the getting started I am supposed to change the
>> config in
>> [hadoop]/conf/*
>> But there is no such conf directory.
>>
>> It looks a lot like I am supposed to copy the files from the
>> tar-file
>> all over the OS into the folders with relatively well known
>> names on a
>> Linux machine. I can try to guess where they all go... -but..?
>> Really?
>>
>> I need a very basic:
>> 1) First you download the package
>> 2) Then you unpack it
>> 3) All the files in ... are copied/symlinked to ...
>> 4) All the files in ... are copied/symlinked to ...
>> ..
>> ..
>> x) Config is done in ...
>> y) Starting services is done with ...
>>
>> I wouldn't mind writing it myself to help others get started
>> from
>> scratch but I need somewhere to start.
>>
>>
>>
>> best regards
>>
>> Thomas Bentsen
>>
>
>
Re: Starting... -help needed
Posted by Chris Mawata <ch...@gmail.com>.
Correct. It seems you are reading literature on version 1.x but your
software is 2.x. (What are
you using for directions?) There will be a few changes in the location
of files.
On 1/27/2014 12:46 PM, Thomas Bentsen wrote:
> I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
> less the same config files that are in the v. 1.2 conf dir.
>
>
>
>
>
> On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
>> Check if you have [hadoop]/etc/hadoop as the configuration directory
>> is different in version 2.x
>>
>> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
>> Hello everyone
>>
>> I have recently decided to try out the Hadoop complex.
>>
>> According to the getting started I am supposed to change the
>> config in
>> [hadoop]/conf/*
>> But there is no such conf directory.
>>
>> It looks a lot like I am supposed to copy the files from the
>> tar-file
>> all over the OS into the folders with relatively well known
>> names on a
>> Linux machine. I can try to guess where they all go... -but..?
>> Really?
>>
>> I need a very basic:
>> 1) First you download the package
>> 2) Then you unpack it
>> 3) All the files in ... are copied/symlinked to ...
>> 4) All the files in ... are copied/symlinked to ...
>> ..
>> ..
>> x) Config is done in ...
>> y) Starting services is done with ...
>>
>> I wouldn't mind writing it myself to help others get started
>> from
>> scratch but I need somewhere to start.
>>
>>
>>
>> best regards
>>
>> Thomas Bentsen
>>
>
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
less the same config files that are in the v. 1.2 conf dir.
On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> Check if you have [hadoop]/etc/hadoop as the configuration directory
> is different in version 2.x
>
> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> Hello everyone
>
> I have recently decided to try out the Hadoop complex.
>
> According to the getting started I am supposed to change the
> config in
> [hadoop]/conf/*
> But there is no such conf directory.
>
> It looks a lot like I am supposed to copy the files from the
> tar-file
> all over the OS into the folders with relatively well known
> names on a
> Linux machine. I can try to guess where they all go... -but..?
> Really?
>
> I need a very basic:
> 1) First you download the package
> 2) Then you unpack it
> 3) All the files in ... are copied/symlinked to ...
> 4) All the files in ... are copied/symlinked to ...
> ..
> ..
> x) Config is done in ...
> y) Starting services is done with ...
>
> I wouldn't mind writing it myself to help others get started
> from
> scratch but I need somewhere to start.
>
>
>
> best regards
>
> Thomas Bentsen
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
less the same config files that are in the v. 1.2 conf dir.
On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> Check if you have [hadoop]/etc/hadoop as the configuration directory
> is different in version 2.x
>
> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> Hello everyone
>
> I have recently decided to try out the Hadoop complex.
>
> According to the getting started I am supposed to change the
> config in
> [hadoop]/conf/*
> But there is no such conf directory.
>
> It looks a lot like I am supposed to copy the files from the
> tar-file
> all over the OS into the folders with relatively well known
> names on a
> Linux machine. I can try to guess where they all go... -but..?
> Really?
>
> I need a very basic:
> 1) First you download the package
> 2) Then you unpack it
> 3) All the files in ... are copied/symlinked to ...
> 4) All the files in ... are copied/symlinked to ...
> ..
> ..
> x) Config is done in ...
> y) Starting services is done with ...
>
> I wouldn't mind writing it myself to help others get started
> from
> scratch but I need somewhere to start.
>
>
>
> best regards
>
> Thomas Bentsen
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
less the same config files that are in the v. 1.2 conf dir.
On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> Check if you have [hadoop]/etc/hadoop as the configuration directory
> is different in version 2.x
>
> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> Hello everyone
>
> I have recently decided to try out the Hadoop complex.
>
> According to the getting started I am supposed to change the
> config in
> [hadoop]/conf/*
> But there is no such conf directory.
>
> It looks a lot like I am supposed to copy the files from the
> tar-file
> all over the OS into the folders with relatively well known
> names on a
> Linux machine. I can try to guess where they all go... -but..?
> Really?
>
> I need a very basic:
> 1) First you download the package
> 2) Then you unpack it
> 3) All the files in ... are copied/symlinked to ...
> 4) All the files in ... are copied/symlinked to ...
> ..
> ..
> x) Config is done in ...
> y) Starting services is done with ...
>
> I wouldn't mind writing it myself to help others get started
> from
> scratch but I need somewhere to start.
>
>
>
> best regards
>
> Thomas Bentsen
>
Re: Starting... -help needed
Posted by Thomas Bentsen <th...@bentzn.com>.
I do have the [hadoop]/etc/hadoop dir and it looks like it has more or
less the same config files that are in the v. 1.2 conf dir.
On Mon, 2014-01-27 at 12:19 -0500, Chris Mawata wrote:
> Check if you have [hadoop]/etc/hadoop as the configuration directory
> is different in version 2.x
>
> On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> Hello everyone
>
> I have recently decided to try out the Hadoop complex.
>
> According to the getting started I am supposed to change the
> config in
> [hadoop]/conf/*
> But there is no such conf directory.
>
> It looks a lot like I am supposed to copy the files from the
> tar-file
> all over the OS into the folders with relatively well known
> names on a
> Linux machine. I can try to guess where they all go... -but..?
> Really?
>
> I need a very basic:
> 1) First you download the package
> 2) Then you unpack it
> 3) All the files in ... are copied/symlinked to ...
> 4) All the files in ... are copied/symlinked to ...
> ..
> ..
> x) Config is done in ...
> y) Starting services is done with ...
>
> I wouldn't mind writing it myself to help others get started
> from
> scratch but I need somewhere to start.
>
>
>
> best regards
>
> Thomas Bentsen
>
Re: Starting... -help needed
Posted by Chris Mawata <ch...@gmail.com>.
Check if you have [hadoop]/etc/hadoop as the configuration directory is
different in version 2.x
On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> Hello everyone
>
> I have recently decided to try out the Hadoop complex.
>
> According to the getting started I am supposed to change the config in
> [hadoop]/conf/*
> But there is no such conf directory.
>
> It looks a lot like I am supposed to copy the files from the tar-file
> all over the OS into the folders with relatively well known names on a
> Linux machine. I can try to guess where they all go... -but..? Really?
>
> I need a very basic:
> 1) First you download the package
> 2) Then you unpack it
> 3) All the files in ... are copied/symlinked to ...
> 4) All the files in ... are copied/symlinked to ...
> ..
> ..
> x) Config is done in ...
> y) Starting services is done with ...
>
> I wouldn't mind writing it myself to help others get started from
> scratch but I need somewhere to start.
>
>
>
> best regards
>
> Thomas Bentsen
>
>
Re: Starting... -help needed
Posted by Chris Mawata <ch...@gmail.com>.
Check if you have [hadoop]/etc/hadoop as the configuration directory is
different in version 2.x
On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> Hello everyone
>
> I have recently decided to try out the Hadoop complex.
>
> According to the getting started I am supposed to change the config in
> [hadoop]/conf/*
> But there is no such conf directory.
>
> It looks a lot like I am supposed to copy the files from the tar-file
> all over the OS into the folders with relatively well known names on a
> Linux machine. I can try to guess where they all go... -but..? Really?
>
> I need a very basic:
> 1) First you download the package
> 2) Then you unpack it
> 3) All the files in ... are copied/symlinked to ...
> 4) All the files in ... are copied/symlinked to ...
> ..
> ..
> x) Config is done in ...
> y) Starting services is done with ...
>
> I wouldn't mind writing it myself to help others get started from
> scratch but I need somewhere to start.
>
>
>
> best regards
>
> Thomas Bentsen
>
>
Re: Starting... -help needed
Posted by Chris Mawata <ch...@gmail.com>.
Check if you have [hadoop]/etc/hadoop as the configuration directory is
different in version 2.x
On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> Hello everyone
>
> I have recently decided to try out the Hadoop complex.
>
> According to the getting started I am supposed to change the config in
> [hadoop]/conf/*
> But there is no such conf directory.
>
> It looks a lot like I am supposed to copy the files from the tar-file
> all over the OS into the folders with relatively well known names on a
> Linux machine. I can try to guess where they all go... -but..? Really?
>
> I need a very basic:
> 1) First you download the package
> 2) Then you unpack it
> 3) All the files in ... are copied/symlinked to ...
> 4) All the files in ... are copied/symlinked to ...
> ..
> ..
> x) Config is done in ...
> y) Starting services is done with ...
>
> I wouldn't mind writing it myself to help others get started from
> scratch but I need somewhere to start.
>
>
>
> best regards
>
> Thomas Bentsen
>
>
Re: Starting... -help needed
Posted by Chris Mawata <ch...@gmail.com>.
Check if you have [hadoop]/etc/hadoop as the configuration directory is
different in version 2.x
On Jan 27, 2014 10:37 AM, "Thomas Bentsen" <th...@bentzn.com> wrote:
> Hello everyone
>
> I have recently decided to try out the Hadoop complex.
>
> According to the getting started I am supposed to change the config in
> [hadoop]/conf/*
> But there is no such conf directory.
>
> It looks a lot like I am supposed to copy the files from the tar-file
> all over the OS into the folders with relatively well known names on a
> Linux machine. I can try to guess where they all go... -but..? Really?
>
> I need a very basic:
> 1) First you download the package
> 2) Then you unpack it
> 3) All the files in ... are copied/symlinked to ...
> 4) All the files in ... are copied/symlinked to ...
> ..
> ..
> x) Config is done in ...
> y) Starting services is done with ...
>
> I wouldn't mind writing it myself to help others get started from
> scratch but I need somewhere to start.
>
>
>
> best regards
>
> Thomas Bentsen
>
>