You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Shahnawaz Saifi <sh...@gmail.com> on 2011/07/01 09:05:21 UTC

Re: problem regarding the hadoop

You can create multiple users on Cluster because it follows LSM (Linux
security model). But Hadoop related process NN, DN, JT, TT need to be
started with one user e.g. hadoop.

Thanks,
Shah

On Fri, Jul 1, 2011 at 4:57 AM, Mitra Kaseebhotla <
mitra.kaseebhotla@gmail.com> wrote:

> Not to divert the question here. I would like to know how we should manage
> the cluster users (I am a newbie)? insead of having one user(hadoop) creds
> shared?
>
> Thanks
> Mitra.
>
> On Thu, Jun 30, 2011 at 4:23 PM, Paul Rimba <pa...@gmail.com> wrote:
>
> > sudo chown -R hadoop:hadoop /usr/local/hadoop.
> > That will give the directory ownership over to your hadoop account.
> >
> > On Fri, Jul 1, 2011 at 5:07 AM, Dhruv Kumar <dk...@ecs.umass.edu>
> wrote:
> >
> > > It is a permission issue. Are you sure that the account "hadoop" has
> read
> > > and write access to /usr/local/* directories?
> > >
> > > The installation of Hadoop has always been effortless for me. Just
> follow
> > > the instructions step by step given on:
> > > http://hadoop.apache.org/common/docs/stable/single_node_setup.html
> > >
> > >
> > >
> > >
> > > On Thu, Jun 30, 2011 at 11:08 AM, Ashish Tamrakar
> > > <as...@gmail.com>wrote:
> > >
> > > > I am having a problem starting my hadoop. i setup my for my
> single-node
> > > > cluster. please help me to solve this out!!!!
> > > >
> > > > hadoop@ashishpc:~$ /usr/local/hadoop/bin/start-all.sh
> > > > starting namenode, logging to
> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out
> > > > /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out:
> > > > Permission denied
> > > > head: cannot open
> > > > `/usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out'
> for
> > > > reading: No such file or directory
> > > > localhost: starting datanode, logging to
> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out
> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out:
> > > > Permission denied
> > > > localhost: head: cannot open
> > > > `/usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out'
> for
> > > > reading: No such file or directory
> > > > localhost: starting secondarynamenode, logging to
> > > >
> > >
> >
> /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out
> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> > > >
> > >
> >
> /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out:
> > > > Permission denied
> > > > localhost: head: cannot open
> > > >
> > > >
> > >
> >
> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out'
> > > > for reading: No such file or directory
> > > > starting jobtracker, logging to
> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out
> > > > /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out:
> > > > Permission denied
> > > > head: cannot open
> > > > `/usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out'
> > for
> > > > reading: No such file or directory
> > > > localhost: starting tasktracker, logging to
> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out
> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out:
> > > > Permission denied
> > > > localhost: head: cannot open
> > > >
> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out'
> > > for
> > > > reading: No such file or directory
> > > >
> > >
> >
>
>
>
> --
>  ________________________________________
>
>  To Understand Recursion,
>  You must first Understand Recursion
>  ________________________________________
>

Re: problem regarding the hadoop

Posted by Shahnawaz Saifi <sh...@gmail.com>.
Harsh,

Thanks for sharing. Recently, I ran into multiple user issue on cluster
because all the time we were using same user to run MRs so I found below
work around for it.

I am curious to know, How do we separate HDFS processes for users?



On Fri, Jul 1, 2011 at 1:39 PM, Harsh J <ha...@cloudera.com> wrote:

> Shahnawaz,
>
> If required, the MR and the HDFS processes can be user separated too.
> Its generally a good thing to do in practice (so that MR daemons and
> user don't get "superuser" access to HDFS files).
>
> FWIW, I usually flip open the cluster setup guide on ccp.cloudera.com
> and everything comes up right when I follow that (its got footnotes
> too, yay!).
>
> But yes, managing local permissions is quite a pain to do manually,
> and I believe the new bigtop incubator entrant is gonna help in
> setting up clusters painlessly as we go forward.
>
> On Fri, Jul 1, 2011 at 12:35 PM, Shahnawaz Saifi <sh...@gmail.com>
> wrote:
> > You can create multiple users on Cluster because it follows LSM (Linux
> > security model). But Hadoop related process NN, DN, JT, TT need to be
> > started with one user e.g. hadoop.
> >
> > Thanks,
> > Shah
> >
> > On Fri, Jul 1, 2011 at 4:57 AM, Mitra Kaseebhotla <
> > mitra.kaseebhotla@gmail.com> wrote:
> >
> >> Not to divert the question here. I would like to know how we should
> manage
> >> the cluster users (I am a newbie)? insead of having one user(hadoop)
> creds
> >> shared?
> >>
> >> Thanks
> >> Mitra.
> >>
> >> On Thu, Jun 30, 2011 at 4:23 PM, Paul Rimba <pa...@gmail.com>
> wrote:
> >>
> >> > sudo chown -R hadoop:hadoop /usr/local/hadoop.
> >> > That will give the directory ownership over to your hadoop account.
> >> >
> >> > On Fri, Jul 1, 2011 at 5:07 AM, Dhruv Kumar <dk...@ecs.umass.edu>
> >> wrote:
> >> >
> >> > > It is a permission issue. Are you sure that the account "hadoop" has
> >> read
> >> > > and write access to /usr/local/* directories?
> >> > >
> >> > > The installation of Hadoop has always been effortless for me. Just
> >> follow
> >> > > the instructions step by step given on:
> >> > > http://hadoop.apache.org/common/docs/stable/single_node_setup.html
> >> > >
> >> > >
> >> > >
> >> > >
> >> > > On Thu, Jun 30, 2011 at 11:08 AM, Ashish Tamrakar
> >> > > <as...@gmail.com>wrote:
> >> > >
> >> > > > I am having a problem starting my hadoop. i setup my for my
> >> single-node
> >> > > > cluster. please help me to solve this out!!!!
> >> > > >
> >> > > > hadoop@ashishpc:~$ /usr/local/hadoop/bin/start-all.sh
> >> > > > starting namenode, logging to
> >> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out
> >> > > > /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> >> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out:
> >> > > > Permission denied
> >> > > > head: cannot open
> >> > > >
> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out'
> >> for
> >> > > > reading: No such file or directory
> >> > > > localhost: starting datanode, logging to
> >> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out
> >> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> >> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out:
> >> > > > Permission denied
> >> > > > localhost: head: cannot open
> >> > > >
> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out'
> >> for
> >> > > > reading: No such file or directory
> >> > > > localhost: starting secondarynamenode, logging to
> >> > > >
> >> > >
> >> >
> >>
> /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out
> >> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> >> > > >
> >> > >
> >> >
> >>
> /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out:
> >> > > > Permission denied
> >> > > > localhost: head: cannot open
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out'
> >> > > > for reading: No such file or directory
> >> > > > starting jobtracker, logging to
> >> > > >
> /usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out
> >> > > > /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> >> > > >
> /usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out:
> >> > > > Permission denied
> >> > > > head: cannot open
> >> > > >
> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out'
> >> > for
> >> > > > reading: No such file or directory
> >> > > > localhost: starting tasktracker, logging to
> >> > > >
> /usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out
> >> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
> >> > > >
> /usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out:
> >> > > > Permission denied
> >> > > > localhost: head: cannot open
> >> > > >
> >> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out'
> >> > > for
> >> > > > reading: No such file or directory
> >> > > >
> >> > >
> >> >
> >>
> >>
> >>
> >> --
> >>  ________________________________________
> >>
> >>  To Understand Recursion,
> >>  You must first Understand Recursion
> >>  ________________________________________
> >>
> >
>
>
>
> --
> Harsh J
>



-- 
Thanks,
Shah

Re: problem regarding the hadoop

Posted by Harsh J <ha...@cloudera.com>.
Shahnawaz,

If required, the MR and the HDFS processes can be user separated too.
Its generally a good thing to do in practice (so that MR daemons and
user don't get "superuser" access to HDFS files).

FWIW, I usually flip open the cluster setup guide on ccp.cloudera.com
and everything comes up right when I follow that (its got footnotes
too, yay!).

But yes, managing local permissions is quite a pain to do manually,
and I believe the new bigtop incubator entrant is gonna help in
setting up clusters painlessly as we go forward.

On Fri, Jul 1, 2011 at 12:35 PM, Shahnawaz Saifi <sh...@gmail.com> wrote:
> You can create multiple users on Cluster because it follows LSM (Linux
> security model). But Hadoop related process NN, DN, JT, TT need to be
> started with one user e.g. hadoop.
>
> Thanks,
> Shah
>
> On Fri, Jul 1, 2011 at 4:57 AM, Mitra Kaseebhotla <
> mitra.kaseebhotla@gmail.com> wrote:
>
>> Not to divert the question here. I would like to know how we should manage
>> the cluster users (I am a newbie)? insead of having one user(hadoop) creds
>> shared?
>>
>> Thanks
>> Mitra.
>>
>> On Thu, Jun 30, 2011 at 4:23 PM, Paul Rimba <pa...@gmail.com> wrote:
>>
>> > sudo chown -R hadoop:hadoop /usr/local/hadoop.
>> > That will give the directory ownership over to your hadoop account.
>> >
>> > On Fri, Jul 1, 2011 at 5:07 AM, Dhruv Kumar <dk...@ecs.umass.edu>
>> wrote:
>> >
>> > > It is a permission issue. Are you sure that the account "hadoop" has
>> read
>> > > and write access to /usr/local/* directories?
>> > >
>> > > The installation of Hadoop has always been effortless for me. Just
>> follow
>> > > the instructions step by step given on:
>> > > http://hadoop.apache.org/common/docs/stable/single_node_setup.html
>> > >
>> > >
>> > >
>> > >
>> > > On Thu, Jun 30, 2011 at 11:08 AM, Ashish Tamrakar
>> > > <as...@gmail.com>wrote:
>> > >
>> > > > I am having a problem starting my hadoop. i setup my for my
>> single-node
>> > > > cluster. please help me to solve this out!!!!
>> > > >
>> > > > hadoop@ashishpc:~$ /usr/local/hadoop/bin/start-all.sh
>> > > > starting namenode, logging to
>> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out
>> > > > /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
>> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out:
>> > > > Permission denied
>> > > > head: cannot open
>> > > > `/usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-ashishpc.out'
>> for
>> > > > reading: No such file or directory
>> > > > localhost: starting datanode, logging to
>> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out
>> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
>> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out:
>> > > > Permission denied
>> > > > localhost: head: cannot open
>> > > > `/usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-ashishpc.out'
>> for
>> > > > reading: No such file or directory
>> > > > localhost: starting secondarynamenode, logging to
>> > > >
>> > >
>> >
>> /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out
>> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
>> > > >
>> > >
>> >
>> /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out:
>> > > > Permission denied
>> > > > localhost: head: cannot open
>> > > >
>> > > >
>> > >
>> >
>> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ashishpc.out'
>> > > > for reading: No such file or directory
>> > > > starting jobtracker, logging to
>> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out
>> > > > /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
>> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out:
>> > > > Permission denied
>> > > > head: cannot open
>> > > > `/usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-ashishpc.out'
>> > for
>> > > > reading: No such file or directory
>> > > > localhost: starting tasktracker, logging to
>> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out
>> > > > localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117:
>> > > > /usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out:
>> > > > Permission denied
>> > > > localhost: head: cannot open
>> > > >
>> `/usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-ashishpc.out'
>> > > for
>> > > > reading: No such file or directory
>> > > >
>> > >
>> >
>>
>>
>>
>> --
>>  ________________________________________
>>
>>  To Understand Recursion,
>>  You must first Understand Recursion
>>  ________________________________________
>>
>



-- 
Harsh J