You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Ravikant Dindokar <ra...@gmail.com> on 2016/01/17 11:36:21 UTC

Sharing single hadoop installation for multiple users on cluster

Hi Hadoop user,

I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
it under one specific user. Now I want  to allow other users on the cluster
to share the same hadoop installation. What changes I need to do in order
to allow access to other users?

Thanks
Ravikant

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

I am facing a new issue here. I have hadoop installed with user hduser and
other users are now able to run hadoop. But the logs files that are
generated do not have read permissions for other users.
I tried

chmod -R 777 $HADOOP_HOME/logs

but everytime a new folder written for a yarn application under
$HADOOP_HOME/logs/userlogs, other users do not have access permissions to
this folder. Is there any configuration parameter that can be used to fix
this?


Thanks
Ravikant

On Tue, Jan 19, 2016 at 10:50 AM, Ravikant Dindokar <ravikant.iisc@gmail.com
> wrote:

> Hi Mohit,
>
> thanks for your response. It worked for me. For hdfs permissions
>
> hadoop dfs -chmod -R 777 /
>
>
> worked for me.
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 2:58 PM, mohit.kaushik <mo...@orkash.com>
> wrote:
>
>> You should also set the dfs permissions
>> Simply run
>>
>>  hadoop fs -chmod 777 /
>>
>> or set the 'dfs.permissions' to 'false'.
>>
>> - Mohit Kaushik
>>
>>
>> On 01/18/2016 02:43 PM, mohit.kaushik wrote:
>>
>> If you started Hadoop daemons with hduser, it will not be shown for the
>> user foo(or any other user) as hadoop daemons are just java processes. But
>> still you can run your jobs with any other user. Ensure that the user foo
>> has access to hadoop directories. And you also don't have to create a
>> directory in hdfs for the user. I hope this resolves your problem.
>>
>> hduser $ start-all.sh
>> hduser $ su - other
>> other $ /home/hduser/hadoop203/bin/hadoop jar
>> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
>>
>> -Mohit Kasuhik
>>
>>
>> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>>
>> Hi Mohit,
>>
>> Thanks for your reply.  Let me elaborate my problem in detail.
>> I have installed hadoop with user called 'hduser' and the HADOOP_HOME
>> points to one folder in hduser's home directory . Now I have added another
>> user foo in the cluster. I modified the access permissions for following
>> directories to 777:
>> 1. Hadoop installation directory ( pointed by  HADOOP_HOME)
>> 2. dfs.datanode.data.dir
>> 3. dfs.namenode.name.dir
>> 4. hadoop.tmp.dir
>>
>> I have also created directory /user/foo inside hdfs
>>
>> After starting hdfs and yarn daemons, I am not able to view these
>> processes in foo user and so not able to submit jobs.
>>
>> Can you point out what I am missing here?
>>
>> Thanks
>> Ravikant
>>
>> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mohit.kaushik@orkash.com
>> > wrote:
>>
>>> Hadoop uses the linux system users. I think, You don't have to make any
>>> changes, Just create a new user in your system and give it access to hadoop
>>> ie. provide permissions to hadoop installation and data directories.
>>>
>>> -Mohit Kaushik
>>>
>>>
>>> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>>
>>> Hi Hadoop user,
>>>
>>> I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>> installed it under one specific user. Now I want  to allow other users on
>>> the cluster to share the same hadoop installation. What changes I need to
>>> do in order to allow access to other users?
>>>
>>> Thanks
>>> Ravikant
>>>
>>>
>>
>>
>> --
>>
>> * Mohit Kaushik*
>> Software Engineer
>> A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
>> *Tel:* +91 (124) 4969352 | *Fax:* +91 (124) 4033553
>>
>> <http://politicomapper.orkash.com>interactive social intelligence at
>> work...
>>
>> <https://www.facebook.com/Orkash2012>
>> <http://www.linkedin.com/company/orkash-services-private-limited>
>> <https://twitter.com/Orkash>  <http://www.orkash.com/blog/>
>> <http://www.orkash.com>
>> <http://www.orkash.com> ... ensuring Assurance in complexity and
>> uncertainty
>>
>> *This message including the attachments, if any, is a confidential
>> business communication. If you are not the intended recipient it may be
>> unlawful for you to read, copy, distribute, disclose or otherwise use the
>> information in this e-mail. If you have received it in error or are not the
>> intended recipient, please destroy it and notify the sender immediately.
>> Thank you *
>>
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

I am facing a new issue here. I have hadoop installed with user hduser and
other users are now able to run hadoop. But the logs files that are
generated do not have read permissions for other users.
I tried

chmod -R 777 $HADOOP_HOME/logs

but everytime a new folder written for a yarn application under
$HADOOP_HOME/logs/userlogs, other users do not have access permissions to
this folder. Is there any configuration parameter that can be used to fix
this?


Thanks
Ravikant

On Tue, Jan 19, 2016 at 10:50 AM, Ravikant Dindokar <ravikant.iisc@gmail.com
> wrote:

> Hi Mohit,
>
> thanks for your response. It worked for me. For hdfs permissions
>
> hadoop dfs -chmod -R 777 /
>
>
> worked for me.
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 2:58 PM, mohit.kaushik <mo...@orkash.com>
> wrote:
>
>> You should also set the dfs permissions
>> Simply run
>>
>>  hadoop fs -chmod 777 /
>>
>> or set the 'dfs.permissions' to 'false'.
>>
>> - Mohit Kaushik
>>
>>
>> On 01/18/2016 02:43 PM, mohit.kaushik wrote:
>>
>> If you started Hadoop daemons with hduser, it will not be shown for the
>> user foo(or any other user) as hadoop daemons are just java processes. But
>> still you can run your jobs with any other user. Ensure that the user foo
>> has access to hadoop directories. And you also don't have to create a
>> directory in hdfs for the user. I hope this resolves your problem.
>>
>> hduser $ start-all.sh
>> hduser $ su - other
>> other $ /home/hduser/hadoop203/bin/hadoop jar
>> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
>>
>> -Mohit Kasuhik
>>
>>
>> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>>
>> Hi Mohit,
>>
>> Thanks for your reply.  Let me elaborate my problem in detail.
>> I have installed hadoop with user called 'hduser' and the HADOOP_HOME
>> points to one folder in hduser's home directory . Now I have added another
>> user foo in the cluster. I modified the access permissions for following
>> directories to 777:
>> 1. Hadoop installation directory ( pointed by  HADOOP_HOME)
>> 2. dfs.datanode.data.dir
>> 3. dfs.namenode.name.dir
>> 4. hadoop.tmp.dir
>>
>> I have also created directory /user/foo inside hdfs
>>
>> After starting hdfs and yarn daemons, I am not able to view these
>> processes in foo user and so not able to submit jobs.
>>
>> Can you point out what I am missing here?
>>
>> Thanks
>> Ravikant
>>
>> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mohit.kaushik@orkash.com
>> > wrote:
>>
>>> Hadoop uses the linux system users. I think, You don't have to make any
>>> changes, Just create a new user in your system and give it access to hadoop
>>> ie. provide permissions to hadoop installation and data directories.
>>>
>>> -Mohit Kaushik
>>>
>>>
>>> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>>
>>> Hi Hadoop user,
>>>
>>> I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>> installed it under one specific user. Now I want  to allow other users on
>>> the cluster to share the same hadoop installation. What changes I need to
>>> do in order to allow access to other users?
>>>
>>> Thanks
>>> Ravikant
>>>
>>>
>>
>>
>> --
>>
>> * Mohit Kaushik*
>> Software Engineer
>> A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
>> *Tel:* +91 (124) 4969352 | *Fax:* +91 (124) 4033553
>>
>> <http://politicomapper.orkash.com>interactive social intelligence at
>> work...
>>
>> <https://www.facebook.com/Orkash2012>
>> <http://www.linkedin.com/company/orkash-services-private-limited>
>> <https://twitter.com/Orkash>  <http://www.orkash.com/blog/>
>> <http://www.orkash.com>
>> <http://www.orkash.com> ... ensuring Assurance in complexity and
>> uncertainty
>>
>> *This message including the attachments, if any, is a confidential
>> business communication. If you are not the intended recipient it may be
>> unlawful for you to read, copy, distribute, disclose or otherwise use the
>> information in this e-mail. If you have received it in error or are not the
>> intended recipient, please destroy it and notify the sender immediately.
>> Thank you *
>>
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

I am facing a new issue here. I have hadoop installed with user hduser and
other users are now able to run hadoop. But the logs files that are
generated do not have read permissions for other users.
I tried

chmod -R 777 $HADOOP_HOME/logs

but everytime a new folder written for a yarn application under
$HADOOP_HOME/logs/userlogs, other users do not have access permissions to
this folder. Is there any configuration parameter that can be used to fix
this?


Thanks
Ravikant

On Tue, Jan 19, 2016 at 10:50 AM, Ravikant Dindokar <ravikant.iisc@gmail.com
> wrote:

> Hi Mohit,
>
> thanks for your response. It worked for me. For hdfs permissions
>
> hadoop dfs -chmod -R 777 /
>
>
> worked for me.
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 2:58 PM, mohit.kaushik <mo...@orkash.com>
> wrote:
>
>> You should also set the dfs permissions
>> Simply run
>>
>>  hadoop fs -chmod 777 /
>>
>> or set the 'dfs.permissions' to 'false'.
>>
>> - Mohit Kaushik
>>
>>
>> On 01/18/2016 02:43 PM, mohit.kaushik wrote:
>>
>> If you started Hadoop daemons with hduser, it will not be shown for the
>> user foo(or any other user) as hadoop daemons are just java processes. But
>> still you can run your jobs with any other user. Ensure that the user foo
>> has access to hadoop directories. And you also don't have to create a
>> directory in hdfs for the user. I hope this resolves your problem.
>>
>> hduser $ start-all.sh
>> hduser $ su - other
>> other $ /home/hduser/hadoop203/bin/hadoop jar
>> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
>>
>> -Mohit Kasuhik
>>
>>
>> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>>
>> Hi Mohit,
>>
>> Thanks for your reply.  Let me elaborate my problem in detail.
>> I have installed hadoop with user called 'hduser' and the HADOOP_HOME
>> points to one folder in hduser's home directory . Now I have added another
>> user foo in the cluster. I modified the access permissions for following
>> directories to 777:
>> 1. Hadoop installation directory ( pointed by  HADOOP_HOME)
>> 2. dfs.datanode.data.dir
>> 3. dfs.namenode.name.dir
>> 4. hadoop.tmp.dir
>>
>> I have also created directory /user/foo inside hdfs
>>
>> After starting hdfs and yarn daemons, I am not able to view these
>> processes in foo user and so not able to submit jobs.
>>
>> Can you point out what I am missing here?
>>
>> Thanks
>> Ravikant
>>
>> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mohit.kaushik@orkash.com
>> > wrote:
>>
>>> Hadoop uses the linux system users. I think, You don't have to make any
>>> changes, Just create a new user in your system and give it access to hadoop
>>> ie. provide permissions to hadoop installation and data directories.
>>>
>>> -Mohit Kaushik
>>>
>>>
>>> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>>
>>> Hi Hadoop user,
>>>
>>> I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>> installed it under one specific user. Now I want  to allow other users on
>>> the cluster to share the same hadoop installation. What changes I need to
>>> do in order to allow access to other users?
>>>
>>> Thanks
>>> Ravikant
>>>
>>>
>>
>>
>> --
>>
>> * Mohit Kaushik*
>> Software Engineer
>> A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
>> *Tel:* +91 (124) 4969352 | *Fax:* +91 (124) 4033553
>>
>> <http://politicomapper.orkash.com>interactive social intelligence at
>> work...
>>
>> <https://www.facebook.com/Orkash2012>
>> <http://www.linkedin.com/company/orkash-services-private-limited>
>> <https://twitter.com/Orkash>  <http://www.orkash.com/blog/>
>> <http://www.orkash.com>
>> <http://www.orkash.com> ... ensuring Assurance in complexity and
>> uncertainty
>>
>> *This message including the attachments, if any, is a confidential
>> business communication. If you are not the intended recipient it may be
>> unlawful for you to read, copy, distribute, disclose or otherwise use the
>> information in this e-mail. If you have received it in error or are not the
>> intended recipient, please destroy it and notify the sender immediately.
>> Thank you *
>>
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

I am facing a new issue here. I have hadoop installed with user hduser and
other users are now able to run hadoop. But the logs files that are
generated do not have read permissions for other users.
I tried

chmod -R 777 $HADOOP_HOME/logs

but everytime a new folder written for a yarn application under
$HADOOP_HOME/logs/userlogs, other users do not have access permissions to
this folder. Is there any configuration parameter that can be used to fix
this?


Thanks
Ravikant

On Tue, Jan 19, 2016 at 10:50 AM, Ravikant Dindokar <ravikant.iisc@gmail.com
> wrote:

> Hi Mohit,
>
> thanks for your response. It worked for me. For hdfs permissions
>
> hadoop dfs -chmod -R 777 /
>
>
> worked for me.
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 2:58 PM, mohit.kaushik <mo...@orkash.com>
> wrote:
>
>> You should also set the dfs permissions
>> Simply run
>>
>>  hadoop fs -chmod 777 /
>>
>> or set the 'dfs.permissions' to 'false'.
>>
>> - Mohit Kaushik
>>
>>
>> On 01/18/2016 02:43 PM, mohit.kaushik wrote:
>>
>> If you started Hadoop daemons with hduser, it will not be shown for the
>> user foo(or any other user) as hadoop daemons are just java processes. But
>> still you can run your jobs with any other user. Ensure that the user foo
>> has access to hadoop directories. And you also don't have to create a
>> directory in hdfs for the user. I hope this resolves your problem.
>>
>> hduser $ start-all.sh
>> hduser $ su - other
>> other $ /home/hduser/hadoop203/bin/hadoop jar
>> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
>>
>> -Mohit Kasuhik
>>
>>
>> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>>
>> Hi Mohit,
>>
>> Thanks for your reply.  Let me elaborate my problem in detail.
>> I have installed hadoop with user called 'hduser' and the HADOOP_HOME
>> points to one folder in hduser's home directory . Now I have added another
>> user foo in the cluster. I modified the access permissions for following
>> directories to 777:
>> 1. Hadoop installation directory ( pointed by  HADOOP_HOME)
>> 2. dfs.datanode.data.dir
>> 3. dfs.namenode.name.dir
>> 4. hadoop.tmp.dir
>>
>> I have also created directory /user/foo inside hdfs
>>
>> After starting hdfs and yarn daemons, I am not able to view these
>> processes in foo user and so not able to submit jobs.
>>
>> Can you point out what I am missing here?
>>
>> Thanks
>> Ravikant
>>
>> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mohit.kaushik@orkash.com
>> > wrote:
>>
>>> Hadoop uses the linux system users. I think, You don't have to make any
>>> changes, Just create a new user in your system and give it access to hadoop
>>> ie. provide permissions to hadoop installation and data directories.
>>>
>>> -Mohit Kaushik
>>>
>>>
>>> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>>
>>> Hi Hadoop user,
>>>
>>> I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>> installed it under one specific user. Now I want  to allow other users on
>>> the cluster to share the same hadoop installation. What changes I need to
>>> do in order to allow access to other users?
>>>
>>> Thanks
>>> Ravikant
>>>
>>>
>>
>>
>> --
>>
>> * Mohit Kaushik*
>> Software Engineer
>> A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
>> *Tel:* +91 (124) 4969352 | *Fax:* +91 (124) 4033553
>>
>> <http://politicomapper.orkash.com>interactive social intelligence at
>> work...
>>
>> <https://www.facebook.com/Orkash2012>
>> <http://www.linkedin.com/company/orkash-services-private-limited>
>> <https://twitter.com/Orkash>  <http://www.orkash.com/blog/>
>> <http://www.orkash.com>
>> <http://www.orkash.com> ... ensuring Assurance in complexity and
>> uncertainty
>>
>> *This message including the attachments, if any, is a confidential
>> business communication. If you are not the intended recipient it may be
>> unlawful for you to read, copy, distribute, disclose or otherwise use the
>> information in this e-mail. If you have received it in error or are not the
>> intended recipient, please destroy it and notify the sender immediately.
>> Thank you *
>>
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

thanks for your response. It worked for me. For hdfs permissions

hadoop dfs -chmod -R 777 /


worked for me.

Thanks
Ravikant

On Mon, Jan 18, 2016 at 2:58 PM, mohit.kaushik <mo...@orkash.com>
wrote:

> You should also set the dfs permissions
> Simply run
>
>  hadoop fs -chmod 777 /
>
> or set the 'dfs.permissions' to 'false'.
>
> - Mohit Kaushik
>
>
> On 01/18/2016 02:43 PM, mohit.kaushik wrote:
>
> If you started Hadoop daemons with hduser, it will not be shown for the
> user foo(or any other user) as hadoop daemons are just java processes. But
> still you can run your jobs with any other user. Ensure that the user foo
> has access to hadoop directories. And you also don't have to create a
> directory in hdfs for the user. I hope this resolves your problem.
>
> hduser $ start-all.sh
> hduser $ su - other
> other $ /home/hduser/hadoop203/bin/hadoop jar
> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
>
> -Mohit Kasuhik
>
>
> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>
> Hi Mohit,
>
> Thanks for your reply.  Let me elaborate my problem in detail.
> I have installed hadoop with user called 'hduser' and the HADOOP_HOME
> points to one folder in hduser's home directory . Now I have added another
> user foo in the cluster. I modified the access permissions for following
> directories to 777:
> 1. Hadoop installation directory ( pointed by  HADOOP_HOME)
> 2. dfs.datanode.data.dir
> 3. dfs.namenode.name.dir
> 4. hadoop.tmp.dir
>
> I have also created directory /user/foo inside hdfs
>
> After starting hdfs and yarn daemons, I am not able to view these
> processes in foo user and so not able to submit jobs.
>
> Can you point out what I am missing here?
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mo...@orkash.com>
> wrote:
>
>> Hadoop uses the linux system users. I think, You don't have to make any
>> changes, Just create a new user in your system and give it access to hadoop
>> ie. provide permissions to hadoop installation and data directories.
>>
>> -Mohit Kaushik
>>
>>
>> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>
>> Hi Hadoop user,
>>
>> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
>> it under one specific user. Now I want  to allow other users on the cluster
>> to share the same hadoop installation. What changes I need to do in order
>> to allow access to other users?
>>
>> Thanks
>> Ravikant
>>
>>
>
>
> --
>
> * Mohit Kaushik*
> Software Engineer
> A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
> *Tel:* +91 (124) 4969352 | *Fax:* +91 (124) 4033553
>
> <http://politicomapper.orkash.com>interactive social intelligence at
> work...
>
> <https://www.facebook.com/Orkash2012>
> <http://www.linkedin.com/company/orkash-services-private-limited>
> <https://twitter.com/Orkash>  <http://www.orkash.com/blog/>
> <http://www.orkash.com>
> <http://www.orkash.com> ... ensuring Assurance in complexity and
> uncertainty
>
> *This message including the attachments, if any, is a confidential
> business communication. If you are not the intended recipient it may be
> unlawful for you to read, copy, distribute, disclose or otherwise use the
> information in this e-mail. If you have received it in error or are not the
> intended recipient, please destroy it and notify the sender immediately.
> Thank you *
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

thanks for your response. It worked for me. For hdfs permissions

hadoop dfs -chmod -R 777 /


worked for me.

Thanks
Ravikant

On Mon, Jan 18, 2016 at 2:58 PM, mohit.kaushik <mo...@orkash.com>
wrote:

> You should also set the dfs permissions
> Simply run
>
>  hadoop fs -chmod 777 /
>
> or set the 'dfs.permissions' to 'false'.
>
> - Mohit Kaushik
>
>
> On 01/18/2016 02:43 PM, mohit.kaushik wrote:
>
> If you started Hadoop daemons with hduser, it will not be shown for the
> user foo(or any other user) as hadoop daemons are just java processes. But
> still you can run your jobs with any other user. Ensure that the user foo
> has access to hadoop directories. And you also don't have to create a
> directory in hdfs for the user. I hope this resolves your problem.
>
> hduser $ start-all.sh
> hduser $ su - other
> other $ /home/hduser/hadoop203/bin/hadoop jar
> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
>
> -Mohit Kasuhik
>
>
> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>
> Hi Mohit,
>
> Thanks for your reply.  Let me elaborate my problem in detail.
> I have installed hadoop with user called 'hduser' and the HADOOP_HOME
> points to one folder in hduser's home directory . Now I have added another
> user foo in the cluster. I modified the access permissions for following
> directories to 777:
> 1. Hadoop installation directory ( pointed by  HADOOP_HOME)
> 2. dfs.datanode.data.dir
> 3. dfs.namenode.name.dir
> 4. hadoop.tmp.dir
>
> I have also created directory /user/foo inside hdfs
>
> After starting hdfs and yarn daemons, I am not able to view these
> processes in foo user and so not able to submit jobs.
>
> Can you point out what I am missing here?
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mo...@orkash.com>
> wrote:
>
>> Hadoop uses the linux system users. I think, You don't have to make any
>> changes, Just create a new user in your system and give it access to hadoop
>> ie. provide permissions to hadoop installation and data directories.
>>
>> -Mohit Kaushik
>>
>>
>> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>
>> Hi Hadoop user,
>>
>> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
>> it under one specific user. Now I want  to allow other users on the cluster
>> to share the same hadoop installation. What changes I need to do in order
>> to allow access to other users?
>>
>> Thanks
>> Ravikant
>>
>>
>
>
> --
>
> * Mohit Kaushik*
> Software Engineer
> A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
> *Tel:* +91 (124) 4969352 | *Fax:* +91 (124) 4033553
>
> <http://politicomapper.orkash.com>interactive social intelligence at
> work...
>
> <https://www.facebook.com/Orkash2012>
> <http://www.linkedin.com/company/orkash-services-private-limited>
> <https://twitter.com/Orkash>  <http://www.orkash.com/blog/>
> <http://www.orkash.com>
> <http://www.orkash.com> ... ensuring Assurance in complexity and
> uncertainty
>
> *This message including the attachments, if any, is a confidential
> business communication. If you are not the intended recipient it may be
> unlawful for you to read, copy, distribute, disclose or otherwise use the
> information in this e-mail. If you have received it in error or are not the
> intended recipient, please destroy it and notify the sender immediately.
> Thank you *
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

thanks for your response. It worked for me. For hdfs permissions

hadoop dfs -chmod -R 777 /


worked for me.

Thanks
Ravikant

On Mon, Jan 18, 2016 at 2:58 PM, mohit.kaushik <mo...@orkash.com>
wrote:

> You should also set the dfs permissions
> Simply run
>
>  hadoop fs -chmod 777 /
>
> or set the 'dfs.permissions' to 'false'.
>
> - Mohit Kaushik
>
>
> On 01/18/2016 02:43 PM, mohit.kaushik wrote:
>
> If you started Hadoop daemons with hduser, it will not be shown for the
> user foo(or any other user) as hadoop daemons are just java processes. But
> still you can run your jobs with any other user. Ensure that the user foo
> has access to hadoop directories. And you also don't have to create a
> directory in hdfs for the user. I hope this resolves your problem.
>
> hduser $ start-all.sh
> hduser $ su - other
> other $ /home/hduser/hadoop203/bin/hadoop jar
> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
>
> -Mohit Kasuhik
>
>
> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>
> Hi Mohit,
>
> Thanks for your reply.  Let me elaborate my problem in detail.
> I have installed hadoop with user called 'hduser' and the HADOOP_HOME
> points to one folder in hduser's home directory . Now I have added another
> user foo in the cluster. I modified the access permissions for following
> directories to 777:
> 1. Hadoop installation directory ( pointed by  HADOOP_HOME)
> 2. dfs.datanode.data.dir
> 3. dfs.namenode.name.dir
> 4. hadoop.tmp.dir
>
> I have also created directory /user/foo inside hdfs
>
> After starting hdfs and yarn daemons, I am not able to view these
> processes in foo user and so not able to submit jobs.
>
> Can you point out what I am missing here?
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mo...@orkash.com>
> wrote:
>
>> Hadoop uses the linux system users. I think, You don't have to make any
>> changes, Just create a new user in your system and give it access to hadoop
>> ie. provide permissions to hadoop installation and data directories.
>>
>> -Mohit Kaushik
>>
>>
>> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>
>> Hi Hadoop user,
>>
>> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
>> it under one specific user. Now I want  to allow other users on the cluster
>> to share the same hadoop installation. What changes I need to do in order
>> to allow access to other users?
>>
>> Thanks
>> Ravikant
>>
>>
>
>
> --
>
> * Mohit Kaushik*
> Software Engineer
> A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
> *Tel:* +91 (124) 4969352 | *Fax:* +91 (124) 4033553
>
> <http://politicomapper.orkash.com>interactive social intelligence at
> work...
>
> <https://www.facebook.com/Orkash2012>
> <http://www.linkedin.com/company/orkash-services-private-limited>
> <https://twitter.com/Orkash>  <http://www.orkash.com/blog/>
> <http://www.orkash.com>
> <http://www.orkash.com> ... ensuring Assurance in complexity and
> uncertainty
>
> *This message including the attachments, if any, is a confidential
> business communication. If you are not the intended recipient it may be
> unlawful for you to read, copy, distribute, disclose or otherwise use the
> information in this e-mail. If you have received it in error or are not the
> intended recipient, please destroy it and notify the sender immediately.
> Thank you *
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

thanks for your response. It worked for me. For hdfs permissions

hadoop dfs -chmod -R 777 /


worked for me.

Thanks
Ravikant

On Mon, Jan 18, 2016 at 2:58 PM, mohit.kaushik <mo...@orkash.com>
wrote:

> You should also set the dfs permissions
> Simply run
>
>  hadoop fs -chmod 777 /
>
> or set the 'dfs.permissions' to 'false'.
>
> - Mohit Kaushik
>
>
> On 01/18/2016 02:43 PM, mohit.kaushik wrote:
>
> If you started Hadoop daemons with hduser, it will not be shown for the
> user foo(or any other user) as hadoop daemons are just java processes. But
> still you can run your jobs with any other user. Ensure that the user foo
> has access to hadoop directories. And you also don't have to create a
> directory in hdfs for the user. I hope this resolves your problem.
>
> hduser $ start-all.sh
> hduser $ su - other
> other $ /home/hduser/hadoop203/bin/hadoop jar
> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
>
> -Mohit Kasuhik
>
>
> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>
> Hi Mohit,
>
> Thanks for your reply.  Let me elaborate my problem in detail.
> I have installed hadoop with user called 'hduser' and the HADOOP_HOME
> points to one folder in hduser's home directory . Now I have added another
> user foo in the cluster. I modified the access permissions for following
> directories to 777:
> 1. Hadoop installation directory ( pointed by  HADOOP_HOME)
> 2. dfs.datanode.data.dir
> 3. dfs.namenode.name.dir
> 4. hadoop.tmp.dir
>
> I have also created directory /user/foo inside hdfs
>
> After starting hdfs and yarn daemons, I am not able to view these
> processes in foo user and so not able to submit jobs.
>
> Can you point out what I am missing here?
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mo...@orkash.com>
> wrote:
>
>> Hadoop uses the linux system users. I think, You don't have to make any
>> changes, Just create a new user in your system and give it access to hadoop
>> ie. provide permissions to hadoop installation and data directories.
>>
>> -Mohit Kaushik
>>
>>
>> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>
>> Hi Hadoop user,
>>
>> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
>> it under one specific user. Now I want  to allow other users on the cluster
>> to share the same hadoop installation. What changes I need to do in order
>> to allow access to other users?
>>
>> Thanks
>> Ravikant
>>
>>
>
>
> --
>
> * Mohit Kaushik*
> Software Engineer
> A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
> *Tel:* +91 (124) 4969352 | *Fax:* +91 (124) 4033553
>
> <http://politicomapper.orkash.com>interactive social intelligence at
> work...
>
> <https://www.facebook.com/Orkash2012>
> <http://www.linkedin.com/company/orkash-services-private-limited>
> <https://twitter.com/Orkash>  <http://www.orkash.com/blog/>
> <http://www.orkash.com>
> <http://www.orkash.com> ... ensuring Assurance in complexity and
> uncertainty
>
> *This message including the attachments, if any, is a confidential
> business communication. If you are not the intended recipient it may be
> unlawful for you to read, copy, distribute, disclose or otherwise use the
> information in this e-mail. If you have received it in error or are not the
> intended recipient, please destroy it and notify the sender immediately.
> Thank you *
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
You should also set the dfs permissions
Simply run

  hadoop fs -chmod 777 /

or set the 'dfs.permissions' to 'false'.

- Mohit Kaushik

On 01/18/2016 02:43 PM, mohit.kaushik wrote:
> If you started Hadoop daemons with hduser, it will not be shown for 
> the user foo(or any other user) as hadoop daemons are just java 
> processes. But still you can run your jobs with any other user. Ensure 
> that the user foo has access to hadoop directories. And you also don't 
> have to create a directory in hdfs for the user. I hope this resolves 
> your problem.
> hduser $ start-all.sh
> hduser $ su - other
> other $ /home/hduser/hadoop203/bin/hadoop jar
> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
> -Mohit Kasuhik
>
>
> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>> Hi Mohit,
>>
>> Thanks for your reply.  Let me elaborate my problem in detail.
>> I have installed hadoop with user called 'hduser' and the HADOOP_HOME 
>> points to one folder in hduser's home directory . Now I have added 
>> another user foo in the cluster. I modified the access permissions 
>> for following directories to 777:
>> 1. Hadoop installation directory ( pointed by HADOOP_HOME)
>> 2. dfs.datanode.data.dir
>> 3. dfs.namenode.name.dir
>> 4. hadoop.tmp.dir
>>
>> I have also created directory /user/foo inside hdfs
>>
>> After starting hdfs and yarn daemons, I am not able to view these 
>> processes in foo user and so not able to submit jobs.
>>
>> Can you point out what I am missing here?
>>
>> Thanks
>> Ravikant
>>
>> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik 
>> <mohit.kaushik@orkash.com <ma...@orkash.com>> wrote:
>>
>>     Hadoop uses the linux system users. I think, You don't have to
>>     make any changes, Just create a new user in your system and give
>>     it access to hadoop ie. provide permissions to hadoop
>>     installation and data directories.
>>
>>     -Mohit Kaushik
>>
>>
>>     On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>>     Hi Hadoop user,
>>>
>>>     I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>>     installed it under one specific user. Now I want  to allow other
>>>     users on the cluster to share the same hadoop installation. What
>>>     changes I need to do in order to allow access to other users?
>>>
>>>     Thanks
>>>     Ravikant
>>
>>


-- 
Signature

*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553

<http://politicomapper.orkash.com>interactive social intelligence at work...

<https://www.facebook.com/Orkash2012> 
<http://www.linkedin.com/company/orkash-services-private-limited> 
<https://twitter.com/Orkash> <http://www.orkash.com/blog/> 
<http://www.orkash.com>
<http://www.orkash.com> ... ensuring Assurance in complexity and uncertainty

/This message including the attachments, if any, is a confidential 
business communication. If you are not the intended recipient it may be 
unlawful for you to read, copy, distribute, disclose or otherwise use 
the information in this e-mail. If you have received it in error or are 
not the intended recipient, please destroy it and notify the sender 
immediately. Thank you /


Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
You should also set the dfs permissions
Simply run

  hadoop fs -chmod 777 /

or set the 'dfs.permissions' to 'false'.

- Mohit Kaushik

On 01/18/2016 02:43 PM, mohit.kaushik wrote:
> If you started Hadoop daemons with hduser, it will not be shown for 
> the user foo(or any other user) as hadoop daemons are just java 
> processes. But still you can run your jobs with any other user. Ensure 
> that the user foo has access to hadoop directories. And you also don't 
> have to create a directory in hdfs for the user. I hope this resolves 
> your problem.
> hduser $ start-all.sh
> hduser $ su - other
> other $ /home/hduser/hadoop203/bin/hadoop jar
> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
> -Mohit Kasuhik
>
>
> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>> Hi Mohit,
>>
>> Thanks for your reply.  Let me elaborate my problem in detail.
>> I have installed hadoop with user called 'hduser' and the HADOOP_HOME 
>> points to one folder in hduser's home directory . Now I have added 
>> another user foo in the cluster. I modified the access permissions 
>> for following directories to 777:
>> 1. Hadoop installation directory ( pointed by HADOOP_HOME)
>> 2. dfs.datanode.data.dir
>> 3. dfs.namenode.name.dir
>> 4. hadoop.tmp.dir
>>
>> I have also created directory /user/foo inside hdfs
>>
>> After starting hdfs and yarn daemons, I am not able to view these 
>> processes in foo user and so not able to submit jobs.
>>
>> Can you point out what I am missing here?
>>
>> Thanks
>> Ravikant
>>
>> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik 
>> <mohit.kaushik@orkash.com <ma...@orkash.com>> wrote:
>>
>>     Hadoop uses the linux system users. I think, You don't have to
>>     make any changes, Just create a new user in your system and give
>>     it access to hadoop ie. provide permissions to hadoop
>>     installation and data directories.
>>
>>     -Mohit Kaushik
>>
>>
>>     On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>>     Hi Hadoop user,
>>>
>>>     I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>>     installed it under one specific user. Now I want  to allow other
>>>     users on the cluster to share the same hadoop installation. What
>>>     changes I need to do in order to allow access to other users?
>>>
>>>     Thanks
>>>     Ravikant
>>
>>


-- 
Signature

*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553

<http://politicomapper.orkash.com>interactive social intelligence at work...

<https://www.facebook.com/Orkash2012> 
<http://www.linkedin.com/company/orkash-services-private-limited> 
<https://twitter.com/Orkash> <http://www.orkash.com/blog/> 
<http://www.orkash.com>
<http://www.orkash.com> ... ensuring Assurance in complexity and uncertainty

/This message including the attachments, if any, is a confidential 
business communication. If you are not the intended recipient it may be 
unlawful for you to read, copy, distribute, disclose or otherwise use 
the information in this e-mail. If you have received it in error or are 
not the intended recipient, please destroy it and notify the sender 
immediately. Thank you /


Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
You should also set the dfs permissions
Simply run

  hadoop fs -chmod 777 /

or set the 'dfs.permissions' to 'false'.

- Mohit Kaushik

On 01/18/2016 02:43 PM, mohit.kaushik wrote:
> If you started Hadoop daemons with hduser, it will not be shown for 
> the user foo(or any other user) as hadoop daemons are just java 
> processes. But still you can run your jobs with any other user. Ensure 
> that the user foo has access to hadoop directories. And you also don't 
> have to create a directory in hdfs for the user. I hope this resolves 
> your problem.
> hduser $ start-all.sh
> hduser $ su - other
> other $ /home/hduser/hadoop203/bin/hadoop jar
> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
> -Mohit Kasuhik
>
>
> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>> Hi Mohit,
>>
>> Thanks for your reply.  Let me elaborate my problem in detail.
>> I have installed hadoop with user called 'hduser' and the HADOOP_HOME 
>> points to one folder in hduser's home directory . Now I have added 
>> another user foo in the cluster. I modified the access permissions 
>> for following directories to 777:
>> 1. Hadoop installation directory ( pointed by HADOOP_HOME)
>> 2. dfs.datanode.data.dir
>> 3. dfs.namenode.name.dir
>> 4. hadoop.tmp.dir
>>
>> I have also created directory /user/foo inside hdfs
>>
>> After starting hdfs and yarn daemons, I am not able to view these 
>> processes in foo user and so not able to submit jobs.
>>
>> Can you point out what I am missing here?
>>
>> Thanks
>> Ravikant
>>
>> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik 
>> <mohit.kaushik@orkash.com <ma...@orkash.com>> wrote:
>>
>>     Hadoop uses the linux system users. I think, You don't have to
>>     make any changes, Just create a new user in your system and give
>>     it access to hadoop ie. provide permissions to hadoop
>>     installation and data directories.
>>
>>     -Mohit Kaushik
>>
>>
>>     On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>>     Hi Hadoop user,
>>>
>>>     I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>>     installed it under one specific user. Now I want  to allow other
>>>     users on the cluster to share the same hadoop installation. What
>>>     changes I need to do in order to allow access to other users?
>>>
>>>     Thanks
>>>     Ravikant
>>
>>


-- 
Signature

*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553

<http://politicomapper.orkash.com>interactive social intelligence at work...

<https://www.facebook.com/Orkash2012> 
<http://www.linkedin.com/company/orkash-services-private-limited> 
<https://twitter.com/Orkash> <http://www.orkash.com/blog/> 
<http://www.orkash.com>
<http://www.orkash.com> ... ensuring Assurance in complexity and uncertainty

/This message including the attachments, if any, is a confidential 
business communication. If you are not the intended recipient it may be 
unlawful for you to read, copy, distribute, disclose or otherwise use 
the information in this e-mail. If you have received it in error or are 
not the intended recipient, please destroy it and notify the sender 
immediately. Thank you /


Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
You should also set the dfs permissions
Simply run

  hadoop fs -chmod 777 /

or set the 'dfs.permissions' to 'false'.

- Mohit Kaushik

On 01/18/2016 02:43 PM, mohit.kaushik wrote:
> If you started Hadoop daemons with hduser, it will not be shown for 
> the user foo(or any other user) as hadoop daemons are just java 
> processes. But still you can run your jobs with any other user. Ensure 
> that the user foo has access to hadoop directories. And you also don't 
> have to create a directory in hdfs for the user. I hope this resolves 
> your problem.
> hduser $ start-all.sh
> hduser $ su - other
> other $ /home/hduser/hadoop203/bin/hadoop jar
> /home/hduser/hadoop203/hadoop-examples*.jar pi 1 1
> -Mohit Kasuhik
>
>
> On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
>> Hi Mohit,
>>
>> Thanks for your reply.  Let me elaborate my problem in detail.
>> I have installed hadoop with user called 'hduser' and the HADOOP_HOME 
>> points to one folder in hduser's home directory . Now I have added 
>> another user foo in the cluster. I modified the access permissions 
>> for following directories to 777:
>> 1. Hadoop installation directory ( pointed by HADOOP_HOME)
>> 2. dfs.datanode.data.dir
>> 3. dfs.namenode.name.dir
>> 4. hadoop.tmp.dir
>>
>> I have also created directory /user/foo inside hdfs
>>
>> After starting hdfs and yarn daemons, I am not able to view these 
>> processes in foo user and so not able to submit jobs.
>>
>> Can you point out what I am missing here?
>>
>> Thanks
>> Ravikant
>>
>> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik 
>> <mohit.kaushik@orkash.com <ma...@orkash.com>> wrote:
>>
>>     Hadoop uses the linux system users. I think, You don't have to
>>     make any changes, Just create a new user in your system and give
>>     it access to hadoop ie. provide permissions to hadoop
>>     installation and data directories.
>>
>>     -Mohit Kaushik
>>
>>
>>     On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>>     Hi Hadoop user,
>>>
>>>     I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>>     installed it under one specific user. Now I want  to allow other
>>>     users on the cluster to share the same hadoop installation. What
>>>     changes I need to do in order to allow access to other users?
>>>
>>>     Thanks
>>>     Ravikant
>>
>>


-- 
Signature

*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553

<http://politicomapper.orkash.com>interactive social intelligence at work...

<https://www.facebook.com/Orkash2012> 
<http://www.linkedin.com/company/orkash-services-private-limited> 
<https://twitter.com/Orkash> <http://www.orkash.com/blog/> 
<http://www.orkash.com>
<http://www.orkash.com> ... ensuring Assurance in complexity and uncertainty

/This message including the attachments, if any, is a confidential 
business communication. If you are not the intended recipient it may be 
unlawful for you to read, copy, distribute, disclose or otherwise use 
the information in this e-mail. If you have received it in error or are 
not the intended recipient, please destroy it and notify the sender 
immediately. Thank you /


Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
If you started Hadoop daemons with hduser, it will not be shown for the 
user foo(or any other user) as hadoop daemons are just java processes. 
But still you can run your jobs with any other user. Ensure that the 
user foo has access to hadoop directories. And you also don't have to 
create a directory in hdfs for the user. I hope this resolves your problem.

hduser $ start-all.sh
hduser $ su - other
other $ /home/hduser/hadoop203/bin/hadoop jar
/home/hduser/hadoop203/hadoop-examples*.jar pi 1 1

-Mohit Kasuhik


On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
> Hi Mohit,
>
> Thanks for your reply.  Let me elaborate my problem in detail.
> I have installed hadoop with user called 'hduser' and the HADOOP_HOME 
> points to one folder in hduser's home directory . Now I have added 
> another user foo in the cluster. I modified the access permissions for 
> following directories to 777:
> 1. Hadoop installation directory ( pointed by HADOOP_HOME)
> 2. dfs.datanode.data.dir
> 3. dfs.namenode.name.dir
> 4. hadoop.tmp.dir
>
> I have also created directory /user/foo inside hdfs
>
> After starting hdfs and yarn daemons, I am not able to view these 
> processes in foo user and so not able to submit jobs.
>
> Can you point out what I am missing here?
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik 
> <mohit.kaushik@orkash.com <ma...@orkash.com>> wrote:
>
>     Hadoop uses the linux system users. I think, You don't have to
>     make any changes, Just create a new user in your system and give
>     it access to hadoop ie. provide permissions to hadoop installation
>     and data directories.
>
>     -Mohit Kaushik
>
>
>     On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>     Hi Hadoop user,
>>
>>     I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>     installed it under one specific user. Now I want  to allow other
>>     users on the cluster to share the same hadoop installation. What
>>     changes I need to do in order to allow access to other users?
>>
>>     Thanks
>>     Ravikant
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
If you started Hadoop daemons with hduser, it will not be shown for the 
user foo(or any other user) as hadoop daemons are just java processes. 
But still you can run your jobs with any other user. Ensure that the 
user foo has access to hadoop directories. And you also don't have to 
create a directory in hdfs for the user. I hope this resolves your problem.

hduser $ start-all.sh
hduser $ su - other
other $ /home/hduser/hadoop203/bin/hadoop jar
/home/hduser/hadoop203/hadoop-examples*.jar pi 1 1

-Mohit Kasuhik


On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
> Hi Mohit,
>
> Thanks for your reply.  Let me elaborate my problem in detail.
> I have installed hadoop with user called 'hduser' and the HADOOP_HOME 
> points to one folder in hduser's home directory . Now I have added 
> another user foo in the cluster. I modified the access permissions for 
> following directories to 777:
> 1. Hadoop installation directory ( pointed by HADOOP_HOME)
> 2. dfs.datanode.data.dir
> 3. dfs.namenode.name.dir
> 4. hadoop.tmp.dir
>
> I have also created directory /user/foo inside hdfs
>
> After starting hdfs and yarn daemons, I am not able to view these 
> processes in foo user and so not able to submit jobs.
>
> Can you point out what I am missing here?
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik 
> <mohit.kaushik@orkash.com <ma...@orkash.com>> wrote:
>
>     Hadoop uses the linux system users. I think, You don't have to
>     make any changes, Just create a new user in your system and give
>     it access to hadoop ie. provide permissions to hadoop installation
>     and data directories.
>
>     -Mohit Kaushik
>
>
>     On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>     Hi Hadoop user,
>>
>>     I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>     installed it under one specific user. Now I want  to allow other
>>     users on the cluster to share the same hadoop installation. What
>>     changes I need to do in order to allow access to other users?
>>
>>     Thanks
>>     Ravikant
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
If you started Hadoop daemons with hduser, it will not be shown for the 
user foo(or any other user) as hadoop daemons are just java processes. 
But still you can run your jobs with any other user. Ensure that the 
user foo has access to hadoop directories. And you also don't have to 
create a directory in hdfs for the user. I hope this resolves your problem.

hduser $ start-all.sh
hduser $ su - other
other $ /home/hduser/hadoop203/bin/hadoop jar
/home/hduser/hadoop203/hadoop-examples*.jar pi 1 1

-Mohit Kasuhik


On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
> Hi Mohit,
>
> Thanks for your reply.  Let me elaborate my problem in detail.
> I have installed hadoop with user called 'hduser' and the HADOOP_HOME 
> points to one folder in hduser's home directory . Now I have added 
> another user foo in the cluster. I modified the access permissions for 
> following directories to 777:
> 1. Hadoop installation directory ( pointed by HADOOP_HOME)
> 2. dfs.datanode.data.dir
> 3. dfs.namenode.name.dir
> 4. hadoop.tmp.dir
>
> I have also created directory /user/foo inside hdfs
>
> After starting hdfs and yarn daemons, I am not able to view these 
> processes in foo user and so not able to submit jobs.
>
> Can you point out what I am missing here?
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik 
> <mohit.kaushik@orkash.com <ma...@orkash.com>> wrote:
>
>     Hadoop uses the linux system users. I think, You don't have to
>     make any changes, Just create a new user in your system and give
>     it access to hadoop ie. provide permissions to hadoop installation
>     and data directories.
>
>     -Mohit Kaushik
>
>
>     On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>     Hi Hadoop user,
>>
>>     I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>     installed it under one specific user. Now I want  to allow other
>>     users on the cluster to share the same hadoop installation. What
>>     changes I need to do in order to allow access to other users?
>>
>>     Thanks
>>     Ravikant
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
If you started Hadoop daemons with hduser, it will not be shown for the 
user foo(or any other user) as hadoop daemons are just java processes. 
But still you can run your jobs with any other user. Ensure that the 
user foo has access to hadoop directories. And you also don't have to 
create a directory in hdfs for the user. I hope this resolves your problem.

hduser $ start-all.sh
hduser $ su - other
other $ /home/hduser/hadoop203/bin/hadoop jar
/home/hduser/hadoop203/hadoop-examples*.jar pi 1 1

-Mohit Kasuhik


On 01/18/2016 11:56 AM, Ravikant Dindokar wrote:
> Hi Mohit,
>
> Thanks for your reply.  Let me elaborate my problem in detail.
> I have installed hadoop with user called 'hduser' and the HADOOP_HOME 
> points to one folder in hduser's home directory . Now I have added 
> another user foo in the cluster. I modified the access permissions for 
> following directories to 777:
> 1. Hadoop installation directory ( pointed by HADOOP_HOME)
> 2. dfs.datanode.data.dir
> 3. dfs.namenode.name.dir
> 4. hadoop.tmp.dir
>
> I have also created directory /user/foo inside hdfs
>
> After starting hdfs and yarn daemons, I am not able to view these 
> processes in foo user and so not able to submit jobs.
>
> Can you point out what I am missing here?
>
> Thanks
> Ravikant
>
> On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik 
> <mohit.kaushik@orkash.com <ma...@orkash.com>> wrote:
>
>     Hadoop uses the linux system users. I think, You don't have to
>     make any changes, Just create a new user in your system and give
>     it access to hadoop ie. provide permissions to hadoop installation
>     and data directories.
>
>     -Mohit Kaushik
>
>
>     On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>>     Hi Hadoop user,
>>
>>     I have hadoop-2.6 installed on my cluster with 11 nodes. I have
>>     installed it under one specific user. Now I want  to allow other
>>     users on the cluster to share the same hadoop installation. What
>>     changes I need to do in order to allow access to other users?
>>
>>     Thanks
>>     Ravikant
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

Thanks for your reply.  Let me elaborate my problem in detail.
I have installed hadoop with user called 'hduser' and the HADOOP_HOME
points to one folder in hduser's home directory . Now I have added another
user foo in the cluster. I modified the access permissions for following
directories to 777:
1. Hadoop installation directory ( pointed by  HADOOP_HOME)
2. dfs.datanode.data.dir
3. dfs.namenode.name.dir
4. hadoop.tmp.dir

I have also created directory /user/foo inside hdfs

After starting hdfs and yarn daemons, I am not able to view these processes
in foo user and so not able to submit jobs.

Can you point out what I am missing here?

Thanks
Ravikant

On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mo...@orkash.com>
wrote:

> Hadoop uses the linux system users. I think, You don't have to make any
> changes, Just create a new user in your system and give it access to hadoop
> ie. provide permissions to hadoop installation and data directories.
>
> -Mohit Kaushik
>
>
> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
> it under one specific user. Now I want  to allow other users on the cluster
> to share the same hadoop installation. What changes I need to do in order
> to allow access to other users?
>
> Thanks
> Ravikant
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

Thanks for your reply.  Let me elaborate my problem in detail.
I have installed hadoop with user called 'hduser' and the HADOOP_HOME
points to one folder in hduser's home directory . Now I have added another
user foo in the cluster. I modified the access permissions for following
directories to 777:
1. Hadoop installation directory ( pointed by  HADOOP_HOME)
2. dfs.datanode.data.dir
3. dfs.namenode.name.dir
4. hadoop.tmp.dir

I have also created directory /user/foo inside hdfs

After starting hdfs and yarn daemons, I am not able to view these processes
in foo user and so not able to submit jobs.

Can you point out what I am missing here?

Thanks
Ravikant

On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mo...@orkash.com>
wrote:

> Hadoop uses the linux system users. I think, You don't have to make any
> changes, Just create a new user in your system and give it access to hadoop
> ie. provide permissions to hadoop installation and data directories.
>
> -Mohit Kaushik
>
>
> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
> it under one specific user. Now I want  to allow other users on the cluster
> to share the same hadoop installation. What changes I need to do in order
> to allow access to other users?
>
> Thanks
> Ravikant
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

Thanks for your reply.  Let me elaborate my problem in detail.
I have installed hadoop with user called 'hduser' and the HADOOP_HOME
points to one folder in hduser's home directory . Now I have added another
user foo in the cluster. I modified the access permissions for following
directories to 777:
1. Hadoop installation directory ( pointed by  HADOOP_HOME)
2. dfs.datanode.data.dir
3. dfs.namenode.name.dir
4. hadoop.tmp.dir

I have also created directory /user/foo inside hdfs

After starting hdfs and yarn daemons, I am not able to view these processes
in foo user and so not able to submit jobs.

Can you point out what I am missing here?

Thanks
Ravikant

On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mo...@orkash.com>
wrote:

> Hadoop uses the linux system users. I think, You don't have to make any
> changes, Just create a new user in your system and give it access to hadoop
> ie. provide permissions to hadoop installation and data directories.
>
> -Mohit Kaushik
>
>
> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
> it under one specific user. Now I want  to allow other users on the cluster
> to share the same hadoop installation. What changes I need to do in order
> to allow access to other users?
>
> Thanks
> Ravikant
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by Ravikant Dindokar <ra...@gmail.com>.
Hi Mohit,

Thanks for your reply.  Let me elaborate my problem in detail.
I have installed hadoop with user called 'hduser' and the HADOOP_HOME
points to one folder in hduser's home directory . Now I have added another
user foo in the cluster. I modified the access permissions for following
directories to 777:
1. Hadoop installation directory ( pointed by  HADOOP_HOME)
2. dfs.datanode.data.dir
3. dfs.namenode.name.dir
4. hadoop.tmp.dir

I have also created directory /user/foo inside hdfs

After starting hdfs and yarn daemons, I am not able to view these processes
in foo user and so not able to submit jobs.

Can you point out what I am missing here?

Thanks
Ravikant

On Mon, Jan 18, 2016 at 10:39 AM, mohit.kaushik <mo...@orkash.com>
wrote:

> Hadoop uses the linux system users. I think, You don't have to make any
> changes, Just create a new user in your system and give it access to hadoop
> ie. provide permissions to hadoop installation and data directories.
>
> -Mohit Kaushik
>
>
> On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
>
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have installed
> it under one specific user. Now I want  to allow other users on the cluster
> to share the same hadoop installation. What changes I need to do in order
> to allow access to other users?
>
> Thanks
> Ravikant
>
>

Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
Hadoop uses the linux system users. I think, You don't have to make any 
changes, Just create a new user in your system and give it access to 
hadoop ie. provide permissions to hadoop installation and data directories.

-Mohit Kaushik

On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have 
> installed it under one specific user. Now I want  to allow other users 
> on the cluster to share the same hadoop installation. What changes I 
> need to do in order to allow access to other users?
>
> Thanks
> Ravikant

Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
Hadoop uses the linux system users. I think, You don't have to make any 
changes, Just create a new user in your system and give it access to 
hadoop ie. provide permissions to hadoop installation and data directories.

-Mohit Kaushik

On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have 
> installed it under one specific user. Now I want  to allow other users 
> on the cluster to share the same hadoop installation. What changes I 
> need to do in order to allow access to other users?
>
> Thanks
> Ravikant

Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
Hadoop uses the linux system users. I think, You don't have to make any 
changes, Just create a new user in your system and give it access to 
hadoop ie. provide permissions to hadoop installation and data directories.

-Mohit Kaushik

On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have 
> installed it under one specific user. Now I want  to allow other users 
> on the cluster to share the same hadoop installation. What changes I 
> need to do in order to allow access to other users?
>
> Thanks
> Ravikant

Re: Sharing single hadoop installation for multiple users on cluster

Posted by "mohit.kaushik" <mo...@orkash.com>.
Hadoop uses the linux system users. I think, You don't have to make any 
changes, Just create a new user in your system and give it access to 
hadoop ie. provide permissions to hadoop installation and data directories.

-Mohit Kaushik

On 01/17/2016 04:06 PM, Ravikant Dindokar wrote:
> Hi Hadoop user,
>
> I have hadoop-2.6 installed on my cluster with 11 nodes. I have 
> installed it under one specific user. Now I want  to allow other users 
> on the cluster to share the same hadoop installation. What changes I 
> need to do in order to allow access to other users?
>
> Thanks
> Ravikant