You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Vivek Mishra <vi...@impetus.co.in> on 2016/03/12 16:39:36 UTC

Kerberos Hadoop access

Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi,
Now getting

Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "IMPETUS-NL146/192.168.56.1"; destination host is: "sandbox.hortonworks.com":8020;

Tried to use cacheTicket and created proxy user.

-Vviek

From: Vivek Mishra [mailto:vivek.mishra@impetus.co.in]
Sent: 13 March 2016 15:01
To: Daniel Schulz <da...@hotmail.com>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: RE: Kerberos Hadoop access

Hi Daniel,
Thanks so Much. Let me give it a try and grab another opportunity to thank you again ☺

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:59
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi,
Now getting

Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "IMPETUS-NL146/192.168.56.1"; destination host is: "sandbox.hortonworks.com":8020;

Tried to use cacheTicket and created proxy user.

-Vviek

From: Vivek Mishra [mailto:vivek.mishra@impetus.co.in]
Sent: 13 March 2016 15:01
To: Daniel Schulz <da...@hotmail.com>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: RE: Kerberos Hadoop access

Hi Daniel,
Thanks so Much. Let me give it a try and grab another opportunity to thank you again ☺

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:59
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi,
Now getting

Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "IMPETUS-NL146/192.168.56.1"; destination host is: "sandbox.hortonworks.com":8020;

Tried to use cacheTicket and created proxy user.

-Vviek

From: Vivek Mishra [mailto:vivek.mishra@impetus.co.in]
Sent: 13 March 2016 15:01
To: Daniel Schulz <da...@hotmail.com>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: RE: Kerberos Hadoop access

Hi Daniel,
Thanks so Much. Let me give it a try and grab another opportunity to thank you again ☺

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:59
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi,
Now getting

Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "IMPETUS-NL146/192.168.56.1"; destination host is: "sandbox.hortonworks.com":8020;

Tried to use cacheTicket and created proxy user.

-Vviek

From: Vivek Mishra [mailto:vivek.mishra@impetus.co.in]
Sent: 13 March 2016 15:01
To: Daniel Schulz <da...@hotmail.com>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: RE: Kerberos Hadoop access

Hi Daniel,
Thanks so Much. Let me give it a try and grab another opportunity to thank you again ☺

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:59
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Daniel,
Thanks so Much. Let me give it a try and grab another opportunity to thank you again ☺

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:59
To: Vivek Mishra <vi...@impetus.co.in>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Daniel,
Thanks so Much. Let me give it a try and grab another opportunity to thank you again ☺

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:59
To: Vivek Mishra <vi...@impetus.co.in>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Daniel,
Thanks so Much. Let me give it a try and grab another opportunity to thank you again ☺

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:59
To: Vivek Mishra <vi...@impetus.co.in>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Daniel,
Thanks so Much. Let me give it a try and grab another opportunity to thank you again ☺

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:59
To: Vivek Mishra <vi...@impetus.co.in>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: Benoy Antony <ba...@gmail.com>>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Daniel Schulz <da...@hotmail.com>.
Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

> On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Daniel,
> Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?
>  
> Please let me know, if I understood correctly.
>  
> PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.
>  
> Sincerely,
> -Vivek
>  
> From: Daniel Schulz [mailto:danielschulz2005@hotmail.com] 
> Sent: 13 March 2016 14:10
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek,
>  
> Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.
>  
> To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.
>  
> Kind regards, Daniel.
> 
> On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Benoy,
> Thanks for your response. Would
>  
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> it also work from remote client machine?
>  
> Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.
>  
> Please suggest.
>  
> Sincerely,
> -Vivek
>  
> From: Benoy Antony [mailto:bantony@gmail.com] 
> Sent: 13 March 2016 02:43
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek, 
> 
> You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
> 
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> Other than fetching a ticket, you do not need to change anything.
> 
> A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
> 
> Let me know if it solves your problem.
>  
> thanks ,
> Benoy
>  
> 
>  
>  
> On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in> wrote:
> Hi,
> Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
> Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.
>  
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Daniel Schulz <da...@hotmail.com>.
Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

> On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Daniel,
> Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?
>  
> Please let me know, if I understood correctly.
>  
> PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.
>  
> Sincerely,
> -Vivek
>  
> From: Daniel Schulz [mailto:danielschulz2005@hotmail.com] 
> Sent: 13 March 2016 14:10
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek,
>  
> Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.
>  
> To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.
>  
> Kind regards, Daniel.
> 
> On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Benoy,
> Thanks for your response. Would
>  
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> it also work from remote client machine?
>  
> Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.
>  
> Please suggest.
>  
> Sincerely,
> -Vivek
>  
> From: Benoy Antony [mailto:bantony@gmail.com] 
> Sent: 13 March 2016 02:43
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek, 
> 
> You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
> 
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> Other than fetching a ticket, you do not need to change anything.
> 
> A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
> 
> Let me know if it solves your problem.
>  
> thanks ,
> Benoy
>  
> 
>  
>  
> On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in> wrote:
> Hi,
> Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
> Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.
>  
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Daniel Schulz <da...@hotmail.com>.
Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

> On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Daniel,
> Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?
>  
> Please let me know, if I understood correctly.
>  
> PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.
>  
> Sincerely,
> -Vivek
>  
> From: Daniel Schulz [mailto:danielschulz2005@hotmail.com] 
> Sent: 13 March 2016 14:10
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek,
>  
> Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.
>  
> To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.
>  
> Kind regards, Daniel.
> 
> On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Benoy,
> Thanks for your response. Would
>  
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> it also work from remote client machine?
>  
> Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.
>  
> Please suggest.
>  
> Sincerely,
> -Vivek
>  
> From: Benoy Antony [mailto:bantony@gmail.com] 
> Sent: 13 March 2016 02:43
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek, 
> 
> You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
> 
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> Other than fetching a ticket, you do not need to change anything.
> 
> A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
> 
> Let me know if it solves your problem.
>  
> thanks ,
> Benoy
>  
> 
>  
>  
> On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in> wrote:
> Hi,
> Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
> Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.
>  
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Daniel Schulz <da...@hotmail.com>.
Hi Vivek,

Sounds like you got this correct to me. The part "callback" sounds a bit odd to me; there is no callback in that sense. You get a ticket to your session and then can work on Kerberized Hadoop with that. This is two separate steps on the commandline — not one workflow connected via callback.

What I described was the manual workflow to get a ticket. If you need to run batch jobs, like cron, it is best to use keytab files for that. I think, the Java API should be used best with keytab files.

Kind regards, Daniel.

> On 13 Mar 2016, at 09:44, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Daniel,
> Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?
>  
> Please let me know, if I understood correctly.
>  
> PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.
>  
> Sincerely,
> -Vivek
>  
> From: Daniel Schulz [mailto:danielschulz2005@hotmail.com] 
> Sent: 13 March 2016 14:10
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek,
>  
> Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.
>  
> To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.
>  
> Kind regards, Daniel.
> 
> On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Benoy,
> Thanks for your response. Would
>  
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> it also work from remote client machine?
>  
> Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.
>  
> Please suggest.
>  
> Sincerely,
> -Vivek
>  
> From: Benoy Antony [mailto:bantony@gmail.com] 
> Sent: 13 March 2016 02:43
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek, 
> 
> You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
> 
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> Other than fetching a ticket, you do not need to change anything.
> 
> A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
> 
> Let me know if it solves your problem.
>  
> thanks ,
> Benoy
>  
> 
>  
>  
> On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in> wrote:
> Hi,
> Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
> Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.
>  
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Daniel,
Thanks for prompt response. So I need a standalone Kerberos java client first for kinit purpose and then in callback I can use UserGroupInformation =>loginfromKeytab and run map reduce job?

Please let me know, if I understood correctly.

PS: Via kinit command line I am able to access and run map reduce job but struggling with java api.

Sincerely,
-Vivek

From: Daniel Schulz [mailto:danielschulz2005@hotmail.com]
Sent: 13 March 2016 14:10
To: Vivek Mishra <vi...@impetus.co.in>
Cc: Benoy Antony <ba...@gmail.com>; user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Daniel Schulz <da...@hotmail.com>.
Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

> On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Benoy,
> Thanks for your response. Would
>  
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> it also work from remote client machine?
>  
> Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.
>  
> Please suggest.
>  
> Sincerely,
> -Vivek
>  
> From: Benoy Antony [mailto:bantony@gmail.com] 
> Sent: 13 March 2016 02:43
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek, 
> 
> You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
> 
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> Other than fetching a ticket, you do not need to change anything.
> 
> A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
> 
> Let me know if it solves your problem.
>  
> thanks ,
> Benoy
>  
> 
>  
>  
> On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in> wrote:
> Hi,
> Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
> Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.
>  
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>  
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Daniel Schulz <da...@hotmail.com>.
Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

> On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Benoy,
> Thanks for your response. Would
>  
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> it also work from remote client machine?
>  
> Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.
>  
> Please suggest.
>  
> Sincerely,
> -Vivek
>  
> From: Benoy Antony [mailto:bantony@gmail.com] 
> Sent: 13 March 2016 02:43
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek, 
> 
> You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
> 
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> Other than fetching a ticket, you do not need to change anything.
> 
> A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
> 
> Let me know if it solves your problem.
>  
> thanks ,
> Benoy
>  
> 
>  
>  
> On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in> wrote:
> Hi,
> Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
> Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.
>  
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>  
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Daniel Schulz <da...@hotmail.com>.
Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

> On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Benoy,
> Thanks for your response. Would
>  
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> it also work from remote client machine?
>  
> Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.
>  
> Please suggest.
>  
> Sincerely,
> -Vivek
>  
> From: Benoy Antony [mailto:bantony@gmail.com] 
> Sent: 13 March 2016 02:43
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek, 
> 
> You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
> 
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> Other than fetching a ticket, you do not need to change anything.
> 
> A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
> 
> Let me know if it solves your problem.
>  
> thanks ,
> Benoy
>  
> 
>  
>  
> On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in> wrote:
> Hi,
> Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
> Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.
>  
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>  
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Daniel Schulz <da...@hotmail.com>.
Hi Vivek,

Benoy is right: when you log into your secured Hadoop cluster machine Y (raw OS first) you need a Kerberos ticket from KDC from machine X first. Therefore, your OS user needs a configured Kerberos client, gets a ticket using kinit from X and assigns it to your session on Y. klist then displays your ticket information and its validity on the commandline. Then, you are able to work on Hadoop with your user for this time span. After it or when doing the next login, you need a new Kerberos ticket doing the same thing.

To destroy your Kerberos ticket simply issue kdestroy on the commandline. In case of further questions, please feel free to reach out to us any time.

Kind regards, Daniel.

> On 13 Mar 2016, at 09:26, Vivek Mishra <vi...@impetus.co.in> wrote:
> 
> Hi Benoy,
> Thanks for your response. Would
>  
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> it also work from remote client machine?
>  
> Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.
>  
> Please suggest.
>  
> Sincerely,
> -Vivek
>  
> From: Benoy Antony [mailto:bantony@gmail.com] 
> Sent: 13 March 2016 02:43
> To: Vivek Mishra <vi...@impetus.co.in>
> Cc: user@hadoop.apache.org
> Subject: Re: Kerberos Hadoop access
>  
> Hi Vivek, 
> 
> You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
> 
> You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
> 
> Other than fetching a ticket, you do not need to change anything.
> 
> A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
> 
> Let me know if it solves your problem.
>  
> thanks ,
> Benoy
>  
> 
>  
>  
> On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in> wrote:
> Hi,
> Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
> Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.
>  
>  
>  
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
>  
> 
> 
> 
> 
> 
> 
> 
> NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>
Cc: user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>
Cc: user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>
Cc: user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

RE: Kerberos Hadoop access

Posted by Vivek Mishra <vi...@impetus.co.in>.
Hi Benoy,
Thanks for your response. Would

You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
it also work from remote client machine?

Shouldn’t it be like need to connect with remote KDC server first for kinit?  Here in my case, KDC is on machine X and secured hadoop cluster is on machine Y.

Please suggest.

Sincerely,
-Vivek

From: Benoy Antony [mailto:bantony@gmail.com]
Sent: 13 March 2016 02:43
To: Vivek Mishra <vi...@impetus.co.in>
Cc: user@hadoop.apache.org
Subject: Re: Kerberos Hadoop access

Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To obtain kerberos ticket , do a kinit. More kerberos command are here : http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User
You can also obtain kerberos tickets programatically using keytab. See http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab
Other than fetching a ticket, you do not need to change anything.
A few useful "How Tos" for a secure Hadoop Cluster are here : http://hadoopsecurity.org/wiki/How%20Tos
Let me know if it solves your problem.

thanks ,
Benoy



On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>> wrote:
Hi,
Can anyone point me to a reference for running map reduce job or HDFS file creation over Kerberos secured HDFS cluster( From remote client machine)?
Spent entire day with different tweaks using UserGroupInformation and SecurityUtil.



________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

Re: Kerberos Hadoop access

Posted by Benoy Antony <ba...@gmail.com>.
Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To
obtain kerberos ticket , do a kinit. More kerberos command are here :
http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User

You can also obtain kerberos tickets programatically using keytab. See
http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab

Other than fetching a ticket, you do not need to change anything.

A few useful "How Tos" for a secure Hadoop Cluster are here :
http://hadoopsecurity.org/wiki/How%20Tos

Let me know if it solves your problem.

thanks ,
Benoy




On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>
wrote:

> Hi,
>
> Can anyone point me to a reference for running map reduce job or HDFS file
> creation over Kerberos secured HDFS cluster( From remote client machine)?
>
> Spent entire day with different tweaks using UserGroupInformation and
> SecurityUtil.
>
>
>
>
>
> ------------------------------
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>

Re: Kerberos Hadoop access

Posted by Benoy Antony <ba...@gmail.com>.
Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To
obtain kerberos ticket , do a kinit. More kerberos command are here :
http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User

You can also obtain kerberos tickets programatically using keytab. See
http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab

Other than fetching a ticket, you do not need to change anything.

A few useful "How Tos" for a secure Hadoop Cluster are here :
http://hadoopsecurity.org/wiki/How%20Tos

Let me know if it solves your problem.

thanks ,
Benoy




On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>
wrote:

> Hi,
>
> Can anyone point me to a reference for running map reduce job or HDFS file
> creation over Kerberos secured HDFS cluster( From remote client machine)?
>
> Spent entire day with different tweaks using UserGroupInformation and
> SecurityUtil.
>
>
>
>
>
> ------------------------------
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>

Re: Kerberos Hadoop access

Posted by Benoy Antony <ba...@gmail.com>.
Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To
obtain kerberos ticket , do a kinit. More kerberos command are here :
http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User

You can also obtain kerberos tickets programatically using keytab. See
http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab

Other than fetching a ticket, you do not need to change anything.

A few useful "How Tos" for a secure Hadoop Cluster are here :
http://hadoopsecurity.org/wiki/How%20Tos

Let me know if it solves your problem.

thanks ,
Benoy




On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>
wrote:

> Hi,
>
> Can anyone point me to a reference for running map reduce job or HDFS file
> creation over Kerberos secured HDFS cluster( From remote client machine)?
>
> Spent entire day with different tweaks using UserGroupInformation and
> SecurityUtil.
>
>
>
>
>
> ------------------------------
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>

Re: Kerberos Hadoop access

Posted by Benoy Antony <ba...@gmail.com>.
Hi Vivek,

You need a kerberos ticket to  interact with a secure Hadoop Cluster. To
obtain kerberos ticket , do a kinit. More kerberos command are here :
http://hadoopsecurity.org/wiki/Useful%20Kerberos%20Commands%20for%20a%20Hadoop%20User

You can also obtain kerberos tickets programatically using keytab. See
http://hadoopsecurity.org/wiki/How%20to%20access%20secure%20Hadoop%20cluster%20programmatically%20using%20keytab

Other than fetching a ticket, you do not need to change anything.

A few useful "How Tos" for a secure Hadoop Cluster are here :
http://hadoopsecurity.org/wiki/How%20Tos

Let me know if it solves your problem.

thanks ,
Benoy




On Sat, Mar 12, 2016 at 7:39 AM, Vivek Mishra <vi...@impetus.co.in>
wrote:

> Hi,
>
> Can anyone point me to a reference for running map reduce job or HDFS file
> creation over Kerberos secured HDFS cluster( From remote client machine)?
>
> Spent entire day with different tweaks using UserGroupInformation and
> SecurityUtil.
>
>
>
>
>
> ------------------------------
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>