You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@giraph.apache.org by Prajakta Kalmegh <pk...@gmail.com> on 2012/10/29 23:29:56 UTC

Running Giraph with secure Hadoop

Hi

I am trying to configure Hadoop with Kerberos for running Giraph as gives
in <
https://cwiki.apache.org/confluence/display/GIRAPH/Quick+Start+-+Running+Giraph+with+Secure+Hadoop
>

I am using Ubuntu 12.04 and installed krb5-kdc and krb5-admin-server. Had
to change principals.sh to use "kinit $NORMAL_USER@$REALM" for the ticket
granting to work. I also added the passwords for all services in
the addprinc commands. When I try to start my Hadoop 1.0.3 namenode, it
gives me the following connection refused error.


---------------

12/10/29 15:23:31 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = sap-OptiPlex-755/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.3
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
************************************************************/
12/10/29 15:23:31 INFO impl.MetricsConfig: loaded properties from
hadoop-metrics2.properties
12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
12/10/29 15:23:31 INFO impl.MetricsSystemImpl: Scheduled snapshot period at
10 second(s).
12/10/29 15:23:31 INFO impl.MetricsSystemImpl: NameNode metrics system
started
12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for source ugi
registered.
12/10/29 15:23:32 ERROR namenode.NameNode: java.io.IOException: Login
failure for hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN from keytab
/home/saplabs/kerb-setup/services.keytab
at
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:630)
 at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:298)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:264)
 at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:496)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1279)
 at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
Caused by: javax.security.auth.login.LoginException: Connection refused
 at
com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:700)
at
com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:542)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
 at javax.security.auth.login.LoginContext.invoke(LoginContext.java:769)
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:186)
 at javax.security.auth.login.LoginContext$5.run(LoginContext.java:706)
at java.security.AccessController.doPrivileged(Native Method)
 at
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:703)
at javax.security.auth.login.LoginContext.login(LoginContext.java:575)
 at
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:621)
... 5 more
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
 at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
 at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
at java.net.Socket.connect(Socket.java:529)
 at sun.security.krb5.internal.TCPClient.<init>(TCPClient.java:46)
at sun.security.krb5.KrbKdcReq$KdcCommunication.run(KrbKdcReq.java:343)
 at java.security.AccessController.doPrivileged(Native Method)
at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:296)
 at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:202)
at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:175)
 at sun.security.krb5.KrbAsReq.send(KrbAsReq.java:431)
at sun.security.krb5.Credentials.sendASRequest(Credentials.java:400)
 at sun.security.krb5.Credentials.acquireTGT(Credentials.java:350)
at
com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:672)
------------------------------------------------


Any idea what could be going wrong here?

Regards,
Prajakta

Re: Running Giraph with secure Hadoop

Posted by Prajakta Kalmegh <pk...@gmail.com>.
Hi Eugene

In addition to details attached in previous mail, here is some more data
about the principals getting added to the keytab file:
------------------
pkalmegh@pkalmegh-home:~/kerb-setup$ . principals.sh
using hostname: pkalmegh-home for instance component of
   server principals (service/instance@REALM).
Authenticating as principal pkalmegh/admin@HADOOP.LOCALDOMAIN with password.
kadmin.local:  delprinc -force host/pkalmegh-home@HADOOP.LOCALDOMAIN
delete_principal: Principal does not exist while deleting principal
"host/pkalmegh-home@HADOOP.LOCALDOMAIN"
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  addprinc -pw mypassword host/pkalmegh-home@HADOOP.LOCALDOMAIN
WARNING: no policy specified for host/pkalmegh-home@HADOOP.LOCALDOMAIN;
defaulting to no policy
Principal "host/pkalmegh-home@HADOOP.LOCALDOMAIN" created.
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  ktadd -k /home/pkalmegh/kerb-setup/services.keytab
host/pkalmegh-home@HADOOP.LOCALDOMAIN
Entry for principal host/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type aes256-cts-hmac-sha1-96 added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal host/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type arcfour-hmac added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal host/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type des3-cbc-sha1 added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal host/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type des-cbc-crc added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  delprinc -force zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN
delete_principal: Principal does not exist while deleting principal
"zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN"
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  addprinc -pw mypassword
zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN
WARNING: no policy specified for zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN;
defaulting to no policy
Principal "zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN" created.
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  ktadd -k /home/pkalmegh/kerb-setup/services.keytab
zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN
Entry for principal zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type aes256-cts-hmac-sha1-96 added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type arcfour-hmac added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type des3-cbc-sha1 added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type des-cbc-crc added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  delprinc -force hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN
delete_principal: Principal does not exist while deleting principal
"hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN"
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  addprinc -pw mypassword hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN
WARNING: no policy specified for hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN;
defaulting to no policy
Principal "hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN" created.
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  ktadd -k /home/pkalmegh/kerb-setup/services.keytab
hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN
Entry for principal hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type aes256-cts-hmac-sha1-96 added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type arcfour-hmac added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type des3-cbc-sha1 added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type des-cbc-crc added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  delprinc -force mapred/pkalmegh-home@HADOOP.LOCALDOMAIN
delete_principal: Principal does not exist while deleting principal
"mapred/pkalmegh-home@HADOOP.LOCALDOMAIN"
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  addprinc -pw mypassword
mapred/pkalmegh-home@HADOOP.LOCALDOMAIN
WARNING: no policy specified for mapred/pkalmegh-home@HADOOP.LOCALDOMAIN;
defaulting to no policy
Principal "mapred/pkalmegh-home@HADOOP.LOCALDOMAIN" created.
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  ktadd -k /home/pkalmegh/kerb-setup/services.keytab
mapred/pkalmegh-home@HADOOP.LOCALDOMAIN
Entry for principal mapred/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type aes256-cts-hmac-sha1-96 added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal mapred/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type arcfour-hmac added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal mapred/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type des3-cbc-sha1 added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
Entry for principal mapred/pkalmegh-home@HADOOP.LOCALDOMAIN with kvno 2,
encryption type des-cbc-crc added to keytab
WRFILE:/home/pkalmegh/kerb-setup/services.keytab.
kadmin.local:
Choose password for principal pkalmegh:
Repeat password for principal pkalmegh:
Authenticating as principal pkalmegh/admin@HADOOP.LOCALDOMAIN with password.
kadmin.local:  delprinc -force pkalmegh@HADOOP.LOCALDOMAIN
delete_principal: Principal does not exist while deleting principal
"pkalmegh@HADOOP.LOCALDOMAIN"
kadmin.local:  Authenticating as principal
pkalmegh/admin@HADOOP.LOCALDOMAINwith password.
kadmin.local:  addprinc -pw mypassword pkalmegh@HADOOP.LOCALDOMAIN
WARNING: no policy specified for pkalmegh@HADOOP.LOCALDOMAIN; defaulting to
no policy
Principal "pkalmegh@HADOOP.LOCALDOMAIN" created.
kadmin.local:   * Restarting Kerberos KDC krb5kdc
                                                            [ OK ]
Now we will obtain a ticket-granting ticket and put it in your ticket
cache. You should be asked for a password. Type the password you just chose
in the last step.
Password for pkalmegh@HADOOP.LOCALDOMAIN:
Obtained and cached ticket successfully. Now attempting to renew your
ticket..ok.

pkalmegh@pkalmegh-home:~/kerb-setup$ ktutil
*ktutil:  read_kt /home/pkalmegh/kerb-setup/services.keytab *
ktutil:  l
slot KVNO Principal
---- ----
---------------------------------------------------------------------
   1    2    host/pkalmegh-home@HADOOP.LOCALDOMAIN
   2    2    host/pkalmegh-home@HADOOP.LOCALDOMAIN
   3    2    host/pkalmegh-home@HADOOP.LOCALDOMAIN
   4    2    host/pkalmegh-home@HADOOP.LOCALDOMAIN
   5    2 zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN
   6    2 zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN
   7    2 zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN
   8    2 zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN
   9    2    hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN
  10    2    hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN
  11    2    hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN
  12    2    hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN
  13    2  mapred/pkalmegh-home@HADOOP.LOCALDOMAIN
  14    2  mapred/pkalmegh-home@HADOOP.LOCALDOMAIN
  15    2  mapred/pkalmegh-home@HADOOP.LOCALDOMAIN
  16    2  mapred/pkalmegh-home@HADOOP.LOCALDOMAIN
ktutil:  q

*pkalmegh@pkalmegh-home:~/kerb-setup$ kadmin.local *
Authenticating as principal pkalmegh/admin@HADOOP.LOCALDOMAIN with password.
*
*
*kadmin.local:  getprinc host/pkalmegh-home@HADOOP.LOCALDOMAIN*
Principal: host/pkalmegh-home@HADOOP.LOCALDOMAIN
Expiration date: [never]
Last password change: Tue Oct 30 17:06:46 PDT 2012
Password expiration date: [none]
Maximum ticket life: 0 days 10:00:00
Maximum renewable life: 7 days 00:00:00
Last modified: Tue Oct 30 17:06:46 PDT 2012
(pkalmegh/admin@HADOOP.LOCALDOMAIN)
Last successful authentication: [never]
Last failed authentication: [never]
Failed password attempts: 0
Number of keys: 4
Key: vno 2, aes256-cts-hmac-sha1-96, no salt
Key: vno 2, arcfour-hmac, no salt
Key: vno 2, des3-cbc-sha1, no salt
Key: vno 2, des-cbc-crc, no salt
MKey: vno 1
Attributes: REQUIRES_PRE_AUTH
Policy: [none]

*kadmin.local:  getprincs*
K/M@HADOOP.LOCALDOMAIN
hdfs/pkalmegh-home@HADOOP.LOCALDOMAIN
host/pkalmegh-home@HADOOP.LOCALDOMAIN
kadmin/admin@HADOOP.LOCALDOMAIN
kadmin/changepw@HADOOP.LOCALDOMAIN
kadmin/pkalmegh-home@HADOOP.LOCALDOMAIN
krbtgt/HADOOP.LOCALDOMAIN@HADOOP.LOCALDOMAIN
mapred/pkalmegh-home@HADOOP.LOCALDOMAIN
pkalmegh@HADOOP.LOCALDOMAIN
zookeeper/pkalmegh-home@HADOOP.LOCALDOMAIN
kadmin.local

------------------------------

Any idea why each principal has 4 keys? Please let me know if you can
figure out how to set this up correctly. I really need to get Giraph
working soon :(

Regards,
Prajakta



On Tue, Oct 30, 2012 at 12:16 PM, Prajakta Kalmegh <pk...@gmail.com>wrote:

> Hi Eugene
>
> Please find attached the principals.sh file and the output with "set -x"
> on. Please let me know if you can see the problem in it.
>
> Regards,
> Prajakta
>
>
>
> On Mon, Oct 29, 2012 at 7:48 PM, Eugene Koontz <ek...@hiro-tan.org>wrote:
>
>>  Hi Prajakta,
>>    I'm confused about the exact details of the script you're using, can
>> you post it in its entirety? Also add a "set -x" at the top of the script
>> (in a new line below the "#!/bin/sh" line), and run it and post the output
>> so I can see what your computer is doing with the script?
>>
>> Thanks,
>> Eugene
>>
>>
>> On 10/29/12 9:35 PM, Prajakta Kalmegh wrote:
>>
>> Hi Eugene
>>
>>  Thanks for replying. Yes, the list given by ktutil is empty. But when I
>> execute the principals.sh, it does not give any error on adding the
>> principals. I have edited principals.sh to even add passwords with the
>> addprinc command.
>>
>>  I just have a pseudo-distributed Hadoop 1.0.3 setup. So not sure if the
>> instructions in the other link you gave will be useful or not. I am using
>> principals.sh from this link <
>> https://github.com/ekoontz/kerb-setup/blob/master/principals.sh>
>>
>>  I also edited KADMIN_LOCAL to remove the sudo and have changed
>> permissions appropriately to run kadmin.local as normal user. The reason
>> for doing this was the principals get added with root/admin authentication
>> otherwise. Also, the call to kinit and kinit -R (to renew tickets) need the
>> $NORMAL_USER@$REALM; else it does not grant ticket.
>>
>>  Regards,
>> Prajakta
>>
>>
>>
>>
>> On Mon, Oct 29, 2012 at 4:28 PM, Eugene Koontz <ek...@hiro-tan.org>wrote:
>>
>>>  On 10/29/12 6:29 PM, Prajakta Kalmegh wrote:
>>>
>>> Hi
>>>
>>>  I am trying to configure Hadoop with Kerberos for running Giraph as
>>> gives in <
>>> https://cwiki.apache.org/confluence/display/GIRAPH/Quick+Start+-+Running+Giraph+with+Secure+Hadoop
>>> >
>>>
>>>  I am using Ubuntu 12.04 and installed krb5-kdc and krb5-admin-server.
>>> Had to change principals.sh to use "kinit $NORMAL_USER@$REALM" for the
>>> ticket granting to work. I also added the passwords for all services in
>>> the addprinc commands. When I try to start my Hadoop 1.0.3 namenode, it
>>> gives me the following connection refused error.
>>>
>>>
>>>   Hi Prajakta,
>>>     At some point, I need to go through that page and ensure make sure
>>> everything works as expected.
>>> You might also look at:
>>> https://github.com/ekoontz/hadoop-conf/blob/master/README
>>>
>>> From your output below, it looks like the keytab does not have
>>> credentials for the principal 'hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN
>>> '.
>>>
>>>
>>> Can you try:
>>>
>>> ktutil -k /home/saplabs/kerb-setup/services.keytab l
>>>
>>> (last character is a lowercase 'L')
>>>
>>> -Eugene
>>>
>>>  ---------------
>>>
>>>  12/10/29 15:23:31 INFO namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = sap-OptiPlex-755/127.0.1.1
>>> STARTUP_MSG:   args = []
>>> STARTUP_MSG:   version = 1.0.3
>>> STARTUP_MSG:   build =
>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>>> 1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
>>> ************************************************************/
>>> 12/10/29 15:23:31 INFO impl.MetricsConfig: loaded properties from
>>> hadoop-metrics2.properties
>>> 12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for source
>>> MetricsSystem,sub=Stats registered.
>>> 12/10/29 15:23:31 INFO impl.MetricsSystemImpl: Scheduled snapshot period
>>> at 10 second(s).
>>> 12/10/29 15:23:31 INFO impl.MetricsSystemImpl: NameNode metrics system
>>> started
>>> 12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for source ugi
>>> registered.
>>> 12/10/29 15:23:32 ERROR namenode.NameNode: java.io.IOException: Login
>>> failure for hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN from keytab
>>> /home/saplabs/kerb-setup/services.keytab
>>>  at
>>> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:630)
>>>  at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:298)
>>>  at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:264)
>>>  at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:496)
>>>  at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1279)
>>>  at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
>>> Caused by: javax.security.auth.login.LoginException: Connection refused
>>>  at
>>> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:700)
>>>  at
>>> com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:542)
>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>  at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>  at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>  at java.lang.reflect.Method.invoke(Method.java:597)
>>>  at javax.security.auth.login.LoginContext.invoke(LoginContext.java:769)
>>>  at
>>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:186)
>>>  at javax.security.auth.login.LoginContext$5.run(LoginContext.java:706)
>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>  at
>>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:703)
>>>  at javax.security.auth.login.LoginContext.login(LoginContext.java:575)
>>>  at
>>> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:621)
>>>  ... 5 more
>>> Caused by: java.net.ConnectException: Connection refused
>>>  at java.net.PlainSocketImpl.socketConnect(Native Method)
>>>  at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
>>>  at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
>>>  at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
>>>  at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
>>>  at java.net.Socket.connect(Socket.java:529)
>>>  at sun.security.krb5.internal.TCPClient.<init>(TCPClient.java:46)
>>>  at sun.security.krb5.KrbKdcReq$KdcCommunication.run(KrbKdcReq.java:343)
>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>  at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:296)
>>>  at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:202)
>>>  at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:175)
>>>  at sun.security.krb5.KrbAsReq.send(KrbAsReq.java:431)
>>>  at sun.security.krb5.Credentials.sendASRequest(Credentials.java:400)
>>>  at sun.security.krb5.Credentials.acquireTGT(Credentials.java:350)
>>>  at
>>> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:672)
>>>  ------------------------------------------------
>>>
>>>
>>>  Any idea what could be going wrong here?
>>>
>>>  Regards,
>>> Prajakta
>>>
>>>
>>>
>>>
>>
>>
>

Re: Running Giraph with secure Hadoop

Posted by Eugene Koontz <ek...@hiro-tan.org>.
Hi Prajakta,
    I'm confused about the exact details of the script you're using, can 
you post it in its entirety? Also add a "set -x" at the top of the 
script (in a new line below the "#!/bin/sh" line), and run it and post 
the output so I can see what your computer is doing with the script?

Thanks,
Eugene

On 10/29/12 9:35 PM, Prajakta Kalmegh wrote:
> Hi Eugene
>
> Thanks for replying. Yes, the list given by ktutil is empty. But when 
> I execute the principals.sh, it does not give any error on adding the 
> principals. I have edited principals.sh to even add passwords with the 
> addprinc command.
>
> I just have a pseudo-distributed Hadoop 1.0.3 setup. So not sure if 
> the instructions in the other link you gave will be useful or not. I 
> am using principals.sh from this link 
> <https://github.com/ekoontz/kerb-setup/blob/master/principals.sh>
>
> I also edited KADMIN_LOCAL to remove the sudo and have changed 
> permissions appropriately to run kadmin.local as normal user. The 
> reason for doing this was the principals get added with root/admin 
> authentication otherwise. Also, the call to kinit and kinit -R (to 
> renew tickets) need the $NORMAL_USER@$REALM; else it does not grant 
> ticket.
>
> Regards,
> Prajakta
>
>
>
>
> On Mon, Oct 29, 2012 at 4:28 PM, Eugene Koontz <ekoontz@hiro-tan.org 
> <ma...@hiro-tan.org>> wrote:
>
>     On 10/29/12 6:29 PM, Prajakta Kalmegh wrote:
>>     Hi
>>
>>     I am trying to configure Hadoop with Kerberos for running Giraph
>>     as gives in
>>     <https://cwiki.apache.org/confluence/display/GIRAPH/Quick+Start+-+Running+Giraph+with+Secure+Hadoop>
>>
>>     I am using Ubuntu 12.04 and installed krb5-kdc and
>>     krb5-admin-server. Had to change principals.sh to use "kinit
>>     $NORMAL_USER@$REALM" for the ticket granting to work. I also
>>     added the passwords for all services in the addprinc commands.
>>     When I try to start my Hadoop 1.0.3 namenode, it gives me the
>>     following connection refused error.
>>
>>
>     Hi Prajakta,
>         At some point, I need to go through that page and ensure make
>     sure everything works as expected.
>     You might also look at:
>     https://github.com/ekoontz/hadoop-conf/blob/master/README
>
>     From your output below, it looks like the keytab does not have
>     credentials for the principal
>     'hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN
>     <ma...@HADOOP.LOCALDOMAIN>'.
>
>
>     Can you try:
>
>     ktutil -k /home/saplabs/kerb-setup/services.keytab l
>
>     (last character is a lowercase 'L')
>
>     -Eugene
>
>>     ---------------
>>
>>     12/10/29 15:23:31 INFO namenode.NameNode: STARTUP_MSG:
>>     /************************************************************
>>     STARTUP_MSG: Starting NameNode
>>     STARTUP_MSG:   host = sap-OptiPlex-755/127.0.1.1 <http://127.0.1.1>
>>     STARTUP_MSG:   args = []
>>     STARTUP_MSG:   version = 1.0.3
>>     STARTUP_MSG:   build =
>>     https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>>     1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
>>     ************************************************************/
>>     12/10/29 15:23:31 INFO impl.MetricsConfig: loaded properties from
>>     hadoop-metrics2.properties
>>     12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for
>>     source MetricsSystem,sub=Stats registered.
>>     12/10/29 15:23:31 INFO impl.MetricsSystemImpl: Scheduled snapshot
>>     period at 10 second(s).
>>     12/10/29 15:23:31 INFO impl.MetricsSystemImpl: NameNode metrics
>>     system started
>>     12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for
>>     source ugi registered.
>>     12/10/29 15:23:32 ERROR namenode.NameNode: java.io.IOException:
>>     Login failure for hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN
>>     <ma...@HADOOP.LOCALDOMAIN> from keytab
>>     /home/saplabs/kerb-setup/services.keytab
>>     at
>>     org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:630)
>>     at
>>     org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:298)
>>     at
>>     org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:264)
>>     at
>>     org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:496)
>>     at
>>     org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1279)
>>     at
>>     org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
>>     Caused by: javax.security.auth.login.LoginException: Connection
>>     refused
>>     at
>>     com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:700)
>>     at
>>     com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:542)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>>     sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>     at
>>     sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>     at java.lang.reflect.Method.invoke(Method.java:597)
>>     at
>>     javax.security.auth.login.LoginContext.invoke(LoginContext.java:769)
>>     at
>>     javax.security.auth.login.LoginContext.access$000(LoginContext.java:186)
>>     at
>>     javax.security.auth.login.LoginContext$5.run(LoginContext.java:706)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at
>>     javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:703)
>>     at
>>     javax.security.auth.login.LoginContext.login(LoginContext.java:575)
>>     at
>>     org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:621)
>>     ... 5 more
>>     Caused by: java.net.ConnectException: Connection refused
>>     at java.net.PlainSocketImpl.socketConnect(Native Method)
>>     at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
>>     at
>>     java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
>>     at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
>>     at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
>>     at java.net.Socket.connect(Socket.java:529)
>>     at sun.security.krb5.internal.TCPClient.<init>(TCPClient.java:46)
>>     at
>>     sun.security.krb5.KrbKdcReq$KdcCommunication.run(KrbKdcReq.java:343)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:296)
>>     at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:202)
>>     at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:175)
>>     at sun.security.krb5.KrbAsReq.send(KrbAsReq.java:431)
>>     at sun.security.krb5.Credentials.sendASRequest(Credentials.java:400)
>>     at sun.security.krb5.Credentials.acquireTGT(Credentials.java:350)
>>     at
>>     com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:672)
>>     ------------------------------------------------
>>
>>
>>     Any idea what could be going wrong here?
>>
>>     Regards,
>>     Prajakta
>>
>>
>
>


Re: Running Giraph with secure Hadoop

Posted by Prajakta Kalmegh <pk...@gmail.com>.
Hi Eugene

Thanks for replying. Yes, the list given by ktutil is empty. But when I
execute the principals.sh, it does not give any error on adding the
principals. I have edited principals.sh to even add passwords with the
addprinc command.

I just have a pseudo-distributed Hadoop 1.0.3 setup. So not sure if the
instructions in the other link you gave will be useful or not. I am using
principals.sh from this link <
https://github.com/ekoontz/kerb-setup/blob/master/principals.sh>

I also edited KADMIN_LOCAL to remove the sudo and have changed permissions
appropriately to run kadmin.local as normal user. The reason for doing this
was the principals get added with root/admin authentication otherwise.
Also, the call to kinit and kinit -R (to renew tickets) need the
$NORMAL_USER@$REALM; else it does not grant ticket.

Regards,
Prajakta




On Mon, Oct 29, 2012 at 4:28 PM, Eugene Koontz <ek...@hiro-tan.org> wrote:

>  On 10/29/12 6:29 PM, Prajakta Kalmegh wrote:
>
> Hi
>
>  I am trying to configure Hadoop with Kerberos for running Giraph as
> gives in <
> https://cwiki.apache.org/confluence/display/GIRAPH/Quick+Start+-+Running+Giraph+with+Secure+Hadoop
> >
>
>  I am using Ubuntu 12.04 and installed krb5-kdc and krb5-admin-server.
> Had to change principals.sh to use "kinit $NORMAL_USER@$REALM" for the
> ticket granting to work. I also added the passwords for all services in
> the addprinc commands. When I try to start my Hadoop 1.0.3 namenode, it
> gives me the following connection refused error.
>
>
>  Hi Prajakta,
>     At some point, I need to go through that page and ensure make sure
> everything works as expected.
> You might also look at:
> https://github.com/ekoontz/hadoop-conf/blob/master/README
>
> From your output below, it looks like the keytab does not have credentials
> for the principal 'hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN'.
>
>
> Can you try:
>
> ktutil -k /home/saplabs/kerb-setup/services.keytab l
>
> (last character is a lowercase 'L')
>
> -Eugene
>
>  ---------------
>
>  12/10/29 15:23:31 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = sap-OptiPlex-755/127.0.1.1
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.3
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
> ************************************************************/
> 12/10/29 15:23:31 INFO impl.MetricsConfig: loaded properties from
> hadoop-metrics2.properties
> 12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 12/10/29 15:23:31 INFO impl.MetricsSystemImpl: Scheduled snapshot period
> at 10 second(s).
> 12/10/29 15:23:31 INFO impl.MetricsSystemImpl: NameNode metrics system
> started
> 12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 12/10/29 15:23:32 ERROR namenode.NameNode: java.io.IOException: Login
> failure for hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN from keytab
> /home/saplabs/kerb-setup/services.keytab
>  at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:630)
>  at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:298)
>  at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:264)
>  at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:496)
>  at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1279)
>  at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
> Caused by: javax.security.auth.login.LoginException: Connection refused
>  at
> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:700)
>  at
> com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:542)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>  at java.lang.reflect.Method.invoke(Method.java:597)
>  at javax.security.auth.login.LoginContext.invoke(LoginContext.java:769)
>  at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:186)
>  at javax.security.auth.login.LoginContext$5.run(LoginContext.java:706)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:703)
>  at javax.security.auth.login.LoginContext.login(LoginContext.java:575)
>  at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:621)
>  ... 5 more
> Caused by: java.net.ConnectException: Connection refused
>  at java.net.PlainSocketImpl.socketConnect(Native Method)
>  at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
>  at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
>  at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
>  at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
>  at java.net.Socket.connect(Socket.java:529)
>  at sun.security.krb5.internal.TCPClient.<init>(TCPClient.java:46)
>  at sun.security.krb5.KrbKdcReq$KdcCommunication.run(KrbKdcReq.java:343)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:296)
>  at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:202)
>  at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:175)
>  at sun.security.krb5.KrbAsReq.send(KrbAsReq.java:431)
>  at sun.security.krb5.Credentials.sendASRequest(Credentials.java:400)
>  at sun.security.krb5.Credentials.acquireTGT(Credentials.java:350)
>  at
> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:672)
>  ------------------------------------------------
>
>
>  Any idea what could be going wrong here?
>
>  Regards,
> Prajakta
>
>
>
>

Re: Running Giraph with secure Hadoop

Posted by Eugene Koontz <ek...@hiro-tan.org>.
On 10/29/12 6:29 PM, Prajakta Kalmegh wrote:
> Hi
>
> I am trying to configure Hadoop with Kerberos for running Giraph as 
> gives in 
> <https://cwiki.apache.org/confluence/display/GIRAPH/Quick+Start+-+Running+Giraph+with+Secure+Hadoop>
>
> I am using Ubuntu 12.04 and installed krb5-kdc and krb5-admin-server. 
> Had to change principals.sh to use "kinit $NORMAL_USER@$REALM" for the 
> ticket granting to work. I also added the passwords for all services 
> in the addprinc commands. When I try to start my Hadoop 1.0.3 
> namenode, it gives me the following connection refused error.
>
>
Hi Prajakta,
     At some point, I need to go through that page and ensure make sure 
everything works as expected.
You might also look at:
https://github.com/ekoontz/hadoop-conf/blob/master/README

 From your output below, it looks like the keytab does not have 
credentials for the principal 'hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN'.


Can you try:

ktutil -k /home/saplabs/kerb-setup/services.keytab l

(last character is a lowercase 'L')

-Eugene
> ---------------
>
> 12/10/29 15:23:31 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = sap-OptiPlex-755/127.0.1.1 <http://127.0.1.1>
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.3
> STARTUP_MSG:   build = 
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 
> 1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
> ************************************************************/
> 12/10/29 15:23:31 INFO impl.MetricsConfig: loaded properties from 
> hadoop-metrics2.properties
> 12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for source 
> MetricsSystem,sub=Stats registered.
> 12/10/29 15:23:31 INFO impl.MetricsSystemImpl: Scheduled snapshot 
> period at 10 second(s).
> 12/10/29 15:23:31 INFO impl.MetricsSystemImpl: NameNode metrics system 
> started
> 12/10/29 15:23:31 INFO impl.MetricsSourceAdapter: MBean for source ugi 
> registered.
> 12/10/29 15:23:32 ERROR namenode.NameNode: java.io.IOException: Login 
> failure for hdfs/sap-optiplex-755@HADOOP.LOCALDOMAIN from keytab 
> /home/saplabs/kerb-setup/services.keytab
> at 
> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:630)
> at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:298)
> at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:264)
> at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:496)
> at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1279)
> at 
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
> Caused by: javax.security.auth.login.LoginException: Connection refused
> at 
> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:700)
> at 
> com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:542)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at javax.security.auth.login.LoginContext.invoke(LoginContext.java:769)
> at 
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:186)
> at javax.security.auth.login.LoginContext$5.run(LoginContext.java:706)
> at java.security.AccessController.doPrivileged(Native Method)
> at 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:703)
> at javax.security.auth.login.LoginContext.login(LoginContext.java:575)
> at 
> org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:621)
> ... 5 more
> Caused by: java.net.ConnectException: Connection refused
> at java.net.PlainSocketImpl.socketConnect(Native Method)
> at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
> at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
> at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
> at java.net.Socket.connect(Socket.java:529)
> at sun.security.krb5.internal.TCPClient.<init>(TCPClient.java:46)
> at sun.security.krb5.KrbKdcReq$KdcCommunication.run(KrbKdcReq.java:343)
> at java.security.AccessController.doPrivileged(Native Method)
> at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:296)
> at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:202)
> at sun.security.krb5.KrbKdcReq.send(KrbKdcReq.java:175)
> at sun.security.krb5.KrbAsReq.send(KrbAsReq.java:431)
> at sun.security.krb5.Credentials.sendASRequest(Credentials.java:400)
> at sun.security.krb5.Credentials.acquireTGT(Credentials.java:350)
> at 
> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:672)
> ------------------------------------------------
>
>
> Any idea what could be going wrong here?
>
> Regards,
> Prajakta
>
>