You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by anil gupta <an...@gmail.com> on 2014/07/01 19:16:42 UTC

Re: Phoenix and Kerberos (No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt))

Hi Guiseppe,

I am curious to know whether you were able to connect to  secure HBase0.98
cluster?

Thanks,
Anil Gupta


On Tue, Jun 24, 2014 at 12:18 AM, anil gupta <an...@gmail.com> wrote:

> Hi Guiseppe,
>
> Since you are using HBase0.98 you dont need to try out different
> permutation and combination.Just take the standard phoenix-client.jar from
> nighlty build and replace the phoenix-client.jar of phoenix-4.0.0
> installation. Make sure your all your bin directory scripts are also
> defaults scripts. You dont need modification of any script as per this
> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
> because Phoenix-19 handles connecting to a secure cluster more gracefully.
> Also you dont need to do kinit now. Phoenix will handle all the stuff
> related to kerberos within Java client.
>
> Then try running sqlline like this:
> sqlline.py( or sqlline.sh) zk:port:hbase_root_dir:keytab_file:principal
>
> If you run into problems then please share the logs. Also, it would be
> helpful if you can turn on debug logs.
>
> Thanks,
> Anil Gupta
>
>
>
> On Mon, Jun 23, 2014 at 1:48 AM, Giuseppe Reina <g....@gmail.com> wrote:
>
>> Hi Justin,
>>   sorry but I tried all the relevant permutations of the jar files in the
>> classpath listing explicitly the path of the jars, and still I have the
>> same problem.
>> Any other ideas?
>>
>> Kind Regards,
>>
>>
>> On Thu, Jun 19, 2014 at 4:09 PM, Justin Workman <justinjworkman@gmail.com
>> > wrote:
>>
>>> I have had success with Hbase 0.96 from cloudera CDH4.3.0. The order of
>>> jars and configs on the classpath played a big part in getting this to work
>>> for me. Here is the order we have to use in order to connect and
>>> authenticate
>>>
>>> 1) Path to Hadoop and Hbase configuration files
>>> 2) Hbase security jar
>>> 3) Hadoop common jar
>>> 4) Hadoop auth jar
>>> 5) Zookeeper jar
>>> 6) Phoenix client jar
>>> 7) All other needed libraries
>>>
>>> For 2-6, list the jars explicitly, not just the path to the containing
>>> directories.
>>>
>>> Hope this helps
>>> Justin
>>>
>>>
>>> On Thu, Jun 19, 2014 at 8:39 AM, Giuseppe Reina <g....@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>   unfortunately I'm still having the same problem. I'm using Hadoop 2
>>>> and HBase 0.98 so I couldn't use Phoenix 3.0.
>>>> I downloaded the sources from the git repository
>>>> <https://git-wip-us.apache.org/repos/asf/phoenix.git> and I compiled
>>>> the jars.
>>>> I had to change the previous java command because I was receiving the
>>>> following exception:
>>>>
>>>> 14/06/19 14:29:23 WARN util.DynamicClassLoader: Failed to identify the
>>>>> fs of dir hdfs://zk1.mydomain:8020/apps/hbase/data/lib, ignored
>>>>> org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot
>>>>> communicate with client version 4
>>>>
>>>>
>>>> So now I launch Phoenix with:
>>>>
>>>> java -cp
>>>>> ".:/usr/lib/phoenix/phoenix-5.0.0-SNAPSHOT-client-minimal.jar:/usr/lib/hbase/lib/htrace-core-2.04.jar:/usr/lib/hbase/lib/hbase-server.jar:jline-2.11.jar:sqlline-1.1.2.jar:/usr/lib/phoenix/phoenix-5.0.0-SNAPSHOT-without-hbase.jar:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>> -n none -p none  --fastConnect=false --verbose=true
>>>>> --isolation=TRANSACTION_READ_COMMITTED
>>>>
>>>>
>>>> But I still get:
>>>>
>>>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>>>> issuing: !connect
>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>> none none org.apache.phoenix.jdbc.PhoenixDriver
>>>>> Connecting to
>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>
>>>>> Java config name: null
>>>>> Native config name: /etc/krb5.conf
>>>>> Loaded from native config
>>>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>>>> >>>DEBUG <CCacheInputStream> server principal is krbtgt/MYREALM@MYREALM
>>>>> >>>DEBUG <CCacheInputStream> key type: 16
>>>>> >>>DEBUG <CCacheInputStream> auth time: Thu Jun 19 10:19:17 UTC 2014
>>>>> >>>DEBUG <CCacheInputStream> start time: Thu Jun 19 10:19:20 UTC 2014
>>>>> >>>DEBUG <CCacheInputStream> end time: Fri Jun 20 10:19:20 UTC 2014
>>>>> >>>DEBUG <CCacheInputStream> renew_till time: Thu Jun 26 10:19:17 UTC
>>>>> 2014
>>>>>
>>>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>
>>>>> Service ticket not found in the subject
>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>> >>> KdcAccessibility: reset
>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>> number of retries =3, #bytes=737
>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>> Krb5Context setting mySeqNumber to: 583967039
>>>>>
>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>> Created InitSecContextToken:
>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>  ..n..j0..f......
>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>  ................
>>>>> [...]
>>>>> 0260: AC 98 ED E8 03 97 D4 D7   97 DE C6 0B 22 02 6C 10
>>>>>  ............".l.
>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>> 01 04 00 ff ff ff ff b0 33 06 7a d1 cb e3 4b 83 f9 e1 2e d0 92 6c fc 80 74
>>>>> 55 f0 14 8a 99 74 da b0 33 4b d2 e1 cc 31 c2 2a 75 2f 01 01 00 00 04 04 04
>>>>> 04 ]
>>>>>
>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53 54
>>>>> 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02 01
>>>>> 04 00 ff ff ff ff 3c d5 18 05 ab db d2 68 8b 35 a6 72 8a 9c 80 82 84 c3 9c
>>>>> 31 b0 df 01 18 e0 6d ea c4 a5 db ff 65 ba 01 c8 71 01 01 00 00 68 64 70 2d
>>>>> 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>> Found ticket for myuser@MYREALM to go to
>>>>> zookeeper/zk1.mydomain@MYREALM expiring on Fri Jun 20 10:19:20 UTC
>>>>> 2014
>>>>>
>>>>> Service ticket not found in the subject
>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>> number of retries =3, #bytes=737
>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>> Krb5Context setting mySeqNumber to: 594904455
>>>>>
>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>> Created InitSecContextToken:
>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>  ..n..j0..f......
>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>  ................
>>>>> [...]
>>>>> 0260: B7 1F A6 E4 47 38 2E 21   AE 4E 25 A6 8D 57 19 CD
>>>>>  ....G8.!.N%..W..
>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>> 01 04 00 ff ff ff ff 09 ed fd 9f ca d8 97 05 3a 2a 64 5c e4 c3 3b e0 71 43
>>>>> 01 4e ab 16 2f 5a 7b 31 8b 32 9f 20 9a 47 8a d1 70 0a 01 01 00 00 04 04 04
>>>>> 04 ]
>>>>>
>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53 54
>>>>> 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02 01
>>>>> 04 00 ff ff ff ff e9 97 40 26 a5 de ee 05 d5 ac 03 42 03 c4 f7 66 11 76 f4
>>>>> 9e 2b e8 f2 a0 f7 d4 62 68 21 2d da 47 b4 92 c5 12 01 01 00 00 68 64 70 2d
>>>>> 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>> 14/06/19 14:15:44 WARN ipc.RpcClient: Exception encountered while
>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>> level: Failed to find any Kerberos tgt)]
>>>>> 14/06/19 14:15:44 FATAL ipc.RpcClient: SASL authentication failed. The
>>>>> most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>
>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>> find any Kerberos tgt)]
>>>>>  at
>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>> at
>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>> at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>  at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>  at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>  at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>  at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>  at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>> at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:784)
>>>>>  at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1068)
>>>>> at
>>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
>>>>>  at
>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1170)
>>>>>
>>>>> at
>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>  at
>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>> at
>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
>>>>>  at
>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
>>>>> at
>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>  at
>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
>>>>> at
>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:941)
>>>>>  at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1470)
>>>>> at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1436)
>>>>>  at
>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>> at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1436)
>>>>>
>>>>>  at
>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>> at
>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>  at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>> at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>  at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>  at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>  at java.lang.reflect.Method.invoke(Method.java:606)
>>>>> at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>  at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>> at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>  at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>> level: Failed to find any Kerberos tgt)
>>>>>  at
>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>> at
>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>  at
>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>> at
>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>  at
>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>> at
>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>  at
>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>> ... 54 more
>>>>> 14/06/19 14:15:46 WARN ipc.RpcClient: Exception encountered while
>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>> level: Failed to find any Kerberos tgt)]
>>>>> 14/06/19 14:15:46 FATAL ipc.RpcClient: SASL authentication failed. The
>>>>> most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>
>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>> find any Kerberos tgt)]
>>>>>  at
>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>> at
>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>> at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>  at
>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>  at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>  at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>  at
>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>  at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>> at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:784)
>>>>>  at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1068)
>>>>> at
>>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
>>>>>  at
>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1170)
>>>>>
>>>>> at
>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>  at
>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>> at
>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
>>>>>  at
>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
>>>>> at
>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>  at
>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
>>>>> at
>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:941)
>>>>>  at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1470)
>>>>> at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1436)
>>>>>  at
>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>> at
>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1436)
>>>>>
>>>>>  at
>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>> at
>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>  at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>> at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>  at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>  at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>> at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>  at java.lang.reflect.Method.invoke(Method.java:606)
>>>>> at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>  at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>> at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>  at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>> level: Failed to find any Kerberos tgt)
>>>>>  at
>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>> at
>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>  at
>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>> at
>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>  at
>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>> at
>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>  at
>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>> ... 54 more
>>>>
>>>>
>>>>
>>>> I have my keys in the cache (also renewed with kinit -R). All the other
>>>> clients work, just Phoenix is not working. I'm probably missing something.
>>>> Any ideas? Does any of you have any insights?
>>>>
>>>>
>>>> Kind Regards
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Tue, Jun 17, 2014 at 3:54 PM, Giuseppe Reina <g....@gmail.com>
>>>> wrote:
>>>>
>>>>> Thank you. I'll try that!
>>>>>
>>>>>
>>>>> On Tue, Jun 17, 2014 at 3:50 PM, anil gupta <an...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi Giuseppe,
>>>>>>
>>>>>> The latest nightly builds of phoenix have the patch for
>>>>>> https://issues.apache.org/jira/browse/PHOENIX-19. If you pick up the
>>>>>> latest nighly then it would be easier to connect to a secure cluster. You
>>>>>> can find the nighly for 3.0 here:
>>>>>>
>>>>>> https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/
>>>>>>
>>>>>> If you are using HBase098 then try out Phoenix4.0 nightly. Let me
>>>>>> know if you need further help.
>>>>>>
>>>>>> Thanks,
>>>>>> Anil Gupta
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 17, 2014 at 6:35 AM, Giuseppe Reina <g....@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi all,
>>>>>>>   I'm trying to make Phoenix work with HBase and Kerberos but so far
>>>>>>> I got no luck. I'm currently using HDP 2.1 on Centos 6.5 and following this
>>>>>>> guide as reference:
>>>>>>> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
>>>>>>> I'm able to use mainly all the Hadoop services (MapReduce,
>>>>>>> Zookeeper, HBase,...) using my user but not Phoenix (note I granted RWCA
>>>>>>> permissions to my user on hbase).
>>>>>>>
>>>>>>> I don't see any problems with my TGT
>>>>>>>
>>>>>>> [myuser@zk1.mydomain ~]$ klist -fae
>>>>>>>> Ticket cache: FILE:/tmp/krb5cc_501
>>>>>>>> Default principal: myuser@MYREALM
>>>>>>>> Valid starting     Expires            Service principal
>>>>>>>> 06/17/14 10:58:50  06/18/14 10:58:50  krbtgt/MYREALM@MYREALM
>>>>>>>>  renew until 06/24/14 10:58:33, Flags: FRIT
>>>>>>>> Etype (skey, tkt): des3-cbc-sha1, des3-cbc-sha1
>>>>>>>> Addresses: (none)
>>>>>>>
>>>>>>>
>>>>>>> But when I launch Phoenix sqlline with the krb5 debug on using the
>>>>>>> following command:
>>>>>>>
>>>>>>>> [myuser@zk1.mydomain ~]$ java -cp
>>>>>>>> ".:/usr/lib/phoenix/*:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>>>>>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>>>>>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>>>>>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>>>>>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>>>>>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure -n myuser
>>>>>>>>  --fastConnect=false --verbose=true --isolation=TRANSACTION_READ_COMMITTED
>>>>>>>
>>>>>>>
>>>>>>> I get the following error:
>>>>>>>
>>>>>>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>>>>>>> issuing: !connect
>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure myuser ''
>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver
>>>>>>>> Connecting to
>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure
>>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>>> SLF4J: Found binding in
>>>>>>>> [jar:file:/usr/lib/phoenix/phoenix-4.0.0.2.1.2.0-402-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: Found binding in
>>>>>>>> [jar:file:/usr/lib/phoenix/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: Found binding in
>>>>>>>> [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for
>>>>>>>> an explanation.
>>>>>>>> Java config name: null
>>>>>>>> Native config name: /etc/krb5.conf
>>>>>>>> Loaded from native config
>>>>>>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>>>>>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>>>>>>> >>>DEBUG <CCacheInputStream> server principal is
>>>>>>>> krbtgt/MYREALM@MYREALM
>>>>>>>> >>>DEBUG <CCacheInputStream> key type: 16
>>>>>>>> >>>DEBUG <CCacheInputStream> auth time: Tue Jun 17 10:58:33 UTC 2014
>>>>>>>> >>>DEBUG <CCacheInputStream> start time: Tue Jun 17 10:58:50 UTC
>>>>>>>> 2014
>>>>>>>> >>>DEBUG <CCacheInputStream> end time: Wed Jun 18 10:58:50 UTC 2014
>>>>>>>> >>>DEBUG <CCacheInputStream> renew_till time: Tue Jun 24 10:58:33
>>>>>>>> UTC 2014
>>>>>>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>> Service ticket not found in the subject
>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> >>> KdcAccessibility: reset
>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>> number of retries =3, #bytes=737
>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> Krb5Context setting mySeqNumber to: 595457406
>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>> Created InitSecContextToken:
>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>  ..n..j0..f......
>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>  ................
>>>>>>>> [...]
>>>>>>>> 0260: 7A 66 8D 83 5C 76 84 2E   09 6B E4 7E 3C 6C 7A 3A
>>>>>>>>  zf..\v...k..<lz:
>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>> 02 01 04 00 ff ff ff ff 7b 14 41 17 41 6a d6 72 f2 55 a5 2f d1 95 c3 99 30
>>>>>>>> 8f 00 95 9e 1a 23 b6 4b b5 5d 89 6e f5 b4 e6 5a 50 1d d3 01 01 00 00 04 04
>>>>>>>> 04 04 ]
>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53
>>>>>>>> 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>>> 01 04 00 ff ff ff ff 17 ec 99 7b 96 4e 05 41 26 5e 0b b5 b9 c6 5e c8 52 9b
>>>>>>>> 14 69 d1 43 7a fa bc 4b 75 fe 49 61 2b 99 52 13 c7 9d 01 01 00 00 68 64 70
>>>>>>>> 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>> Found ticket for myuser@MYREALM to go to
>>>>>>>> zookeeper/zk1.mydomain@MYREALM expiring on Wed Jun 18 10:58:50 UTC
>>>>>>>> 2014
>>>>>>>> Service ticket not found in the subject
>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>> number of retries =3, #bytes=737
>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> Krb5Context setting mySeqNumber to: 284225265
>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>> Created InitSecContextToken:
>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>  ..n..j0..f......
>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>  ................
>>>>>>>> [...]
>>>>>>>> 0260: 76 30 32 5D 70 32 BA 6F   1F E0 C7 8F 9C B4 24 73
>>>>>>>>  v02]p2.o......$s
>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>> 02 01 04 00 ff ff ff ff 24 63 c2 63 f1 09 e4 a1 d9 8e 56 77 35 a4 c3 76 11
>>>>>>>> 77 d7 30 a9 6b 15 4d ee d7 2d 5c 80 e8 28 1d 2a 75 ac 1c 01 01 00 00 04 04
>>>>>>>> 04 04 ]
>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53
>>>>>>>> 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>>> 01 04 00 ff ff ff ff 55 15 d9 ab fb 0f ac 4e a1 1f 2e 0b 89 ca 61 a0 5a d3
>>>>>>>> 4e f6 af 30 4f 6d 8f ad 2d 0c 9d b7 c4 be a7 b2 ac b5 01 01 00 00 68 64 70
>>>>>>>> 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>>>> 14/06/17 12:25:26 WARN ipc.RpcClient: Exception encountered while
>>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>>> 14/06/17 12:25:26 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>> find any Kerberos tgt)]
>>>>>>>>  at
>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>>  at
>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at
>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>  at
>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>  at
>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>> at
>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>> at
>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>> at
>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>> at
>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>>  ... 47 more
>>>>>>>> 14/06/17 12:25:28 WARN ipc.RpcClient: Exception encountered while
>>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>>> 14/06/17 12:25:28 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>> find any Kerberos tgt)]
>>>>>>>>  at
>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>>  at
>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at
>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>  at
>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>  at
>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>> at
>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>> at
>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>> at
>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>> at
>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>>  ... 47 more
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Can any of you help me with this problem?
>>>>>>>
>>>>>>> Kind Regards
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Thanks & Regards,
>>>>>> Anil Gupta
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
>
> --
> Thanks & Regards,
> Anil Gupta
>



-- 
Thanks & Regards,
Anil Gupta

Re: Phoenix and Kerberos (No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt))

Posted by Giuseppe Reina <g....@gmail.com>.
Yes,
  I compiled the latest version of Phoenix and provided principal and
keytab in the connection string. It works with or without using kinit to
cache  the credentials.

Kind Regards


On Wed, Jul 2, 2014 at 4:57 PM, anil gupta <an...@gmail.com> wrote:

> Nice, thanks for the update. I am assuming you didnt do kinit rather you
> provided keytab and principal in phoenix connection string.
>
>
> On Wed, Jul 2, 2014 at 1:22 AM, Giuseppe Reina <g....@gmail.com> wrote:
>
>> Hi Anil,
>>   yes in the end Justin was right, I solved it reordering the classpath.
>> Essentially, I tried again finding the right permutation but with
>> phoenix-*-client-without-hbase.jar library instead of the
>> phoenix-*-client-minimal.jar and including as many jars as possible from
>> the hbase, hdfs, and hadoop deployments.
>>
>> Kind Regards,
>>
>>
>> On Tue, Jul 1, 2014 at 6:16 PM, anil gupta <an...@gmail.com> wrote:
>>
>>> Hi Guiseppe,
>>>
>>> I am curious to know whether you were able to connect to  secure
>>> HBase0.98 cluster?
>>>
>>> Thanks,
>>> Anil Gupta
>>>
>>>
>>> On Tue, Jun 24, 2014 at 12:18 AM, anil gupta <an...@gmail.com>
>>> wrote:
>>>
>>>> Hi Guiseppe,
>>>>
>>>> Since you are using HBase0.98 you dont need to try out different
>>>> permutation and combination.Just take the standard phoenix-client.jar from
>>>> nighlty build and replace the phoenix-client.jar of phoenix-4.0.0
>>>> installation. Make sure your all your bin directory scripts are also
>>>> defaults scripts. You dont need modification of any script as per this
>>>> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
>>>> because Phoenix-19 handles connecting to a secure cluster more gracefully.
>>>> Also you dont need to do kinit now. Phoenix will handle all the stuff
>>>> related to kerberos within Java client.
>>>>
>>>> Then try running sqlline like this:
>>>> sqlline.py( or sqlline.sh) zk:port:hbase_root_dir:keytab_file:principal
>>>>
>>>> If you run into problems then please share the logs. Also, it would be
>>>> helpful if you can turn on debug logs.
>>>>
>>>> Thanks,
>>>> Anil Gupta
>>>>
>>>>
>>>>
>>>> On Mon, Jun 23, 2014 at 1:48 AM, Giuseppe Reina <g....@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi Justin,
>>>>>   sorry but I tried all the relevant permutations of the jar files in
>>>>> the classpath listing explicitly the path of the jars, and still I have the
>>>>> same problem.
>>>>> Any other ideas?
>>>>>
>>>>> Kind Regards,
>>>>>
>>>>>
>>>>> On Thu, Jun 19, 2014 at 4:09 PM, Justin Workman <
>>>>> justinjworkman@gmail.com> wrote:
>>>>>
>>>>>> I have had success with Hbase 0.96 from cloudera CDH4.3.0. The order
>>>>>> of jars and configs on the classpath played a big part in getting this to
>>>>>> work for me. Here is the order we have to use in order to connect and
>>>>>> authenticate
>>>>>>
>>>>>> 1) Path to Hadoop and Hbase configuration files
>>>>>> 2) Hbase security jar
>>>>>> 3) Hadoop common jar
>>>>>> 4) Hadoop auth jar
>>>>>> 5) Zookeeper jar
>>>>>> 6) Phoenix client jar
>>>>>> 7) All other needed libraries
>>>>>>
>>>>>> For 2-6, list the jars explicitly, not just the path to the
>>>>>> containing directories.
>>>>>>
>>>>>> Hope this helps
>>>>>> Justin
>>>>>>
>>>>>>
>>>>>> On Thu, Jun 19, 2014 at 8:39 AM, Giuseppe Reina <g....@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>   unfortunately I'm still having the same problem. I'm using Hadoop
>>>>>>> 2 and HBase 0.98 so I couldn't use Phoenix 3.0.
>>>>>>> I downloaded the sources from the git repository
>>>>>>> <https://git-wip-us.apache.org/repos/asf/phoenix.git> and I
>>>>>>> compiled the jars.
>>>>>>> I had to change the previous java command because I was receiving
>>>>>>> the following exception:
>>>>>>>
>>>>>>> 14/06/19 14:29:23 WARN util.DynamicClassLoader: Failed to identify
>>>>>>>> the fs of dir hdfs://zk1.mydomain:8020/apps/hbase/data/lib, ignored
>>>>>>>> org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot
>>>>>>>> communicate with client version 4
>>>>>>>
>>>>>>>
>>>>>>> So now I launch Phoenix with:
>>>>>>>
>>>>>>> java -cp
>>>>>>>> ".:/usr/lib/phoenix/phoenix-5.0.0-SNAPSHOT-client-minimal.jar:/usr/lib/hbase/lib/htrace-core-2.04.jar:/usr/lib/hbase/lib/hbase-server.jar:jline-2.11.jar:sqlline-1.1.2.jar:/usr/lib/phoenix/phoenix-5.0.0-SNAPSHOT-without-hbase.jar:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>>>>>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>>>>>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>>>>>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>>>>>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>>>>>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>>>> -n none -p none  --fastConnect=false --verbose=true
>>>>>>>> --isolation=TRANSACTION_READ_COMMITTED
>>>>>>>
>>>>>>>
>>>>>>> But I still get:
>>>>>>>
>>>>>>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>>>>>>> issuing: !connect
>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>>>> none none org.apache.phoenix.jdbc.PhoenixDriver
>>>>>>>> Connecting to
>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>>>>
>>>>>>>> Java config name: null
>>>>>>>> Native config name: /etc/krb5.conf
>>>>>>>> Loaded from native config
>>>>>>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>>>>>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>>>>>>> >>>DEBUG <CCacheInputStream> server principal is
>>>>>>>> krbtgt/MYREALM@MYREALM
>>>>>>>> >>>DEBUG <CCacheInputStream> key type: 16
>>>>>>>> >>>DEBUG <CCacheInputStream> auth time: Thu Jun 19 10:19:17 UTC 2014
>>>>>>>> >>>DEBUG <CCacheInputStream> start time: Thu Jun 19 10:19:20 UTC
>>>>>>>> 2014
>>>>>>>> >>>DEBUG <CCacheInputStream> end time: Fri Jun 20 10:19:20 UTC 2014
>>>>>>>> >>>DEBUG <CCacheInputStream> renew_till time: Thu Jun 26 10:19:17
>>>>>>>> UTC 2014
>>>>>>>>
>>>>>>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>>>
>>>>>>>> Service ticket not found in the subject
>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> >>> KdcAccessibility: reset
>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>> number of retries =3, #bytes=737
>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> Krb5Context setting mySeqNumber to: 583967039
>>>>>>>>
>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>> Created InitSecContextToken:
>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>  ..n..j0..f......
>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>  ................
>>>>>>>> [...]
>>>>>>>> 0260: AC 98 ED E8 03 97 D4 D7   97 DE C6 0B 22 02 6C 10
>>>>>>>>  ............".l.
>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>> 02 01 04 00 ff ff ff ff b0 33 06 7a d1 cb e3 4b 83 f9 e1 2e d0 92 6c fc 80
>>>>>>>> 74 55 f0 14 8a 99 74 da b0 33 4b d2 e1 cc 31 c2 2a 75 2f 01 01 00 00 04 04
>>>>>>>> 04 04 ]
>>>>>>>>
>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53
>>>>>>>> 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>>> 01 04 00 ff ff ff ff 3c d5 18 05 ab db d2 68 8b 35 a6 72 8a 9c 80 82 84 c3
>>>>>>>> 9c 31 b0 df 01 18 e0 6d ea c4 a5 db ff 65 ba 01 c8 71 01 01 00 00 68 64 70
>>>>>>>> 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>>> Found ticket for myuser@MYREALM to go to
>>>>>>>> zookeeper/zk1.mydomain@MYREALM expiring on Fri Jun 20 10:19:20 UTC
>>>>>>>> 2014
>>>>>>>>
>>>>>>>> Service ticket not found in the subject
>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>> number of retries =3, #bytes=737
>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>> Krb5Context setting mySeqNumber to: 594904455
>>>>>>>>
>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>> Created InitSecContextToken:
>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>  ..n..j0..f......
>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>  ................
>>>>>>>> [...]
>>>>>>>> 0260: B7 1F A6 E4 47 38 2E 21   AE 4E 25 A6 8D 57 19 CD
>>>>>>>>  ....G8.!.N%..W..
>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>> 02 01 04 00 ff ff ff ff 09 ed fd 9f ca d8 97 05 3a 2a 64 5c e4 c3 3b e0 71
>>>>>>>> 43 01 4e ab 16 2f 5a 7b 31 8b 32 9f 20 9a 47 8a d1 70 0a 01 01 00 00 04 04
>>>>>>>> 04 04 ]
>>>>>>>>
>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53
>>>>>>>> 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>>> 01 04 00 ff ff ff ff e9 97 40 26 a5 de ee 05 d5 ac 03 42 03 c4 f7 66 11 76
>>>>>>>> f4 9e 2b e8 f2 a0 f7 d4 62 68 21 2d da 47 b4 92 c5 12 01 01 00 00 68 64 70
>>>>>>>> 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>>>> 14/06/19 14:15:44 WARN ipc.RpcClient: Exception encountered while
>>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>>> 14/06/19 14:15:44 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>>>
>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>> find any Kerberos tgt)]
>>>>>>>>  at
>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:784)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1068)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1170)
>>>>>>>>
>>>>>>>> at
>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:941)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1470)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1436)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1436)
>>>>>>>>
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>>  at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>> at
>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>>  at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>  at
>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>> at
>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>  at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>> at
>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>>  at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>> at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>>  at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>> at
>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>> at
>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>> at
>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>>  at
>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>> ... 54 more
>>>>>>>> 14/06/19 14:15:46 WARN ipc.RpcClient: Exception encountered while
>>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>>> 14/06/19 14:15:46 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>>>
>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>> find any Kerberos tgt)]
>>>>>>>>  at
>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>> at
>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>  at
>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:784)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1068)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1170)
>>>>>>>>
>>>>>>>> at
>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:941)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1470)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1436)
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1436)
>>>>>>>>
>>>>>>>>  at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>> at
>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>>  at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>> at
>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>>  at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>  at
>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>> at
>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>  at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>> at
>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>>  at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>> at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>>  at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>> at
>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>> at
>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>>  at
>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>> at
>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>>  at
>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>> ... 54 more
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I have my keys in the cache (also renewed with kinit -R). All the
>>>>>>> other clients work, just Phoenix is not working. I'm probably missing
>>>>>>> something.
>>>>>>> Any ideas? Does any of you have any insights?
>>>>>>>
>>>>>>>
>>>>>>> Kind Regards
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 17, 2014 at 3:54 PM, Giuseppe Reina <g....@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Thank you. I'll try that!
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Jun 17, 2014 at 3:50 PM, anil gupta <an...@gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hi Giuseppe,
>>>>>>>>>
>>>>>>>>> The latest nightly builds of phoenix have the patch for
>>>>>>>>> https://issues.apache.org/jira/browse/PHOENIX-19. If you pick up
>>>>>>>>> the latest nighly then it would be easier to connect to a secure cluster.
>>>>>>>>> You can find the nighly for 3.0 here:
>>>>>>>>>
>>>>>>>>> https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/
>>>>>>>>>
>>>>>>>>> If you are using HBase098 then try out Phoenix4.0 nightly. Let me
>>>>>>>>> know if you need further help.
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>> Anil Gupta
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Jun 17, 2014 at 6:35 AM, Giuseppe Reina <g.reina@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> Hi all,
>>>>>>>>>>   I'm trying to make Phoenix work with HBase and Kerberos but so
>>>>>>>>>> far I got no luck. I'm currently using HDP 2.1 on Centos 6.5 and following
>>>>>>>>>> this guide as reference:
>>>>>>>>>> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
>>>>>>>>>> I'm able to use mainly all the Hadoop services (MapReduce,
>>>>>>>>>> Zookeeper, HBase,...) using my user but not Phoenix (note I granted RWCA
>>>>>>>>>> permissions to my user on hbase).
>>>>>>>>>>
>>>>>>>>>> I don't see any problems with my TGT
>>>>>>>>>>
>>>>>>>>>> [myuser@zk1.mydomain ~]$ klist -fae
>>>>>>>>>>> Ticket cache: FILE:/tmp/krb5cc_501
>>>>>>>>>>> Default principal: myuser@MYREALM
>>>>>>>>>>> Valid starting     Expires            Service principal
>>>>>>>>>>> 06/17/14 10:58:50  06/18/14 10:58:50  krbtgt/MYREALM@MYREALM
>>>>>>>>>>>  renew until 06/24/14 10:58:33, Flags: FRIT
>>>>>>>>>>> Etype (skey, tkt): des3-cbc-sha1, des3-cbc-sha1
>>>>>>>>>>> Addresses: (none)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> But when I launch Phoenix sqlline with the krb5 debug on using
>>>>>>>>>> the following command:
>>>>>>>>>>
>>>>>>>>>>> [myuser@zk1.mydomain ~]$ java -cp
>>>>>>>>>>> ".:/usr/lib/phoenix/*:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>>>>>>>>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>>>>>>>>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>>>>>>>>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>>>>>>>>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>>>>>>>>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure -n myuser
>>>>>>>>>>>  --fastConnect=false --verbose=true --isolation=TRANSACTION_READ_COMMITTED
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> I get the following error:
>>>>>>>>>>
>>>>>>>>>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>>>>>>>>>> issuing: !connect
>>>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure myuser ''
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver
>>>>>>>>>>> Connecting to
>>>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure
>>>>>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>>>>>> SLF4J: Found binding in
>>>>>>>>>>> [jar:file:/usr/lib/phoenix/phoenix-4.0.0.2.1.2.0-402-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>>>> SLF4J: Found binding in
>>>>>>>>>>> [jar:file:/usr/lib/phoenix/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>>>> SLF4J: Found binding in
>>>>>>>>>>> [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for
>>>>>>>>>>> an explanation.
>>>>>>>>>>> Java config name: null
>>>>>>>>>>> Native config name: /etc/krb5.conf
>>>>>>>>>>> Loaded from native config
>>>>>>>>>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>>>>>>>>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>>>>>>>>>> >>>DEBUG <CCacheInputStream> server principal is
>>>>>>>>>>> krbtgt/MYREALM@MYREALM
>>>>>>>>>>> >>>DEBUG <CCacheInputStream> key type: 16
>>>>>>>>>>> >>>DEBUG <CCacheInputStream> auth time: Tue Jun 17 10:58:33 UTC
>>>>>>>>>>> 2014
>>>>>>>>>>> >>>DEBUG <CCacheInputStream> start time: Tue Jun 17 10:58:50 UTC
>>>>>>>>>>> 2014
>>>>>>>>>>> >>>DEBUG <CCacheInputStream> end time: Wed Jun 18 10:58:50 UTC
>>>>>>>>>>> 2014
>>>>>>>>>>> >>>DEBUG <CCacheInputStream> renew_till time: Tue Jun 24
>>>>>>>>>>> 10:58:33 UTC 2014
>>>>>>>>>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE;
>>>>>>>>>>> INITIAL;
>>>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>>>> Service ticket not found in the subject
>>>>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>>>>> >>> EType:
>>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>>> >>> KdcAccessibility: reset
>>>>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>>>>> number of retries =3, #bytes=737
>>>>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>>>>> >>> EType:
>>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>>>>> >>> EType:
>>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>>> Krb5Context setting mySeqNumber to: 595457406
>>>>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>>>>> Created InitSecContextToken:
>>>>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>>>>  ..n..j0..f......
>>>>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>>>>  ................
>>>>>>>>>>> [...]
>>>>>>>>>>> 0260: 7A 66 8D 83 5C 76 84 2E   09 6B E4 7E 3C 6C 7A 3A
>>>>>>>>>>>  zf..\v...k..<lz:
>>>>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02
>>>>>>>>>>> 02 02 01 04 00 ff ff ff ff 7b 14 41 17 41 6a d6 72 f2 55 a5 2f d1 95 c3 99
>>>>>>>>>>> 30 8f 00 95 9e 1a 23 b6 4b b5 5d 89 6e f5 b4 e6 5a 50 1d d3 01 01 00 00 04
>>>>>>>>>>> 04 04 04 ]
>>>>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40
>>>>>>>>>>> 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>>>>> 02 01 04 00 ff ff ff ff 17 ec 99 7b 96 4e 05 41 26 5e 0b b5 b9 c6 5e c8 52
>>>>>>>>>>> 9b 14 69 d1 43 7a fa bc 4b 75 fe 49 61 2b 99 52 13 c7 9d 01 01 00 00 68 64
>>>>>>>>>>> 70 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03
>>>>>>>>>>> 03 ]
>>>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>>>> Found ticket for myuser@MYREALM to go to
>>>>>>>>>>> zookeeper/zk1.mydomain@MYREALM expiring on Wed Jun 18 10:58:50
>>>>>>>>>>> UTC 2014
>>>>>>>>>>> Service ticket not found in the subject
>>>>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>>>>> >>> EType:
>>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>>>>> number of retries =3, #bytes=737
>>>>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>>>>> >>> EType:
>>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>>>>> >>> EType:
>>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>>> Krb5Context setting mySeqNumber to: 284225265
>>>>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>>>>> Created InitSecContextToken:
>>>>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>>>>  ..n..j0..f......
>>>>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>>>>  ................
>>>>>>>>>>> [...]
>>>>>>>>>>> 0260: 76 30 32 5D 70 32 BA 6F   1F E0 C7 8F 9C B4 24 73
>>>>>>>>>>>  v02]p2.o......$s
>>>>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02
>>>>>>>>>>> 02 02 01 04 00 ff ff ff ff 24 63 c2 63 f1 09 e4 a1 d9 8e 56 77 35 a4 c3 76
>>>>>>>>>>> 11 77 d7 30 a9 6b 15 4d ee d7 2d 5c 80 e8 28 1d 2a 75 ac 1c 01 01 00 00 04
>>>>>>>>>>> 04 04 04 ]
>>>>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40
>>>>>>>>>>> 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>>>>> 02 01 04 00 ff ff ff ff 55 15 d9 ab fb 0f ac 4e a1 1f 2e 0b 89 ca 61 a0 5a
>>>>>>>>>>> d3 4e f6 af 30 4f 6d 8f ad 2d 0c 9d b7 c4 be a7 b2 ac b5 01 01 00 00 68 64
>>>>>>>>>>> 70 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03
>>>>>>>>>>> 03 ]
>>>>>>>>>>> 14/06/17 12:25:26 WARN ipc.RpcClient: Exception encountered
>>>>>>>>>>> while connecting to the server : javax.security.sasl.SaslException: GSS
>>>>>>>>>>> initiate failed [Caused by GSSException: No valid credentials provided
>>>>>>>>>>> (Mechanism level: Failed to find any Kerberos tgt)]
>>>>>>>>>>> 14/06/17 12:25:26 FATAL ipc.RpcClient: SASL authentication
>>>>>>>>>>> failed. The most likely cause is missing or invalid credentials. Consider
>>>>>>>>>>> 'kinit'.
>>>>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused
>>>>>>>>>>> by GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>>>>> find any Kerberos tgt)]
>>>>>>>>>>>  at
>>>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>>>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>>>>>  at
>>>>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>>> at
>>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>>>  at
>>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>>>  at
>>>>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>>>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>>>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>>>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>>>>> Caused by: GSSException: No valid credentials provided
>>>>>>>>>>> (Mechanism level: Failed to find any Kerberos tgt)
>>>>>>>>>>> at
>>>>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>>>>>  at
>>>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>>>>> at
>>>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>>>>>  at
>>>>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>>>>> at
>>>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>>>>>  at
>>>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>>>>> at
>>>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>>>>>  ... 47 more
>>>>>>>>>>> 14/06/17 12:25:28 WARN ipc.RpcClient: Exception encountered
>>>>>>>>>>> while connecting to the server : javax.security.sasl.SaslException: GSS
>>>>>>>>>>> initiate failed [Caused by GSSException: No valid credentials provided
>>>>>>>>>>> (Mechanism level: Failed to find any Kerberos tgt)]
>>>>>>>>>>> 14/06/17 12:25:28 FATAL ipc.RpcClient: SASL authentication
>>>>>>>>>>> failed. The most likely cause is missing or invalid credentials. Consider
>>>>>>>>>>> 'kinit'.
>>>>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused
>>>>>>>>>>> by GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>>>>> find any Kerberos tgt)]
>>>>>>>>>>>  at
>>>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>>>>>>>>>> at
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>>>>>  at
>>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>>>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>>>>>  at
>>>>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>>> at
>>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>>>  at
>>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>>>  at
>>>>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>>>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>>>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>>>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>>>>> Caused by: GSSException: No valid credentials provided
>>>>>>>>>>> (Mechanism level: Failed to find any Kerberos tgt)
>>>>>>>>>>> at
>>>>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>>>>>  at
>>>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>>>>> at
>>>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>>>>>  at
>>>>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>>>>> at
>>>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>>>>>  at
>>>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>>>>> at
>>>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>>>>>  ... 47 more
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Can any of you help me with this problem?
>>>>>>>>>>
>>>>>>>>>> Kind Regards
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Thanks & Regards,
>>>>>>>>> Anil Gupta
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Thanks & Regards,
>>>> Anil Gupta
>>>>
>>>
>>>
>>>
>>> --
>>> Thanks & Regards,
>>> Anil Gupta
>>>
>>
>>
>
>
> --
> Thanks & Regards,
> Anil Gupta
>

Re: Phoenix and Kerberos (No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt))

Posted by anil gupta <an...@gmail.com>.
Nice, thanks for the update. I am assuming you didnt do kinit rather you
provided keytab and principal in phoenix connection string.


On Wed, Jul 2, 2014 at 1:22 AM, Giuseppe Reina <g....@gmail.com> wrote:

> Hi Anil,
>   yes in the end Justin was right, I solved it reordering the classpath.
> Essentially, I tried again finding the right permutation but with
> phoenix-*-client-without-hbase.jar library instead of the
> phoenix-*-client-minimal.jar and including as many jars as possible from
> the hbase, hdfs, and hadoop deployments.
>
> Kind Regards,
>
>
> On Tue, Jul 1, 2014 at 6:16 PM, anil gupta <an...@gmail.com> wrote:
>
>> Hi Guiseppe,
>>
>> I am curious to know whether you were able to connect to  secure
>> HBase0.98 cluster?
>>
>> Thanks,
>> Anil Gupta
>>
>>
>> On Tue, Jun 24, 2014 at 12:18 AM, anil gupta <an...@gmail.com>
>> wrote:
>>
>>> Hi Guiseppe,
>>>
>>> Since you are using HBase0.98 you dont need to try out different
>>> permutation and combination.Just take the standard phoenix-client.jar from
>>> nighlty build and replace the phoenix-client.jar of phoenix-4.0.0
>>> installation. Make sure your all your bin directory scripts are also
>>> defaults scripts. You dont need modification of any script as per this
>>> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
>>> because Phoenix-19 handles connecting to a secure cluster more gracefully.
>>> Also you dont need to do kinit now. Phoenix will handle all the stuff
>>> related to kerberos within Java client.
>>>
>>> Then try running sqlline like this:
>>> sqlline.py( or sqlline.sh) zk:port:hbase_root_dir:keytab_file:principal
>>>
>>> If you run into problems then please share the logs. Also, it would be
>>> helpful if you can turn on debug logs.
>>>
>>> Thanks,
>>> Anil Gupta
>>>
>>>
>>>
>>> On Mon, Jun 23, 2014 at 1:48 AM, Giuseppe Reina <g....@gmail.com>
>>> wrote:
>>>
>>>> Hi Justin,
>>>>   sorry but I tried all the relevant permutations of the jar files in
>>>> the classpath listing explicitly the path of the jars, and still I have the
>>>> same problem.
>>>> Any other ideas?
>>>>
>>>> Kind Regards,
>>>>
>>>>
>>>> On Thu, Jun 19, 2014 at 4:09 PM, Justin Workman <
>>>> justinjworkman@gmail.com> wrote:
>>>>
>>>>> I have had success with Hbase 0.96 from cloudera CDH4.3.0. The order
>>>>> of jars and configs on the classpath played a big part in getting this to
>>>>> work for me. Here is the order we have to use in order to connect and
>>>>> authenticate
>>>>>
>>>>> 1) Path to Hadoop and Hbase configuration files
>>>>> 2) Hbase security jar
>>>>> 3) Hadoop common jar
>>>>> 4) Hadoop auth jar
>>>>> 5) Zookeeper jar
>>>>> 6) Phoenix client jar
>>>>> 7) All other needed libraries
>>>>>
>>>>> For 2-6, list the jars explicitly, not just the path to the containing
>>>>> directories.
>>>>>
>>>>> Hope this helps
>>>>> Justin
>>>>>
>>>>>
>>>>> On Thu, Jun 19, 2014 at 8:39 AM, Giuseppe Reina <g....@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>   unfortunately I'm still having the same problem. I'm using Hadoop 2
>>>>>> and HBase 0.98 so I couldn't use Phoenix 3.0.
>>>>>> I downloaded the sources from the git repository
>>>>>> <https://git-wip-us.apache.org/repos/asf/phoenix.git> and I compiled
>>>>>> the jars.
>>>>>> I had to change the previous java command because I was receiving the
>>>>>> following exception:
>>>>>>
>>>>>> 14/06/19 14:29:23 WARN util.DynamicClassLoader: Failed to identify
>>>>>>> the fs of dir hdfs://zk1.mydomain:8020/apps/hbase/data/lib, ignored
>>>>>>> org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot
>>>>>>> communicate with client version 4
>>>>>>
>>>>>>
>>>>>> So now I launch Phoenix with:
>>>>>>
>>>>>> java -cp
>>>>>>> ".:/usr/lib/phoenix/phoenix-5.0.0-SNAPSHOT-client-minimal.jar:/usr/lib/hbase/lib/htrace-core-2.04.jar:/usr/lib/hbase/lib/hbase-server.jar:jline-2.11.jar:sqlline-1.1.2.jar:/usr/lib/phoenix/phoenix-5.0.0-SNAPSHOT-without-hbase.jar:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>>>>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>>>>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>>>>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>>>>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>>>>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>>> -n none -p none  --fastConnect=false --verbose=true
>>>>>>> --isolation=TRANSACTION_READ_COMMITTED
>>>>>>
>>>>>>
>>>>>> But I still get:
>>>>>>
>>>>>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>>>>>> issuing: !connect
>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>>> none none org.apache.phoenix.jdbc.PhoenixDriver
>>>>>>> Connecting to
>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>>>
>>>>>>> Java config name: null
>>>>>>> Native config name: /etc/krb5.conf
>>>>>>> Loaded from native config
>>>>>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>>>>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>>>>>> >>>DEBUG <CCacheInputStream> server principal is
>>>>>>> krbtgt/MYREALM@MYREALM
>>>>>>> >>>DEBUG <CCacheInputStream> key type: 16
>>>>>>> >>>DEBUG <CCacheInputStream> auth time: Thu Jun 19 10:19:17 UTC 2014
>>>>>>> >>>DEBUG <CCacheInputStream> start time: Thu Jun 19 10:19:20 UTC 2014
>>>>>>> >>>DEBUG <CCacheInputStream> end time: Fri Jun 20 10:19:20 UTC 2014
>>>>>>> >>>DEBUG <CCacheInputStream> renew_till time: Thu Jun 26 10:19:17
>>>>>>> UTC 2014
>>>>>>>
>>>>>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>>
>>>>>>> Service ticket not found in the subject
>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>> >>> KdcAccessibility: reset
>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>> number of retries =3, #bytes=737
>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>> Krb5Context setting mySeqNumber to: 583967039
>>>>>>>
>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>> Created InitSecContextToken:
>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>  ..n..j0..f......
>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>  ................
>>>>>>> [...]
>>>>>>> 0260: AC 98 ED E8 03 97 D4 D7   97 DE C6 0B 22 02 6C 10
>>>>>>>  ............".l.
>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>> 01 04 00 ff ff ff ff b0 33 06 7a d1 cb e3 4b 83 f9 e1 2e d0 92 6c fc 80 74
>>>>>>> 55 f0 14 8a 99 74 da b0 33 4b d2 e1 cc 31 c2 2a 75 2f 01 01 00 00 04 04 04
>>>>>>> 04 ]
>>>>>>>
>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53 54
>>>>>>> 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>> 01 04 00 ff ff ff ff 3c d5 18 05 ab db d2 68 8b 35 a6 72 8a 9c 80 82 84 c3
>>>>>>> 9c 31 b0 df 01 18 e0 6d ea c4 a5 db ff 65 ba 01 c8 71 01 01 00 00 68 64 70
>>>>>>> 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>> Found ticket for myuser@MYREALM to go to
>>>>>>> zookeeper/zk1.mydomain@MYREALM expiring on Fri Jun 20 10:19:20 UTC
>>>>>>> 2014
>>>>>>>
>>>>>>> Service ticket not found in the subject
>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>> number of retries =3, #bytes=737
>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>> Krb5Context setting mySeqNumber to: 594904455
>>>>>>>
>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>> Created InitSecContextToken:
>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>  ..n..j0..f......
>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>  ................
>>>>>>> [...]
>>>>>>> 0260: B7 1F A6 E4 47 38 2E 21   AE 4E 25 A6 8D 57 19 CD
>>>>>>>  ....G8.!.N%..W..
>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>> 01 04 00 ff ff ff ff 09 ed fd 9f ca d8 97 05 3a 2a 64 5c e4 c3 3b e0 71 43
>>>>>>> 01 4e ab 16 2f 5a 7b 31 8b 32 9f 20 9a 47 8a d1 70 0a 01 01 00 00 04 04 04
>>>>>>> 04 ]
>>>>>>>
>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53 54
>>>>>>> 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>> 01 04 00 ff ff ff ff e9 97 40 26 a5 de ee 05 d5 ac 03 42 03 c4 f7 66 11 76
>>>>>>> f4 9e 2b e8 f2 a0 f7 d4 62 68 21 2d da 47 b4 92 c5 12 01 01 00 00 68 64 70
>>>>>>> 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>>> 14/06/19 14:15:44 WARN ipc.RpcClient: Exception encountered while
>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>> 14/06/19 14:15:44 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>>
>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>> find any Kerberos tgt)]
>>>>>>>  at
>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>> at
>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:784)
>>>>>>>  at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1068)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
>>>>>>>  at
>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1170)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>  at
>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
>>>>>>>  at
>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
>>>>>>> at
>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>>  at
>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:941)
>>>>>>>  at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1470)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1436)
>>>>>>>  at
>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1436)
>>>>>>>
>>>>>>>  at
>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>  at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>> at
>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>  at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>  at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>  at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>> at
>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>  at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>> at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>  at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>  at
>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>> at
>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>  at
>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>> at
>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>  at
>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>> at
>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>  at
>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>> ... 54 more
>>>>>>> 14/06/19 14:15:46 WARN ipc.RpcClient: Exception encountered while
>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>> 14/06/19 14:15:46 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>>
>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>> find any Kerberos tgt)]
>>>>>>>  at
>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>> at
>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>> at
>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:784)
>>>>>>>  at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1068)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
>>>>>>>  at
>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1170)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>  at
>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
>>>>>>>  at
>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
>>>>>>> at
>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>>  at
>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:941)
>>>>>>>  at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1470)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1436)
>>>>>>>  at
>>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>> at
>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1436)
>>>>>>>
>>>>>>>  at
>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>> at
>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>  at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>> at
>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>  at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>  at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>> at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>  at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>> at
>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>  at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>> at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>  at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>  at
>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>> at
>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>  at
>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>> at
>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>  at
>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>> at
>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>  at
>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>> ... 54 more
>>>>>>
>>>>>>
>>>>>>
>>>>>> I have my keys in the cache (also renewed with kinit -R). All the
>>>>>> other clients work, just Phoenix is not working. I'm probably missing
>>>>>> something.
>>>>>> Any ideas? Does any of you have any insights?
>>>>>>
>>>>>>
>>>>>> Kind Regards
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 17, 2014 at 3:54 PM, Giuseppe Reina <g....@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Thank you. I'll try that!
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 17, 2014 at 3:50 PM, anil gupta <an...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi Giuseppe,
>>>>>>>>
>>>>>>>> The latest nightly builds of phoenix have the patch for
>>>>>>>> https://issues.apache.org/jira/browse/PHOENIX-19. If you pick up
>>>>>>>> the latest nighly then it would be easier to connect to a secure cluster.
>>>>>>>> You can find the nighly for 3.0 here:
>>>>>>>>
>>>>>>>> https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/
>>>>>>>>
>>>>>>>> If you are using HBase098 then try out Phoenix4.0 nightly. Let me
>>>>>>>> know if you need further help.
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Anil Gupta
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Jun 17, 2014 at 6:35 AM, Giuseppe Reina <g....@gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hi all,
>>>>>>>>>   I'm trying to make Phoenix work with HBase and Kerberos but so
>>>>>>>>> far I got no luck. I'm currently using HDP 2.1 on Centos 6.5 and following
>>>>>>>>> this guide as reference:
>>>>>>>>> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
>>>>>>>>> I'm able to use mainly all the Hadoop services (MapReduce,
>>>>>>>>> Zookeeper, HBase,...) using my user but not Phoenix (note I granted RWCA
>>>>>>>>> permissions to my user on hbase).
>>>>>>>>>
>>>>>>>>> I don't see any problems with my TGT
>>>>>>>>>
>>>>>>>>> [myuser@zk1.mydomain ~]$ klist -fae
>>>>>>>>>> Ticket cache: FILE:/tmp/krb5cc_501
>>>>>>>>>> Default principal: myuser@MYREALM
>>>>>>>>>> Valid starting     Expires            Service principal
>>>>>>>>>> 06/17/14 10:58:50  06/18/14 10:58:50  krbtgt/MYREALM@MYREALM
>>>>>>>>>>  renew until 06/24/14 10:58:33, Flags: FRIT
>>>>>>>>>> Etype (skey, tkt): des3-cbc-sha1, des3-cbc-sha1
>>>>>>>>>> Addresses: (none)
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> But when I launch Phoenix sqlline with the krb5 debug on using the
>>>>>>>>> following command:
>>>>>>>>>
>>>>>>>>>> [myuser@zk1.mydomain ~]$ java -cp
>>>>>>>>>> ".:/usr/lib/phoenix/*:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>>>>>>>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>>>>>>>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>>>>>>>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>>>>>>>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>>>>>>>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure -n myuser
>>>>>>>>>>  --fastConnect=false --verbose=true --isolation=TRANSACTION_READ_COMMITTED
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> I get the following error:
>>>>>>>>>
>>>>>>>>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>>>>>>>>> issuing: !connect
>>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure myuser ''
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver
>>>>>>>>>> Connecting to
>>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure
>>>>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>>>>> SLF4J: Found binding in
>>>>>>>>>> [jar:file:/usr/lib/phoenix/phoenix-4.0.0.2.1.2.0-402-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>>> SLF4J: Found binding in
>>>>>>>>>> [jar:file:/usr/lib/phoenix/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>>> SLF4J: Found binding in
>>>>>>>>>> [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for
>>>>>>>>>> an explanation.
>>>>>>>>>> Java config name: null
>>>>>>>>>> Native config name: /etc/krb5.conf
>>>>>>>>>> Loaded from native config
>>>>>>>>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>>>>>>>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>>>>>>>>> >>>DEBUG <CCacheInputStream> server principal is
>>>>>>>>>> krbtgt/MYREALM@MYREALM
>>>>>>>>>> >>>DEBUG <CCacheInputStream> key type: 16
>>>>>>>>>> >>>DEBUG <CCacheInputStream> auth time: Tue Jun 17 10:58:33 UTC
>>>>>>>>>> 2014
>>>>>>>>>> >>>DEBUG <CCacheInputStream> start time: Tue Jun 17 10:58:50 UTC
>>>>>>>>>> 2014
>>>>>>>>>> >>>DEBUG <CCacheInputStream> end time: Wed Jun 18 10:58:50 UTC
>>>>>>>>>> 2014
>>>>>>>>>> >>>DEBUG <CCacheInputStream> renew_till time: Tue Jun 24 10:58:33
>>>>>>>>>> UTC 2014
>>>>>>>>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE;
>>>>>>>>>> INITIAL;
>>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>>> Service ticket not found in the subject
>>>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>>>> >>> EType:
>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>> >>> KdcAccessibility: reset
>>>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>>>> number of retries =3, #bytes=737
>>>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>>>> >>> EType:
>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>>>> >>> EType:
>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>> Krb5Context setting mySeqNumber to: 595457406
>>>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>>>> Created InitSecContextToken:
>>>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>>>  ..n..j0..f......
>>>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>>>  ................
>>>>>>>>>> [...]
>>>>>>>>>> 0260: 7A 66 8D 83 5C 76 84 2E   09 6B E4 7E 3C 6C 7A 3A
>>>>>>>>>>  zf..\v...k..<lz:
>>>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>>>> 02 01 04 00 ff ff ff ff 7b 14 41 17 41 6a d6 72 f2 55 a5 2f d1 95 c3 99 30
>>>>>>>>>> 8f 00 95 9e 1a 23 b6 4b b5 5d 89 6e f5 b4 e6 5a 50 1d d3 01 01 00 00 04 04
>>>>>>>>>> 04 04 ]
>>>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53
>>>>>>>>>> 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>>>> 02 01 04 00 ff ff ff ff 17 ec 99 7b 96 4e 05 41 26 5e 0b b5 b9 c6 5e c8 52
>>>>>>>>>> 9b 14 69 d1 43 7a fa bc 4b 75 fe 49 61 2b 99 52 13 c7 9d 01 01 00 00 68 64
>>>>>>>>>> 70 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03
>>>>>>>>>> 03 ]
>>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>>> Found ticket for myuser@MYREALM to go to
>>>>>>>>>> zookeeper/zk1.mydomain@MYREALM expiring on Wed Jun 18 10:58:50
>>>>>>>>>> UTC 2014
>>>>>>>>>> Service ticket not found in the subject
>>>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>>>> >>> EType:
>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>>>> number of retries =3, #bytes=737
>>>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>>>> >>> EType:
>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>>>> >>> EType:
>>>>>>>>>> sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>>> Krb5Context setting mySeqNumber to: 284225265
>>>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>>>> Created InitSecContextToken:
>>>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>>>  ..n..j0..f......
>>>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>>>  ................
>>>>>>>>>> [...]
>>>>>>>>>> 0260: 76 30 32 5D 70 32 BA 6F   1F E0 C7 8F 9C B4 24 73
>>>>>>>>>>  v02]p2.o......$s
>>>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>>>> 02 01 04 00 ff ff ff ff 24 63 c2 63 f1 09 e4 a1 d9 8e 56 77 35 a4 c3 76 11
>>>>>>>>>> 77 d7 30 a9 6b 15 4d ee d7 2d 5c 80 e8 28 1d 2a 75 ac 1c 01 01 00 00 04 04
>>>>>>>>>> 04 04 ]
>>>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53
>>>>>>>>>> 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>>>> 02 01 04 00 ff ff ff ff 55 15 d9 ab fb 0f ac 4e a1 1f 2e 0b 89 ca 61 a0 5a
>>>>>>>>>> d3 4e f6 af 30 4f 6d 8f ad 2d 0c 9d b7 c4 be a7 b2 ac b5 01 01 00 00 68 64
>>>>>>>>>> 70 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03
>>>>>>>>>> 03 ]
>>>>>>>>>> 14/06/17 12:25:26 WARN ipc.RpcClient: Exception encountered while
>>>>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>>>>> 14/06/17 12:25:26 FATAL ipc.RpcClient: SASL authentication
>>>>>>>>>> failed. The most likely cause is missing or invalid credentials. Consider
>>>>>>>>>> 'kinit'.
>>>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>>>> find any Kerberos tgt)]
>>>>>>>>>>  at
>>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>>>>  at
>>>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>>  at
>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>>  at
>>>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>>>> at
>>>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>>>>  at
>>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>>>> at
>>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>>>>  at
>>>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>>>> at
>>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>>>>  at
>>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>>>> at
>>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>>>>  ... 47 more
>>>>>>>>>> 14/06/17 12:25:28 WARN ipc.RpcClient: Exception encountered while
>>>>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>>>>> 14/06/17 12:25:28 FATAL ipc.RpcClient: SASL authentication
>>>>>>>>>> failed. The most likely cause is missing or invalid credentials. Consider
>>>>>>>>>> 'kinit'.
>>>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>>>> find any Kerberos tgt)]
>>>>>>>>>>  at
>>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>>>>>>>>> at
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>>>>  at
>>>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>>> at
>>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>>  at
>>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>>  at
>>>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>>>> at
>>>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>>>>  at
>>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>>>> at
>>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>>>>  at
>>>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>>>> at
>>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>>>>  at
>>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>>>> at
>>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>>>>  ... 47 more
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Can any of you help me with this problem?
>>>>>>>>>
>>>>>>>>> Kind Regards
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> Thanks & Regards,
>>>>>>>> Anil Gupta
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Thanks & Regards,
>>> Anil Gupta
>>>
>>
>>
>>
>> --
>> Thanks & Regards,
>> Anil Gupta
>>
>
>


-- 
Thanks & Regards,
Anil Gupta

Re: Phoenix and Kerberos (No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt))

Posted by Giuseppe Reina <g....@gmail.com>.
Hi Anil,
  yes in the end Justin was right, I solved it reordering the classpath.
Essentially, I tried again finding the right permutation but with
phoenix-*-client-without-hbase.jar library instead of the
phoenix-*-client-minimal.jar and including as many jars as possible from
the hbase, hdfs, and hadoop deployments.

Kind Regards,


On Tue, Jul 1, 2014 at 6:16 PM, anil gupta <an...@gmail.com> wrote:

> Hi Guiseppe,
>
> I am curious to know whether you were able to connect to  secure HBase0.98
> cluster?
>
> Thanks,
> Anil Gupta
>
>
> On Tue, Jun 24, 2014 at 12:18 AM, anil gupta <an...@gmail.com>
> wrote:
>
>> Hi Guiseppe,
>>
>> Since you are using HBase0.98 you dont need to try out different
>> permutation and combination.Just take the standard phoenix-client.jar from
>> nighlty build and replace the phoenix-client.jar of phoenix-4.0.0
>> installation. Make sure your all your bin directory scripts are also
>> defaults scripts. You dont need modification of any script as per this
>> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
>> because Phoenix-19 handles connecting to a secure cluster more gracefully.
>> Also you dont need to do kinit now. Phoenix will handle all the stuff
>> related to kerberos within Java client.
>>
>> Then try running sqlline like this:
>> sqlline.py( or sqlline.sh) zk:port:hbase_root_dir:keytab_file:principal
>>
>> If you run into problems then please share the logs. Also, it would be
>> helpful if you can turn on debug logs.
>>
>> Thanks,
>> Anil Gupta
>>
>>
>>
>> On Mon, Jun 23, 2014 at 1:48 AM, Giuseppe Reina <g....@gmail.com>
>> wrote:
>>
>>> Hi Justin,
>>>   sorry but I tried all the relevant permutations of the jar files in
>>> the classpath listing explicitly the path of the jars, and still I have the
>>> same problem.
>>> Any other ideas?
>>>
>>> Kind Regards,
>>>
>>>
>>> On Thu, Jun 19, 2014 at 4:09 PM, Justin Workman <
>>> justinjworkman@gmail.com> wrote:
>>>
>>>> I have had success with Hbase 0.96 from cloudera CDH4.3.0. The order of
>>>> jars and configs on the classpath played a big part in getting this to work
>>>> for me. Here is the order we have to use in order to connect and
>>>> authenticate
>>>>
>>>> 1) Path to Hadoop and Hbase configuration files
>>>> 2) Hbase security jar
>>>> 3) Hadoop common jar
>>>> 4) Hadoop auth jar
>>>> 5) Zookeeper jar
>>>> 6) Phoenix client jar
>>>> 7) All other needed libraries
>>>>
>>>> For 2-6, list the jars explicitly, not just the path to the containing
>>>> directories.
>>>>
>>>> Hope this helps
>>>> Justin
>>>>
>>>>
>>>> On Thu, Jun 19, 2014 at 8:39 AM, Giuseppe Reina <g....@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi,
>>>>>   unfortunately I'm still having the same problem. I'm using Hadoop 2
>>>>> and HBase 0.98 so I couldn't use Phoenix 3.0.
>>>>> I downloaded the sources from the git repository
>>>>> <https://git-wip-us.apache.org/repos/asf/phoenix.git> and I compiled
>>>>> the jars.
>>>>> I had to change the previous java command because I was receiving the
>>>>> following exception:
>>>>>
>>>>> 14/06/19 14:29:23 WARN util.DynamicClassLoader: Failed to identify the
>>>>>> fs of dir hdfs://zk1.mydomain:8020/apps/hbase/data/lib, ignored
>>>>>> org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot
>>>>>> communicate with client version 4
>>>>>
>>>>>
>>>>> So now I launch Phoenix with:
>>>>>
>>>>> java -cp
>>>>>> ".:/usr/lib/phoenix/phoenix-5.0.0-SNAPSHOT-client-minimal.jar:/usr/lib/hbase/lib/htrace-core-2.04.jar:/usr/lib/hbase/lib/hbase-server.jar:jline-2.11.jar:sqlline-1.1.2.jar:/usr/lib/phoenix/phoenix-5.0.0-SNAPSHOT-without-hbase.jar:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>>>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>>>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>>>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>>>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>>>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>> -n none -p none  --fastConnect=false --verbose=true
>>>>>> --isolation=TRANSACTION_READ_COMMITTED
>>>>>
>>>>>
>>>>> But I still get:
>>>>>
>>>>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>>>>> issuing: !connect
>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>> none none org.apache.phoenix.jdbc.PhoenixDriver
>>>>>> Connecting to
>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure:/etc/security/keytabs/myuser.headless.keytab
>>>>>>
>>>>>> Java config name: null
>>>>>> Native config name: /etc/krb5.conf
>>>>>> Loaded from native config
>>>>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>>>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>>>>> >>>DEBUG <CCacheInputStream> server principal is
>>>>>> krbtgt/MYREALM@MYREALM
>>>>>> >>>DEBUG <CCacheInputStream> key type: 16
>>>>>> >>>DEBUG <CCacheInputStream> auth time: Thu Jun 19 10:19:17 UTC 2014
>>>>>> >>>DEBUG <CCacheInputStream> start time: Thu Jun 19 10:19:20 UTC 2014
>>>>>> >>>DEBUG <CCacheInputStream> end time: Fri Jun 20 10:19:20 UTC 2014
>>>>>> >>>DEBUG <CCacheInputStream> renew_till time: Thu Jun 26 10:19:17 UTC
>>>>>> 2014
>>>>>>
>>>>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>>
>>>>>> Service ticket not found in the subject
>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>> >>> KdcAccessibility: reset
>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>> number of retries =3, #bytes=737
>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>> Krb5Context setting mySeqNumber to: 583967039
>>>>>>
>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>> Created InitSecContextToken:
>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>  ..n..j0..f......
>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>  ................
>>>>>> [...]
>>>>>> 0260: AC 98 ED E8 03 97 D4 D7   97 DE C6 0B 22 02 6C 10
>>>>>>  ............".l.
>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>> 01 04 00 ff ff ff ff b0 33 06 7a d1 cb e3 4b 83 f9 e1 2e d0 92 6c fc 80 74
>>>>>> 55 f0 14 8a 99 74 da b0 33 4b d2 e1 cc 31 c2 2a 75 2f 01 01 00 00 04 04 04
>>>>>> 04 ]
>>>>>>
>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53 54
>>>>>> 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02 01
>>>>>> 04 00 ff ff ff ff 3c d5 18 05 ab db d2 68 8b 35 a6 72 8a 9c 80 82 84 c3 9c
>>>>>> 31 b0 df 01 18 e0 6d ea c4 a5 db ff 65 ba 01 c8 71 01 01 00 00 68 64 70 2d
>>>>>> 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>> expiring on Fri Jun 20 10:19:20 UTC 2014
>>>>>> Found ticket for myuser@MYREALM to go to
>>>>>> zookeeper/zk1.mydomain@MYREALM expiring on Fri Jun 20 10:19:20 UTC
>>>>>> 2014
>>>>>>
>>>>>> Service ticket not found in the subject
>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>> number of retries =3, #bytes=737
>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>> Krb5Context setting mySeqNumber to: 594904455
>>>>>>
>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>> Created InitSecContextToken:
>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>  ..n..j0..f......
>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>  ................
>>>>>> [...]
>>>>>> 0260: B7 1F A6 E4 47 38 2E 21   AE 4E 25 A6 8D 57 19 CD
>>>>>>  ....G8.!.N%..W..
>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>> 01 04 00 ff ff ff ff 09 ed fd 9f ca d8 97 05 3a 2a 64 5c e4 c3 3b e0 71 43
>>>>>> 01 4e ab 16 2f 5a 7b 31 8b 32 9f 20 9a 47 8a d1 70 0a 01 01 00 00 04 04 04
>>>>>> 04 ]
>>>>>>
>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53 54
>>>>>> 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02 01
>>>>>> 04 00 ff ff ff ff e9 97 40 26 a5 de ee 05 d5 ac 03 42 03 c4 f7 66 11 76 f4
>>>>>> 9e 2b e8 f2 a0 f7 d4 62 68 21 2d da 47 b4 92 c5 12 01 01 00 00 68 64 70 2d
>>>>>> 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>> 14/06/19 14:15:44 WARN ipc.RpcClient: Exception encountered while
>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>> 14/06/19 14:15:44 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>
>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>> find any Kerberos tgt)]
>>>>>>  at
>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>> at
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>> at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:784)
>>>>>>  at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1068)
>>>>>> at
>>>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
>>>>>>  at
>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1170)
>>>>>>
>>>>>> at
>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>  at
>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>> at
>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
>>>>>>  at
>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
>>>>>> at
>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>  at
>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
>>>>>> at
>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:941)
>>>>>>  at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1470)
>>>>>> at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1436)
>>>>>>  at
>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>> at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1436)
>>>>>>
>>>>>>  at
>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>> at
>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>  at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>> at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>  at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>  at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>  at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>> at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>  at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>> at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>  at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>  at
>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>> at
>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>  at
>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>> at
>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>  at
>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>> at
>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>  at
>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>> ... 54 more
>>>>>> 14/06/19 14:15:46 WARN ipc.RpcClient: Exception encountered while
>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>> 14/06/19 14:15:46 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>
>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>> find any Kerberos tgt)]
>>>>>>  at
>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>> at
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>> at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>  at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>> at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:784)
>>>>>>  at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1068)
>>>>>> at
>>>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:114)
>>>>>>  at
>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1170)
>>>>>>
>>>>>> at
>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>  at
>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>> at
>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:246)
>>>>>>  at
>>>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:237)
>>>>>> at
>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>>  at
>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:236)
>>>>>> at
>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:941)
>>>>>>  at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1470)
>>>>>> at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$9.call(ConnectionQueryServicesImpl.java:1436)
>>>>>>  at
>>>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
>>>>>> at
>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1436)
>>>>>>
>>>>>>  at
>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>> at
>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>  at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>> at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>  at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>  at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>> at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>  at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>> at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>  at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>> at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>  at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>  at
>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>> at
>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>  at
>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>> at
>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>  at
>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>> at
>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>  at
>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>> ... 54 more
>>>>>
>>>>>
>>>>>
>>>>> I have my keys in the cache (also renewed with kinit -R). All the
>>>>> other clients work, just Phoenix is not working. I'm probably missing
>>>>> something.
>>>>> Any ideas? Does any of you have any insights?
>>>>>
>>>>>
>>>>> Kind Regards
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Tue, Jun 17, 2014 at 3:54 PM, Giuseppe Reina <g....@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Thank you. I'll try that!
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 17, 2014 at 3:50 PM, anil gupta <an...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi Giuseppe,
>>>>>>>
>>>>>>> The latest nightly builds of phoenix have the patch for
>>>>>>> https://issues.apache.org/jira/browse/PHOENIX-19. If you pick up
>>>>>>> the latest nighly then it would be easier to connect to a secure cluster.
>>>>>>> You can find the nighly for 3.0 here:
>>>>>>>
>>>>>>> https://builds.apache.org/job/Phoenix-3.0-hadoop1/lastSuccessfulBuild/artifact/
>>>>>>>
>>>>>>> If you are using HBase098 then try out Phoenix4.0 nightly. Let me
>>>>>>> know if you need further help.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Anil Gupta
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 17, 2014 at 6:35 AM, Giuseppe Reina <g....@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi all,
>>>>>>>>   I'm trying to make Phoenix work with HBase and Kerberos but so
>>>>>>>> far I got no luck. I'm currently using HDP 2.1 on Centos 6.5 and following
>>>>>>>> this guide as reference:
>>>>>>>> http://bigdatanoob.blogspot.co.uk/2013/09/connect-phoenix-to-secure-hbase-cluster.html
>>>>>>>> I'm able to use mainly all the Hadoop services (MapReduce,
>>>>>>>> Zookeeper, HBase,...) using my user but not Phoenix (note I granted RWCA
>>>>>>>> permissions to my user on hbase).
>>>>>>>>
>>>>>>>> I don't see any problems with my TGT
>>>>>>>>
>>>>>>>> [myuser@zk1.mydomain ~]$ klist -fae
>>>>>>>>> Ticket cache: FILE:/tmp/krb5cc_501
>>>>>>>>> Default principal: myuser@MYREALM
>>>>>>>>> Valid starting     Expires            Service principal
>>>>>>>>> 06/17/14 10:58:50  06/18/14 10:58:50  krbtgt/MYREALM@MYREALM
>>>>>>>>>  renew until 06/24/14 10:58:33, Flags: FRIT
>>>>>>>>> Etype (skey, tkt): des3-cbc-sha1, des3-cbc-sha1
>>>>>>>>> Addresses: (none)
>>>>>>>>
>>>>>>>>
>>>>>>>> But when I launch Phoenix sqlline with the krb5 debug on using the
>>>>>>>> following command:
>>>>>>>>
>>>>>>>>> [myuser@zk1.mydomain ~]$ java -cp
>>>>>>>>> ".:/usr/lib/phoenix/*:/usr/lib/phoenix/lib/*:/usr/lib/hadoop/client/*:/etc/hbase/conf.dist/:/etc/hbase/conf/:/etc/hadoop/conf.dist/:/etc/hbase/conf/:/usr/lib/hbase/*"
>>>>>>>>> -Djavax.net.debug=ssl -Dsun.security.krb5.debug=true
>>>>>>>>> -Djava.security.auth.login.config=/etc/hbase/conf/hbase_client_jaas.conf
>>>>>>>>> -Djava.library.path=/usr/lib/hadoop/lib/native/
>>>>>>>>> -Dlog4j.configuration=file:/usr/lib/phoenix/bin/log4j.properties
>>>>>>>>> sqlline.SqlLine -d org.apache.phoenix.jdbc.PhoenixDriver -u
>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure -n myuser
>>>>>>>>>  --fastConnect=false --verbose=true --isolation=TRANSACTION_READ_COMMITTED
>>>>>>>>
>>>>>>>>
>>>>>>>> I get the following error:
>>>>>>>>
>>>>>>>> Setting property: [isolation, TRANSACTION_READ_COMMITTED]
>>>>>>>>> issuing: !connect
>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure myuser ''
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver
>>>>>>>>> Connecting to
>>>>>>>>> jdbc:phoenix:zk1.mydomain,zk2.mydomain,zk3.mydomain:/hbase-secure
>>>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>>>> SLF4J: Found binding in
>>>>>>>>> [jar:file:/usr/lib/phoenix/phoenix-4.0.0.2.1.2.0-402-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>> SLF4J: Found binding in
>>>>>>>>> [jar:file:/usr/lib/phoenix/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>> SLF4J: Found binding in
>>>>>>>>> [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for
>>>>>>>>> an explanation.
>>>>>>>>> Java config name: null
>>>>>>>>> Native config name: /etc/krb5.conf
>>>>>>>>> Loaded from native config
>>>>>>>>> >>>KinitOptions cache name is /tmp/krb5cc_501
>>>>>>>>> >>>DEBUG <CCacheInputStream>  client principal is myuser@MYREALM
>>>>>>>>> >>>DEBUG <CCacheInputStream> server principal is
>>>>>>>>> krbtgt/MYREALM@MYREALM
>>>>>>>>> >>>DEBUG <CCacheInputStream> key type: 16
>>>>>>>>> >>>DEBUG <CCacheInputStream> auth time: Tue Jun 17 10:58:33 UTC
>>>>>>>>> 2014
>>>>>>>>> >>>DEBUG <CCacheInputStream> start time: Tue Jun 17 10:58:50 UTC
>>>>>>>>> 2014
>>>>>>>>> >>>DEBUG <CCacheInputStream> end time: Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>> >>>DEBUG <CCacheInputStream> renew_till time: Tue Jun 24 10:58:33
>>>>>>>>> UTC 2014
>>>>>>>>> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE;
>>>>>>>>> INITIAL;
>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>> Service ticket not found in the subject
>>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>> >>> KdcAccessibility: reset
>>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>>> number of retries =3, #bytes=737
>>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>> Krb5Context setting mySeqNumber to: 595457406
>>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>>> Created InitSecContextToken:
>>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>>  ..n..j0..f......
>>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>>  ................
>>>>>>>>> [...]
>>>>>>>>> 0260: 7A 66 8D 83 5C 76 84 2E   09 6B E4 7E 3C 6C 7A 3A
>>>>>>>>>  zf..\v...k..<lz:
>>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>>> 02 01 04 00 ff ff ff ff 7b 14 41 17 41 6a d6 72 f2 55 a5 2f d1 95 c3 99 30
>>>>>>>>> 8f 00 95 9e 1a 23 b6 4b b5 5d 89 6e f5 b4 e6 5a 50 1d d3 01 01 00 00 04 04
>>>>>>>>> 04 04 ]
>>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53
>>>>>>>>> 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>>>> 01 04 00 ff ff ff ff 17 ec 99 7b 96 4e 05 41 26 5e 0b b5 b9 c6 5e c8 52 9b
>>>>>>>>> 14 69 d1 43 7a fa bc 4b 75 fe 49 61 2b 99 52 13 c7 9d 01 01 00 00 68 64 70
>>>>>>>>> 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>> Entered Krb5Context.initSecContext with state=STATE_NEW
>>>>>>>>> Found ticket for myuser@MYREALM to go to krbtgt/MYREALM@MYREALM
>>>>>>>>> expiring on Wed Jun 18 10:58:50 UTC 2014
>>>>>>>>> Found ticket for myuser@MYREALM to go to
>>>>>>>>> zookeeper/zk1.mydomain@MYREALM expiring on Wed Jun 18 10:58:50
>>>>>>>>> UTC 2014
>>>>>>>>> Service ticket not found in the subject
>>>>>>>>> >>> Credentials acquireServiceCreds: same realm
>>>>>>>>> default etypes for default_tgs_enctypes: 16 1 3.
>>>>>>>>> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>> >>> KrbKdcReq send: kdc=kerberos.mydomain UDP:88, timeout=30000,
>>>>>>>>> number of retries =3, #bytes=737
>>>>>>>>> >>> KDCCommunication: kdc=kerberos.mydomain UDP:88,
>>>>>>>>> timeout=30000,Attempt =1, #bytes=737
>>>>>>>>> >>> KrbKdcReq send: #bytes read=714
>>>>>>>>> >>> KdcAccessibility: remove kerberos.mydomain
>>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>> >>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>>>>>>>> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>>>>>>>> Krb5Context setting mySeqNumber to: 284225265
>>>>>>>>> Krb5Context setting peerSeqNumber to: 0
>>>>>>>>> Created InitSecContextToken:
>>>>>>>>> 0000: 01 00 6E 82 02 6A 30 82   02 66 A0 03 02 01 05 A1
>>>>>>>>>  ..n..j0..f......
>>>>>>>>> 0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 82 01
>>>>>>>>>  ................
>>>>>>>>> [...]
>>>>>>>>> 0260: 76 30 32 5D 70 32 BA 6F   1F E0 C7 8F 9C B4 24 73
>>>>>>>>>  v02]p2.o......$s
>>>>>>>>> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02
>>>>>>>>> 02 01 04 00 ff ff ff ff 24 63 c2 63 f1 09 e4 a1 d9 8e 56 77 35 a4 c3 76 11
>>>>>>>>> 77 d7 30 a9 6b 15 4d ee d7 2d 5c 80 e8 28 1d 2a 75 ac 1c 01 01 00 00 04 04
>>>>>>>>> 04 04 ]
>>>>>>>>> Krb5Context.unwrap: data=[01 01 00 00 ]
>>>>>>>>> Krb5Context.wrap: data=[01 01 00 00 68 64 70 2d 75 73 65 72 40 53
>>>>>>>>> 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c ]
>>>>>>>>> Krb5Context.wrap: token=[60 57 06 09 2a 86 48 86 f7 12 01 02 02 02
>>>>>>>>> 01 04 00 ff ff ff ff 55 15 d9 ab fb 0f ac 4e a1 1f 2e 0b 89 ca 61 a0 5a d3
>>>>>>>>> 4e f6 af 30 4f 6d 8f ad 2d 0c 9d b7 c4 be a7 b2 ac b5 01 01 00 00 68 64 70
>>>>>>>>> 2d 75 73 65 72 40 53 54 2d 50 4f 43 31 2e 53 50 2e 4c 4f 43 41 4c 03 03 03 ]
>>>>>>>>> 14/06/17 12:25:26 WARN ipc.RpcClient: Exception encountered while
>>>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>>>> 14/06/17 12:25:26 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>>> find any Kerberos tgt)]
>>>>>>>>>  at
>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>>>  at
>>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>  at
>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>  at
>>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>>> at
>>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>>>  at
>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>>> at
>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>>>  at
>>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>>> at
>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>>>  at
>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>>> at
>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>>>  ... 47 more
>>>>>>>>> 14/06/17 12:25:28 WARN ipc.RpcClient: Exception encountered while
>>>>>>>>> connecting to the server : javax.security.sasl.SaslException: GSS initiate
>>>>>>>>> failed [Caused by GSSException: No valid credentials provided (Mechanism
>>>>>>>>> level: Failed to find any Kerberos tgt)]
>>>>>>>>> 14/06/17 12:25:28 FATAL ipc.RpcClient: SASL authentication failed.
>>>>>>>>> The most likely cause is missing or invalid credentials. Consider 'kinit'.
>>>>>>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>>>>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>>>>>>> find any Kerberos tgt)]
>>>>>>>>>  at
>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
>>>>>>>>>  at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
>>>>>>>>> at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:40216)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1644)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1553)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1579)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1633)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1841)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getHTableDescriptor(ConnectionManager.java:2559)
>>>>>>>>> at
>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:396)
>>>>>>>>>  at
>>>>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:401)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:773)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>>>>>>>> at
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>>>>>>>>  at
>>>>>>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>>>>>>>> at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
>>>>>>>>>  at
>>>>>>>>> sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
>>>>>>>>> at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
>>>>>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>  at
>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>  at
>>>>>>>>> sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
>>>>>>>>> at sqlline.SqlLine.dispatch(SqlLine.java:817)
>>>>>>>>> at sqlline.SqlLine.initArgs(SqlLine.java:633)
>>>>>>>>>  at sqlline.SqlLine.begin(SqlLine.java:680)
>>>>>>>>> at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
>>>>>>>>> at sqlline.SqlLine.main(SqlLine.java:424)
>>>>>>>>> Caused by: GSSException: No valid credentials provided (Mechanism
>>>>>>>>> level: Failed to find any Kerberos tgt)
>>>>>>>>> at
>>>>>>>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>>>>>>  at
>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>>>>>> at
>>>>>>>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>>>>>>  at
>>>>>>>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>>>>>> at
>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>>>>>>  at
>>>>>>>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>>>>>> at
>>>>>>>>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>>>>>>  ... 47 more
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Can any of you help me with this problem?
>>>>>>>>
>>>>>>>> Kind Regards
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Thanks & Regards,
>>>>>>> Anil Gupta
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> Thanks & Regards,
>> Anil Gupta
>>
>
>
>
> --
> Thanks & Regards,
> Anil Gupta
>