You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Maria <li...@126.com> on 2016/07/02 09:52:51 UTC

How to access linux kerberosed hive from windows eclipse workspace?

Hi,all:
     recently,I  attempted to access Kerberized hadoop cluster by launching JAVA applications from Windows workstations. And I hava configured kerberos in my windows7, and can successfully access hdfs50070. But when I launch JDBC from windows to connection remote hiveserver,errors accured:
java.sql.SQLException:could not open client transport with JDBC Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS initiate failed
     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
     at java.sql.DriverManager.getConnection(Unknown Source)
     at java.sql.DriverManager.getConnection(Unknown Source)
     at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
Caused by: org.apache.thrift.transport.TTransportException:GSS initiate failed
     at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
     at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
     at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Unknow source)
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
     at  org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
... 5 more 
------------------------------------------------------------------------------
below are my test codes:

public static void main(String[] args) {
    String principal = "hive/hm@HADOOM.COM"; 
    String keytab = "E:\\Program Files (x86)\\java\\jre7\\lib\\security\\hive.keytab";
    String url = "jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM";

    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
    conf.addResource(new File("core-site.xml").toURI().toURL());
    conf.addResource(new File("yarn-site.xml").toURI().toURL());
    conf.addResource(new File("hive-site.xml").toURI().toURL());
    
    conf.set("hadoop.security.authentication", "Kerberos");
    UserGroupInformation.setConfiguration(conf); 
    UserGroupInformation.loginUserFromKeytab(principal, keytab);

    Class.forName("org.apache.hive.,jdbc.HiveDriver");
    Connection conn =DriverManager.getConnection(url);

    Statement stmt = conn.createStatement();
    String sql = "select * from testkerberos";
    ResultSet rs = stmt.executeQuery(sql);
    while (rs.next()) {
       system.out.println(rs.getString(1));
    }
} 

Does anyone had the same problem? Or know how to solve it ?

Thanks in advance.

Maria.

Re: Re:Re: Re:Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Aviral Agarwal <av...@gmail.com>.
Glad to help :)
On 5 Jul 2016 8:26 a.m., "Vivek Shrivastava" <vi...@gmail.com>
wrote:

> Good to know Maria.
>
> On Mon, Jul 4, 2016 at 10:46 PM, Maria <li...@126.com> wrote:
>
>>
>> I did it! "KrbException: Clock skew too great (37) - PROCESS_TGS" means
>> my windows clock is not synchronized with the kerberos server clock.
>> After I do synchronized between windows and linux kerberos server. Every
>> thing goes well.
>>
>> I am so grateful  to you two.(^_^)
>>
>> Maria.
>>
>> At 2016-07-05 09:59:04, "Maria" <li...@126.com> wrote:
>> >
>> >Yup,yesterday I started to realize that The renewal is a  principal
>> level setting. I hava fixed renew time in KDC kdc.conf. Do as Aviral said,
>> I enable kerberos logs with
>> >    "-Dsun.security.krb5.debug=true" , more error info printed out:
>>
>> >------------------------------------------------------------------------------------------------
>> >Java config name: E:\Program Files (x86)\Java\jre7\lib\security\krb5.conf
>> >Loaded from Java config
>> >Java config name: E:\Program Files (x86)\Java\jre7\lib\security\krb5.conf
>> >Loaded from Java config
>> >>>> KdcAccessibility: reset
>> >>>> KdcAccessibility: reset
>> >>>> KeyTabInputStream, readName(): HADOOP.COM
>> >>>> KeyTabInputStream, readName(): hive
>> >>>> KeyTabInputStream, readName(): hm
>> >>>> KeyTab: load() entry length: 69; type: 18
>> >>>> KeyTabInputStream, readName(): HADOOP.COM
>> >>>> KeyTabInputStream, readName(): hive
>> >>>> KeyTabInputStream, readName(): hm
>> >>>> KeyTab: load() entry length: 53; type: 17
>> >>>> KeyTabInputStream, readName(): HADOOP.COM
>> >>>> KeyTabInputStream, readName(): hive
>> >>>> KeyTabInputStream, readName(): hm
>> >>>> KeyTab: load() entry length: 61; type: 16
>> >>>> KeyTabInputStream, readName(): HADOOP.COM
>> >>>> KeyTabInputStream, readName(): hive
>> >>>> KeyTabInputStream, readName(): hm
>> >>>> KeyTab: load() entry length: 53; type: 23
>> >>>> KeyTabInputStream, readName(): HADOOP.COM
>> >>>> KeyTabInputStream, readName(): hive
>> >>>> KeyTabInputStream, readName(): hm
>> >>>> KeyTab: load() entry length: 45; type: 8
>> >>>> KeyTabInputStream, readName(): HADOOP.COM
>> >>>> KeyTabInputStream, readName(): hive
>> >>>> KeyTabInputStream, readName(): hm
>> >>>> KeyTab: load() entry length: 45; type: 3
>> >Added key: 3version: 1
>> >Found unsupported keytype (8) for hive/hm@HADOOP.COM
>> >Added key: 23version: 1
>> >Added key: 16version: 1
>> >Added key: 17version: 1
>> >Found unsupported keytype (18) for hive/hm@HADOOP.COM
>> >Ordering keys wrt default_tkt_enctypes list
>> >Using builtin default etypes for default_tkt_enctypes
>> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
>> >Added key: 3version: 1
>> >Found unsupported keytype (8) for hive/hm@HADOOP.COM
>> >Added key: 23version: 1
>> >Added key: 16version: 1
>> >Added key: 17version: 1
>> >Found unsupported keytype (18) for hive/hm@HADOOP.COM
>> >Ordering keys wrt default_tkt_enctypes list
>> >Using builtin default etypes for default_tkt_enctypes
>> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
>> >Using builtin default etypes for default_tkt_enctypes
>> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
>> >>>> KrbAsReq creating message
>> >>>> KrbKdcReq send: kdc=hm UDP:88, timeout=30000, number of retries =3,
>> #bytes=145
>> >>>> KDCCommunication: kdc=hm UDP:88, timeout=30000,Attempt =1, #bytes=145
>> >>>> KrbKdcReq send: #bytes read=598
>> >>>> KdcAccessibility: remove hm
>> >Added key: 3version: 1
>> >Found unsupported keytype (8) for hive/hm@HADOOP.COM
>> >Added key: 23version: 1
>> >Added key: 16version: 1
>> >Added key: 17version: 1
>> >Found unsupported keytype (18) for hive/hm@HADOOP.COM
>> >Ordering keys wrt default_tkt_enctypes list
>> >Using builtin default etypes for default_tkt_enctypes
>> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
>> >>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>> >>>> KrbAsRep cons in KrbAsReq.getReply hive/hm
>> >Added key: 3version: 1
>> >Found unsupported keytype (8) for hive/hm@HADOOP.COM
>> >Added key: 23version: 1
>> >Added key: 16version: 1
>> >Added key: 17version: 1
>> >Found unsupported keytype (18) for hive/hm@HADOOP.COM
>> >Ordering keys wrt default_tkt_enctypes list
>> >Using builtin default etypes for default_tkt_enctypes
>> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
>> >start connect hiveserver..
>> >Found ticket for hive/hm@HADOOP.COM to go to krbtgt/
>> HADOOP.COM@HADOOP.COM expiring on Wed Jul 06 09:29:15 CST 2016
>> >Entered Krb5Context.initSecContext with state=STATE_NEW
>> >Found ticket for hive/hm@HADOOP.COM to go to krbtgt/
>> HADOOP.COM@HADOOP.COM expiring on Wed Jul 06 09:29:15 CST 2016
>> >Service ticket not found in the subject
>> >>>> Credentials acquireServiceCreds: same realm
>> >Using builtin default etypes for default_tgs_enctypes
>> >default etypes for default_tgs_enctypes: 17 16 23 1 3.
>> >>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>> >>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>> >>>> KrbKdcReq send: kdc=hm UDP:88, timeout=30000, number of retries =3,
>> #bytes=619
>> >>>> KDCCommunication: kdc=hm UDP:88, timeout=30000,Attempt =1, #bytes=619
>> >>>> KrbKdcReq send: #bytes read=116
>> >>>> KdcAccessibility: remove hm
>> >>>> KDCRep: init() encoding tag is 126 req type is 13
>> >>>>KRBError:
>> >        cTime is Wed Jul 04 22:58:32 CST 1984 457801112000
>> >        sTime is Tue Jul 05 09:29:15 CST 2016 1467682155000
>> >        suSec is 944361
>> >        error code is 37
>> >        error Message is Clock skew too great
>> >        realm is HADOOP.COM
>> >        sname is hive/hm
>> >        msgType is 30
>> >KrbException: Clock skew too great (37) - PROCESS_TGS
>> >       at sun.security.krb5.KrbTgsRep.<init>(Unknown Source)
>> >       at sun.security.krb5.KrbTgsReq.getReply(Unknown Source)
>> >       at sun.security.krb5.KrbTgsReq.sendAndGetCreds(Unknown Source)
>> >       at
>> sun.security.krb5.internal.CredentialsUtil.serviceCreds(Unknown Source)
>> >       at
>> sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(Unknown
>> Source)
>> >       at sun.security.krb5.Credentials.acquireServiceCreds(Unknown
>> Source)
>> >       at sun.security.jgss.krb5.Krb5Context.initSecContext(Unknown
>> Source)
>> >       at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source)
>> >       at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source)
>> >       at
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Unknown
>> Source)
>> >       at
>> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
>> >       at
>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>> >       at
>> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>> >       at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>> >       at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>> >       at java.security.AccessController.doPrivileged(Native Method)
>> >       at javax.security.auth.Subject.doAs(Unknown Source)
>> >       at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>> >       at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>> >       at
>> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>> >       at
>> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>> >       at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
>> >       at java.sql.DriverManager.getConnection(Unknown Source)
>> >       at java.sql.DriverManager.getConnection(Unknown Source)
>> >       at
>> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:50)
>> >Caused by: KrbException: Identifier doesn't match expected value (906)
>> >       at sun.security.krb5.internal.KDCRep.init(Unknown Source)
>> >       at sun.security.krb5.internal.TGSRep.init(Unknown Source)
>> >       at sun.security.krb5.internal.TGSRep.<init>(Unknown Source)
>> >       ... 25 more
>> >java.sql.SQLException: Could not open client transport with JDBC Uri:
>> jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOP.COM: GSS initiate
>> failed
>> >       at
>> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>> >       at
>> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>> >       at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
>> >       at java.sql.DriverManager.getConnection(Unknown Source)
>> >       at java.sql.DriverManager.getConnection(Unknown Source)
>> >       at
>> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:50)
>> >Caused by: org.apache.thrift.transport.TTransportException: GSS initiate
>> failed
>> >       at
>> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>> >       at
>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>> >       at
>> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>> >       at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>> >       at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>> >       at java.security.AccessController.doPrivileged(Native Method)
>> >       at javax.security.auth.Subject.doAs(Unknown Source)
>> >       at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>> >       at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>> >       at
>> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>> >       ... 5 more
>> >
>> >As if kerberos configuration is incorrect ....
>> >
>> >
>> >At 2016-07-04 21:26:53, "Vivek Shrivastava" <vi...@gmail.com>
>> wrote:
>> >
>> >
>> >The renewal lifetime at client krb5.conf level does make any difference.
>> The renewal time period is defined at  kdc in kdc.conf. Client can not
>> override it. The renewal is also a property set at the principal level,
>> both the settings ( renewal_lifetime, +renewal ) dictate if a ticket can be
>> renewed. I don't think your problem has anything to do with that.
>> >
>> >
>> >Seems something basic is missing in your environment. I would probably,
>> run the same piece of code in the unix environment and ensure that there is
>> no error. Enabling Kerberos debugging logging as suggested in the previous
>> post will also help you compare the sequence of execution.
>> >
>> >
>> >On Mon, Jul 4, 2016 at 7:52 AM, Aviral Agarwal <av...@gmail.com>
>> wrote:
>> >
>> >
>> >Hi,
>> >Could you enable kerberos logs with
>> >
>> >    -Dsun.security.krb5.debug=true
>> >
>> >
>> >and paste the output ?
>> >
>> >
>> >
>> >
>> >On Mon, Jul 4, 2016 at 3:47 PM, Maria <li...@126.com> wrote:
>> >
>> >The qestion "kinit: Ticket expired while renewing credentials" has
>> been solved. I can successfully execute "kinit -R",
>> >
>> >but the error “java.lang.RuntimeException:
>> org.apache.thrift.transport.TTransportException: Peer indicated failure:
>> GSS initiate failed”
>> >
>> >is still there..
>> >
>> >
>> >
>> >
>> >
>> >At 2016-07-04 14:39:04, "Maria" <li...@126.com> wrote:
>> >
>> >>I saw a  mail named "HCatalog Security",His or her problem was similar
>> to mine,and the reply answer were:
>> >
>> >>"This issue goes away after doing a kinit -R".
>> >
>> >>
>> >
>> >>So I did the same operation.while it is failed:
>> >
>> >>kinit: Ticket expired while renewing credentials
>> >
>> >>
>> >
>> >>But in my /etc/krb5.conf, I have configed this item:
>> >
>> >>renew_lifetime=7d
>> >
>> >>
>> >
>> >>So, Can anybody give me some suggestions, please? Thankyou.
>> >
>> >>
>> >
>> >>At 2016-07-04 11:32:30, "Maria" <li...@126.com> wrote:
>> >
>> >>>
>> >
>> >>>
>> >
>> >>>And  I can suucessfully access hiveserver2 from beeline.
>> >
>> >>>
>> >
>> >>>
>> >
>> >>>I was so confused by this error"Peer indicated failure: GSS initiate
>> failed".
>> >
>> >>>
>> >
>> >>> Can you anybody please help me? Any reply will be much appreciated.
>> >
>> >>>
>> >
>> >>>At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:
>> >
>> >>>>Yup,my  hiveserver2 log errors are:
>> >
>> >>>>
>> >
>> >>>>ERROR [Hiveserver2-Handler-Pool:
>> Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) -
>> error occurred during processing of message.
>> >
>> >>>>java.lang.RuntimeException:
>> org.apache.thrift.transport.TTransportException: Peer indicated failure:
>> GSS initiate failed
>> >
>> >>>>    at
>> org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
>> >
>> >>>>    at
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
>> >
>> >>>>    at
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
>> >
>> >>>>    at java.security.AccessController.doPrivileged(Native Method)
>> >
>> >>>>    at javax.security.auth.Subject.doAs(Subject.java:356)
>> >
>> >>>>    at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
>> >
>> >>>>    at
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
>> >
>> >>>>    at
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>> >
>> >>>>    at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> >
>> >>>>    at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> >
>> >>>>    at java.lang.Thread.run(Thread.java:745)
>> >
>> >>>>Caused by: org.apache.thrift.transport.TTransportException:Peer
>> indicated failure: GSS initiate failed
>> >
>> >>>>    at
>> org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
>> >
>> >>>>    at
>> org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>> >
>> >>>>    at
>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>> >
>> >>>>    at
>> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>> >
>> >>>>    at
>> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>> >
>> >>>> ... 10 more
>> >
>> >>>>================================================
>> >
>> >>>>As if the windows  hive JDBC client can communicate with the
>> hiveserver2,isn't it?
>> >
>> >>>>
>> >
>> >>>>while I checked everything I can :
>> >
>> >>>>(1)in hiveserver2 node, I execute command "klist",the results are:
>> >
>> >>>>Ticket cache: FILE:/tmp/krb5cc_0
>> >
>> >>>>Default principal: hive/hm@HADOOP.COM
>> >
>> >>>>
>> >
>> >>>>Valid starting    Expires                     Service principal
>> >
>> >>>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/
>> HADOOP.COM@HADOOP.COM
>> >
>> >>>>                 renew until 07/04/16 10:28:14
>> >
>> >>>>(2)in windows dos cmd,I execute command "klist",the results are:
>> >
>> >>>>Ticket cache:API: 1
>> >
>> >>>>Default principal: hive/hm@HADOOP.COM
>> >
>> >>>>
>> >
>> >>>>Valid starting    Expires                     Service principal
>> >
>> >>>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/
>> HADOOP.COM@HADOOP.COM
>> >
>> >>>>                 renew until 07/04/16 10:24:32
>> >
>> >>>>
>> >
>> >>>> Is there any thing else I have to add or set for hiveserver2?
>> >
>> >>>>
>> >
>> >>>>Thanks in advance.
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>Maria.
>> >
>> >>>>
>> >
>> >>>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com>
>> wrote:
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>Please look at the hiveserver2 log, it will have better error
>> information. You can paste error from the logs if you need help.
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>Regards,
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>Vivek
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com>
>> wrote:
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>Hi,all:
>> >
>> >>>>
>> >
>> >>>>     recently,I  attempted to access Kerberized hadoop cluster by
>> launching JAVA applications from Windows workstations. And I hava
>> configured kerberos in my windows7, and can successfully access hdfs50070.
>> But when I launch JDBC from windows to connection remote hiveserver,errors
>> accured:
>> >
>> >>>>
>> >
>> >>>>java.sql.SQLException:could not open client transport with JDBC
>> Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS
>> initiate failed
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
>> >
>> >>>>
>> >
>> >>>>     at java.sql.DriverManager.getConnection(Unknown Source)
>> >
>> >>>>
>> >
>> >>>>     at java.sql.DriverManager.getConnection(Unknown Source)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
>> >
>> >>>>
>> >
>> >>>>Caused by: org.apache.thrift.transport.TTransportException:GSS
>> initiate failed
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>> >
>> >>>>
>> >
>> >>>>     at java.security.AccessController.doPrivileged(Native Method)
>> >
>> >>>>
>> >
>> >>>>     at javax.security.auth.Subject.doAs(Unknow source)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>> >
>> >>>>
>> >
>> >>>>     at
>> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>> >
>> >>>>
>> >
>> >>>>... 5 more
>> >
>> >>>>
>> >
>>
>> >>>>------------------------------------------------------------------------------
>> >
>> >>>>
>> >
>> >>>>below are my test codes:
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>public static void main(String[] args) {
>> >
>> >>>>
>> >
>> >>>>    String principal = "hive/hm@HADOOM.COM";
>> >
>> >>>>
>> >
>> >>>>    String keytab = "E:\\Program Files
>> (x86)\\java\\jre7\\lib\\security\\hive.keytab";
>> >
>> >>>>
>> >
>> >>>>    String url = "jdbc:hive2://hm:10000/default;principal=hive/
>> hm@HADOOM.COM";
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
>> >
>> >>>>
>> >
>> >>>>    conf.addResource(new File("core-site.xml").toURI().toURL());
>> >
>> >>>>
>> >
>> >>>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
>> >
>> >>>>
>> >
>> >>>>    conf.addResource(new File("hive-site.xml").toURI().toURL());
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>    conf.set("hadoop.security.authentication", "Kerberos");
>> >
>> >>>>
>> >
>> >>>>    UserGroupInformation.setConfiguration(conf);
>> >
>> >>>>
>> >
>> >>>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
>> >
>> >>>>
>> >
>> >>>>    Connection conn =DriverManager.getConnection(url);
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>    Statement stmt = conn.createStatement();
>> >
>> >>>>
>> >
>> >>>>    String sql = "select * from testkerberos";
>> >
>> >>>>
>> >
>> >>>>    ResultSet rs = stmt.executeQuery(sql);
>> >
>> >>>>
>> >
>> >>>>    while (rs.next()) {
>> >
>> >>>>
>> >
>> >>>>       system.out.println(rs.getString(1));
>> >
>> >>>>
>> >
>> >>>>    }
>> >
>> >>>>
>> >
>> >>>>}
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>Does anyone had the same problem? Or know how to solve it ?
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>Thanks in advance.
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>Maria.
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >>>>
>> >
>> >
>> >
>> >
>> >
>>
>
>

Re: Re:Re: Re:Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Vivek Shrivastava <vi...@gmail.com>.
Good to know Maria.

On Mon, Jul 4, 2016 at 10:46 PM, Maria <li...@126.com> wrote:

>
> I did it! "KrbException: Clock skew too great (37) - PROCESS_TGS" means my
> windows clock is not synchronized with the kerberos server clock.
> After I do synchronized between windows and linux kerberos server. Every
> thing goes well.
>
> I am so grateful  to you two.(^_^)
>
> Maria.
>
> At 2016-07-05 09:59:04, "Maria" <li...@126.com> wrote:
> >
> >Yup,yesterday I started to realize that The renewal is a  principal level
> setting. I hava fixed renew time in KDC kdc.conf. Do as Aviral said, I
> enable kerberos logs with
> >    "-Dsun.security.krb5.debug=true" , more error info printed out:
>
> >------------------------------------------------------------------------------------------------
> >Java config name: E:\Program Files (x86)\Java\jre7\lib\security\krb5.conf
> >Loaded from Java config
> >Java config name: E:\Program Files (x86)\Java\jre7\lib\security\krb5.conf
> >Loaded from Java config
> >>>> KdcAccessibility: reset
> >>>> KdcAccessibility: reset
> >>>> KeyTabInputStream, readName(): HADOOP.COM
> >>>> KeyTabInputStream, readName(): hive
> >>>> KeyTabInputStream, readName(): hm
> >>>> KeyTab: load() entry length: 69; type: 18
> >>>> KeyTabInputStream, readName(): HADOOP.COM
> >>>> KeyTabInputStream, readName(): hive
> >>>> KeyTabInputStream, readName(): hm
> >>>> KeyTab: load() entry length: 53; type: 17
> >>>> KeyTabInputStream, readName(): HADOOP.COM
> >>>> KeyTabInputStream, readName(): hive
> >>>> KeyTabInputStream, readName(): hm
> >>>> KeyTab: load() entry length: 61; type: 16
> >>>> KeyTabInputStream, readName(): HADOOP.COM
> >>>> KeyTabInputStream, readName(): hive
> >>>> KeyTabInputStream, readName(): hm
> >>>> KeyTab: load() entry length: 53; type: 23
> >>>> KeyTabInputStream, readName(): HADOOP.COM
> >>>> KeyTabInputStream, readName(): hive
> >>>> KeyTabInputStream, readName(): hm
> >>>> KeyTab: load() entry length: 45; type: 8
> >>>> KeyTabInputStream, readName(): HADOOP.COM
> >>>> KeyTabInputStream, readName(): hive
> >>>> KeyTabInputStream, readName(): hm
> >>>> KeyTab: load() entry length: 45; type: 3
> >Added key: 3version: 1
> >Found unsupported keytype (8) for hive/hm@HADOOP.COM
> >Added key: 23version: 1
> >Added key: 16version: 1
> >Added key: 17version: 1
> >Found unsupported keytype (18) for hive/hm@HADOOP.COM
> >Ordering keys wrt default_tkt_enctypes list
> >Using builtin default etypes for default_tkt_enctypes
> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
> >Added key: 3version: 1
> >Found unsupported keytype (8) for hive/hm@HADOOP.COM
> >Added key: 23version: 1
> >Added key: 16version: 1
> >Added key: 17version: 1
> >Found unsupported keytype (18) for hive/hm@HADOOP.COM
> >Ordering keys wrt default_tkt_enctypes list
> >Using builtin default etypes for default_tkt_enctypes
> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
> >Using builtin default etypes for default_tkt_enctypes
> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
> >>>> KrbAsReq creating message
> >>>> KrbKdcReq send: kdc=hm UDP:88, timeout=30000, number of retries =3,
> #bytes=145
> >>>> KDCCommunication: kdc=hm UDP:88, timeout=30000,Attempt =1, #bytes=145
> >>>> KrbKdcReq send: #bytes read=598
> >>>> KdcAccessibility: remove hm
> >Added key: 3version: 1
> >Found unsupported keytype (8) for hive/hm@HADOOP.COM
> >Added key: 23version: 1
> >Added key: 16version: 1
> >Added key: 17version: 1
> >Found unsupported keytype (18) for hive/hm@HADOOP.COM
> >Ordering keys wrt default_tkt_enctypes list
> >Using builtin default etypes for default_tkt_enctypes
> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
> >>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>>> KrbAsRep cons in KrbAsReq.getReply hive/hm
> >Added key: 3version: 1
> >Found unsupported keytype (8) for hive/hm@HADOOP.COM
> >Added key: 23version: 1
> >Added key: 16version: 1
> >Added key: 17version: 1
> >Found unsupported keytype (18) for hive/hm@HADOOP.COM
> >Ordering keys wrt default_tkt_enctypes list
> >Using builtin default etypes for default_tkt_enctypes
> >default etypes for default_tkt_enctypes: 17 16 23 1 3.
> >start connect hiveserver..
> >Found ticket for hive/hm@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM
> expiring on Wed Jul 06 09:29:15 CST 2016
> >Entered Krb5Context.initSecContext with state=STATE_NEW
> >Found ticket for hive/hm@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM
> expiring on Wed Jul 06 09:29:15 CST 2016
> >Service ticket not found in the subject
> >>>> Credentials acquireServiceCreds: same realm
> >Using builtin default etypes for default_tgs_enctypes
> >default etypes for default_tgs_enctypes: 17 16 23 1 3.
> >>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>>> KrbKdcReq send: kdc=hm UDP:88, timeout=30000, number of retries =3,
> #bytes=619
> >>>> KDCCommunication: kdc=hm UDP:88, timeout=30000,Attempt =1, #bytes=619
> >>>> KrbKdcReq send: #bytes read=116
> >>>> KdcAccessibility: remove hm
> >>>> KDCRep: init() encoding tag is 126 req type is 13
> >>>>KRBError:
> >        cTime is Wed Jul 04 22:58:32 CST 1984 457801112000
> >        sTime is Tue Jul 05 09:29:15 CST 2016 1467682155000
> >        suSec is 944361
> >        error code is 37
> >        error Message is Clock skew too great
> >        realm is HADOOP.COM
> >        sname is hive/hm
> >        msgType is 30
> >KrbException: Clock skew too great (37) - PROCESS_TGS
> >       at sun.security.krb5.KrbTgsRep.<init>(Unknown Source)
> >       at sun.security.krb5.KrbTgsReq.getReply(Unknown Source)
> >       at sun.security.krb5.KrbTgsReq.sendAndGetCreds(Unknown Source)
> >       at sun.security.krb5.internal.CredentialsUtil.serviceCreds(Unknown
> Source)
> >       at
> sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(Unknown
> Source)
> >       at sun.security.krb5.Credentials.acquireServiceCreds(Unknown
> Source)
> >       at sun.security.jgss.krb5.Krb5Context.initSecContext(Unknown
> Source)
> >       at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source)
> >       at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source)
> >       at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Unknown
> Source)
> >       at
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
> >       at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> >       at
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> >       at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> >       at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Unknown Source)
> >       at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> >       at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> >       at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
> >       at
> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
> >       at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
> >       at java.sql.DriverManager.getConnection(Unknown Source)
> >       at java.sql.DriverManager.getConnection(Unknown Source)
> >       at
> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:50)
> >Caused by: KrbException: Identifier doesn't match expected value (906)
> >       at sun.security.krb5.internal.KDCRep.init(Unknown Source)
> >       at sun.security.krb5.internal.TGSRep.init(Unknown Source)
> >       at sun.security.krb5.internal.TGSRep.<init>(Unknown Source)
> >       ... 25 more
> >java.sql.SQLException: Could not open client transport with JDBC Uri:
> jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOP.COM: GSS initiate
> failed
> >       at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
> >       at
> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
> >       at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
> >       at java.sql.DriverManager.getConnection(Unknown Source)
> >       at java.sql.DriverManager.getConnection(Unknown Source)
> >       at
> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:50)
> >Caused by: org.apache.thrift.transport.TTransportException: GSS initiate
> failed
> >       at
> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
> >       at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
> >       at
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> >       at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> >       at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Unknown Source)
> >       at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> >       at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> >       at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
> >       ... 5 more
> >
> >As if kerberos configuration is incorrect ....
> >
> >
> >At 2016-07-04 21:26:53, "Vivek Shrivastava" <vi...@gmail.com>
> wrote:
> >
> >
> >The renewal lifetime at client krb5.conf level does make any difference.
> The renewal time period is defined at  kdc in kdc.conf. Client can not
> override it. The renewal is also a property set at the principal level,
> both the settings ( renewal_lifetime, +renewal ) dictate if a ticket can be
> renewed. I don't think your problem has anything to do with that.
> >
> >
> >Seems something basic is missing in your environment. I would probably,
> run the same piece of code in the unix environment and ensure that there is
> no error. Enabling Kerberos debugging logging as suggested in the previous
> post will also help you compare the sequence of execution.
> >
> >
> >On Mon, Jul 4, 2016 at 7:52 AM, Aviral Agarwal <av...@gmail.com>
> wrote:
> >
> >
> >Hi,
> >Could you enable kerberos logs with
> >
> >    -Dsun.security.krb5.debug=true
> >
> >
> >and paste the output ?
> >
> >
> >
> >
> >On Mon, Jul 4, 2016 at 3:47 PM, Maria <li...@126.com> wrote:
> >
> >The qestion "kinit: Ticket expired while renewing credentials" has
> been solved. I can successfully execute "kinit -R",
> >
> >but the error “java.lang.RuntimeException:
> org.apache.thrift.transport.TTransportException: Peer indicated failure:
> GSS initiate failed”
> >
> >is still there..
> >
> >
> >
> >
> >
> >At 2016-07-04 14:39:04, "Maria" <li...@126.com> wrote:
> >
> >>I saw a  mail named "HCatalog Security",His or her problem was similar
> to mine,and the reply answer were:
> >
> >>"This issue goes away after doing a kinit -R".
> >
> >>
> >
> >>So I did the same operation.while it is failed:
> >
> >>kinit: Ticket expired while renewing credentials
> >
> >>
> >
> >>But in my /etc/krb5.conf, I have configed this item:
> >
> >>renew_lifetime=7d
> >
> >>
> >
> >>So, Can anybody give me some suggestions, please? Thankyou.
> >
> >>
> >
> >>At 2016-07-04 11:32:30, "Maria" <li...@126.com> wrote:
> >
> >>>
> >
> >>>
> >
> >>>And  I can suucessfully access hiveserver2 from beeline.
> >
> >>>
> >
> >>>
> >
> >>>I was so confused by this error"Peer indicated failure: GSS initiate
> failed".
> >
> >>>
> >
> >>> Can you anybody please help me? Any reply will be much appreciated.
> >
> >>>
> >
> >>>At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:
> >
> >>>>Yup,my  hiveserver2 log errors are:
> >
> >>>>
> >
> >>>>ERROR [Hiveserver2-Handler-Pool:
> Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) -
> error occurred during processing of message.
> >
> >>>>java.lang.RuntimeException:
> org.apache.thrift.transport.TTransportException: Peer indicated failure:
> GSS initiate failed
> >
> >>>>    at
> org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
> >
> >>>>    at
> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
> >
> >>>>    at
> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
> >
> >>>>    at java.security.AccessController.doPrivileged(Native Method)
> >
> >>>>    at javax.security.auth.Subject.doAs(Subject.java:356)
> >
> >>>>    at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
> >
> >>>>    at
> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
> >
> >>>>    at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
> >
> >>>>    at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >
> >>>>    at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >
> >>>>    at java.lang.Thread.run(Thread.java:745)
> >
> >>>>Caused by: org.apache.thrift.transport.TTransportException:Peer
> indicated failure: GSS initiate failed
> >
> >>>>    at
> org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
> >
> >>>>    at
> org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
> >
> >>>>    at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> >
> >>>>    at
> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
> >
> >>>>    at
> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
> >
> >>>> ... 10 more
> >
> >>>>================================================
> >
> >>>>As if the windows  hive JDBC client can communicate with the
> hiveserver2,isn't it?
> >
> >>>>
> >
> >>>>while I checked everything I can :
> >
> >>>>(1)in hiveserver2 node, I execute command "klist",the results are:
> >
> >>>>Ticket cache: FILE:/tmp/krb5cc_0
> >
> >>>>Default principal: hive/hm@HADOOP.COM
> >
> >>>>
> >
> >>>>Valid starting    Expires                     Service principal
> >
> >>>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/
> HADOOP.COM@HADOOP.COM
> >
> >>>>                 renew until 07/04/16 10:28:14
> >
> >>>>(2)in windows dos cmd,I execute command "klist",the results are:
> >
> >>>>Ticket cache:API: 1
> >
> >>>>Default principal: hive/hm@HADOOP.COM
> >
> >>>>
> >
> >>>>Valid starting    Expires                     Service principal
> >
> >>>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/
> HADOOP.COM@HADOOP.COM
> >
> >>>>                 renew until 07/04/16 10:24:32
> >
> >>>>
> >
> >>>> Is there any thing else I have to add or set for hiveserver2?
> >
> >>>>
> >
> >>>>Thanks in advance.
> >
> >>>>
> >
> >>>>
> >
> >>>>Maria.
> >
> >>>>
> >
> >>>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com>
> wrote:
> >
> >>>>
> >
> >>>>
> >
> >>>>Please look at the hiveserver2 log, it will have better error
> information. You can paste error from the logs if you need help.
> >
> >>>>
> >
> >>>>
> >
> >>>>Regards,
> >
> >>>>
> >
> >>>>
> >
> >>>>Vivek
> >
> >>>>
> >
> >>>>
> >
> >>>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>Hi,all:
> >
> >>>>
> >
> >>>>     recently,I  attempted to access Kerberized hadoop cluster by
> launching JAVA applications from Windows workstations. And I hava
> configured kerberos in my windows7, and can successfully access hdfs50070.
> But when I launch JDBC from windows to connection remote hiveserver,errors
> accured:
> >
> >>>>
> >
> >>>>java.sql.SQLException:could not open client transport with JDBC
> Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS
> initiate failed
> >
> >>>>
> >
> >>>>     at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
> >
> >>>>
> >
> >>>>     at
> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
> >
> >>>>
> >
> >>>>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
> >
> >>>>
> >
> >>>>     at java.sql.DriverManager.getConnection(Unknown Source)
> >
> >>>>
> >
> >>>>     at java.sql.DriverManager.getConnection(Unknown Source)
> >
> >>>>
> >
> >>>>     at
> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
> >
> >>>>
> >
> >>>>Caused by: org.apache.thrift.transport.TTransportException:GSS
> initiate failed
> >
> >>>>
> >
> >>>>     at
> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
> >
> >>>>
> >
> >>>>     at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
> >
> >>>>
> >
> >>>>     at
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> >
> >>>>
> >
> >>>>     at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> >
> >>>>
> >
> >>>>     at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> >
> >>>>
> >
> >>>>     at java.security.AccessController.doPrivileged(Native Method)
> >
> >>>>
> >
> >>>>     at javax.security.auth.Subject.doAs(Unknow source)
> >
> >>>>
> >
> >>>>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> >
> >>>>
> >
> >>>>     at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> >
> >>>>
> >
> >>>>     at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
> >
> >>>>
> >
> >>>>... 5 more
> >
> >>>>
> >
>
> >>>>------------------------------------------------------------------------------
> >
> >>>>
> >
> >>>>below are my test codes:
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>public static void main(String[] args) {
> >
> >>>>
> >
> >>>>    String principal = "hive/hm@HADOOM.COM";
> >
> >>>>
> >
> >>>>    String keytab = "E:\\Program Files
> (x86)\\java\\jre7\\lib\\security\\hive.keytab";
> >
> >>>>
> >
> >>>>    String url = "jdbc:hive2://hm:10000/default;principal=hive/
> hm@HADOOM.COM";
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
> >
> >>>>
> >
> >>>>    conf.addResource(new File("core-site.xml").toURI().toURL());
> >
> >>>>
> >
> >>>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
> >
> >>>>
> >
> >>>>    conf.addResource(new File("hive-site.xml").toURI().toURL());
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>    conf.set("hadoop.security.authentication", "Kerberos");
> >
> >>>>
> >
> >>>>    UserGroupInformation.setConfiguration(conf);
> >
> >>>>
> >
> >>>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
> >
> >>>>
> >
> >>>>    Connection conn =DriverManager.getConnection(url);
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>    Statement stmt = conn.createStatement();
> >
> >>>>
> >
> >>>>    String sql = "select * from testkerberos";
> >
> >>>>
> >
> >>>>    ResultSet rs = stmt.executeQuery(sql);
> >
> >>>>
> >
> >>>>    while (rs.next()) {
> >
> >>>>
> >
> >>>>       system.out.println(rs.getString(1));
> >
> >>>>
> >
> >>>>    }
> >
> >>>>
> >
> >>>>}
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>Does anyone had the same problem? Or know how to solve it ?
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>Thanks in advance.
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >>>>Maria.
> >
> >>>>
> >
> >>>>
> >
> >>>>
> >
> >
> >
> >
> >
>

Re:Re:Re: Re:Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Maria <li...@126.com>.
I did it! "KrbException: Clock skew too great (37) - PROCESS_TGS" means my windows clock is not synchronized with the kerberos server clock.
After I do synchronized between windows and linux kerberos server. Every thing goes well. 

I am so grateful  to you two.(^_^)

Maria.

At 2016-07-05 09:59:04, "Maria" <li...@126.com> wrote:
>
>Yup,yesterday I started to realize that The renewal is a  principal level setting. I hava fixed renew time in KDC kdc.conf. Do as Aviral said, I enable kerberos logs with     
>    "-Dsun.security.krb5.debug=true" , more error info printed out:
>------------------------------------------------------------------------------------------------
>Java config name: E:\Program Files (x86)\Java\jre7\lib\security\krb5.conf
>Loaded from Java config
>Java config name: E:\Program Files (x86)\Java\jre7\lib\security\krb5.conf
>Loaded from Java config
>>>> KdcAccessibility: reset
>>>> KdcAccessibility: reset
>>>> KeyTabInputStream, readName(): HADOOP.COM
>>>> KeyTabInputStream, readName(): hive
>>>> KeyTabInputStream, readName(): hm
>>>> KeyTab: load() entry length: 69; type: 18
>>>> KeyTabInputStream, readName(): HADOOP.COM
>>>> KeyTabInputStream, readName(): hive
>>>> KeyTabInputStream, readName(): hm
>>>> KeyTab: load() entry length: 53; type: 17
>>>> KeyTabInputStream, readName(): HADOOP.COM
>>>> KeyTabInputStream, readName(): hive
>>>> KeyTabInputStream, readName(): hm
>>>> KeyTab: load() entry length: 61; type: 16
>>>> KeyTabInputStream, readName(): HADOOP.COM
>>>> KeyTabInputStream, readName(): hive
>>>> KeyTabInputStream, readName(): hm
>>>> KeyTab: load() entry length: 53; type: 23
>>>> KeyTabInputStream, readName(): HADOOP.COM
>>>> KeyTabInputStream, readName(): hive
>>>> KeyTabInputStream, readName(): hm
>>>> KeyTab: load() entry length: 45; type: 8
>>>> KeyTabInputStream, readName(): HADOOP.COM
>>>> KeyTabInputStream, readName(): hive
>>>> KeyTabInputStream, readName(): hm
>>>> KeyTab: load() entry length: 45; type: 3
>Added key: 3version: 1
>Found unsupported keytype (8) for hive/hm@HADOOP.COM
>Added key: 23version: 1
>Added key: 16version: 1
>Added key: 17version: 1
>Found unsupported keytype (18) for hive/hm@HADOOP.COM
>Ordering keys wrt default_tkt_enctypes list
>Using builtin default etypes for default_tkt_enctypes
>default etypes for default_tkt_enctypes: 17 16 23 1 3.
>Added key: 3version: 1
>Found unsupported keytype (8) for hive/hm@HADOOP.COM
>Added key: 23version: 1
>Added key: 16version: 1
>Added key: 17version: 1
>Found unsupported keytype (18) for hive/hm@HADOOP.COM
>Ordering keys wrt default_tkt_enctypes list
>Using builtin default etypes for default_tkt_enctypes
>default etypes for default_tkt_enctypes: 17 16 23 1 3.
>Using builtin default etypes for default_tkt_enctypes
>default etypes for default_tkt_enctypes: 17 16 23 1 3.
>>>> KrbAsReq creating message
>>>> KrbKdcReq send: kdc=hm UDP:88, timeout=30000, number of retries =3, #bytes=145
>>>> KDCCommunication: kdc=hm UDP:88, timeout=30000,Attempt =1, #bytes=145
>>>> KrbKdcReq send: #bytes read=598
>>>> KdcAccessibility: remove hm
>Added key: 3version: 1
>Found unsupported keytype (8) for hive/hm@HADOOP.COM
>Added key: 23version: 1
>Added key: 16version: 1
>Added key: 17version: 1
>Found unsupported keytype (18) for hive/hm@HADOOP.COM
>Ordering keys wrt default_tkt_enctypes list
>Using builtin default etypes for default_tkt_enctypes
>default etypes for default_tkt_enctypes: 17 16 23 1 3.
>>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>>> KrbAsRep cons in KrbAsReq.getReply hive/hm
>Added key: 3version: 1
>Found unsupported keytype (8) for hive/hm@HADOOP.COM
>Added key: 23version: 1
>Added key: 16version: 1
>Added key: 17version: 1
>Found unsupported keytype (18) for hive/hm@HADOOP.COM
>Ordering keys wrt default_tkt_enctypes list
>Using builtin default etypes for default_tkt_enctypes
>default etypes for default_tkt_enctypes: 17 16 23 1 3.
>start connect hiveserver..
>Found ticket for hive/hm@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM expiring on Wed Jul 06 09:29:15 CST 2016
>Entered Krb5Context.initSecContext with state=STATE_NEW
>Found ticket for hive/hm@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM expiring on Wed Jul 06 09:29:15 CST 2016
>Service ticket not found in the subject
>>>> Credentials acquireServiceCreds: same realm
>Using builtin default etypes for default_tgs_enctypes
>default etypes for default_tgs_enctypes: 17 16 23 1 3.
>>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>>> KrbKdcReq send: kdc=hm UDP:88, timeout=30000, number of retries =3, #bytes=619
>>>> KDCCommunication: kdc=hm UDP:88, timeout=30000,Attempt =1, #bytes=619
>>>> KrbKdcReq send: #bytes read=116
>>>> KdcAccessibility: remove hm
>>>> KDCRep: init() encoding tag is 126 req type is 13
>>>>KRBError:
>	 cTime is Wed Jul 04 22:58:32 CST 1984 457801112000
>	 sTime is Tue Jul 05 09:29:15 CST 2016 1467682155000
>	 suSec is 944361
>	 error code is 37
>	 error Message is Clock skew too great
>	 realm is HADOOP.COM
>	 sname is hive/hm
>	 msgType is 30
>KrbException: Clock skew too great (37) - PROCESS_TGS
>	at sun.security.krb5.KrbTgsRep.<init>(Unknown Source)
>	at sun.security.krb5.KrbTgsReq.getReply(Unknown Source)
>	at sun.security.krb5.KrbTgsReq.sendAndGetCreds(Unknown Source)
>	at sun.security.krb5.internal.CredentialsUtil.serviceCreds(Unknown Source)
>	at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(Unknown Source)
>	at sun.security.krb5.Credentials.acquireServiceCreds(Unknown Source)
>	at sun.security.jgss.krb5.Krb5Context.initSecContext(Unknown Source)
>	at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source)
>	at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source)
>	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Unknown Source)
>	at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
>	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>	at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>	at java.security.AccessController.doPrivileged(Native Method)
>	at javax.security.auth.Subject.doAs(Unknown Source)
>	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>	at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
>	at java.sql.DriverManager.getConnection(Unknown Source)
>	at java.sql.DriverManager.getConnection(Unknown Source)
>	at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:50)
>Caused by: KrbException: Identifier doesn't match expected value (906)
>	at sun.security.krb5.internal.KDCRep.init(Unknown Source)
>	at sun.security.krb5.internal.TGSRep.init(Unknown Source)
>	at sun.security.krb5.internal.TGSRep.<init>(Unknown Source)
>	... 25 more
>java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOP.COM: GSS initiate failed
>	at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
>	at java.sql.DriverManager.getConnection(Unknown Source)
>	at java.sql.DriverManager.getConnection(Unknown Source)
>	at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:50)
>Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed
>	at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>	at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>	at java.security.AccessController.doPrivileged(Native Method)
>	at javax.security.auth.Subject.doAs(Unknown Source)
>	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>	at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>	... 5 more
>
>As if kerberos configuration is incorrect ....
>
>
>At 2016-07-04 21:26:53, "Vivek Shrivastava" <vi...@gmail.com> wrote:
> 
>
>The renewal lifetime at client krb5.conf level does make any difference. The renewal time period is defined at  kdc in kdc.conf. Client can not override it. The renewal is also a property set at the principal level, both the settings ( renewal_lifetime, +renewal ) dictate if a ticket can be renewed. I don't think your problem has anything to do with that. 
>
>
>Seems something basic is missing in your environment. I would probably, run the same piece of code in the unix environment and ensure that there is no error. Enabling Kerberos debugging logging as suggested in the previous post will also help you compare the sequence of execution. 
>
>
>On Mon, Jul 4, 2016 at 7:52 AM, Aviral Agarwal <av...@gmail.com> wrote:
>
>
>Hi,
>Could you enable kerberos logs with 
>    
>    -Dsun.security.krb5.debug=true
>
>
>and paste the output ?
>
>
>
>
>On Mon, Jul 4, 2016 at 3:47 PM, Maria <li...@126.com> wrote:
>
>The qestion "kinit: Ticket expired while renewing credentials" has been solved. I can successfully execute "kinit -R",
>
>but the error “java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed”
>
>is still there..
>
>
>
>
>
>At 2016-07-04 14:39:04, "Maria" <li...@126.com> wrote:
>
>>I saw a  mail named "HCatalog Security",His or her problem was similar to mine,and the reply answer were:
>
>>"This issue goes away after doing a kinit -R".
>
>>
>
>>So I did the same operation.while it is failed:
>
>>kinit: Ticket expired while renewing credentials
>
>>
>
>>But in my /etc/krb5.conf, I have configed this item:
>
>>renew_lifetime=7d
>
>>
>
>>So, Can anybody give me some suggestions, please? Thankyou.
>
>>
>
>>At 2016-07-04 11:32:30, "Maria" <li...@126.com> wrote:
>
>>>
>
>>>
>
>>>And  I can suucessfully access hiveserver2 from beeline.
>
>>>
>
>>>
>
>>>I was so confused by this error"Peer indicated failure: GSS initiate failed".
>
>>>
>
>>> Can you anybody please help me? Any reply will be much appreciated.
>
>>>
>
>>>At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:
>
>>>>Yup,my  hiveserver2 log errors are:
>
>>>>
>
>>>>ERROR [Hiveserver2-Handler-Pool: Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) - error occurred during processing of message.
>
>>>>java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
>
>>>>    at org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
>
>>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
>
>>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
>
>>>>    at java.security.AccessController.doPrivileged(Native Method)
>
>>>>    at javax.security.auth.Subject.doAs(Subject.java:356)
>
>>>>    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
>
>>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
>
>>>>    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>
>>>>    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
>>>>    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
>>>>    at java.lang.Thread.run(Thread.java:745)
>
>>>>Caused by: org.apache.thrift.transport.TTransportException:Peer indicated failure: GSS initiate failed
>
>>>>    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
>
>>>>    at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>
>>>>    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>
>>>>    at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>
>>>>    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>
>>>> ... 10 more
>
>>>>================================================
>
>>>>As if the windows  hive JDBC client can communicate with the hiveserver2,isn't it?
>
>>>>
>
>>>>while I checked everything I can :
>
>>>>(1)in hiveserver2 node, I execute command "klist",the results are:
>
>>>>Ticket cache: FILE:/tmp/krb5cc_0
>
>>>>Default principal: hive/hm@HADOOP.COM
>
>>>>
>
>>>>Valid starting    Expires                     Service principal
>
>>>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/HADOOP.COM@HADOOP.COM
>
>>>>                 renew until 07/04/16 10:28:14
>
>>>>(2)in windows dos cmd,I execute command "klist",the results are:
>
>>>>Ticket cache:API: 1
>
>>>>Default principal: hive/hm@HADOOP.COM
>
>>>>
>
>>>>Valid starting    Expires                     Service principal
>
>>>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/HADOOP.COM@HADOOP.COM
>
>>>>                 renew until 07/04/16 10:24:32
>
>>>>
>
>>>> Is there any thing else I have to add or set for hiveserver2?
>
>>>>
>
>>>>Thanks in advance.
>
>>>>
>
>>>>
>
>>>>Maria.
>
>>>>
>
>>>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com> wrote:
>
>>>>
>
>>>>
>
>>>>Please look at the hiveserver2 log, it will have better error information. You can paste error from the logs if you need help. 
>
>>>>
>
>>>>
>
>>>>Regards,
>
>>>>
>
>>>>
>
>>>>Vivek
>
>>>>
>
>>>>
>
>>>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:
>
>>>>
>
>>>>
>
>>>>
>
>>>>Hi,all:
>
>>>>
>
>>>>     recently,I  attempted to access Kerberized hadoop cluster by launching JAVA applications from Windows workstations. And I hava configured kerberos in my windows7, and can successfully access hdfs50070. But when I launch JDBC from windows to connection remote hiveserver,errors accured:
>
>>>>
>
>>>>java.sql.SQLException:could not open client transport with JDBC Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS initiate failed
>
>>>>
>
>>>>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>
>>>>
>
>>>>     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>
>>>>
>
>>>>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
>
>>>>
>
>>>>     at java.sql.DriverManager.getConnection(Unknown Source)
>
>>>>
>
>>>>     at java.sql.DriverManager.getConnection(Unknown Source)
>
>>>>
>
>>>>     at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
>
>>>>
>
>>>>Caused by: org.apache.thrift.transport.TTransportException:GSS initiate failed
>
>>>>
>
>>>>     at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>
>>>>
>
>>>>     at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>
>>>>
>
>>>>     at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>
>>>>
>
>>>>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>
>>>>
>
>>>>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>
>>>>
>
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>
>>>>
>
>>>>     at javax.security.auth.Subject.doAs(Unknow source)
>
>>>>
>
>>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>
>>>>
>
>>>>     at  org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>
>>>>
>
>>>>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>
>>>>
>
>>>>... 5 more
>
>>>>
>
>>>>------------------------------------------------------------------------------
>
>>>>
>
>>>>below are my test codes:
>
>>>>
>
>>>>
>
>>>>
>
>>>>public static void main(String[] args) {
>
>>>>
>
>>>>    String principal = "hive/hm@HADOOM.COM";
>
>>>>
>
>>>>    String keytab = "E:\\Program Files (x86)\\java\\jre7\\lib\\security\\hive.keytab";
>
>>>>
>
>>>>    String url = "jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM";
>
>>>>
>
>>>>
>
>>>>
>
>>>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
>
>>>>
>
>>>>    conf.addResource(new File("core-site.xml").toURI().toURL());
>
>>>>
>
>>>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
>
>>>>
>
>>>>    conf.addResource(new File("hive-site.xml").toURI().toURL());
>
>>>>
>
>>>>
>
>>>>
>
>>>>    conf.set("hadoop.security.authentication", "Kerberos");
>
>>>>
>
>>>>    UserGroupInformation.setConfiguration(conf);
>
>>>>
>
>>>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
>
>>>>
>
>>>>
>
>>>>
>
>>>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
>
>>>>
>
>>>>    Connection conn =DriverManager.getConnection(url);
>
>>>>
>
>>>>
>
>>>>
>
>>>>    Statement stmt = conn.createStatement();
>
>>>>
>
>>>>    String sql = "select * from testkerberos";
>
>>>>
>
>>>>    ResultSet rs = stmt.executeQuery(sql);
>
>>>>
>
>>>>    while (rs.next()) {
>
>>>>
>
>>>>       system.out.println(rs.getString(1));
>
>>>>
>
>>>>    }
>
>>>>
>
>>>>}
>
>>>>
>
>>>>
>
>>>>
>
>>>>Does anyone had the same problem? Or know how to solve it ?
>
>>>>
>
>>>>
>
>>>>
>
>>>>Thanks in advance.
>
>>>>
>
>>>>
>
>>>>
>
>>>>Maria.
>
>>>>
>
>>>>
>
>>>>
>
>
>
>
>

Re:Re: Re:Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Maria <li...@126.com>.
Yup,yesterday I started to realize that The renewal is a  principal level setting. I hava fixed renew time in KDC kdc.conf. Do as Aviral said, I enable kerberos logs with     
    "-Dsun.security.krb5.debug=true" , more error info printed out:
------------------------------------------------------------------------------------------------
Java config name: E:\Program Files (x86)\Java\jre7\lib\security\krb5.conf
Loaded from Java config
Java config name: E:\Program Files (x86)\Java\jre7\lib\security\krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): HADOOP.COM
>>> KeyTabInputStream, readName(): hive
>>> KeyTabInputStream, readName(): hm
>>> KeyTab: load() entry length: 69; type: 18
>>> KeyTabInputStream, readName(): HADOOP.COM
>>> KeyTabInputStream, readName(): hive
>>> KeyTabInputStream, readName(): hm
>>> KeyTab: load() entry length: 53; type: 17
>>> KeyTabInputStream, readName(): HADOOP.COM
>>> KeyTabInputStream, readName(): hive
>>> KeyTabInputStream, readName(): hm
>>> KeyTab: load() entry length: 61; type: 16
>>> KeyTabInputStream, readName(): HADOOP.COM
>>> KeyTabInputStream, readName(): hive
>>> KeyTabInputStream, readName(): hm
>>> KeyTab: load() entry length: 53; type: 23
>>> KeyTabInputStream, readName(): HADOOP.COM
>>> KeyTabInputStream, readName(): hive
>>> KeyTabInputStream, readName(): hm
>>> KeyTab: load() entry length: 45; type: 8
>>> KeyTabInputStream, readName(): HADOOP.COM
>>> KeyTabInputStream, readName(): hive
>>> KeyTabInputStream, readName(): hm
>>> KeyTab: load() entry length: 45; type: 3
Added key: 3version: 1
Found unsupported keytype (8) for hive/hm@HADOOP.COM
Added key: 23version: 1
Added key: 16version: 1
Added key: 17version: 1
Found unsupported keytype (18) for hive/hm@HADOOP.COM
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 17 16 23 1 3.
Added key: 3version: 1
Found unsupported keytype (8) for hive/hm@HADOOP.COM
Added key: 23version: 1
Added key: 16version: 1
Added key: 17version: 1
Found unsupported keytype (18) for hive/hm@HADOOP.COM
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 17 16 23 1 3.
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 17 16 23 1 3.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=hm UDP:88, timeout=30000, number of retries =3, #bytes=145
>>> KDCCommunication: kdc=hm UDP:88, timeout=30000,Attempt =1, #bytes=145
>>> KrbKdcReq send: #bytes read=598
>>> KdcAccessibility: remove hm
Added key: 3version: 1
Found unsupported keytype (8) for hive/hm@HADOOP.COM
Added key: 23version: 1
Added key: 16version: 1
Added key: 17version: 1
Found unsupported keytype (18) for hive/hm@HADOOP.COM
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 17 16 23 1 3.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply hive/hm
Added key: 3version: 1
Found unsupported keytype (8) for hive/hm@HADOOP.COM
Added key: 23version: 1
Added key: 16version: 1
Added key: 17version: 1
Found unsupported keytype (18) for hive/hm@HADOOP.COM
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 17 16 23 1 3.
start connect hiveserver..
Found ticket for hive/hm@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM expiring on Wed Jul 06 09:29:15 CST 2016
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for hive/hm@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM expiring on Wed Jul 06 09:29:15 CST 2016
Service ticket not found in the subject
>>> Credentials acquireServiceCreds: same realm
Using builtin default etypes for default_tgs_enctypes
default etypes for default_tgs_enctypes: 17 16 23 1 3.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbKdcReq send: kdc=hm UDP:88, timeout=30000, number of retries =3, #bytes=619
>>> KDCCommunication: kdc=hm UDP:88, timeout=30000,Attempt =1, #bytes=619
>>> KrbKdcReq send: #bytes read=116
>>> KdcAccessibility: remove hm
>>> KDCRep: init() encoding tag is 126 req type is 13
>>>KRBError:
	 cTime is Wed Jul 04 22:58:32 CST 1984 457801112000
	 sTime is Tue Jul 05 09:29:15 CST 2016 1467682155000
	 suSec is 944361
	 error code is 37
	 error Message is Clock skew too great
	 realm is HADOOP.COM
	 sname is hive/hm
	 msgType is 30
KrbException: Clock skew too great (37) - PROCESS_TGS
	at sun.security.krb5.KrbTgsRep.<init>(Unknown Source)
	at sun.security.krb5.KrbTgsReq.getReply(Unknown Source)
	at sun.security.krb5.KrbTgsReq.sendAndGetCreds(Unknown Source)
	at sun.security.krb5.internal.CredentialsUtil.serviceCreds(Unknown Source)
	at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(Unknown Source)
	at sun.security.krb5.Credentials.acquireServiceCreds(Unknown Source)
	at sun.security.jgss.krb5.Krb5Context.initSecContext(Unknown Source)
	at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source)
	at sun.security.jgss.GSSContextImpl.initSecContext(Unknown Source)
	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(Unknown Source)
	at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
	at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Unknown Source)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
	at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
	at java.sql.DriverManager.getConnection(Unknown Source)
	at java.sql.DriverManager.getConnection(Unknown Source)
	at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:50)
Caused by: KrbException: Identifier doesn't match expected value (906)
	at sun.security.krb5.internal.KDCRep.init(Unknown Source)
	at sun.security.krb5.internal.TGSRep.init(Unknown Source)
	at sun.security.krb5.internal.TGSRep.<init>(Unknown Source)
	... 25 more
java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOP.COM: GSS initiate failed
	at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
	at java.sql.DriverManager.getConnection(Unknown Source)
	at java.sql.DriverManager.getConnection(Unknown Source)
	at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:50)
Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed
	at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
	at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Unknown Source)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
	at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
	... 5 more

As if kerberos configuration is incorrect ....


At 2016-07-04 21:26:53, "Vivek Shrivastava" <vi...@gmail.com> wrote:
 

The renewal lifetime at client krb5.conf level does make any difference. The renewal time period is defined at  kdc in kdc.conf. Client can not override it. The renewal is also a property set at the principal level, both the settings ( renewal_lifetime, +renewal ) dictate if a ticket can be renewed. I don't think your problem has anything to do with that. 


Seems something basic is missing in your environment. I would probably, run the same piece of code in the unix environment and ensure that there is no error. Enabling Kerberos debugging logging as suggested in the previous post will also help you compare the sequence of execution. 


On Mon, Jul 4, 2016 at 7:52 AM, Aviral Agarwal <av...@gmail.com> wrote:


Hi,
Could you enable kerberos logs with 
    
    -Dsun.security.krb5.debug=true


and paste the output ?




On Mon, Jul 4, 2016 at 3:47 PM, Maria <li...@126.com> wrote:

The qestion "kinit: Ticket expired while renewing credentials" has been solved. I can successfully execute "kinit -R",

but the error “java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed”

is still there..





At 2016-07-04 14:39:04, "Maria" <li...@126.com> wrote:

>I saw a  mail named "HCatalog Security",His or her problem was similar to mine,and the reply answer were:

>"This issue goes away after doing a kinit -R".

>

>So I did the same operation.while it is failed:

>kinit: Ticket expired while renewing credentials

>

>But in my /etc/krb5.conf, I have configed this item:

>renew_lifetime=7d

>

>So, Can anybody give me some suggestions, please? Thankyou.

>

>At 2016-07-04 11:32:30, "Maria" <li...@126.com> wrote:

>>

>>

>>And  I can suucessfully access hiveserver2 from beeline.

>>

>>

>>I was so confused by this error"Peer indicated failure: GSS initiate failed".

>>

>> Can you anybody please help me? Any reply will be much appreciated.

>>

>>At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:

>>>Yup,my  hiveserver2 log errors are:

>>>

>>>ERROR [Hiveserver2-Handler-Pool: Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) - error occurred during processing of message.

>>>java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed

>>>    at org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)

>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)

>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)

>>>    at java.security.AccessController.doPrivileged(Native Method)

>>>    at javax.security.auth.Subject.doAs(Subject.java:356)

>>>    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)

>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)

>>>    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)

>>>    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

>>>    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

>>>    at java.lang.Thread.run(Thread.java:745)

>>>Caused by: org.apache.thrift.transport.TTransportException:Peer indicated failure: GSS initiate failed

>>>    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)

>>>    at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)

>>>    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)

>>>    at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)

>>>    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)

>>> ... 10 more

>>>================================================

>>>As if the windows  hive JDBC client can communicate with the hiveserver2,isn't it?

>>>

>>>while I checked everything I can :

>>>(1)in hiveserver2 node, I execute command "klist",the results are:

>>>Ticket cache: FILE:/tmp/krb5cc_0

>>>Default principal: hive/hm@HADOOP.COM

>>>

>>>Valid starting    Expires                     Service principal

>>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/HADOOP.COM@HADOOP.COM

>>>                 renew until 07/04/16 10:28:14

>>>(2)in windows dos cmd,I execute command "klist",the results are:

>>>Ticket cache:API: 1

>>>Default principal: hive/hm@HADOOP.COM

>>>

>>>Valid starting    Expires                     Service principal

>>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/HADOOP.COM@HADOOP.COM

>>>                 renew until 07/04/16 10:24:32

>>>

>>> Is there any thing else I have to add or set for hiveserver2?

>>>

>>>Thanks in advance.

>>>

>>>

>>>Maria.

>>>

>>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com> wrote:

>>>

>>>

>>>Please look at the hiveserver2 log, it will have better error information. You can paste error from the logs if you need help. 

>>>

>>>

>>>Regards,

>>>

>>>

>>>Vivek

>>>

>>>

>>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:

>>>

>>>

>>>

>>>Hi,all:

>>>

>>>     recently,I  attempted to access Kerberized hadoop cluster by launching JAVA applications from Windows workstations. And I hava configured kerberos in my windows7, and can successfully access hdfs50070. But when I launch JDBC from windows to connection remote hiveserver,errors accured:

>>>

>>>java.sql.SQLException:could not open client transport with JDBC Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS initiate failed

>>>

>>>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)

>>>

>>>     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)

>>>

>>>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)

>>>

>>>     at java.sql.DriverManager.getConnection(Unknown Source)

>>>

>>>     at java.sql.DriverManager.getConnection(Unknown Source)

>>>

>>>     at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)

>>>

>>>Caused by: org.apache.thrift.transport.TTransportException:GSS initiate failed

>>>

>>>     at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)

>>>

>>>     at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)

>>>

>>>     at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)

>>>

>>>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)

>>>

>>>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)

>>>

>>>     at java.security.AccessController.doPrivileged(Native Method)

>>>

>>>     at javax.security.auth.Subject.doAs(Unknow source)

>>>

>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)

>>>

>>>     at  org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)

>>>

>>>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)

>>>

>>>... 5 more

>>>

>>>------------------------------------------------------------------------------

>>>

>>>below are my test codes:

>>>

>>>

>>>

>>>public static void main(String[] args) {

>>>

>>>    String principal = "hive/hm@HADOOM.COM";

>>>

>>>    String keytab = "E:\\Program Files (x86)\\java\\jre7\\lib\\security\\hive.keytab";

>>>

>>>    String url = "jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM";

>>>

>>>

>>>

>>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());

>>>

>>>    conf.addResource(new File("core-site.xml").toURI().toURL());

>>>

>>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());

>>>

>>>    conf.addResource(new File("hive-site.xml").toURI().toURL());

>>>

>>>

>>>

>>>    conf.set("hadoop.security.authentication", "Kerberos");

>>>

>>>    UserGroupInformation.setConfiguration(conf);

>>>

>>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);

>>>

>>>

>>>

>>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");

>>>

>>>    Connection conn =DriverManager.getConnection(url);

>>>

>>>

>>>

>>>    Statement stmt = conn.createStatement();

>>>

>>>    String sql = "select * from testkerberos";

>>>

>>>    ResultSet rs = stmt.executeQuery(sql);

>>>

>>>    while (rs.next()) {

>>>

>>>       system.out.println(rs.getString(1));

>>>

>>>    }

>>>

>>>}

>>>

>>>

>>>

>>>Does anyone had the same problem? Or know how to solve it ?

>>>

>>>

>>>

>>>Thanks in advance.

>>>

>>>

>>>

>>>Maria.

>>>

>>>

>>>






Re: Re:Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Vivek Shrivastava <vi...@gmail.com>.
The renewal lifetime at client krb5.conf level does make any difference.
The renewal time period is defined at  kdc in kdc.conf. Client can not
override it. The renewal is also a property set at the principal level,
both the settings ( renewal_lifetime, +renewal ) dictate if a ticket can be
renewed. I don't think your problem has anything to do with that.

Seems something basic is missing in your environment. I would probably, run
the same piece of code in the unix environment and ensure that there is no
error. Enabling Kerberos debugging logging as suggested in the previous
post will also help you compare the sequence of execution.

On Mon, Jul 4, 2016 at 7:52 AM, Aviral Agarwal <av...@gmail.com>
wrote:

> Hi,
> Could you enable kerberos logs with
> -Dsun.security.krb5.debug=true
>
> and paste the output ?
>
> On Mon, Jul 4, 2016 at 3:47 PM, Maria <li...@126.com> wrote:
>
>> The qestion "kinit: Ticket expired while renewing credentials" has
>> been solved. I can successfully execute "kinit -R",
>> but the error “java.lang.RuntimeException:
>> org.apache.thrift.transport.TTransportException: Peer indicated failure:
>> GSS initiate failed”
>> is still there..
>>
>> At 2016-07-04 14:39:04, "Maria" <li...@126.com> wrote:
>> >I saw a  mail named "HCatalog Security",His or her problem was similar
>> to mine,and the reply answer were:
>> >"This issue goes away after doing a kinit -R".
>> >
>> >So I did the same operation.while it is failed:
>> >kinit: Ticket expired while renewing credentials
>> >
>> >But in my /etc/krb5.conf, I have configed this item:
>> >renew_lifetime=7d
>> >
>> >So, Can anybody give me some suggestions, please? Thankyou.
>> >
>> >At 2016-07-04 11:32:30, "Maria" <li...@126.com> wrote:
>> >>
>> >>
>> >>And  I can suucessfully access hiveserver2 from beeline.
>> >>
>> >>
>> >>I was so confused by this error"Peer indicated failure: GSS initiate
>> failed".
>> >>
>> >> Can you anybody please help me? Any reply will be much appreciated.
>> >>
>> >>At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:
>> >>>Yup,my  hiveserver2 log errors are:
>> >>>
>> >>>ERROR [Hiveserver2-Handler-Pool:
>> Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) -
>> error occurred during processing of message.
>> >>>java.lang.RuntimeException:
>> org.apache.thrift.transport.TTransportException: Peer indicated failure:
>> GSS initiate failed
>> >>>    at
>> org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
>> >>>    at
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
>> >>>    at
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
>> >>>    at java.security.AccessController.doPrivileged(Native Method)
>> >>>    at javax.security.auth.Subject.doAs(Subject.java:356)
>> >>>    at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
>> >>>    at
>> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
>> >>>    at
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>> >>>    at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> >>>    at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> >>>    at java.lang.Thread.run(Thread.java:745)
>> >>>Caused by: org.apache.thrift.transport.TTransportException:Peer
>> indicated failure: GSS initiate failed
>> >>>    at
>> org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
>> >>>    at
>> org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>> >>>    at
>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>> >>>    at
>> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>> >>>    at
>> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>> >>> ... 10 more
>> >>>================================================
>> >>>As if the windows  hive JDBC client can communicate with the
>> hiveserver2,isn't it?
>> >>>
>> >>>while I checked everything I can :
>> >>>(1)in hiveserver2 node, I execute command "klist",the results are:
>> >>>Ticket cache: FILE:/tmp/krb5cc_0
>> >>>Default principal: hive/hm@HADOOP.COM
>> >>>
>> >>>Valid starting    Expires                     Service principal
>> >>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/
>> HADOOP.COM@HADOOP.COM
>> >>>                 renew until 07/04/16 10:28:14
>> >>>(2)in windows dos cmd,I execute command "klist",the results are:
>> >>>Ticket cache:API: 1
>> >>>Default principal: hive/hm@HADOOP.COM
>> >>>
>> >>>Valid starting    Expires                     Service principal
>> >>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/
>> HADOOP.COM@HADOOP.COM
>> >>>                 renew until 07/04/16 10:24:32
>> >>>
>> >>> Is there any thing else I have to add or set for hiveserver2?
>> >>>
>> >>>Thanks in advance.
>> >>>
>> >>>
>> >>>Maria.
>> >>>
>> >>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com>
>> wrote:
>> >>>
>> >>>
>> >>>Please look at the hiveserver2 log, it will have better error
>> information. You can paste error from the logs if you need help.
>> >>>
>> >>>
>> >>>Regards,
>> >>>
>> >>>
>> >>>Vivek
>> >>>
>> >>>
>> >>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:
>> >>>
>> >>>
>> >>>
>> >>>Hi,all:
>> >>>
>> >>>     recently,I  attempted to access Kerberized hadoop cluster by
>> launching JAVA applications from Windows workstations. And I hava
>> configured kerberos in my windows7, and can successfully access hdfs50070.
>> But when I launch JDBC from windows to connection remote hiveserver,errors
>> accured:
>> >>>
>> >>>java.sql.SQLException:could not open client transport with JDBC
>> Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS
>> initiate failed
>> >>>
>> >>>     at
>> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>> >>>
>> >>>     at
>> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>> >>>
>> >>>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
>> >>>
>> >>>     at java.sql.DriverManager.getConnection(Unknown Source)
>> >>>
>> >>>     at java.sql.DriverManager.getConnection(Unknown Source)
>> >>>
>> >>>     at
>> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
>> >>>
>> >>>Caused by: org.apache.thrift.transport.TTransportException:GSS
>> initiate failed
>> >>>
>> >>>     at
>> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>> >>>
>> >>>     at
>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>> >>>
>> >>>     at
>> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>> >>>
>> >>>     at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>> >>>
>> >>>     at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>> >>>
>> >>>     at java.security.AccessController.doPrivileged(Native Method)
>> >>>
>> >>>     at javax.security.auth.Subject.doAs(Unknow source)
>> >>>
>> >>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>> >>>
>> >>>     at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>> >>>
>> >>>     at
>> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>> >>>
>> >>>... 5 more
>> >>>
>>
>> >>>------------------------------------------------------------------------------
>> >>>
>> >>>below are my test codes:
>> >>>
>> >>>
>> >>>
>> >>>public static void main(String[] args) {
>> >>>
>> >>>    String principal = "hive/hm@HADOOM.COM";
>> >>>
>> >>>    String keytab = "E:\\Program Files
>> (x86)\\java\\jre7\\lib\\security\\hive.keytab";
>> >>>
>> >>>    String url = "jdbc:hive2://hm:10000/default;principal=hive/
>> hm@HADOOM.COM";
>> >>>
>> >>>
>> >>>
>> >>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
>> >>>
>> >>>    conf.addResource(new File("core-site.xml").toURI().toURL());
>> >>>
>> >>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
>> >>>
>> >>>    conf.addResource(new File("hive-site.xml").toURI().toURL());
>> >>>
>> >>>
>> >>>
>> >>>    conf.set("hadoop.security.authentication", "Kerberos");
>> >>>
>> >>>    UserGroupInformation.setConfiguration(conf);
>> >>>
>> >>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
>> >>>
>> >>>
>> >>>
>> >>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
>> >>>
>> >>>    Connection conn =DriverManager.getConnection(url);
>> >>>
>> >>>
>> >>>
>> >>>    Statement stmt = conn.createStatement();
>> >>>
>> >>>    String sql = "select * from testkerberos";
>> >>>
>> >>>    ResultSet rs = stmt.executeQuery(sql);
>> >>>
>> >>>    while (rs.next()) {
>> >>>
>> >>>       system.out.println(rs.getString(1));
>> >>>
>> >>>    }
>> >>>
>> >>>}
>> >>>
>> >>>
>> >>>
>> >>>Does anyone had the same problem? Or know how to solve it ?
>> >>>
>> >>>
>> >>>
>> >>>Thanks in advance.
>> >>>
>> >>>
>> >>>
>> >>>Maria.
>> >>>
>> >>>
>> >>>
>>
>
>

Re: Re:Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Aviral Agarwal <av...@gmail.com>.
Hi,
Could you enable kerberos logs with
-Dsun.security.krb5.debug=true

and paste the output ?

On Mon, Jul 4, 2016 at 3:47 PM, Maria <li...@126.com> wrote:

> The qestion "kinit: Ticket expired while renewing credentials" has
> been solved. I can successfully execute "kinit -R",
> but the error “java.lang.RuntimeException:
> org.apache.thrift.transport.TTransportException: Peer indicated failure:
> GSS initiate failed”
> is still there..
>
> At 2016-07-04 14:39:04, "Maria" <li...@126.com> wrote:
> >I saw a  mail named "HCatalog Security",His or her problem was similar to
> mine,and the reply answer were:
> >"This issue goes away after doing a kinit -R".
> >
> >So I did the same operation.while it is failed:
> >kinit: Ticket expired while renewing credentials
> >
> >But in my /etc/krb5.conf, I have configed this item:
> >renew_lifetime=7d
> >
> >So, Can anybody give me some suggestions, please? Thankyou.
> >
> >At 2016-07-04 11:32:30, "Maria" <li...@126.com> wrote:
> >>
> >>
> >>And  I can suucessfully access hiveserver2 from beeline.
> >>
> >>
> >>I was so confused by this error"Peer indicated failure: GSS initiate
> failed".
> >>
> >> Can you anybody please help me? Any reply will be much appreciated.
> >>
> >>At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:
> >>>Yup,my  hiveserver2 log errors are:
> >>>
> >>>ERROR [Hiveserver2-Handler-Pool:
> Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) -
> error occurred during processing of message.
> >>>java.lang.RuntimeException:
> org.apache.thrift.transport.TTransportException: Peer indicated failure:
> GSS initiate failed
> >>>    at
> org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
> >>>    at
> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
> >>>    at
> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
> >>>    at java.security.AccessController.doPrivileged(Native Method)
> >>>    at javax.security.auth.Subject.doAs(Subject.java:356)
> >>>    at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
> >>>    at
> org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
> >>>    at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
> >>>    at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >>>    at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >>>    at java.lang.Thread.run(Thread.java:745)
> >>>Caused by: org.apache.thrift.transport.TTransportException:Peer
> indicated failure: GSS initiate failed
> >>>    at
> org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
> >>>    at
> org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
> >>>    at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> >>>    at
> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
> >>>    at
> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
> >>> ... 10 more
> >>>================================================
> >>>As if the windows  hive JDBC client can communicate with the
> hiveserver2,isn't it?
> >>>
> >>>while I checked everything I can :
> >>>(1)in hiveserver2 node, I execute command "klist",the results are:
> >>>Ticket cache: FILE:/tmp/krb5cc_0
> >>>Default principal: hive/hm@HADOOP.COM
> >>>
> >>>Valid starting    Expires                     Service principal
> >>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/HADOOP.COM@HADOOP.COM
> >>>                 renew until 07/04/16 10:28:14
> >>>(2)in windows dos cmd,I execute command "klist",the results are:
> >>>Ticket cache:API: 1
> >>>Default principal: hive/hm@HADOOP.COM
> >>>
> >>>Valid starting    Expires                     Service principal
> >>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/HADOOP.COM@HADOOP.COM
> >>>                 renew until 07/04/16 10:24:32
> >>>
> >>> Is there any thing else I have to add or set for hiveserver2?
> >>>
> >>>Thanks in advance.
> >>>
> >>>
> >>>Maria.
> >>>
> >>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com>
> wrote:
> >>>
> >>>
> >>>Please look at the hiveserver2 log, it will have better error
> information. You can paste error from the logs if you need help.
> >>>
> >>>
> >>>Regards,
> >>>
> >>>
> >>>Vivek
> >>>
> >>>
> >>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:
> >>>
> >>>
> >>>
> >>>Hi,all:
> >>>
> >>>     recently,I  attempted to access Kerberized hadoop cluster by
> launching JAVA applications from Windows workstations. And I hava
> configured kerberos in my windows7, and can successfully access hdfs50070.
> But when I launch JDBC from windows to connection remote hiveserver,errors
> accured:
> >>>
> >>>java.sql.SQLException:could not open client transport with JDBC
> Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS
> initiate failed
> >>>
> >>>     at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
> >>>
> >>>     at
> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
> >>>
> >>>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
> >>>
> >>>     at java.sql.DriverManager.getConnection(Unknown Source)
> >>>
> >>>     at java.sql.DriverManager.getConnection(Unknown Source)
> >>>
> >>>     at
> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
> >>>
> >>>Caused by: org.apache.thrift.transport.TTransportException:GSS initiate
> failed
> >>>
> >>>     at
> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
> >>>
> >>>     at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
> >>>
> >>>     at
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> >>>
> >>>     at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> >>>
> >>>     at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> >>>
> >>>     at java.security.AccessController.doPrivileged(Native Method)
> >>>
> >>>     at javax.security.auth.Subject.doAs(Unknow source)
> >>>
> >>>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> >>>
> >>>     at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> >>>
> >>>     at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
> >>>
> >>>... 5 more
> >>>
>
> >>>------------------------------------------------------------------------------
> >>>
> >>>below are my test codes:
> >>>
> >>>
> >>>
> >>>public static void main(String[] args) {
> >>>
> >>>    String principal = "hive/hm@HADOOM.COM";
> >>>
> >>>    String keytab = "E:\\Program Files
> (x86)\\java\\jre7\\lib\\security\\hive.keytab";
> >>>
> >>>    String url = "jdbc:hive2://hm:10000/default;principal=hive/
> hm@HADOOM.COM";
> >>>
> >>>
> >>>
> >>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
> >>>
> >>>    conf.addResource(new File("core-site.xml").toURI().toURL());
> >>>
> >>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
> >>>
> >>>    conf.addResource(new File("hive-site.xml").toURI().toURL());
> >>>
> >>>
> >>>
> >>>    conf.set("hadoop.security.authentication", "Kerberos");
> >>>
> >>>    UserGroupInformation.setConfiguration(conf);
> >>>
> >>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
> >>>
> >>>
> >>>
> >>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
> >>>
> >>>    Connection conn =DriverManager.getConnection(url);
> >>>
> >>>
> >>>
> >>>    Statement stmt = conn.createStatement();
> >>>
> >>>    String sql = "select * from testkerberos";
> >>>
> >>>    ResultSet rs = stmt.executeQuery(sql);
> >>>
> >>>    while (rs.next()) {
> >>>
> >>>       system.out.println(rs.getString(1));
> >>>
> >>>    }
> >>>
> >>>}
> >>>
> >>>
> >>>
> >>>Does anyone had the same problem? Or know how to solve it ?
> >>>
> >>>
> >>>
> >>>Thanks in advance.
> >>>
> >>>
> >>>
> >>>Maria.
> >>>
> >>>
> >>>
>

Re:Re:Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Maria <li...@126.com>.
The qestion "kinit: Ticket expired while renewing credentials" has been solved. I can successfully execute "kinit -R",
but the error “java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed”
is still there..

At 2016-07-04 14:39:04, "Maria" <li...@126.com> wrote:
>I saw a  mail named "HCatalog Security",His or her problem was similar to mine,and the reply answer were:
>"This issue goes away after doing a kinit -R".
>
>So I did the same operation.while it is failed:
>kinit: Ticket expired while renewing credentials
>
>But in my /etc/krb5.conf, I have configed this item:
>renew_lifetime=7d
>
>So, Can anybody give me some suggestions, please? Thankyou.
>
>At 2016-07-04 11:32:30, "Maria" <li...@126.com> wrote:
>>
>>
>>And  I can suucessfully access hiveserver2 from beeline.
>>
>>
>>I was so confused by this error"Peer indicated failure: GSS initiate failed".
>>
>> Can you anybody please help me? Any reply will be much appreciated.
>>
>>At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:
>>>Yup,my  hiveserver2 log errors are:
>>>
>>>ERROR [Hiveserver2-Handler-Pool: Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) - error occurred during processing of message.
>>>java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
>>>    at org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
>>>    at java.security.AccessController.doPrivileged(Native Method)
>>>    at javax.security.auth.Subject.doAs(Subject.java:356)
>>>    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
>>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
>>>    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>>>    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>    at java.lang.Thread.run(Thread.java:745)
>>>Caused by: org.apache.thrift.transport.TTransportException:Peer indicated failure: GSS initiate failed
>>>    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
>>>    at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>>>    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>>>    at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>>>    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>>> ... 10 more
>>>================================================
>>>As if the windows  hive JDBC client can communicate with the hiveserver2,isn't it?
>>>
>>>while I checked everything I can :
>>>(1)in hiveserver2 node, I execute command "klist",the results are:
>>>Ticket cache: FILE:/tmp/krb5cc_0
>>>Default principal: hive/hm@HADOOP.COM
>>>
>>>Valid starting    Expires                     Service principal
>>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/HADOOP.COM@HADOOP.COM
>>>                 renew until 07/04/16 10:28:14
>>>(2)in windows dos cmd,I execute command "klist",the results are:
>>>Ticket cache:API: 1
>>>Default principal: hive/hm@HADOOP.COM
>>>
>>>Valid starting    Expires                     Service principal
>>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/HADOOP.COM@HADOOP.COM
>>>                 renew until 07/04/16 10:24:32
>>>
>>> Is there any thing else I have to add or set for hiveserver2?
>>>
>>>Thanks in advance.
>>>
>>>
>>>Maria.
>>>
>>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com> wrote:
>>> 
>>>
>>>Please look at the hiveserver2 log, it will have better error information. You can paste error from the logs if you need help. 
>>>
>>>
>>>Regards,
>>>
>>>
>>>Vivek
>>>
>>>
>>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:
>>>
>>>
>>>
>>>Hi,all:
>>>
>>>     recently,I  attempted to access Kerberized hadoop cluster by launching JAVA applications from Windows workstations. And I hava configured kerberos in my windows7, and can successfully access hdfs50070. But when I launch JDBC from windows to connection remote hiveserver,errors accured:
>>>
>>>java.sql.SQLException:could not open client transport with JDBC Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS initiate failed
>>>
>>>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>>>
>>>     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>>>
>>>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
>>>
>>>     at java.sql.DriverManager.getConnection(Unknown Source)
>>>
>>>     at java.sql.DriverManager.getConnection(Unknown Source)
>>>
>>>     at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
>>>
>>>Caused by: org.apache.thrift.transport.TTransportException:GSS initiate failed
>>>
>>>     at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>>>
>>>     at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>>>
>>>     at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>>>
>>>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>>>
>>>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>>>
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>
>>>     at javax.security.auth.Subject.doAs(Unknow source)
>>>
>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>>
>>>     at  org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>>>
>>>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>>>
>>>... 5 more
>>>
>>>------------------------------------------------------------------------------
>>>
>>>below are my test codes:
>>>
>>>
>>>
>>>public static void main(String[] args) {
>>>
>>>    String principal = "hive/hm@HADOOM.COM";
>>>
>>>    String keytab = "E:\\Program Files (x86)\\java\\jre7\\lib\\security\\hive.keytab";
>>>
>>>    String url = "jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM";
>>>
>>>
>>>
>>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
>>>
>>>    conf.addResource(new File("core-site.xml").toURI().toURL());
>>>
>>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
>>>
>>>    conf.addResource(new File("hive-site.xml").toURI().toURL());
>>>
>>>
>>>
>>>    conf.set("hadoop.security.authentication", "Kerberos");
>>>
>>>    UserGroupInformation.setConfiguration(conf);
>>>
>>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
>>>
>>>
>>>
>>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
>>>
>>>    Connection conn =DriverManager.getConnection(url);
>>>
>>>
>>>
>>>    Statement stmt = conn.createStatement();
>>>
>>>    String sql = "select * from testkerberos";
>>>
>>>    ResultSet rs = stmt.executeQuery(sql);
>>>
>>>    while (rs.next()) {
>>>
>>>       system.out.println(rs.getString(1));
>>>
>>>    }
>>>
>>>}
>>>
>>>
>>>
>>>Does anyone had the same problem? Or know how to solve it ?
>>>
>>>
>>>
>>>Thanks in advance.
>>>
>>>
>>>
>>>Maria.
>>>
>>>
>>>

Re:Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Maria <li...@126.com>.
I saw a  mail named "HCatalog Security",His or her problem was similar to mine,and the reply answer were:
"This issue goes away after doing a kinit -R".

So I did the same operation.while it is failed:
kinit: Ticket expired while renewing credentials

But in my /etc/krb5.conf, I have configed this item:
renew_lifetime=7d

So, Can anybody give me some suggestions, please? Thankyou.

At 2016-07-04 11:32:30, "Maria" <li...@126.com> wrote:
>
>
>And  I can suucessfully access hiveserver2 from beeline.
>
>
>I was so confused by this error"Peer indicated failure: GSS initiate failed".
>
> Can you anybody please help me? Any reply will be much appreciated.
>
>At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:
>>Yup,my  hiveserver2 log errors are:
>>
>>ERROR [Hiveserver2-Handler-Pool: Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) - error occurred during processing of message.
>>java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
>>    at org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
>>    at java.security.AccessController.doPrivileged(Native Method)
>>    at javax.security.auth.Subject.doAs(Subject.java:356)
>>    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
>>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
>>    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>>    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>    at java.lang.Thread.run(Thread.java:745)
>>Caused by: org.apache.thrift.transport.TTransportException:Peer indicated failure: GSS initiate failed
>>    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
>>    at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>>    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>>    at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>>    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>> ... 10 more
>>================================================
>>As if the windows  hive JDBC client can communicate with the hiveserver2,isn't it?
>>
>>while I checked everything I can :
>>(1)in hiveserver2 node, I execute command "klist",the results are:
>>Ticket cache: FILE:/tmp/krb5cc_0
>>Default principal: hive/hm@HADOOP.COM
>>
>>Valid starting    Expires                     Service principal
>>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/HADOOP.COM@HADOOP.COM
>>                 renew until 07/04/16 10:28:14
>>(2)in windows dos cmd,I execute command "klist",the results are:
>>Ticket cache:API: 1
>>Default principal: hive/hm@HADOOP.COM
>>
>>Valid starting    Expires                     Service principal
>>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/HADOOP.COM@HADOOP.COM
>>                 renew until 07/04/16 10:24:32
>>
>> Is there any thing else I have to add or set for hiveserver2?
>>
>>Thanks in advance.
>>
>>
>>Maria.
>>
>>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com> wrote:
>> 
>>
>>Please look at the hiveserver2 log, it will have better error information. You can paste error from the logs if you need help. 
>>
>>
>>Regards,
>>
>>
>>Vivek
>>
>>
>>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:
>>
>>
>>
>>Hi,all:
>>
>>     recently,I  attempted to access Kerberized hadoop cluster by launching JAVA applications from Windows workstations. And I hava configured kerberos in my windows7, and can successfully access hdfs50070. But when I launch JDBC from windows to connection remote hiveserver,errors accured:
>>
>>java.sql.SQLException:could not open client transport with JDBC Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS initiate failed
>>
>>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>>
>>     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>>
>>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
>>
>>     at java.sql.DriverManager.getConnection(Unknown Source)
>>
>>     at java.sql.DriverManager.getConnection(Unknown Source)
>>
>>     at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
>>
>>Caused by: org.apache.thrift.transport.TTransportException:GSS initiate failed
>>
>>     at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>>
>>     at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>>
>>     at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>>
>>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>>
>>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>>
>>     at java.security.AccessController.doPrivileged(Native Method)
>>
>>     at javax.security.auth.Subject.doAs(Unknow source)
>>
>>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>
>>     at  org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>>
>>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>>
>>... 5 more
>>
>>------------------------------------------------------------------------------
>>
>>below are my test codes:
>>
>>
>>
>>public static void main(String[] args) {
>>
>>    String principal = "hive/hm@HADOOM.COM";
>>
>>    String keytab = "E:\\Program Files (x86)\\java\\jre7\\lib\\security\\hive.keytab";
>>
>>    String url = "jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM";
>>
>>
>>
>>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
>>
>>    conf.addResource(new File("core-site.xml").toURI().toURL());
>>
>>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
>>
>>    conf.addResource(new File("hive-site.xml").toURI().toURL());
>>
>>
>>
>>    conf.set("hadoop.security.authentication", "Kerberos");
>>
>>    UserGroupInformation.setConfiguration(conf);
>>
>>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
>>
>>
>>
>>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
>>
>>    Connection conn =DriverManager.getConnection(url);
>>
>>
>>
>>    Statement stmt = conn.createStatement();
>>
>>    String sql = "select * from testkerberos";
>>
>>    ResultSet rs = stmt.executeQuery(sql);
>>
>>    while (rs.next()) {
>>
>>       system.out.println(rs.getString(1));
>>
>>    }
>>
>>}
>>
>>
>>
>>Does anyone had the same problem? Or know how to solve it ?
>>
>>
>>
>>Thanks in advance.
>>
>>
>>
>>Maria.
>>
>>
>>

Re:Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Maria <li...@126.com>.

And  I can suucessfully access hiveserver2 from beeline.


I was so confused by this error"Peer indicated failure: GSS initiate failed".

 Can you anybody please help me? Any reply will be much appreciated.

At 2016-07-04 11:26:53, "Maria" <li...@126.com> wrote:
>Yup,my  hiveserver2 log errors are:
>
>ERROR [Hiveserver2-Handler-Pool: Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) - error occurred during processing of message.
>java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
>    at org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
>    at java.security.AccessController.doPrivileged(Native Method)
>    at javax.security.auth.Subject.doAs(Subject.java:356)
>    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
>    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
>    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
>    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>    at java.lang.Thread.run(Thread.java:745)
>Caused by: org.apache.thrift.transport.TTransportException:Peer indicated failure: GSS initiate failed
>    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
>    at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>    at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
> ... 10 more
>================================================
>As if the windows  hive JDBC client can communicate with the hiveserver2,isn't it?
>
>while I checked everything I can :
>(1)in hiveserver2 node, I execute command "klist",the results are:
>Ticket cache: FILE:/tmp/krb5cc_0
>Default principal: hive/hm@HADOOP.COM
>
>Valid starting    Expires                     Service principal
>07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/HADOOP.COM@HADOOP.COM
>                 renew until 07/04/16 10:28:14
>(2)in windows dos cmd,I execute command "klist",the results are:
>Ticket cache:API: 1
>Default principal: hive/hm@HADOOP.COM
>
>Valid starting    Expires                     Service principal
>07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/HADOOP.COM@HADOOP.COM
>                 renew until 07/04/16 10:24:32
>
> Is there any thing else I have to add or set for hiveserver2?
>
>Thanks in advance.
>
>
>Maria.
>
>At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com> wrote:
> 
>
>Please look at the hiveserver2 log, it will have better error information. You can paste error from the logs if you need help. 
>
>
>Regards,
>
>
>Vivek
>
>
>On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:
>
>
>
>Hi,all:
>
>     recently,I  attempted to access Kerberized hadoop cluster by launching JAVA applications from Windows workstations. And I hava configured kerberos in my windows7, and can successfully access hdfs50070. But when I launch JDBC from windows to connection remote hiveserver,errors accured:
>
>java.sql.SQLException:could not open client transport with JDBC Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS initiate failed
>
>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>
>     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>
>     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
>
>     at java.sql.DriverManager.getConnection(Unknown Source)
>
>     at java.sql.DriverManager.getConnection(Unknown Source)
>
>     at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
>
>Caused by: org.apache.thrift.transport.TTransportException:GSS initiate failed
>
>     at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>
>     at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>
>     at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>
>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>
>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>
>     at java.security.AccessController.doPrivileged(Native Method)
>
>     at javax.security.auth.Subject.doAs(Unknow source)
>
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>
>     at  org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>
>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
>
>... 5 more
>
>------------------------------------------------------------------------------
>
>below are my test codes:
>
>
>
>public static void main(String[] args) {
>
>    String principal = "hive/hm@HADOOM.COM";
>
>    String keytab = "E:\\Program Files (x86)\\java\\jre7\\lib\\security\\hive.keytab";
>
>    String url = "jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM";
>
>
>
>    conf.addResource(new File("hdfs-site.xml").toURI().toURL());
>
>    conf.addResource(new File("core-site.xml").toURI().toURL());
>
>    conf.addResource(new File("yarn-site.xml").toURI().toURL());
>
>    conf.addResource(new File("hive-site.xml").toURI().toURL());
>
>
>
>    conf.set("hadoop.security.authentication", "Kerberos");
>
>    UserGroupInformation.setConfiguration(conf);
>
>    UserGroupInformation.loginUserFromKeytab(principal, keytab);
>
>
>
>    Class.forName("org.apache.hive.,jdbc.HiveDriver");
>
>    Connection conn =DriverManager.getConnection(url);
>
>
>
>    Statement stmt = conn.createStatement();
>
>    String sql = "select * from testkerberos";
>
>    ResultSet rs = stmt.executeQuery(sql);
>
>    while (rs.next()) {
>
>       system.out.println(rs.getString(1));
>
>    }
>
>}
>
>
>
>Does anyone had the same problem? Or know how to solve it ?
>
>
>
>Thanks in advance.
>
>
>
>Maria.
>
>
>

Re:Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Maria <li...@126.com>.
Yup,my  hiveserver2 log errors are:

ERROR [Hiveserver2-Handler-Pool: Thread-48]:server.TThreadPoolServer(TThreadPoolServer.java:run(296)) - error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
    at org.apache.thrift.transport.TSaslServerTransport$FactorygetTransport(TSaslServerTransport.java:219)
    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:356)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1608)
    at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException:Peer indicated failure: GSS initiate failed
    at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
    at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
    at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
    at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
 ... 10 more
================================================
As if the windows  hive JDBC client can communicate with the hiveserver2,isn't it?

while I checked everything I can :
(1)in hiveserver2 node, I execute command "klist",the results are:
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: hive/hm@HADOOP.COM

Valid starting    Expires                     Service principal
07/04/16 10:28:14    07/05/16 10:28:14     krbtgt/HADOOP.COM@HADOOP.COM
                 renew until 07/04/16 10:28:14
(2)in windows dos cmd,I execute command "klist",the results are:
Ticket cache:API: 1
Default principal: hive/hm@HADOOP.COM

Valid starting    Expires                     Service principal
07/04/16 10:24:32    07/05/16 10:24:32     krbtgt/HADOOP.COM@HADOOP.COM
                 renew until 07/04/16 10:24:32

 Is there any thing else I have to add or set for hiveserver2?

Thanks in advance.


Maria.

At 2016-07-03 04:39:31, "Vivek Shrivastava" <vi...@gmail.com> wrote:
 

Please look at the hiveserver2 log, it will have better error information. You can paste error from the logs if you need help. 


Regards,


Vivek


On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:



Hi,all:

     recently,I  attempted to access Kerberized hadoop cluster by launching JAVA applications from Windows workstations. And I hava configured kerberos in my windows7, and can successfully access hdfs50070. But when I launch JDBC from windows to connection remote hiveserver,errors accured:

java.sql.SQLException:could not open client transport with JDBC Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS initiate failed

     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)

     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)

     at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)

     at java.sql.DriverManager.getConnection(Unknown Source)

     at java.sql.DriverManager.getConnection(Unknown Source)

     at org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)

Caused by: org.apache.thrift.transport.TTransportException:GSS initiate failed

     at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)

     at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)

     at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)

     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)

     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)

     at java.security.AccessController.doPrivileged(Native Method)

     at javax.security.auth.Subject.doAs(Unknow source)

     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)

     at  org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)

     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)

... 5 more

------------------------------------------------------------------------------

below are my test codes:



public static void main(String[] args) {

    String principal = "hive/hm@HADOOM.COM";

    String keytab = "E:\\Program Files (x86)\\java\\jre7\\lib\\security\\hive.keytab";

    String url = "jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM";



    conf.addResource(new File("hdfs-site.xml").toURI().toURL());

    conf.addResource(new File("core-site.xml").toURI().toURL());

    conf.addResource(new File("yarn-site.xml").toURI().toURL());

    conf.addResource(new File("hive-site.xml").toURI().toURL());



    conf.set("hadoop.security.authentication", "Kerberos");

    UserGroupInformation.setConfiguration(conf);

    UserGroupInformation.loginUserFromKeytab(principal, keytab);



    Class.forName("org.apache.hive.,jdbc.HiveDriver");

    Connection conn =DriverManager.getConnection(url);



    Statement stmt = conn.createStatement();

    String sql = "select * from testkerberos";

    ResultSet rs = stmt.executeQuery(sql);

    while (rs.next()) {

       system.out.println(rs.getString(1));

    }

}



Does anyone had the same problem? Or know how to solve it ?



Thanks in advance.



Maria.




Re: How to access linux kerberosed hive from windows eclipse workspace?

Posted by Vivek Shrivastava <vi...@gmail.com>.
Please look at the hiveserver2 log, it will have better error information.
You can paste error from the logs if you need help.

Regards,

Vivek

On Sat, Jul 2, 2016 at 5:52 AM, Maria <li...@126.com> wrote:

>
> Hi,all:
>      recently,I  attempted to access Kerberized hadoop cluster by
> launching JAVA applications from Windows workstations. And I hava
> configured kerberos in my windows7, and can successfully access hdfs50070.
> But when I launch JDBC from windows to connection remote hiveserver,errors
> accured:
> java.sql.SQLException:could not open client transport with JDBC
> Uri:jdbc:hive2://hm:10000/default;principal=hive/hm@HADOOM.COM: GSS
> initiate failed
>      at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
>      at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
>      at org.apache.hive.jdbc.HiveDriver.connection(HiveDriver.java:105)
>      at java.sql.DriverManager.getConnection(Unknown Source)
>      at java.sql.DriverManager.getConnection(Unknown Source)
>      at
> org.apache.hadoop.hive.ql.security.authorization.plugin.KerberosTest.main(KerberosTest.java:41)
> Caused by: org.apache.thrift.transport.TTransportException:GSS initiate
> failed
>      at
> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>      at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
>      at
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>      at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>      at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>      at java.security.AccessController.doPrivileged(Native Method)
>      at javax.security.auth.Subject.doAs(Unknow source)
>      at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>      at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>      at
> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
> ... 5 more
>
> ------------------------------------------------------------------------------
> below are my test codes:
>
> public static void main(String[] args) {
>     String principal = "hive/hm@HADOOM.COM";
>     String keytab = "E:\\Program Files
> (x86)\\java\\jre7\\lib\\security\\hive.keytab";
>     String url = "jdbc:hive2://hm:10000/default;principal=hive/
> hm@HADOOM.COM";
>
>     conf.addResource(new File("hdfs-site.xml").toURI().toURL());
>     conf.addResource(new File("core-site.xml").toURI().toURL());
>     conf.addResource(new File("yarn-site.xml").toURI().toURL());
>     conf.addResource(new File("hive-site.xml").toURI().toURL());
>
>     conf.set("hadoop.security.authentication", "Kerberos");
>     UserGroupInformation.setConfiguration(conf);
>     UserGroupInformation.loginUserFromKeytab(principal, keytab);
>
>     Class.forName("org.apache.hive.,jdbc.HiveDriver");
>     Connection conn =DriverManager.getConnection(url);
>
>     Statement stmt = conn.createStatement();
>     String sql = "select * from testkerberos";
>     ResultSet rs = stmt.executeQuery(sql);
>     while (rs.next()) {
>        system.out.println(rs.getString(1));
>     }
> }
>
> Does anyone had the same problem? Or know how to solve it ?
>
> Thanks in advance.
>
> Maria.
>