You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Ricardo Fajardo <ri...@autodesk.com> on 2017/01/26 06:36:42 UTC

Pls Help me - Hive Kerberos Issue

Hello,


Please I need your help with the Kerberos authentication with Hive.


I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1


But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my user. I created a keytab file with my passwd for my user. Please tell me if it is ok.

On the another hand when I am debugging the hive code the operative system user is authenticated but I need authenticate my Kerberos user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or How can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Ricardo.



Re: Pls Help me - Hive Kerberos Issue

Posted by Vivek Shrivastava <vi...@gmail.com>.
The attached file does not look like hive-site.xml. What is the value of
hive.server2.authentication in hive-site.xml. Also your sasl. qop value
should be one of three values, not all three.

On Mon, Jan 30, 2017 at 4:28 PM, Ricardo Fajardo <
ricardo.fajardo@autodesk.com> wrote:

> Attached the hive-site.xml configuration file.
> ------------------------------
> *From:* Vivek Shrivastava <vi...@gmail.com>
> *Sent:* Monday, January 30, 2017 4:10:42 PM
>
> *To:* user@hive.apache.org
> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>
> If this is working then your kerberos setup is ok. I suspect configuration
> is Hiveserver2. What is the authentication and security setup in Hive
> config? Please see if you can attach it.
>
> On Mon, Jan 30, 2017 at 2:33 PM, Ricardo Fajardo <
> ricardo.fajardo@autodesk.com> wrote:
>
>> [cloudera@quickstart bin]$
>> [cloudera@quickstart bin]$ hadoop fs -ls
>> Java config name: null
>> Native config name: /etc/krb5.conf
>> Loaded from native config
>> Found 20 items
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-13 17:51 checkpoint
>> -rw-r--r--   1 cloudera cloudera       3249 2016-05-11 16:19 hadoop.txt
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:15 hadoop2.txt
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:30 hadoop3.txt
>> drwxr-xr-x - cloudera cloudera 0 2016-06-16 16:37 gives
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-16 16:06 out1
>> -rw-r--r--   1 cloudera cloudera       3868 2016-06-15 08:39
>> post.small0.xml
>> drwxr-xr-x   - cloudera cloudera          0 2016-07-14 17:01 tCount1
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 15:57 test1
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:57 test10
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 17:33 test12
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:02 test2
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:24 test3
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:27 test4
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:32 test5
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:37 test6
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:49 test7
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:51 test8
>> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:54 test9
>> -rw-r--r--   1 cloudera cloudera    8481022 2016-06-08 21:51 train.tsv
>> [cloudera@quickstart bin]$
>> [cloudera@quickstart bin]$
>> [cloudera@quickstart bin]$
>> [cloudera@quickstart bin]$ echo $HADOOP_OPTS
>> -Dsun.security.krb5.debug=true
>> [cloudera@quickstart bin]$
>>
>> ------------------------------
>> *From:* Vivek Shrivastava <vi...@gmail.com>
>> *Sent:* Monday, January 30, 2017 2:28:53 PM
>>
>> *To:* user@hive.apache.org
>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>
>> If you are using AES256, then please do update java unlimited strength
>> jar files. What is the output of hadoop ls command after exporting the
>> below environment variable?
>>
>> export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
>> hadoop fs -ls /
>>
>> On Mon, Jan 30, 2017 at 2:21 PM, Ricardo Fajardo <
>> ricardo.fajardo@autodesk.com> wrote:
>>
>>> I did the changes but I am getting the same error.
>>>
>>> Klist:
>>>
>>> [cloudera@quickstart bin]$ klist -fe
>>> Ticket cache: FILE:/tmp/krb5cc_501
>>> Default principal: t_fajar@ADS.AUTODESK.COM
>>>
>>> Valid starting     Expires            Service principal
>>> 01/30/17 11:56:20  01/30/17 21:56:24  krbtgt/ADS.AUTODESK.COM@ADS.A
>>> UTODESK.COM
>>> renew until 01/31/17 11:56:20, Flags: FPRIA
>>> Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac
>>>
>>>
>>> Log:
>>>
>>> [cloudera@quickstart bin]$ export HADOOP_OPTS="-Dsun.security.kr
>>> b5.debug=true"
>>> [cloudera@quickstart bin]$
>>> [cloudera@quickstart bin]$
>>> [cloudera@quickstart bin]$ ./beeline -u "jdbc:hive2://localhost:10000/
>>> default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.p
>>> roxy.user=t_fajar"
>>> /home/cloudera/workspace/hive/bin/hive: line 99: [:
>>> /home/cloudera/workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar:
>>> binary operator expected
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in [jar:file:/home/cloudera/works
>>> pace/hive/lib/benchmarks.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/cloudera/works
>>> pace/hive/lib/hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!/org/s
>>> lf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/cloudera/works
>>> pace/hive/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4
>>> j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/cloudera/works
>>> pace/hive/lib/spark-examples-1.6.0-hadoop2.6.0.jar!/org/slf4
>>> j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/l
>>> ib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_
>>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>> Java config name: null
>>> Native config name: /etc/krb5.conf
>>> Loaded from native config
>>> 17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SASL
>>> negotiation failure
>>> javax.security.sasl.SaslException: GSS initiate failed
>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
>>> ~[?:1.8.0_73]
>>> at org.apache.thrift.transport.TSaslClientTransport.handleSaslS
>>> tartMessage(TSaslClientTransport.java:94) ~[benchmarks.jar:2.2.0-SNAPSHO
>>> T]
>>> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>>> [benchmarks.jar:2.2.0-SNAPSHOT]
>>> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>>> [benchmarks.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>>> .run(TUGIAssumingTransport.java:52) [benchmarks.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>>> .run(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> ~[?:1.8.0_73]
>>> at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_73]
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>>> [benchmarks.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.o
>>> pen(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227)
>>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182)
>>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
>>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>> [?:1.8.0_73]
>>> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>> [?:1.8.0_73]
>>> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1524)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1419)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> ~[?:1.8.0_73]
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> ~[?:1.8.0_73]
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> ~[?:1.8.0_73]
>>> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
>>> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:797)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:885)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494)
>>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> ~[?:1.8.0_73]
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> ~[?:1.8.0_73]
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> ~[?:1.8.0_73]
>>> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
>>> at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>>> [benchmarks.jar:2.2.0-SNAPSHOT]
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>> [benchmarks.jar:2.2.0-SNAPSHOT]
>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>> (Mechanism level: Failed to find any Kerberos tgt)
>>> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>> ~[?:1.8.0_73]
>>> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
>>> ~[?:1.8.0_73]
>>> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>> ~[?:1.8.0_73]
>>> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
>>> ~[?:1.8.0_73]
>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>> ~[?:1.8.0_73]
>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>> ~[?:1.8.0_73]
>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
>>> ~[?:1.8.0_73]
>>> ... 35 more
>>> 17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed to connect to
>>> localhost:10000
>>> HS2 may be unavailable, check server status
>>> Error: Could not open client transport with JDBC Uri:
>>> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD
>>> S.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed
>>> (state=08S01,code=0)
>>> Beeline version 2.2.0-SNAPSHOT by Apache Hive
>>> beeline>
>>>
>>>
>>> ------------------------------
>>> *From:* Vivek Shrivastava <vi...@gmail.com>
>>> *Sent:* Monday, January 30, 2017 11:34:27 AM
>>>
>>> *To:* user@hive.apache.org
>>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>>
>>> You can comment both default_tkt_enctypes and default_tgs_enctypes out,
>>> the default value will become aes256-cts-hmac-sha1-96AES128-CTS-HMAC-SHA1-96
>>> des3-c BC-SHA1 arcfour-HMAC-MD5 camel lia256 CTS-CMAC camellia128-CT with
>>> CMAC- des-cbc-crc des-cbc-md5 des-cbc-MD4  .
>>> Then do
>>> kdestroy
>>> kinit
>>> klist -fev
>>> your beeline command
>>>
>>> if still does not work then paste the output of
>>>
>>> export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
>>> hadoop fs -ls /
>>>
>>>
>>>
>>> On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo <
>>> ricardo.fajardo@autodesk.com> wrote:
>>>
>>>> I don't have any particular reason for selecting arcfour encryption
>>>> type. If I need to change it and it will work I can do.
>>>>
>>>> Values from krb5.conf:
>>>>
>>>> [Libdefaults]
>>>>         default_realm = ADS.AUTODESK.COM
>>>>         krb4_config = /etc/krb.conf
>>>>         krb4_realms = /etc/krb.realms
>>>>         kdc_timesync = 1
>>>>         ccache_type = 4
>>>>         forwardable = true
>>>>         proxiable = true
>>>>         v4_instance_resolve = false
>>>>         v4_name_convert = {
>>>>                 host = {
>>>>                         rcmd = host
>>>>                         ftp = ftp
>>>>                 }
>>>>                 plain = {
>>>>                         something = something-else
>>>>                 }
>>>>         }
>>>>         fcc-mit-ticketflags = true
>>>>         default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5
>>>> AES256-CTS
>>>>         default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5
>>>> AES256-CTS
>>>>
>>>> [realms]
>>>>
>>>>         ADS.AUTODESK.COM = {
>>>>                 kdc = krb.ads.autodesk.com: 88
>>>>                 admin_server = krb.ads.autodesk.com
>>>>                 default_domain = ads.autodesk.com
>>>>                 database_module = openldap_ldapconf
>>>>                 master_key_type = aes256-cts
>>>>                 supported_enctypes = aes256-cts:normal
>>>> aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal
>>>> des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
>>>>                 default_principal_flags = +preauth
>>>>         }
>>>>
>>>> Thanks so much for your help,
>>>> Richard.
>>>> ------------------------------
>>>> *From:* Vivek Shrivastava <vi...@gmail.com>
>>>> *Sent:* Monday, January 30, 2017 11:01:24 AM
>>>>
>>>> *To:* user@hive.apache.org
>>>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>>>
>>>> Any particular reason for selecting arcfour encryption type? Could you
>>>> please post defaults (e.g enc_type) values from krb5.conf
>>>>
>>>> On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <
>>>> ricardo.fajardo@autodesk.com> wrote:
>>>>
>>>>>
>>>>> 1. klist -fe
>>>>>
>>>>> [cloudera@quickstart bin]$ klist -fe
>>>>> Ticket cache: FILE:/tmp/krb5cc_501
>>>>> Default principal: t_fajar@ADS.AUTODESK.COM
>>>>>
>>>>> Valid starting     Expires            Service principal
>>>>> 01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.A
>>>>> UTODESK.COM
>>>>> renew until 01/31/17 10:52:37, Flags: FPRIA
>>>>> Etype (skey, tkt): arcfour-hmac, arcfour-hmac
>>>>> [cloudera@quickstart bin]$
>>>>>
>>>>> 2. relevant entries from HiveServer2 log
>>>>>
>>>>>
>>>>> beeline> !connect jdbc:hive2://localhost:10000/d
>>>>> efault;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.pr
>>>>> oxy.user=t_fajar
>>>>> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD
>>>>> S.
>>>>> AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>>>> epository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/lo
>>>>> g4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>>>> epository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.
>>>>> jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>>>> epository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.1
>>>>> 0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>> SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4
>>>>> jLoggerFactory]
>>>>> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_
>>>>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>>>> 17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
>>>>> 17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
>>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>>>> org.apache.hadoop.metrics2.lib.MutableRate
>>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
>>>>> with annotation @org.apache.hadoop.metrics2.an
>>>>> notation.Metric(valueName=Time, value=[Rate of successful kerberos
>>>>> logins and latency (milliseconds)], about=, type=DEFAULT, always=false,
>>>>> sampleName=Ops)
>>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>>>> org.apache.hadoop.metrics2.lib.MutableRate
>>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
>>>>> with annotation @org.apache.hadoop.metrics2.an
>>>>> notation.Metric(valueName=Time, value=[Rate of failed kerberos logins
>>>>> and latency (milliseconds)], about=, type=DEFAULT, always=false,
>>>>> sampleName=Ops)
>>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>>>> org.apache.hadoop.metrics2.lib.MutableRate
>>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups
>>>>> with annotation @org.apache.hadoop.metrics2.an
>>>>> notation.Metric(valueName=Time, value=[GetGroups], about=,
>>>>> type=DEFAULT, always=false, sampleName=Ops)
>>>>> 17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group
>>>>> related metrics
>>>>> 17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
>>>>> 17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
>>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the
>>>>> custom-built native-hadoop library...
>>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop
>>>>> with error: java.lang.UnsatisfiedLinkError: no hadoop in
>>>>> java.library.path
>>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader:
>>>>> java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/l
>>>>> ib64:/lib:/usr/lib
>>>>> 17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop
>>>>> library for your platform... using builtin-java classes where applicable
>>>>> 17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell
>>>>> based
>>>>> 17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group
>>>>> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
>>>>> 17/01/27 16:16:38 DEBUG Groups: Group mapping
>>>>> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
>>>>> cacheTimeout=300000; warningDeltaMs=5000
>>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
>>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
>>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: using local
>>>>> user:UnixPrincipal: cloudera
>>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: Using user:
>>>>> "UnixPrincipal: cloudera" with name cloudera
>>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
>>>>> 17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera
>>>>> (auth:SIMPLE)
>>>>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod =
>>>>> SIMPLE
>>>>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as
>>>>> passed-in authMethod of kerberos != current.
>>>>> 17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction
>>>>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th
>>>>> rift.HadoopThriftAuthBridge$Client.createClientTransport(Had
>>>>> oopThriftAuthBridge.java:208)
>>>>> 17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction
>>>>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th
>>>>> rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>>>>> 17/01/30 10:55:02 DEBUG TSaslTransport: opening transport
>>>>> org.apache.thrift.transport.TSaslClientTransport@1119f7c5
>>>>> 17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
>>>>> javax.security.sasl.SaslException: GSS initiate failed
>>>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>>> ~[?:1.7.0_67]
>>>>> at org.apache.thrift.transport.TSaslClientTransport.handleSaslS
>>>>> tartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
>>>>> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>>>>> [libthrift-0.9.3.jar:0.9.3]
>>>>> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>>>>> [libthrift-0.9.3.jar:0.9.3]
>>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>>>>> .run(TUGIAssumingTransport.java:52) [classes/:?]
>>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>>>>> .run(TUGIAssumingTransport.java:1) [classes/:?]
>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>> ~[?:1.7.0_67]
>>>>> at javax.security.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67]
>>>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>>>>> [hadoop-common-2.7.2.jar:?]
>>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.o
>>>>> pen(TUGIAssumingTransport.java:49) [classes/:?]
>>>>> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227)
>>>>> [classes/:?]
>>>>> at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182)
>>>>> [classes/:?]
>>>>> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
>>>>> [classes/:?]
>>>>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>>>>> [?:1.7.0_67]
>>>>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>>>>> [?:1.7.0_67]
>>>>> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1524)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1419)
>>>>> [classes/:?]
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> ~[?:1.7.0_67]
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>> ~[?:1.7.0_67]
>>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>> ~[?:1.7.0_67]
>>>>> at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
>>>>> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511)
>>>>> [classes/:?]
>>>>> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
>>>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>>>> (Mechanism level: Failed to find any Kerberos tgt)
>>>>> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>>> ~[?:1.7.0_67]
>>>>> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>>> ~[?:1.7.0_67]
>>>>> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>>> ~[?:1.7.0_67]
>>>>> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>>> ~[?:1.7.0_67]
>>>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>>> ~[?:1.7.0_67]
>>>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>>> ~[?:1.7.0_67]
>>>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>>> ~[?:1.7.0_67]
>>>>> ... 29 more
>>>>> 17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with
>>>>> status BAD and payload length 19
>>>>> 17/01/30 10:55:02 WARN HiveConnection: Failed to connect to
>>>>> localhost:10000
>>>>> HS2 may be unavailable, check server status
>>>>> Error: Could not open client transport with JDBC Uri:
>>>>> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD
>>>>> S.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed
>>>>> (state=08S01,code=0)
>>>>> beeline>
>>>>>
>>>>> ------------------------------
>>>>> *From:* Vivek Shrivastava <vi...@gmail.com>
>>>>> *Sent:* Monday, January 30, 2017 10:48:35 AM
>>>>> *To:* user@hive.apache.org
>>>>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>>>>
>>>>> Please paste the output of
>>>>> 1. klist -fe
>>>>> 2. relevant entries from HiveServer2 log
>>>>>
>>>>> On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <
>>>>> ricardo.fajardo@autodesk.com> wrote:
>>>>>
>>>>>> I could not resolve the problem.
>>>>>>
>>>>>>
>>>>>> I have debugged the code and I found out that:
>>>>>>
>>>>>>
>>>>>> 1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class
>>>>>>   line 208
>>>>>>
>>>>>> ....
>>>>>>
>>>>>> UserGroupInformation.getCurrentUser return (). Two (....
>>>>>>
>>>>>> ..
>>>>>>
>>>>>> This method always returns the user of the operative system but and I
>>>>>> need authenticate the user set on the property: hive.server2.proxy.u
>>>>>> ser=yourid because I have a token for this one.
>>>>>>
>>>>>>
>>>>>> 2. I have found out that the hive.server2.proxy.user is implemented
>>>>>> on the org.apache.hive.jdbc.HiveConnection class method: openSession()
>>>>>> but this code is never executed.
>>>>>>
>>>>>>
>>>>>> 3. On the org.apache.hive.service.auth.HiveAuthFactory class there
>>>>>> is this code on the method getAuthTransFactory():
>>>>>>
>>>>>> ....
>>>>>>
>>>>>>       if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName()))
>>>>>> {
>>>>>>         // no-op
>>>>>> ....
>>>>>>
>>>>>> It means that Kerberos authentication is not implemented?
>>>>>>
>>>>>>
>>>>>>
>>>>>> Please anyone can help me??
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Richard.
>>>>>> ------------------------------
>>>>>> *From:* Dulam, Naresh <na...@bankofamerica.com>
>>>>>> *Sent:* Thursday, January 26, 2017 8:41:48 AM
>>>>>> *To:* user@hive.apache.org
>>>>>> *Subject:* RE: Pls Help me - Hive Kerberos Issue
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM
>>>>>>
>>>>>>
>>>>>>
>>>>>> # Connect using following JDBC connection string
>>>>>>
>>>>>> # jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_
>>>>>> HOST@MY-REALM.COM;hive.server2.proxy.user=yourid
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From:* Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com]
>>>>>> *Sent:* Thursday, January 26, 2017 1:37 AM
>>>>>> *To:* user@hive.apache.org
>>>>>> *Subject:* Pls Help me - Hive Kerberos Issue
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hello,
>>>>>>
>>>>>>
>>>>>>
>>>>>> Please I need your help with the Kerberos authentication with Hive.
>>>>>>
>>>>>>
>>>>>>
>>>>>> I am following this guide:
>>>>>>
>>>>>> https://www.cloudera.com/documentation/enterprise/5-4-x/topi
>>>>>> cs/cdh_sg_hiveserver2_security.html#topic_9_1_1
>>>>>>
>>>>>> But I am getting this error:
>>>>>>
>>>>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>>>>> (Mechanism level: Failed to find any Kerberos tgt)
>>>>>>
>>>>>>
>>>>>>
>>>>>> I have a remote Kerberos server and I can generate a token with kinit
>>>>>> for my user. I created a keytab file with my passwd for my user. Please
>>>>>> tell me if it is ok.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On the another hand when I am debugging the hive code the operative
>>>>>> system user is authenticated but I need authenticate my Kerberos user, can
>>>>>> you tell me how I can achieve that? How can I store my tickets where Hive
>>>>>> can load it?? or How can I verify where Hive is searching the tickets and
>>>>>> what Hive is reading??
>>>>>>
>>>>>>
>>>>>>
>>>>>> Thanks so much for your help.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Best regards,
>>>>>>
>>>>>> Richard.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> ------------------------------
>>>>>> This message, and any attachments, is for the intended recipient(s)
>>>>>> only, may contain information that is privileged, confidential and/or
>>>>>> proprietary and subject to important terms and conditions available at
>>>>>> http://www.bankofamerica.com/emaildisclaimer. If you are not the
>>>>>> intended recipient, please delete this message.
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Pls Help me - Hive Kerberos Issue

Posted by Ricardo Fajardo <ri...@autodesk.com>.
Attached the hive-site.xml configuration file.

________________________________
From: Vivek Shrivastava <vi...@gmail.com>
Sent: Monday, January 30, 2017 4:10:42 PM
To: user@hive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue

If this is working then your kerberos setup is ok. I suspect configuration is Hiveserver2. What is the authentication and security setup in Hive config? Please see if you can attach it.

On Mon, Jan 30, 2017 at 2:33 PM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ hadoop fs -ls
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Found 20 items
drwxr-xr-x   - cloudera cloudera          0 2016-06-13 17:51 checkpoint
-rw-r--r--   1 cloudera cloudera       3249 2016-05-11 16:19 hadoop.txt
drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:15 hadoop2.txt
drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:30 hadoop3.txt
drwxr-xr-x - cloudera cloudera 0 2016-06-16 16:37 gives
drwxr-xr-x   - cloudera cloudera          0 2016-06-16 16:06 out1
-rw-r--r--   1 cloudera cloudera       3868 2016-06-15 08:39 post.small0.xml
drwxr-xr-x   - cloudera cloudera          0 2016-07-14 17:01 tCount1
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 15:57 test1
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:57 test10
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 17:33 test12
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:02 test2
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:24 test3
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:27 test4
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:32 test5
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:37 test6
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:49 test7
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:51 test8
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:54 test9
-rw-r--r--   1 cloudera cloudera    8481022 2016-06-08 21:51 train.tsv
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ echo $HADOOP_OPTS
-Dsun.security.krb5.debug=true
[cloudera@quickstart bin]$


________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 2:28:53 PM

To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

If you are using AES256, then please do update java unlimited strength jar files. What is the output of hadoop ls command after exporting the below environment variable?

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /

On Mon, Jan 30, 2017 at 2:21 PM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I did the changes but I am getting the same error.

Klist:

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>

Valid starting     Expires            Service principal
01/30/17 11:56:20  01/30/17 21:56:24  krbtgt/ADS.AUTODESK.COM@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>
renew until 01/31/17 11:56:20, Flags: FPRIA
Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac


Log:

[cloudera@quickstart bin]$ export HADOOP_OPTS="-Dsun.security.kr<http://Dsun.security.kr>b5.debug=true"
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ ./beeline -u "jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.proxy.user=t_fajar"
/home/cloudera/workspace/hive/bin/hive: line 99: [: /home/cloudera/workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar: binary operator expected
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/benchmarks.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/spark-examples-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_73]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_73]
at javax.security.auth.Subject.do<http://javax.security.auth.Subject.do>As(Subject.java:422) [?:1.8.0_73]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_73]
at java.sql.DriverManager.getConnection(DriverManager.java:208) [?:1.8.0_73]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:797) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:885) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) [benchmarks.jar:2.2.0-SNAPSHOT]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.8.0_73]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[?:1.8.0_73]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.8.0_73]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_73]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_73]
... 35 more
17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
Beeline version 2.2.0-SNAPSHOT by Apache Hive
beeline>



________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 11:34:27 AM

To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

You can comment both default_tkt_enctypes and default_tgs_enctypes out, the default value will become aes256-cts-hmac-sha1-96AES128-CTS-HMAC-SHA1-96 des3-c BC-SHA1 arcfour-HMAC-MD5 camel lia256 CTS-CMAC camellia128-CT with CMAC- des-cbc-crc des-cbc-md5 des-cbc-MD4  .
Then do
kdestroy
kinit
klist -fev
your beeline command

if still does not work then paste the output of

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /



On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I don't have any particular reason for selecting arcfour encryption type. If I need to change it and it will work I can do.

Values from krb5.conf:

[Libdefaults]
        default_realm = ADS.AUTODESK.COM<http://ADS.AUTODESK.COM>
        krb4_config = /etc/krb.conf
        krb4_realms = /etc/krb.realms
        kdc_timesync = 1
        ccache_type = 4
        forwardable = true
        proxiable = true
        v4_instance_resolve = false
        v4_name_convert = {
                host = {
                        rcmd = host
                        ftp = ftp
                }
                plain = {
                        something = something-else
                }
        }
        fcc-mit-ticketflags = true
        default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS
        default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5 AES256-CTS

[realms]

        ADS.AUTODESK.COM<http://ADS.AUTODESK.COM> = {
                kdc = krb.ads.autodesk.com<http://ads.autodesk.com>: 88
                admin_server = krb.ads.autodesk.com<http://ads.autodesk.com>
                default_domain = ads.autodesk.com<http://ads.autodesk.com>
                database_module = openldap_ldapconf
                master_key_type = aes256-cts
                supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
                default_principal_flags = +preauth
        }

Thanks so much for your help,
Richard.
________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 11:01:24 AM

To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Any particular reason for selecting arcfour encryption type? Could you please post defaults (e.g enc_type) values from krb5.conf

On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

1. klist -fe

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>

Valid starting     Expires            Service principal
01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>
renew until 01/31/17 10:52:37, Flags: FPRIA
Etype (skey, tkt): arcfour-hmac, arcfour-hmac
[cloudera@quickstart bin]$

2. relevant entries from HiveServer2 log


beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
!connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
AUTODESK.COM<http://AUTODESK.COM>;hive.server2.proxy.user=t_fajar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
17/01/27 16:16:38 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
17/01/27 16:16:38 DEBUG UserGroupInformation: using local user:UnixPrincipal: cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrincipal: cloudera" with name cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera (auth:SIMPLE)
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod = SIMPLE
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as passed-in authMethod of kerberos != current.
17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th<http://org.apache.hadoop.hive.th>rift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)
17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th<http://org.apache.hadoop.hive.th>rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
17/01/30 10:55:02 DEBUG TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@1119f7c5
17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) ~[?:1.7.0_67]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [libthrift-0.9.3.jar:0.9.3]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [classes/:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:1) [classes/:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_67]
at javax.security.auth.Subject.do<http://javax.security.auth.Subject.do>As(Subject.java:415) [?:1.7.0_67]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [classes/:?]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [classes/:?]
at java.sql.DriverManager.getConnection(DriverManager.java:571) [?:1.7.0_67]
at java.sql.DriverManager.getConnection(DriverManager.java:187) [?:1.7.0_67]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [classes/:?]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_67]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_67]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_67]
at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [classes/:?]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) [classes/:?]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [classes/:?]
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) [classes/:?]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.7.0_67]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.7.0_67]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193) ~[?:1.7.0_67]
... 29 more
17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with status BAD and payload length 19
17/01/30 10:55:02 WARN HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
beeline>


________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 10:48:35 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Please paste the output of
1. klist -fe
2. relevant entries from HiveServer2 log

On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I could not resolve the problem.


I have debugged the code and I found out that:


1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line 208

....

UserGroupInformation.getCurrentUser return (). Two (....

..

This method always returns the user of the operative system but and I need authenticate the user set on the property: hive.server2.proxy.user=yourid because I have a token for this one.


2. I have found out that the hive.server2.proxy.user is implemented on the org.apache.hive.jdbc.HiveConnection class method: openSession() but this code is never executed.


3. On the org.apache.hive.service.auth.HiveAuthFactory class there is this code on the method getAuthTransFactory():

....

      if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) {
        // no-op
....


It means that Kerberos authentication is not implemented?



Please anyone can help me??


Thanks,

Richard.

________________________________
From: Dulam, Naresh <na...@bankofamerica.com>>
Sent: Thursday, January 26, 2017 8:41:48 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: RE: Pls Help me - Hive Kerberos Issue


Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM<ma...@MY-REALM.COM>

# Connect using following JDBC connection string
# jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid<http://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid>






From: Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com<ma...@autodesk.com>]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Pls Help me - Hive Kerberos Issue

Hello,



Please I need your help with the Kerberos authentication with Hive.



I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my user. I created a keytab file with my passwd for my user. Please tell me if it is ok.

On the another hand when I am debugging the hive code the operative system user is authenticated but I need authenticate my Kerberos user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or How can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Richard.


________________________________
This message, and any attachments, is for the intended recipient(s) only, may contain information that is privileged, confidential and/or proprietary and subject to important terms and conditions available at http://www.bankofamerica.com/emaildisclaimer. If you are not the intended recipient, please delete this message.






Re: Pls Help me - Hive Kerberos Issue

Posted by Vivek Shrivastava <vi...@gmail.com>.
If this is working then your kerberos setup is ok. I suspect configuration
is Hiveserver2. What is the authentication and security setup in Hive
config? Please see if you can attach it.

On Mon, Jan 30, 2017 at 2:33 PM, Ricardo Fajardo <
ricardo.fajardo@autodesk.com> wrote:

> [cloudera@quickstart bin]$
> [cloudera@quickstart bin]$ hadoop fs -ls
> Java config name: null
> Native config name: /etc/krb5.conf
> Loaded from native config
> Found 20 items
> drwxr-xr-x   - cloudera cloudera          0 2016-06-13 17:51 checkpoint
> -rw-r--r--   1 cloudera cloudera       3249 2016-05-11 16:19 hadoop.txt
> drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:15 hadoop2.txt
> drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:30 hadoop3.txt
> drwxr-xr-x - cloudera cloudera 0 2016-06-16 16:37 gives
> drwxr-xr-x   - cloudera cloudera          0 2016-06-16 16:06 out1
> -rw-r--r--   1 cloudera cloudera       3868 2016-06-15 08:39
> post.small0.xml
> drwxr-xr-x   - cloudera cloudera          0 2016-07-14 17:01 tCount1
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 15:57 test1
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:57 test10
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 17:33 test12
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:02 test2
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:24 test3
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:27 test4
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:32 test5
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:37 test6
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:49 test7
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:51 test8
> drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:54 test9
> -rw-r--r--   1 cloudera cloudera    8481022 2016-06-08 21:51 train.tsv
> [cloudera@quickstart bin]$
> [cloudera@quickstart bin]$
> [cloudera@quickstart bin]$
> [cloudera@quickstart bin]$ echo $HADOOP_OPTS
> -Dsun.security.krb5.debug=true
> [cloudera@quickstart bin]$
>
> ------------------------------
> *From:* Vivek Shrivastava <vi...@gmail.com>
> *Sent:* Monday, January 30, 2017 2:28:53 PM
>
> *To:* user@hive.apache.org
> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>
> If you are using AES256, then please do update java unlimited strength jar
> files. What is the output of hadoop ls command after exporting the below
> environment variable?
>
> export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
> hadoop fs -ls /
>
> On Mon, Jan 30, 2017 at 2:21 PM, Ricardo Fajardo <
> ricardo.fajardo@autodesk.com> wrote:
>
>> I did the changes but I am getting the same error.
>>
>> Klist:
>>
>> [cloudera@quickstart bin]$ klist -fe
>> Ticket cache: FILE:/tmp/krb5cc_501
>> Default principal: t_fajar@ADS.AUTODESK.COM
>>
>> Valid starting     Expires            Service principal
>> 01/30/17 11:56:20  01/30/17 21:56:24  krbtgt/ADS.AUTODESK.COM@ADS.A
>> UTODESK.COM
>> renew until 01/31/17 11:56:20, Flags: FPRIA
>> Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac
>>
>>
>> Log:
>>
>> [cloudera@quickstart bin]$ export HADOOP_OPTS="-Dsun.security.kr
>> b5.debug=true"
>> [cloudera@quickstart bin]$
>> [cloudera@quickstart bin]$
>> [cloudera@quickstart bin]$ ./beeline -u "jdbc:hive2://localhost:10000/
>> default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.p
>> roxy.user=t_fajar"
>> /home/cloudera/workspace/hive/bin/hive: line 99: [:
>> /home/cloudera/workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar:
>> binary operator expected
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in [jar:file:/home/cloudera/works
>> pace/hive/lib/benchmarks.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/cloudera/works
>> pace/hive/lib/hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!/org/
>> slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/cloudera/works
>> pace/hive/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/
>> slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/cloudera/works
>> pace/hive/lib/spark-examples-1.6.0-hadoop2.6.0.jar!/org/
>> slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/l
>> ib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_
>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>> Java config name: null
>> Native config name: /etc/krb5.conf
>> Loaded from native config
>> 17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SASL
>> negotiation failure
>> javax.security.sasl.SaslException: GSS initiate failed
>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
>> ~[?:1.8.0_73]
>> at org.apache.thrift.transport.TSaslClientTransport.handleSaslS
>> tartMessage(TSaslClientTransport.java:94) ~[benchmarks.jar:2.2.0-SNAPSHO
>> T]
>> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>> [benchmarks.jar:2.2.0-SNAPSHOT]
>> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>> [benchmarks.jar:2.2.0-SNAPSHOT]
>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>> .run(TUGIAssumingTransport.java:52) [benchmarks.jar:2.2.0-SNAPSHOT]
>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>> .run(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
>> at java.security.AccessController.doPrivileged(Native Method)
>> ~[?:1.8.0_73]
>> at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_73]
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>> [benchmarks.jar:2.2.0-SNAPSHOT]
>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.o
>> pen(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227)
>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182)
>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
>> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>> [?:1.8.0_73]
>> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>> [?:1.8.0_73]
>> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.Commands.connect(Commands.java:1524)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.Commands.connect(Commands.java:1419)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> ~[?:1.8.0_73]
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> ~[?:1.8.0_73]
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> ~[?:1.8.0_73]
>> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
>> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:797)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:885)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494)
>> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> ~[?:1.8.0_73]
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> ~[?:1.8.0_73]
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> ~[?:1.8.0_73]
>> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
>> at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>> [benchmarks.jar:2.2.0-SNAPSHOT]
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>> [benchmarks.jar:2.2.0-SNAPSHOT]
>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>> (Mechanism level: Failed to find any Kerberos tgt)
>> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>> ~[?:1.8.0_73]
>> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
>> ~[?:1.8.0_73]
>> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>> ~[?:1.8.0_73]
>> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
>> ~[?:1.8.0_73]
>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>> ~[?:1.8.0_73]
>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>> ~[?:1.8.0_73]
>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
>> ~[?:1.8.0_73]
>> ... 35 more
>> 17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed to connect to
>> localhost:10000
>> HS2 may be unavailable, check server status
>> Error: Could not open client transport with JDBC Uri:
>> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD
>> S.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed
>> (state=08S01,code=0)
>> Beeline version 2.2.0-SNAPSHOT by Apache Hive
>> beeline>
>>
>>
>> ------------------------------
>> *From:* Vivek Shrivastava <vi...@gmail.com>
>> *Sent:* Monday, January 30, 2017 11:34:27 AM
>>
>> *To:* user@hive.apache.org
>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>
>> You can comment both default_tkt_enctypes and default_tgs_enctypes out,
>> the default value will become aes256-cts-hmac-sha1-96
>> aes128-cts-hmac-sha1-96 des3-cbc-sha1 arcfour-hmac-md5 camel
>> lia256-cts-cmac camellia128-cts-cmac des-cbc-crc des-cbc-md5 des-cbc-md4 .
>>
>> Then do
>> kdestroy
>> kinit
>> klist -fev
>> your beeline command
>>
>> if still does not work then paste the output of
>>
>> export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
>> hadoop fs -ls /
>>
>>
>>
>> On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo <
>> ricardo.fajardo@autodesk.com> wrote:
>>
>>> I don't have any particular reason for selecting arcfour encryption
>>> type. If I need to change it and it will work I can do.
>>>
>>> Values from krb5.conf:
>>>
>>> [Libdefaults]
>>>         default_realm = ADS.AUTODESK.COM
>>>         krb4_config = /etc/krb.conf
>>>         krb4_realms = /etc/krb.realms
>>>         kdc_timesync = 1
>>>         ccache_type = 4
>>>         forwardable = true
>>>         proxiable = true
>>>         v4_instance_resolve = false
>>>         v4_name_convert = {
>>>                 host = {
>>>                         rcmd = host
>>>                         ftp = ftp
>>>                 }
>>>                 plain = {
>>>                         something = something-else
>>>                 }
>>>         }
>>>         fcc-mit-ticketflags = true
>>>         default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS
>>>         default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5
>>> AES256-CTS
>>>
>>> [realms]
>>>
>>>         ADS.AUTODESK.COM = {
>>>                 kdc = krb.ads.autodesk.com: 88
>>>                 admin_server = krb.ads.autodesk.com
>>>                 default_domain = ads.autodesk.com
>>>                 database_module = openldap_ldapconf
>>>                 master_key_type = aes256-cts
>>>                 supported_enctypes = aes256-cts:normal aes128-cts:normal
>>> des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal
>>> des-cbc-md5:normal des-cbc-crc:normal
>>>                 default_principal_flags = +preauth
>>>         }
>>>
>>> Thanks so much for your help,
>>> Ricardo.
>>> ------------------------------
>>> *From:* Vivek Shrivastava <vi...@gmail.com>
>>> *Sent:* Monday, January 30, 2017 11:01:24 AM
>>>
>>> *To:* user@hive.apache.org
>>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>>
>>> Any particular reason for selecting arcfour encryption type? Could you
>>> please post defaults (e.g enc_type) values from krb5.conf
>>>
>>> On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <
>>> ricardo.fajardo@autodesk.com> wrote:
>>>
>>>>
>>>> 1. klist -fe
>>>>
>>>> [cloudera@quickstart bin]$ klist -fe
>>>> Ticket cache: FILE:/tmp/krb5cc_501
>>>> Default principal: t_fajar@ADS.AUTODESK.COM
>>>>
>>>> Valid starting     Expires            Service principal
>>>> 01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.A
>>>> UTODESK.COM
>>>> renew until 01/31/17 10:52:37, Flags: FPRIA
>>>> Etype (skey, tkt): arcfour-hmac, arcfour-hmac
>>>> [cloudera@quickstart bin]$
>>>>
>>>> 2. relevant entries from HiveServer2 log
>>>>
>>>>
>>>> beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_
>>>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>>> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD
>>>> S.
>>>> AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>>> epository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/lo
>>>> g4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>>> epository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.
>>>> jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>>> epository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.1
>>>> 0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>> explanation.
>>>> SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4
>>>> jLoggerFactory]
>>>> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_
>>>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>>> 17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
>>>> 17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>>> org.apache.hadoop.metrics2.lib.MutableRate
>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
>>>> with annotation @org.apache.hadoop.metrics2.an
>>>> notation.Metric(valueName=Time, value=[Rate of successful kerberos
>>>> logins and latency (milliseconds)], about=, type=DEFAULT, always=false,
>>>> sampleName=Ops)
>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>>> org.apache.hadoop.metrics2.lib.MutableRate
>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
>>>> with annotation @org.apache.hadoop.metrics2.an
>>>> notation.Metric(valueName=Time, value=[Rate of failed kerberos logins
>>>> and latency (milliseconds)], about=, type=DEFAULT, always=false,
>>>> sampleName=Ops)
>>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>>> org.apache.hadoop.metrics2.lib.MutableRate
>>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups
>>>> with annotation @org.apache.hadoop.metrics2.an
>>>> notation.Metric(valueName=Time, value=[GetGroups], about=,
>>>> type=DEFAULT, always=false, sampleName=Ops)
>>>> 17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group
>>>> related metrics
>>>> 17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
>>>> 17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the
>>>> custom-built native-hadoop library...
>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop
>>>> with error: java.lang.UnsatisfiedLinkError: no hadoop in
>>>> java.library.path
>>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/pa
>>>> ckages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
>>>> 17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop
>>>> library for your platform... using builtin-java classes where applicable
>>>> 17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
>>>> 17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group
>>>> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
>>>> 17/01/27 16:16:38 DEBUG Groups: Group mapping
>>>> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
>>>> cacheTimeout=300000; warningDeltaMs=5000
>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: using local
>>>> user:UnixPrincipal: cloudera
>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: Using user:
>>>> "UnixPrincipal: cloudera" with name cloudera
>>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
>>>> 17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera
>>>> (auth:SIMPLE)
>>>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod =
>>>> SIMPLE
>>>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as
>>>> passed-in authMethod of kerberos != current.
>>>> 17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction
>>>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th
>>>> rift.HadoopThriftAuthBridge$Client.createClientTransport(Had
>>>> oopThriftAuthBridge.java:208)
>>>> 17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction
>>>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th
>>>> rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>>>> 17/01/30 10:55:02 DEBUG TSaslTransport: opening transport
>>>> org.apache.thrift.transport.TSaslClientTransport@1119f7c5
>>>> 17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
>>>> javax.security.sasl.SaslException: GSS initiate failed
>>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>>> ~[?:1.7.0_67]
>>>> at org.apache.thrift.transport.TSaslClientTransport.handleSaslS
>>>> tartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
>>>> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>>>> [libthrift-0.9.3.jar:0.9.3]
>>>> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>>>> [libthrift-0.9.3.jar:0.9.3]
>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>>>> .run(TUGIAssumingTransport.java:52) [classes/:?]
>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>>>> .run(TUGIAssumingTransport.java:1) [classes/:?]
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> ~[?:1.7.0_67]
>>>> at javax.security.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67]
>>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>>>> [hadoop-common-2.7.2.jar:?]
>>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.o
>>>> pen(TUGIAssumingTransport.java:49) [classes/:?]
>>>> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227)
>>>> [classes/:?]
>>>> at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182)
>>>> [classes/:?]
>>>> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
>>>> [classes/:?]
>>>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>>>> [?:1.7.0_67]
>>>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>>>> [?:1.7.0_67]
>>>> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
>>>> [classes/:?]
>>>> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
>>>> [classes/:?]
>>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1524)
>>>> [classes/:?]
>>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1419)
>>>> [classes/:?]
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> ~[?:1.7.0_67]
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> ~[?:1.7.0_67]
>>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> ~[?:1.7.0_67]
>>>> at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
>>>> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
>>>> [classes/:?]
>>>> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127)
>>>> [classes/:?]
>>>> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166)
>>>> [classes/:?]
>>>> at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999)
>>>> [classes/:?]
>>>> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
>>>> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511)
>>>> [classes/:?]
>>>> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
>>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>>> (Mechanism level: Failed to find any Kerberos tgt)
>>>> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>>> ~[?:1.7.0_67]
>>>> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>>> ~[?:1.7.0_67]
>>>> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>>> ~[?:1.7.0_67]
>>>> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>>> ~[?:1.7.0_67]
>>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>>> ~[?:1.7.0_67]
>>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>>> ~[?:1.7.0_67]
>>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>>> ~[?:1.7.0_67]
>>>> ... 29 more
>>>> 17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with
>>>> status BAD and payload length 19
>>>> 17/01/30 10:55:02 WARN HiveConnection: Failed to connect to
>>>> localhost:10000
>>>> HS2 may be unavailable, check server status
>>>> Error: Could not open client transport with JDBC Uri:
>>>> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD
>>>> S.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed
>>>> (state=08S01,code=0)
>>>> beeline>
>>>>
>>>> ------------------------------
>>>> *From:* Vivek Shrivastava <vi...@gmail.com>
>>>> *Sent:* Monday, January 30, 2017 10:48:35 AM
>>>> *To:* user@hive.apache.org
>>>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>>>
>>>> Please paste the output of
>>>> 1. klist -fe
>>>> 2. relevant entries from HiveServer2 log
>>>>
>>>> On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <
>>>> ricardo.fajardo@autodesk.com> wrote:
>>>>
>>>>> I could not resolve the problem.
>>>>>
>>>>>
>>>>> I have debugged the code and I found out that:
>>>>>
>>>>>
>>>>> 1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line
>>>>> 208
>>>>>
>>>>> ....
>>>>>
>>>>> UserGroupInformation.getCurrentUser return (). Two (....
>>>>>
>>>>> ..
>>>>>
>>>>> This method always returns the user of the operative system but and I
>>>>> need authenticate the user set on the property: hive.server2.proxy.u
>>>>> ser=yourid because I have a token for this one.
>>>>>
>>>>>
>>>>> 2. I have found out that the hive.server2.proxy.user is implemented
>>>>> on the org.apache.hive.jdbc.HiveConnection class method: openSession()
>>>>> but this code is never executed.
>>>>>
>>>>>
>>>>> 3. On the org.apache.hive.service.auth.HiveAuthFactory class there is
>>>>> this code on the method getAuthTransFactory():
>>>>>
>>>>> ....
>>>>>
>>>>>       if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName()))
>>>>> {
>>>>>         // no-op
>>>>> ....
>>>>>
>>>>> It means that Kerberos authentication is not implemented?
>>>>>
>>>>>
>>>>>
>>>>> Please anyone can help me??
>>>>>
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Ricardo.
>>>>> ------------------------------
>>>>> *From:* Dulam, Naresh <na...@bankofamerica.com>
>>>>> *Sent:* Thursday, January 26, 2017 8:41:48 AM
>>>>> *To:* user@hive.apache.org
>>>>> *Subject:* RE: Pls Help me - Hive Kerberos Issue
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM
>>>>>
>>>>>
>>>>>
>>>>> # Connect using following JDBC connection string
>>>>>
>>>>> # jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_
>>>>> HOST@MY-REALM.COM;hive.server2.proxy.user=yourid
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *From:* Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com]
>>>>> *Sent:* Thursday, January 26, 2017 1:37 AM
>>>>> *To:* user@hive.apache.org
>>>>> *Subject:* Pls Help me - Hive Kerberos Issue
>>>>>
>>>>>
>>>>>
>>>>> Hello,
>>>>>
>>>>>
>>>>>
>>>>> Please I need your help with the Kerberos authentication with Hive.
>>>>>
>>>>>
>>>>>
>>>>> I am following this guide:
>>>>>
>>>>> https://www.cloudera.com/documentation/enterprise/5-4-x/topi
>>>>> cs/cdh_sg_hiveserver2_security.html#topic_9_1_1
>>>>>
>>>>> But I am getting this error:
>>>>>
>>>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>>>> (Mechanism level: Failed to find any Kerberos tgt)
>>>>>
>>>>>
>>>>>
>>>>> I have a remote Kerberos server and I can generate a token with kinit
>>>>> for my user. I created a keytab file with my passwd for my user. Please
>>>>> tell me if it is ok.
>>>>>
>>>>>
>>>>>
>>>>> On the another hand when I am debugging the hive code the operative
>>>>> system user is authenticated but I need authenticate my Kerberos user, can
>>>>> you tell me how I can achieve that? How can I store my tickets where Hive
>>>>> can load it?? or How can I verify where Hive is searching the tickets and
>>>>> what Hive is reading??
>>>>>
>>>>>
>>>>>
>>>>> Thanks so much for your help.
>>>>>
>>>>>
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Ricardo.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> ------------------------------
>>>>> This message, and any attachments, is for the intended recipient(s)
>>>>> only, may contain information that is privileged, confidential and/or
>>>>> proprietary and subject to important terms and conditions available at
>>>>> http://www.bankofamerica.com/emaildisclaimer. If you are not the
>>>>> intended recipient, please delete this message.
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Pls Help me - Hive Kerberos Issue

Posted by Ricardo Fajardo <ri...@autodesk.com>.
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ hadoop fs -ls
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Found 20 items
drwxr-xr-x   - cloudera cloudera          0 2016-06-13 17:51 checkpoint
-rw-r--r--   1 cloudera cloudera       3249 2016-05-11 16:19 hadoop.txt
drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:15 hadoop2.txt
drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:30 hadoop3.txt
drwxr-xr-x - cloudera cloudera 0 2016-06-16 16:37 gives
drwxr-xr-x   - cloudera cloudera          0 2016-06-16 16:06 out1
-rw-r--r--   1 cloudera cloudera       3868 2016-06-15 08:39 post.small0.xml
drwxr-xr-x   - cloudera cloudera          0 2016-07-14 17:01 tCount1
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 15:57 test1
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:57 test10
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 17:33 test12
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:02 test2
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:24 test3
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:27 test4
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:32 test5
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:37 test6
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:49 test7
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:51 test8
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:54 test9
-rw-r--r--   1 cloudera cloudera    8481022 2016-06-08 21:51 train.tsv
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ echo $HADOOP_OPTS
-Dsun.security.krb5.debug=true
[cloudera@quickstart bin]$


________________________________
From: Vivek Shrivastava <vi...@gmail.com>
Sent: Monday, January 30, 2017 2:28:53 PM
To: user@hive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue

If you are using AES256, then please do update java unlimited strength jar files. What is the output of hadoop ls command after exporting the below environment variable?

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /

On Mon, Jan 30, 2017 at 2:21 PM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I did the changes but I am getting the same error.

Klist:

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>

Valid starting     Expires            Service principal
01/30/17 11:56:20  01/30/17 21:56:24  krbtgt/ADS.AUTODESK.COM@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>
renew until 01/31/17 11:56:20, Flags: FPRIA
Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac


Log:

[cloudera@quickstart bin]$ export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ ./beeline -u "jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.proxy.user=t_fajar"
/home/cloudera/workspace/hive/bin/hive: line 99: [: /home/cloudera/workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar: binary operator expected
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/benchmarks.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/spark-examples-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.proxy.user=t_fajar
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_73]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_73]
at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_73]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_73]
at java.sql.DriverManager.getConnection(DriverManager.java:208) [?:1.8.0_73]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:797) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:885) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) [benchmarks.jar:2.2.0-SNAPSHOT]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.8.0_73]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[?:1.8.0_73]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.8.0_73]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_73]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_73]
... 35 more
17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.proxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
Beeline version 2.2.0-SNAPSHOT by Apache Hive
beeline>



________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 11:34:27 AM

To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

You can comment both default_tkt_enctypes and default_tgs_enctypes out, the default value will become aes256-cts-hmac-sha1-96aes128-cts-hmac-sha1-96 des3-cbc-sha1 arcfour-hmac-md5 camellia256-cts-cmac camellia128-cts-cmac des-cbc-crc des-cbc-md5 des-cbc-md4 .
Then do
kdestroy
kinit
klist -fev
your beeline command

if still does not work then paste the output of

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /



On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I don't have any particular reason for selecting arcfour encryption type. If I need to change it and it will work I can do.

Values from krb5.conf:

[Libdefaults]
        default_realm = ADS.AUTODESK.COM<http://ADS.AUTODESK.COM>
        krb4_config = /etc/krb.conf
        krb4_realms = /etc/krb.realms
        kdc_timesync = 1
        ccache_type = 4
        forwardable = true
        proxiable = true
        v4_instance_resolve = false
        v4_name_convert = {
                host = {
                        rcmd = host
                        ftp = ftp
                }
                plain = {
                        something = something-else
                }
        }
        fcc-mit-ticketflags = true
        default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS
        default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5 AES256-CTS

[realms]

        ADS.AUTODESK.COM<http://ADS.AUTODESK.COM> = {
                kdc = krb.ads.autodesk.com<http://ads.autodesk.com>: 88
                admin_server = krb.ads.autodesk.com<http://ads.autodesk.com>
                default_domain = ads.autodesk.com<http://ads.autodesk.com>
                database_module = openldap_ldapconf
                master_key_type = aes256-cts
                supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
                default_principal_flags = +preauth
        }

Thanks so much for your help,
Ricardo.
________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 11:01:24 AM

To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Any particular reason for selecting arcfour encryption type? Could you please post defaults (e.g enc_type) values from krb5.conf

On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

1. klist -fe

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>

Valid starting     Expires            Service principal
01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>
renew until 01/31/17 10:52:37, Flags: FPRIA
Etype (skey, tkt): arcfour-hmac, arcfour-hmac
[cloudera@quickstart bin]$

2. relevant entries from HiveServer2 log


beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
!connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
AUTODESK.COM<http://AUTODESK.COM>;hive.server2.proxy.user=t_fajar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
17/01/27 16:16:38 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
17/01/27 16:16:38 DEBUG UserGroupInformation: using local user:UnixPrincipal: cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrincipal: cloudera" with name cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera (auth:SIMPLE)
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod = SIMPLE
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as passed-in authMethod of kerberos != current.
17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th<http://org.apache.hadoop.hive.th>rift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)
17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th<http://org.apache.hadoop.hive.th>rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
17/01/30 10:55:02 DEBUG TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@1119f7c5
17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) ~[?:1.7.0_67]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [libthrift-0.9.3.jar:0.9.3]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [classes/:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:1) [classes/:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_67]
at javax.security.auth.Subject.do<http://javax.security.auth.Subject.do>As(Subject.java:415) [?:1.7.0_67]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [classes/:?]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [classes/:?]
at java.sql.DriverManager.getConnection(DriverManager.java:571) [?:1.7.0_67]
at java.sql.DriverManager.getConnection(DriverManager.java:187) [?:1.7.0_67]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [classes/:?]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_67]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_67]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_67]
at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [classes/:?]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) [classes/:?]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [classes/:?]
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) [classes/:?]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.7.0_67]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.7.0_67]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193) ~[?:1.7.0_67]
... 29 more
17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with status BAD and payload length 19
17/01/30 10:55:02 WARN HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
beeline>


________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 10:48:35 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Please paste the output of
1. klist -fe
2. relevant entries from HiveServer2 log

On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I could not resolve the problem.


I have debugged the code and I found out that:


1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line 208

....

UserGroupInformation.getCurrentUser return (). Two (....

..

This method always returns the user of the operative system but and I need authenticate the user set on the property: hive.server2.proxy.user=yourid because I have a token for this one.


2. I have found out that the hive.server2.proxy.user is implemented on the org.apache.hive.jdbc.HiveConnection class method: openSession() but this code is never executed.


3. On the org.apache.hive.service.auth.HiveAuthFactory class there is this code on the method getAuthTransFactory():

....

      if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) {
        // no-op
....


It means that Kerberos authentication is not implemented?



Please anyone can help me??


Thanks,

Ricardo.

________________________________
From: Dulam, Naresh <na...@bankofamerica.com>>
Sent: Thursday, January 26, 2017 8:41:48 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: RE: Pls Help me - Hive Kerberos Issue


Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM<ma...@MY-REALM.COM>

# Connect using following JDBC connection string
# jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid<http://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid>






From: Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com<ma...@autodesk.com>]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Pls Help me - Hive Kerberos Issue

Hello,



Please I need your help with the Kerberos authentication with Hive.



I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my user. I created a keytab file with my passwd for my user. Please tell me if it is ok.

On the another hand when I am debugging the hive code the operative system user is authenticated but I need authenticate my Kerberos user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or How can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Ricardo.


________________________________
This message, and any attachments, is for the intended recipient(s) only, may contain information that is privileged, confidential and/or proprietary and subject to important terms and conditions available at http://www.bankofamerica.com/emaildisclaimer. If you are not the intended recipient, please delete this message.





Re: Pls Help me - Hive Kerberos Issue

Posted by Vivek Shrivastava <vi...@gmail.com>.
If you are using AES256, then please do update java unlimited strength jar
files. What is the output of hadoop ls command after exporting the below
environment variable?

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /

On Mon, Jan 30, 2017 at 2:21 PM, Ricardo Fajardo <
ricardo.fajardo@autodesk.com> wrote:

> I did the changes but I am getting the same error.
>
> Klist:
>
> [cloudera@quickstart bin]$ klist -fe
> Ticket cache: FILE:/tmp/krb5cc_501
> Default principal: t_fajar@ADS.AUTODESK.COM
>
> Valid starting     Expires            Service principal
> 01/30/17 11:56:20  01/30/17 21:56:24  krbtgt/ADS.AUTODESK.COM@ADS.
> AUTODESK.COM
> renew until 01/31/17 11:56:20, Flags: FPRIA
> Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac
>
>
> Log:
>
> [cloudera@quickstart bin]$ export HADOOP_OPTS="-Dsun.security.
> krb5.debug=true"
> [cloudera@quickstart bin]$
> [cloudera@quickstart bin]$
> [cloudera@quickstart bin]$ ./beeline -u "jdbc:hive2://localhost:10000/
> default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.
> proxy.user=t_fajar"
> /home/cloudera/workspace/hive/bin/hive: line 99: [:
> /home/cloudera/workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar:
> binary operator expected
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/home/cloudera/
> workspace/hive/lib/benchmarks.jar!/org/slf4j/impl/
> StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/cloudera/
> workspace/hive/lib/hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!
> /org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/spark-
> assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/spark-
> examples-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/
> lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@
> ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
> Java config name: null
> Native config name: /etc/krb5.conf
> Loaded from native config
> 17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SASL negotiation
> failure
> javax.security.sasl.SaslException: GSS initiate failed
> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> ~[?:1.8.0_73]
> at org.apache.thrift.transport.TSaslClientTransport.
> handleSaslStartMessage(TSaslClientTransport.java:94)
> ~[benchmarks.jar:2.2.0-SNAPSHOT]
> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> [benchmarks.jar:2.2.0-SNAPSHOT]
> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> [benchmarks.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$
> 1.run(TUGIAssumingTransport.java:52) [benchmarks.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$
> 1.run(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
> at java.security.AccessController.doPrivileged(Native Method)
> ~[?:1.8.0_73]
> at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_73]
> at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1657) [benchmarks.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.
> open(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227)
> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182)
> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
> [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at java.sql.DriverManager.getConnection(DriverManager.java:664)
> [?:1.8.0_73]
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
> [?:1.8.0_73]
> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.Commands.connect(Commands.java:1524)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.Commands.connect(Commands.java:1419)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_73]
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(
> ReflectiveCommandHandler.java:56) [hive-beeline-2.2.0-SNAPSHOT.
> jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:797)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:885)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494)
> [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_73]
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
> at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
> [benchmarks.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> [benchmarks.jar:2.2.0-SNAPSHOT]
> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
> (Mechanism level: Failed to find any Kerberos tgt)
> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> ~[?:1.8.0_73]
> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
> ~[?:1.8.0_73]
> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> ~[?:1.8.0_73]
> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
> ~[?:1.8.0_73]
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> ~[?:1.8.0_73]
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> ~[?:1.8.0_73]
> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
> ~[?:1.8.0_73]
> ... 35 more
> 17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed to connect to
> localhost:10000
> HS2 may be unavailable, check server status
> Error: Could not open client transport with JDBC Uri:
> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM
> ;hive.server2.proxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
> Beeline version 2.2.0-SNAPSHOT by Apache Hive
> beeline>
>
>
> ------------------------------
> *From:* Vivek Shrivastava <vi...@gmail.com>
> *Sent:* Monday, January 30, 2017 11:34:27 AM
>
> *To:* user@hive.apache.org
> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>
> You can comment both default_tkt_enctypes and default_tgs_enctypes out,
> the default value will become aes256-cts-hmac-sha1-96
> aes128-cts-hmac-sha1-96 des3-cbc-sha1 arcfour-hmac-md5 came
> llia256-cts-cmac camellia128-cts-cmac des-cbc-crc des-cbc-md5 des-cbc-md4 .
>
> Then do
> kdestroy
> kinit
> klist -fev
> your beeline command
>
> if still does not work then paste the output of
>
> export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
> hadoop fs -ls /
>
>
>
> On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo <
> ricardo.fajardo@autodesk.com> wrote:
>
>> I don't have any particular reason for selecting arcfour encryption type.
>> If I need to change it and it will work I can do.
>>
>> Values from krb5.conf:
>>
>> [Libdefaults]
>>         default_realm = ADS.AUTODESK.COM
>>         krb4_config = /etc/krb.conf
>>         krb4_realms = /etc/krb.realms
>>         kdc_timesync = 1
>>         ccache_type = 4
>>         forwardable = true
>>         proxiable = true
>>         v4_instance_resolve = false
>>         v4_name_convert = {
>>                 host = {
>>                         rcmd = host
>>                         ftp = ftp
>>                 }
>>                 plain = {
>>                         something = something-else
>>                 }
>>         }
>>         fcc-mit-ticketflags = true
>>         default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS
>>         default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5 AES256-CTS
>>
>> [realms]
>>
>>         ADS.AUTODESK.COM = {
>>                 kdc = krb.ads.autodesk.com: 88
>>                 admin_server = krb.ads.autodesk.com
>>                 default_domain = ads.autodesk.com
>>                 database_module = openldap_ldapconf
>>                 master_key_type = aes256-cts
>>                 supported_enctypes = aes256-cts:normal aes128-cts:normal
>> des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal
>> des-cbc-md5:normal des-cbc-crc:normal
>>                 default_principal_flags = +preauth
>>         }
>>
>> Thanks so much for your help,
>> Ricardo.
>> ------------------------------
>> *From:* Vivek Shrivastava <vi...@gmail.com>
>> *Sent:* Monday, January 30, 2017 11:01:24 AM
>>
>> *To:* user@hive.apache.org
>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>
>> Any particular reason for selecting arcfour encryption type? Could you
>> please post defaults (e.g enc_type) values from krb5.conf
>>
>> On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <
>> ricardo.fajardo@autodesk.com> wrote:
>>
>>>
>>> 1. klist -fe
>>>
>>> [cloudera@quickstart bin]$ klist -fe
>>> Ticket cache: FILE:/tmp/krb5cc_501
>>> Default principal: t_fajar@ADS.AUTODESK.COM
>>>
>>> Valid starting     Expires            Service principal
>>> 01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.A
>>> UTODESK.COM
>>> renew until 01/31/17 10:52:37, Flags: FPRIA
>>> Etype (skey, tkt): arcfour-hmac, arcfour-hmac
>>> [cloudera@quickstart bin]$
>>>
>>> 2. relevant entries from HiveServer2 log
>>>
>>>
>>> beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_
>>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
>>> AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>> epository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/lo
>>> g4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>> epository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.
>>> jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>>> epository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.1
>>> 0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4
>>> jLoggerFactory]
>>> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_
>>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>>> 17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
>>> 17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>> org.apache.hadoop.metrics2.lib.MutableRate
>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
>>> with annotation @org.apache.hadoop.metrics2.an
>>> notation.Metric(valueName=Time, value=[Rate of successful kerberos
>>> logins and latency (milliseconds)], about=, type=DEFAULT, always=false,
>>> sampleName=Ops)
>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>> org.apache.hadoop.metrics2.lib.MutableRate
>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
>>> with annotation @org.apache.hadoop.metrics2.an
>>> notation.Metric(valueName=Time, value=[Rate of failed kerberos logins
>>> and latency (milliseconds)], about=, type=DEFAULT, always=false,
>>> sampleName=Ops)
>>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>>> org.apache.hadoop.metrics2.lib.MutableRate
>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups
>>> with annotation @org.apache.hadoop.metrics2.an
>>> notation.Metric(valueName=Time, value=[GetGroups], about=,
>>> type=DEFAULT, always=false, sampleName=Ops)
>>> 17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group
>>> related metrics
>>> 17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
>>> 17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the
>>> custom-built native-hadoop library...
>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop
>>> with error: java.lang.UnsatisfiedLinkError: no hadoop in
>>> java.library.path
>>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/pa
>>> ckages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
>>> 17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop
>>> library for your platform... using builtin-java classes where applicable
>>> 17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
>>> 17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group
>>> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
>>> 17/01/27 16:16:38 DEBUG Groups: Group mapping
>>> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
>>> cacheTimeout=300000; warningDeltaMs=5000
>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: using local
>>> user:UnixPrincipal: cloudera
>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: Using user:
>>> "UnixPrincipal: cloudera" with name cloudera
>>> 17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
>>> 17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera
>>> (auth:SIMPLE)
>>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod =
>>> SIMPLE
>>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as
>>> passed-in authMethod of kerberos != current.
>>> 17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction
>>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th
>>> rift.HadoopThriftAuthBridge$Client.createClientTransport(Had
>>> oopThriftAuthBridge.java:208)
>>> 17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction
>>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th
>>> rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>>> 17/01/30 10:55:02 DEBUG TSaslTransport: opening transport
>>> org.apache.thrift.transport.TSaslClientTransport@1119f7c5
>>> 17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
>>> javax.security.sasl.SaslException: GSS initiate failed
>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>> ~[?:1.7.0_67]
>>> at org.apache.thrift.transport.TSaslClientTransport.handleSaslS
>>> tartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
>>> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>>> [libthrift-0.9.3.jar:0.9.3]
>>> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>>> [libthrift-0.9.3.jar:0.9.3]
>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>>> .run(TUGIAssumingTransport.java:52) [classes/:?]
>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>>> .run(TUGIAssumingTransport.java:1) [classes/:?]
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> ~[?:1.7.0_67]
>>> at javax.security.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67]
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>>> [hadoop-common-2.7.2.jar:?]
>>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.o
>>> pen(TUGIAssumingTransport.java:49) [classes/:?]
>>> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227)
>>> [classes/:?]
>>> at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182)
>>> [classes/:?]
>>> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
>>> [classes/:?]
>>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>>> [?:1.7.0_67]
>>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>>> [?:1.7.0_67]
>>> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
>>> [classes/:?]
>>> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
>>> [classes/:?]
>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1524)
>>> [classes/:?]
>>> at org.apache.hive.beeline.Commands.connect(Commands.java:1419)
>>> [classes/:?]
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> ~[?:1.7.0_67]
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> ~[?:1.7.0_67]
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> ~[?:1.7.0_67]
>>> at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
>>> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
>>> [classes/:?]
>>> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127)
>>> [classes/:?]
>>> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166)
>>> [classes/:?]
>>> at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999)
>>> [classes/:?]
>>> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
>>> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511)
>>> [classes/:?]
>>> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>> (Mechanism level: Failed to find any Kerberos tgt)
>>> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>> ~[?:1.7.0_67]
>>> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>>> ~[?:1.7.0_67]
>>> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>> ~[?:1.7.0_67]
>>> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>>> ~[?:1.7.0_67]
>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>> ~[?:1.7.0_67]
>>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>> ~[?:1.7.0_67]
>>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>>> ~[?:1.7.0_67]
>>> ... 29 more
>>> 17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with
>>> status BAD and payload length 19
>>> 17/01/30 10:55:02 WARN HiveConnection: Failed to connect to
>>> localhost:10000
>>> HS2 may be unavailable, check server status
>>> Error: Could not open client transport with JDBC Uri:
>>> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD
>>> S.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed
>>> (state=08S01,code=0)
>>> beeline>
>>>
>>> ------------------------------
>>> *From:* Vivek Shrivastava <vi...@gmail.com>
>>> *Sent:* Monday, January 30, 2017 10:48:35 AM
>>> *To:* user@hive.apache.org
>>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>>
>>> Please paste the output of
>>> 1. klist -fe
>>> 2. relevant entries from HiveServer2 log
>>>
>>> On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <
>>> ricardo.fajardo@autodesk.com> wrote:
>>>
>>>> I could not resolve the problem.
>>>>
>>>>
>>>> I have debugged the code and I found out that:
>>>>
>>>>
>>>> 1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line
>>>> 208
>>>>
>>>> ....
>>>>
>>>> UserGroupInformation.getCurrentUser return (). Two (....
>>>>
>>>> ..
>>>>
>>>> This method always returns the user of the operative system but and I
>>>> need authenticate the user set on the property: hive.server2.proxy.u
>>>> ser=yourid because I have a token for this one.
>>>>
>>>>
>>>> 2. I have found out that the hive.server2.proxy.user is implemented on
>>>> the org.apache.hive.jdbc.HiveConnection class method: openSession() but
>>>> this code is never executed.
>>>>
>>>>
>>>> 3. On the org.apache.hive.service.auth.HiveAuthFactory class there is
>>>> this code on the method getAuthTransFactory():
>>>>
>>>> ....
>>>>
>>>>       if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName()))
>>>> {
>>>>         // no-op
>>>> ....
>>>>
>>>> It means that Kerberos authentication is not implemented?
>>>>
>>>>
>>>>
>>>> Please anyone can help me??
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Ricardo.
>>>> ------------------------------
>>>> *From:* Dulam, Naresh <na...@bankofamerica.com>
>>>> *Sent:* Thursday, January 26, 2017 8:41:48 AM
>>>> *To:* user@hive.apache.org
>>>> *Subject:* RE: Pls Help me - Hive Kerberos Issue
>>>>
>>>>
>>>>
>>>>
>>>> Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM
>>>>
>>>>
>>>>
>>>> # Connect using following JDBC connection string
>>>>
>>>> # jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_
>>>> HOST@MY-REALM.COM;hive.server2.proxy.user=yourid
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *From:* Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com]
>>>> *Sent:* Thursday, January 26, 2017 1:37 AM
>>>> *To:* user@hive.apache.org
>>>> *Subject:* Pls Help me - Hive Kerberos Issue
>>>>
>>>>
>>>>
>>>> Hello,
>>>>
>>>>
>>>>
>>>> Please I need your help with the Kerberos authentication with Hive.
>>>>
>>>>
>>>>
>>>> I am following this guide:
>>>>
>>>> https://www.cloudera.com/documentation/enterprise/5-4-x/topi
>>>> cs/cdh_sg_hiveserver2_security.html#topic_9_1_1
>>>>
>>>> But I am getting this error:
>>>>
>>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>>> (Mechanism level: Failed to find any Kerberos tgt)
>>>>
>>>>
>>>>
>>>> I have a remote Kerberos server and I can generate a token with kinit
>>>> for my user. I created a keytab file with my passwd for my user. Please
>>>> tell me if it is ok.
>>>>
>>>>
>>>>
>>>> On the another hand when I am debugging the hive code the operative
>>>> system user is authenticated but I need authenticate my Kerberos user, can
>>>> you tell me how I can achieve that? How can I store my tickets where Hive
>>>> can load it?? or How can I verify where Hive is searching the tickets and
>>>> what Hive is reading??
>>>>
>>>>
>>>>
>>>> Thanks so much for your help.
>>>>
>>>>
>>>>
>>>> Best regards,
>>>>
>>>> Ricardo.
>>>>
>>>>
>>>>
>>>>
>>>> ------------------------------
>>>> This message, and any attachments, is for the intended recipient(s)
>>>> only, may contain information that is privileged, confidential and/or
>>>> proprietary and subject to important terms and conditions available at
>>>> http://www.bankofamerica.com/emaildisclaimer. If you are not the
>>>> intended recipient, please delete this message.
>>>>
>>>
>>>
>>
>

Re: Pls Help me - Hive Kerberos Issue

Posted by Ricardo Fajardo <ri...@autodesk.com>.
I did the changes but I am getting the same error.

Klist:

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM

Valid starting     Expires            Service principal
01/30/17 11:56:20  01/30/17 21:56:24  krbtgt/ADS.AUTODESK.COM@ADS.AUTODESK.COM
renew until 01/31/17 11:56:20, Flags: FPRIA
Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac


Log:

[cloudera@quickstart bin]$ export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ ./beeline -u "jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar"
/home/cloudera/workspace/hive/bin/hive: line 99: [: /home/cloudera/workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar: binary operator expected
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/benchmarks.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/hive/lib/spark-examples-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_73]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_73]
at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_73]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_73]
at java.sql.DriverManager.getConnection(DriverManager.java:208) [?:1.8.0_73]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:797) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:885) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_73]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) [benchmarks.jar:2.2.0-SNAPSHOT]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.8.0_73]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[?:1.8.0_73]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.8.0_73]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_73]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_73]
... 35 more
17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
Beeline version 2.2.0-SNAPSHOT by Apache Hive
beeline>



________________________________
From: Vivek Shrivastava <vi...@gmail.com>
Sent: Monday, January 30, 2017 11:34:27 AM
To: user@hive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue

You can comment both default_tkt_enctypes and default_tgs_enctypes out, the default value will become aes256-cts-hmac-sha1-96aes128-cts-hmac-sha1-96 des3-cbc-sha1 arcfour-hmac-md5 camellia256-cts-cmac camellia128-cts-cmac des-cbc-crc des-cbc-md5 des-cbc-md4 .
Then do
kdestroy
kinit
klist -fev
your beeline command

if still does not work then paste the output of

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /



On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I don't have any particular reason for selecting arcfour encryption type. If I need to change it and it will work I can do.

Values from krb5.conf:

[Libdefaults]
        default_realm = ADS.AUTODESK.COM<http://ADS.AUTODESK.COM>
        krb4_config = /etc/krb.conf
        krb4_realms = /etc/krb.realms
        kdc_timesync = 1
        ccache_type = 4
        forwardable = true
        proxiable = true
        v4_instance_resolve = false
        v4_name_convert = {
                host = {
                        rcmd = host
                        ftp = ftp
                }
                plain = {
                        something = something-else
                }
        }
        fcc-mit-ticketflags = true
        default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS
        default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5 AES256-CTS

[realms]

        ADS.AUTODESK.COM<http://ADS.AUTODESK.COM> = {
                kdc = krb.ads.autodesk.com<http://ads.autodesk.com>: 88
                admin_server = krb.ads.autodesk.com<http://ads.autodesk.com>
                default_domain = ads.autodesk.com<http://ads.autodesk.com>
                database_module = openldap_ldapconf
                master_key_type = aes256-cts
                supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
                default_principal_flags = +preauth
        }

Thanks so much for your help,
Ricardo.
________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 11:01:24 AM

To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Any particular reason for selecting arcfour encryption type? Could you please post defaults (e.g enc_type) values from krb5.conf

On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

1. klist -fe

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>

Valid starting     Expires            Service principal
01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>
renew until 01/31/17 10:52:37, Flags: FPRIA
Etype (skey, tkt): arcfour-hmac, arcfour-hmac
[cloudera@quickstart bin]$

2. relevant entries from HiveServer2 log


beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
!connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
AUTODESK.COM<http://AUTODESK.COM>;hive.server2.proxy.user=t_fajar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
17/01/27 16:16:38 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
17/01/27 16:16:38 DEBUG UserGroupInformation: using local user:UnixPrincipal: cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrincipal: cloudera" with name cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera (auth:SIMPLE)
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod = SIMPLE
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as passed-in authMethod of kerberos != current.
17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th<http://org.apache.hadoop.hive.th>rift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)
17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th<http://org.apache.hadoop.hive.th>rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
17/01/30 10:55:02 DEBUG TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@1119f7c5
17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) ~[?:1.7.0_67]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [libthrift-0.9.3.jar:0.9.3]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [classes/:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:1) [classes/:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_67]
at javax.security.auth.Subject.do<http://javax.security.auth.Subject.do>As(Subject.java:415) [?:1.7.0_67]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [classes/:?]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [classes/:?]
at java.sql.DriverManager.getConnection(DriverManager.java:571) [?:1.7.0_67]
at java.sql.DriverManager.getConnection(DriverManager.java:187) [?:1.7.0_67]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [classes/:?]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_67]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_67]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_67]
at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [classes/:?]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) [classes/:?]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [classes/:?]
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) [classes/:?]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.7.0_67]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.7.0_67]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193) ~[?:1.7.0_67]
... 29 more
17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with status BAD and payload length 19
17/01/30 10:55:02 WARN HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
beeline>


________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 10:48:35 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Please paste the output of
1. klist -fe
2. relevant entries from HiveServer2 log

On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I could not resolve the problem.


I have debugged the code and I found out that:


1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line 208

....

UserGroupInformation.getCurrentUser return (). Two (....

..

This method always returns the user of the operative system but and I need authenticate the user set on the property: hive.server2.proxy.user=yourid because I have a token for this one.


2. I have found out that the hive.server2.proxy.user is implemented on the org.apache.hive.jdbc.HiveConnection class method: openSession() but this code is never executed.


3. On the org.apache.hive.service.auth.HiveAuthFactory class there is this code on the method getAuthTransFactory():

....

      if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) {
        // no-op
....


It means that Kerberos authentication is not implemented?



Please anyone can help me??


Thanks,

Ricardo.

________________________________
From: Dulam, Naresh <na...@bankofamerica.com>>
Sent: Thursday, January 26, 2017 8:41:48 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: RE: Pls Help me - Hive Kerberos Issue


Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM<ma...@MY-REALM.COM>

# Connect using following JDBC connection string
# jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid<http://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid>






From: Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com<ma...@autodesk.com>]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Pls Help me - Hive Kerberos Issue

Hello,



Please I need your help with the Kerberos authentication with Hive.



I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my user. I created a keytab file with my passwd for my user. Please tell me if it is ok.

On the another hand when I am debugging the hive code the operative system user is authenticated but I need authenticate my Kerberos user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or How can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Ricardo.


________________________________
This message, and any attachments, is for the intended recipient(s) only, may contain information that is privileged, confidential and/or proprietary and subject to important terms and conditions available at http://www.bankofamerica.com/emaildisclaimer. If you are not the intended recipient, please delete this message.




Re: Pls Help me - Hive Kerberos Issue

Posted by Vivek Shrivastava <vi...@gmail.com>.
You can comment both default_tkt_enctypes and default_tgs_enctypes out, the
default value will become aes256-cts-hmac-sha1-96aes128-cts-hmac-sha1-96
des3-cbc-sha1 arcfour-hmac-md5 camellia256-cts-cmac camellia128-cts-cmac
des-cbc-crc des-cbc-md5 des-cbc-md4 .
Then do
kdestroy
kinit
klist -fev
your beeline command

if still does not work then paste the output of

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /



On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo <
ricardo.fajardo@autodesk.com> wrote:

> I don't have any particular reason for selecting arcfour encryption type.
> If I need to change it and it will work I can do.
>
> Values from krb5.conf:
>
> [Libdefaults]
>         default_realm = ADS.AUTODESK.COM
>         krb4_config = /etc/krb.conf
>         krb4_realms = /etc/krb.realms
>         kdc_timesync = 1
>         ccache_type = 4
>         forwardable = true
>         proxiable = true
>         v4_instance_resolve = false
>         v4_name_convert = {
>                 host = {
>                         rcmd = host
>                         ftp = ftp
>                 }
>                 plain = {
>                         something = something-else
>                 }
>         }
>         fcc-mit-ticketflags = true
>         default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS
>         default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5 AES256-CTS
>
> [realms]
>
>         ADS.AUTODESK.COM = {
>                 kdc = krb.ads.autodesk.com: 88
>                 admin_server = krb.ads.autodesk.com
>                 default_domain = ads.autodesk.com
>                 database_module = openldap_ldapconf
>                 master_key_type = aes256-cts
>                 supported_enctypes = aes256-cts:normal aes128-cts:normal
> des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal
> des-cbc-md5:normal des-cbc-crc:normal
>                 default_principal_flags = +preauth
>         }
>
> Thanks so much for your help,
> Ricardo.
> ------------------------------
> *From:* Vivek Shrivastava <vi...@gmail.com>
> *Sent:* Monday, January 30, 2017 11:01:24 AM
>
> *To:* user@hive.apache.org
> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>
> Any particular reason for selecting arcfour encryption type? Could you
> please post defaults (e.g enc_type) values from krb5.conf
>
> On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <
> ricardo.fajardo@autodesk.com> wrote:
>
>>
>> 1. klist -fe
>>
>> [cloudera@quickstart bin]$ klist -fe
>> Ticket cache: FILE:/tmp/krb5cc_501
>> Default principal: t_fajar@ADS.AUTODESK.COM
>>
>> Valid starting     Expires            Service principal
>> 01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.A
>> UTODESK.COM
>> renew until 01/31/17 10:52:37, Flags: FPRIA
>> Etype (skey, tkt): arcfour-hmac, arcfour-hmac
>> [cloudera@quickstart bin]$
>>
>> 2. relevant entries from HiveServer2 log
>>
>>
>> beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_
>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
>> AUTODESK.COM;hive.server2.proxy.user=t_fajar
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>> epository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/lo
>> g4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>> epository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.
>> jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/r
>> epository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.
>> 10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4
>> jLoggerFactory]
>> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_
>> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
>> 17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
>> 17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.Use
>> rGroupInformation$UgiMetrics.loginSuccess with annotation @
>> org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
>> of successful kerberos logins and latency (milliseconds)], about=,
>> type=DEFAULT, always=false, sampleName=Ops)
>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.Use
>> rGroupInformation$UgiMetrics.loginFailure with annotation @
>> org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
>> of failed kerberos logins and latency (milliseconds)], about=,
>> type=DEFAULT, always=false, sampleName=Ops)
>> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
>> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.Use
>> rGroupInformation$UgiMetrics.getGroups with annotation @
>> org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
>> value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
>> 17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group
>> related metrics
>> 17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
>> 17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-built
>> native-hadoop library...
>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop
>> with error: java.lang.UnsatisfiedLinkError: no hadoop in
>> java.library.path
>> 17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/pa
>> ckages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
>> 17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>> 17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
>> 17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group
>> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
>> 17/01/27 16:16:38 DEBUG Groups: Group mapping
>> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
>> cacheTimeout=300000; warningDeltaMs=5000
>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
>> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
>> 17/01/27 16:16:38 DEBUG UserGroupInformation: using local
>> user:UnixPrincipal: cloudera
>> 17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrincipal:
>> cloudera" with name cloudera
>> 17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
>> 17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera
>> (auth:SIMPLE)
>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod =
>> SIMPLE
>> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as
>> passed-in authMethod of kerberos != current.
>> 17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction
>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th
>> rift.HadoopThriftAuthBridge$Client.createClientTransport(Had
>> oopThriftAuthBridge.java:208)
>> 17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction
>> as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.th
>> rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>> 17/01/30 10:55:02 DEBUG TSaslTransport: opening transport
>> org.apache.thrift.transport.TSaslClientTransport@1119f7c5
>> 17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
>> javax.security.sasl.SaslException: GSS initiate failed
>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>> ~[?:1.7.0_67]
>> at org.apache.thrift.transport.TSaslClientTransport.handleSaslS
>> tartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
>> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>> [libthrift-0.9.3.jar:0.9.3]
>> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>> [libthrift-0.9.3.jar:0.9.3]
>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>> .run(TUGIAssumingTransport.java:52) [classes/:?]
>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1
>> .run(TUGIAssumingTransport.java:1) [classes/:?]
>> at java.security.AccessController.doPrivileged(Native Method)
>> ~[?:1.7.0_67]
>> at javax.security.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67]
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>> [hadoop-common-2.7.2.jar:?]
>> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.o
>> pen(TUGIAssumingTransport.java:49) [classes/:?]
>> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227)
>> [classes/:?]
>> at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182)
>> [classes/:?]
>> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
>> [classes/:?]
>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>> [?:1.7.0_67]
>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>> [?:1.7.0_67]
>> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
>> [classes/:?]
>> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
>> [classes/:?]
>> at org.apache.hive.beeline.Commands.connect(Commands.java:1524)
>> [classes/:?]
>> at org.apache.hive.beeline.Commands.connect(Commands.java:1419)
>> [classes/:?]
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> ~[?:1.7.0_67]
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> ~[?:1.7.0_67]
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> ~[?:1.7.0_67]
>> at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
>> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
>> [classes/:?]
>> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127)
>> [classes/:?]
>> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166)
>> [classes/:?]
>> at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?]
>> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
>> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511)
>> [classes/:?]
>> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>> (Mechanism level: Failed to find any Kerberos tgt)
>> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>> ~[?:1.7.0_67]
>> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>> ~[?:1.7.0_67]
>> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>> ~[?:1.7.0_67]
>> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>> ~[?:1.7.0_67]
>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>> ~[?:1.7.0_67]
>> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>> ~[?:1.7.0_67]
>> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>> ~[?:1.7.0_67]
>> ... 29 more
>> 17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with
>> status BAD and payload length 19
>> 17/01/30 10:55:02 WARN HiveConnection: Failed to connect to
>> localhost:10000
>> HS2 may be unavailable, check server status
>> Error: Could not open client transport with JDBC Uri:
>> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@AD
>> S.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed
>> (state=08S01,code=0)
>> beeline>
>>
>> ------------------------------
>> *From:* Vivek Shrivastava <vi...@gmail.com>
>> *Sent:* Monday, January 30, 2017 10:48:35 AM
>> *To:* user@hive.apache.org
>> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>>
>> Please paste the output of
>> 1. klist -fe
>> 2. relevant entries from HiveServer2 log
>>
>> On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <
>> ricardo.fajardo@autodesk.com> wrote:
>>
>>> I could not resolve the problem.
>>>
>>>
>>> I have debugged the code and I found out that:
>>>
>>>
>>> 1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line
>>> 208
>>>
>>> ....
>>>
>>> UserGroupInformation.getCurrentUser return (). Two (....
>>>
>>> ..
>>>
>>> This method always returns the user of the operative system but and I
>>> need authenticate the user set on the property: hive.server2.proxy.u
>>> ser=yourid because I have a token for this one.
>>>
>>>
>>> 2. I have found out that the hive.server2.proxy.user is implemented on
>>> the org.apache.hive.jdbc.HiveConnection class method: openSession() but
>>> this code is never executed.
>>>
>>>
>>> 3. On the org.apache.hive.service.auth.HiveAuthFactory class there is
>>> this code on the method getAuthTransFactory():
>>>
>>> ....
>>>
>>>       if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName()))
>>> {
>>>         // no-op
>>> ....
>>>
>>> It means that Kerberos authentication is not implemented?
>>>
>>>
>>>
>>> Please anyone can help me??
>>>
>>>
>>> Thanks,
>>>
>>> Ricardo.
>>> ------------------------------
>>> *From:* Dulam, Naresh <na...@bankofamerica.com>
>>> *Sent:* Thursday, January 26, 2017 8:41:48 AM
>>> *To:* user@hive.apache.org
>>> *Subject:* RE: Pls Help me - Hive Kerberos Issue
>>>
>>>
>>>
>>>
>>> Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM
>>>
>>>
>>>
>>> # Connect using following JDBC connection string
>>>
>>> # jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_
>>> HOST@MY-REALM.COM;hive.server2.proxy.user=yourid
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *From:* Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com]
>>> *Sent:* Thursday, January 26, 2017 1:37 AM
>>> *To:* user@hive.apache.org
>>> *Subject:* Pls Help me - Hive Kerberos Issue
>>>
>>>
>>>
>>> Hello,
>>>
>>>
>>>
>>> Please I need your help with the Kerberos authentication with Hive.
>>>
>>>
>>>
>>> I am following this guide:
>>>
>>> https://www.cloudera.com/documentation/enterprise/5-4-x/topi
>>> cs/cdh_sg_hiveserver2_security.html#topic_9_1_1
>>>
>>> But I am getting this error:
>>>
>>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>> (Mechanism level: Failed to find any Kerberos tgt)
>>>
>>>
>>>
>>> I have a remote Kerberos server and I can generate a token with kinit
>>> for my user. I created a keytab file with my passwd for my user. Please
>>> tell me if it is ok.
>>>
>>>
>>>
>>> On the another hand when I am debugging the hive code the operative
>>> system user is authenticated but I need authenticate my Kerberos user, can
>>> you tell me how I can achieve that? How can I store my tickets where Hive
>>> can load it?? or How can I verify where Hive is searching the tickets and
>>> what Hive is reading??
>>>
>>>
>>>
>>> Thanks so much for your help.
>>>
>>>
>>>
>>> Best regards,
>>>
>>> Ricardo.
>>>
>>>
>>>
>>>
>>> ------------------------------
>>> This message, and any attachments, is for the intended recipient(s)
>>> only, may contain information that is privileged, confidential and/or
>>> proprietary and subject to important terms and conditions available at
>>> http://www.bankofamerica.com/emaildisclaimer. If you are not the
>>> intended recipient, please delete this message.
>>>
>>
>>
>

Re: Pls Help me - Hive Kerberos Issue

Posted by Ricardo Fajardo <ri...@autodesk.com>.
I don't have any particular reason for selecting arcfour encryption type. If I need to change it and it will work I can do.

Values from krb5.conf:

[Libdefaults]
        default_realm = ADS.AUTODESK.COM
        krb4_config = /etc/krb.conf
        krb4_realms = /etc/krb.realms
        kdc_timesync = 1
        ccache_type = 4
        forwardable = true
        proxiable = true
        v4_instance_resolve = false
        v4_name_convert = {
                host = {
                        rcmd = host
                        ftp = ftp
                }
                plain = {
                        something = something-else
                }
        }
        fcc-mit-ticketflags = true
        default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS
        default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5 AES256-CTS

[realms]

        ADS.AUTODESK.COM = {
                kdc = krb.ads.autodesk.com: 88
                admin_server = krb.ads.autodesk.com
                default_domain = ads.autodesk.com
                database_module = openldap_ldapconf
                master_key_type = aes256-cts
                supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
                default_principal_flags = +preauth
        }

Thanks so much for your help,
Ricardo.
________________________________
From: Vivek Shrivastava <vi...@gmail.com>
Sent: Monday, January 30, 2017 11:01:24 AM
To: user@hive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue

Any particular reason for selecting arcfour encryption type? Could you please post defaults (e.g enc_type) values from krb5.conf

On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

1. klist -fe

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>

Valid starting     Expires            Service principal
01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>
renew until 01/31/17 10:52:37, Flags: FPRIA
Etype (skey, tkt): arcfour-hmac, arcfour-hmac
[cloudera@quickstart bin]$

2. relevant entries from HiveServer2 log


beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.proxy.user=t_fajar
!connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
AUTODESK.COM<http://AUTODESK.COM>;hive.server2.proxy.user=t_fajar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.proxy.user=t_fajar
17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
17/01/27 16:16:38 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
17/01/27 16:16:38 DEBUG UserGroupInformation: using local user:UnixPrincipal: cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrincipal: cloudera" with name cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera (auth:SIMPLE)
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod = SIMPLE
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as passed-in authMethod of kerberos != current.
17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)
17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
17/01/30 10:55:02 DEBUG TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@1119f7c5
17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) ~[?:1.7.0_67]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [libthrift-0.9.3.jar:0.9.3]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [classes/:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:1) [classes/:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_67]
at javax.security.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [classes/:?]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [classes/:?]
at java.sql.DriverManager.getConnection(DriverManager.java:571) [?:1.7.0_67]
at java.sql.DriverManager.getConnection(DriverManager.java:187) [?:1.7.0_67]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [classes/:?]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_67]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_67]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_67]
at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [classes/:?]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) [classes/:?]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [classes/:?]
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) [classes/:?]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.7.0_67]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.7.0_67]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193) ~[?:1.7.0_67]
... 29 more
17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with status BAD and payload length 19
17/01/30 10:55:02 WARN HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM<ma...@ADS.AUTODESK.COM>;hive.server2.proxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
beeline>


________________________________
From: Vivek Shrivastava <vi...@gmail.com>>
Sent: Monday, January 30, 2017 10:48:35 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Please paste the output of
1. klist -fe
2. relevant entries from HiveServer2 log

On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I could not resolve the problem.


I have debugged the code and I found out that:


1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line 208

....

UserGroupInformation.getCurrentUser return (). Two (....

..

This method always returns the user of the operative system but and I need authenticate the user set on the property: hive.server2.proxy.user=yourid because I have a token for this one.


2. I have found out that the hive.server2.proxy.user is implemented on the org.apache.hive.jdbc.HiveConnection class method: openSession() but this code is never executed.


3. On the org.apache.hive.service.auth.HiveAuthFactory class there is this code on the method getAuthTransFactory():

....

      if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) {
        // no-op
....


It means that Kerberos authentication is not implemented?



Please anyone can help me??


Thanks,

Ricardo.

________________________________
From: Dulam, Naresh <na...@bankofamerica.com>>
Sent: Thursday, January 26, 2017 8:41:48 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: RE: Pls Help me - Hive Kerberos Issue


Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM<ma...@MY-REALM.COM>

# Connect using following JDBC connection string
# jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid<http://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid>






From: Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com<ma...@autodesk.com>]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Pls Help me - Hive Kerberos Issue

Hello,



Please I need your help with the Kerberos authentication with Hive.



I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my user. I created a keytab file with my passwd for my user. Please tell me if it is ok.

On the another hand when I am debugging the hive code the operative system user is authenticated but I need authenticate my Kerberos user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or How can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Ricardo.


________________________________
This message, and any attachments, is for the intended recipient(s) only, may contain information that is privileged, confidential and/or proprietary and subject to important terms and conditions available at http://www.bankofamerica.com/emaildisclaimer. If you are not the intended recipient, please delete this message.



Re: Pls Help me - Hive Kerberos Issue

Posted by Vivek Shrivastava <vi...@gmail.com>.
Any particular reason for selecting arcfour encryption type? Could you
please post defaults (e.g enc_type) values from krb5.conf

On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo <
ricardo.fajardo@autodesk.com> wrote:

>
> 1. klist -fe
>
> [cloudera@quickstart bin]$ klist -fe
> Ticket cache: FILE:/tmp/krb5cc_501
> Default principal: t_fajar@ADS.AUTODESK.COM
>
> Valid starting     Expires            Service principal
> 01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.
> AUTODESK.COM
> renew until 01/31/17 10:52:37, Flags: FPRIA
> Etype (skey, tkt): arcfour-hmac, arcfour-hmac
> [cloudera@quickstart bin]$
>
> 2. relevant entries from HiveServer2 log
>
>
> beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_
> HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
> AUTODESK.COM;hive.server2.proxy.user=t_fajar
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/
> repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/
> log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/
> repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.
> 6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/cloudera/.m2/
> repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-
> 1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.apache.logging.slf4j.
> Log4jLoggerFactory]
> Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@
> ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
> 17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
> 17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.
> UserGroupInformation$UgiMetrics.loginSuccess with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
> of successful kerberos logins and latency (milliseconds)], about=,
> type=DEFAULT, always=false, sampleName=Ops)
> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.
> UserGroupInformation$UgiMetrics.loginFailure with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
> of failed kerberos logins and latency (milliseconds)], about=,
> type=DEFAULT, always=false, sampleName=Ops)
> 17/01/27 16:16:36 DEBUG MutableMetricsFactory: field
> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.
> UserGroupInformation$UgiMetrics.getGroups with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
> value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
> 17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group
> related metrics
> 17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
> 17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-built
> native-hadoop library...
> 17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop
> with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
> 17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/
> packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
> 17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
> 17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group
> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
> 17/01/27 16:16:38 DEBUG Groups: Group mapping impl=org.apache.hadoop.
> security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000;
> warningDeltaMs=5000
> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
> 17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
> 17/01/27 16:16:38 DEBUG UserGroupInformation: using local
> user:UnixPrincipal: cloudera
> 17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrincipal:
> cloudera" with name cloudera
> 17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
> 17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera
> (auth:SIMPLE)
> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod = SIMPLE
> 17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as
> passed-in authMethod of kerberos != current.
> 17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction as:cloudera
> (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$
> Client.createClientTransport(HadoopThriftAuthBridge.java:208)
> 17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction as:cloudera
> (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.client.
> TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> 17/01/30 10:55:02 DEBUG TSaslTransport: opening transport
> org.apache.thrift.transport.TSaslClientTransport@1119f7c5
> 17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
> javax.security.sasl.SaslException: GSS initiate failed
> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> ~[?:1.7.0_67]
> at org.apache.thrift.transport.TSaslClientTransport.
> handleSaslStartMessage(TSaslClientTransport.java:94)
> ~[libthrift-0.9.3.jar:0.9.3]
> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> [libthrift-0.9.3.jar:0.9.3]
> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> [libthrift-0.9.3.jar:0.9.3]
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$
> 1.run(TUGIAssumingTransport.java:52) [classes/:?]
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$
> 1.run(TUGIAssumingTransport.java:1) [classes/:?]
> at java.security.AccessController.doPrivileged(Native Method)
> ~[?:1.7.0_67]
> at javax.security.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67]
> at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?]
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.
> open(TUGIAssumingTransport.java:49) [classes/:?]
> at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227)
> [classes/:?]
> at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182)
> [classes/:?]
> at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
> [classes/:?]
> at java.sql.DriverManager.getConnection(DriverManager.java:571)
> [?:1.7.0_67]
> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> [?:1.7.0_67]
> at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
> [classes/:?]
> at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
> [classes/:?]
> at org.apache.hive.beeline.Commands.connect(Commands.java:1524)
> [classes/:?]
> at org.apache.hive.beeline.Commands.connect(Commands.java:1419)
> [classes/:?]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.7.0_67]
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57) ~[?:1.7.0_67]
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_67]
> at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
> at org.apache.hive.beeline.ReflectiveCommandHandler.execute(
> ReflectiveCommandHandler.java:56) [classes/:?]
> at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127)
> [classes/:?]
> at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166)
> [classes/:?]
> at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?]
> at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
> at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511)
> [classes/:?]
> at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
> (Mechanism level: Failed to find any Kerberos tgt)
> at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> ~[?:1.7.0_67]
> at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> ~[?:1.7.0_67]
> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> ~[?:1.7.0_67]
> at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> ~[?:1.7.0_67]
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> ~[?:1.7.0_67]
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> ~[?:1.7.0_67]
> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> ~[?:1.7.0_67]
> ... 29 more
> 17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with
> status BAD and payload length 19
> 17/01/30 10:55:02 WARN HiveConnection: Failed to connect to localhost:10000
> HS2 may be unavailable, check server status
> Error: Could not open client transport with JDBC Uri:
> jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM
> ;hive.server2.proxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
> beeline>
>
> ------------------------------
> *From:* Vivek Shrivastava <vi...@gmail.com>
> *Sent:* Monday, January 30, 2017 10:48:35 AM
> *To:* user@hive.apache.org
> *Subject:* Re: Pls Help me - Hive Kerberos Issue
>
> Please paste the output of
> 1. klist -fe
> 2. relevant entries from HiveServer2 log
>
> On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <
> ricardo.fajardo@autodesk.com> wrote:
>
>> I could not resolve the problem.
>>
>>
>> I have debugged the code and I found out that:
>>
>>
>> 1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line
>> 208
>>
>> ....
>>
>> UserGroupInformation.getCurrentUser return (). Two (....
>>
>> ..
>>
>> This method always returns the user of the operative system but and I
>> need authenticate the user set on the property: hive.server2.proxy.u
>> ser=yourid because I have a token for this one.
>>
>>
>> 2. I have found out that the hive.server2.proxy.user is implemented on
>> the org.apache.hive.jdbc.HiveConnection class method: openSession() but
>> this code is never executed.
>>
>>
>> 3. On the org.apache.hive.service.auth.HiveAuthFactory class there is
>> this code on the method getAuthTransFactory():
>>
>> ....
>>
>>       if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName()))
>> {
>>         // no-op
>> ....
>>
>> It means that Kerberos authentication is not implemented?
>>
>>
>>
>> Please anyone can help me??
>>
>>
>> Thanks,
>>
>> Ricardo.
>> ------------------------------
>> *From:* Dulam, Naresh <na...@bankofamerica.com>
>> *Sent:* Thursday, January 26, 2017 8:41:48 AM
>> *To:* user@hive.apache.org
>> *Subject:* RE: Pls Help me - Hive Kerberos Issue
>>
>>
>>
>>
>> Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM
>>
>>
>>
>> # Connect using following JDBC connection string
>>
>> # jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_
>> HOST@MY-REALM.COM;hive.server2.proxy.user=yourid
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *From:* Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com]
>> *Sent:* Thursday, January 26, 2017 1:37 AM
>> *To:* user@hive.apache.org
>> *Subject:* Pls Help me - Hive Kerberos Issue
>>
>>
>>
>> Hello,
>>
>>
>>
>> Please I need your help with the Kerberos authentication with Hive.
>>
>>
>>
>> I am following this guide:
>>
>> https://www.cloudera.com/documentation/enterprise/5-4-x/
>> topics/cdh_sg_hiveserver2_security.html#topic_9_1_1
>>
>> But I am getting this error:
>>
>> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>> (Mechanism level: Failed to find any Kerberos tgt)
>>
>>
>>
>> I have a remote Kerberos server and I can generate a token with kinit for
>> my user. I created a keytab file with my passwd for my user. Please tell me
>> if it is ok.
>>
>>
>>
>> On the another hand when I am debugging the hive code the operative
>> system user is authenticated but I need authenticate my Kerberos user, can
>> you tell me how I can achieve that? How can I store my tickets where Hive
>> can load it?? or How can I verify where Hive is searching the tickets and
>> what Hive is reading??
>>
>>
>>
>> Thanks so much for your help.
>>
>>
>>
>> Best regards,
>>
>> Ricardo.
>>
>>
>>
>>
>> ------------------------------
>> This message, and any attachments, is for the intended recipient(s) only,
>> may contain information that is privileged, confidential and/or proprietary
>> and subject to important terms and conditions available at
>> http://www.bankofamerica.com/emaildisclaimer. If you are not the
>> intended recipient, please delete this message.
>>
>
>

Re: Pls Help me - Hive Kerberos Issue

Posted by Ricardo Fajardo <ri...@autodesk.com>.
1. klist -fe

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fajar@ADS.AUTODESK.COM

Valid starting     Expires            Service principal
01/30/17 10:52:37  01/30/17 20:52:43  krbtgt/ADS.AUTODESK.COM@ADS.AUTODESK.COM
renew until 01/31/17 10:52:37, Flags: FPRIA
Etype (skey, tkt): arcfour-hmac, arcfour-hmac
[cloudera@quickstart bin]$

2. relevant entries from HiveServer2 log


beeline> !connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
!connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
AUTODESK.COM;hive.server2.proxy.user=t_fajar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar
17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/01/27 16:16:37 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
17/01/27 16:16:38 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
17/01/27 16:16:38 DEBUG UserGroupInformation: using local user:UnixPrincipal: cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrincipal: cloudera" with name cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera (auth:SIMPLE)
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod = SIMPLE
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as passed-in authMethod of kerberos != current.
17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)
17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
17/01/30 10:55:02 DEBUG TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@1119f7c5
17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) ~[?:1.7.0_67]
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [libthrift-0.9.3.jar:0.9.3]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [classes/:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:1) [classes/:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_67]
at javax.security.auth.Subject.doAs(Subject.java:415) [?:1.7.0_67]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) [hadoop-common-2.7.2.jar:?]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) [classes/:?]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) [classes/:?]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [classes/:?]
at java.sql.DriverManager.getConnection(DriverManager.java:571) [?:1.7.0_67]
at java.sql.DriverManager.getConnection(DriverManager.java:187) [?:1.7.0_67]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [classes/:?]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_67]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_67]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_67]
at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [classes/:?]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) [classes/:?]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [classes/:?]
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) [classes/:?]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121) ~[?:1.7.0_67]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.7.0_67]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.7.0_67]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193) ~[?:1.7.0_67]
... 29 more
17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with status BAD and payload length 19
17/01/30 10:55:02 WARN HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.AUTODESK.COM;hive.server2.proxy.user=t_fajar: GSS initiate failed (state=08S01,code=0)
beeline>


________________________________
From: Vivek Shrivastava <vi...@gmail.com>
Sent: Monday, January 30, 2017 10:48:35 AM
To: user@hive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue

Please paste the output of
1. klist -fe
2. relevant entries from HiveServer2 log

On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <ri...@autodesk.com>> wrote:

I could not resolve the problem.


I have debugged the code and I found out that:


1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line 208

....

UserGroupInformation.getCurrentUser return (). Two (....

..

This method always returns the user of the operative system but and I need authenticate the user set on the property: hive.server2.proxy.user=yourid because I have a token for this one.


2. I have found out that the hive.server2.proxy.user is implemented on the org.apache.hive.jdbc.HiveConnection class method: openSession() but this code is never executed.


3. On the org.apache.hive.service.auth.HiveAuthFactory class there is this code on the method getAuthTransFactory():

....

      if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) {
        // no-op
....


It means that Kerberos authentication is not implemented?



Please anyone can help me??


Thanks,

Ricardo.

________________________________
From: Dulam, Naresh <na...@bankofamerica.com>>
Sent: Thursday, January 26, 2017 8:41:48 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: RE: Pls Help me - Hive Kerberos Issue


Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM<ma...@MY-REALM.COM>

# Connect using following JDBC connection string
# jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid<http://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid>






From: Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com<ma...@autodesk.com>]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@hive.apache.org<ma...@hive.apache.org>
Subject: Pls Help me - Hive Kerberos Issue

Hello,



Please I need your help with the Kerberos authentication with Hive.



I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my user. I created a keytab file with my passwd for my user. Please tell me if it is ok.

On the another hand when I am debugging the hive code the operative system user is authenticated but I need authenticate my Kerberos user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or How can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Ricardo.


________________________________
This message, and any attachments, is for the intended recipient(s) only, may contain information that is privileged, confidential and/or proprietary and subject to important terms and conditions available at http://www.bankofamerica.com/emaildisclaimer. If you are not the intended recipient, please delete this message.


Re: Pls Help me - Hive Kerberos Issue

Posted by Vivek Shrivastava <vi...@gmail.com>.
Please paste the output of
1. klist -fe
2. relevant entries from HiveServer2 log

On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo <
ricardo.fajardo@autodesk.com> wrote:

> I could not resolve the problem.
>
>
> I have debugged the code and I found out that:
>
>
> 1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line
> 208
>
> ....
>
> UserGroupInformation.getCurrentUser return (). Two (....
>
> ..
>
> This method always returns the user of the operative system but and I need
> authenticate the user set on the property: hive.server2.proxy.user=yourid because
> I have a token for this one.
>
>
> 2. I have found out that the hive.server2.proxy.user is implemented on
> the org.apache.hive.jdbc.HiveConnection class method: openSession() but
> this code is never executed.
>
>
> 3. On the org.apache.hive.service.auth.HiveAuthFactory class there is
> this code on the method getAuthTransFactory():
>
> ....
>
>       if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName()))
> {
>         // no-op
> ....
>
> It means that Kerberos authentication is not implemented?
>
>
>
> Please anyone can help me??
>
>
> Thanks,
>
> Ricardo.
> ------------------------------
> *From:* Dulam, Naresh <na...@bankofamerica.com>
> *Sent:* Thursday, January 26, 2017 8:41:48 AM
> *To:* user@hive.apache.org
> *Subject:* RE: Pls Help me - Hive Kerberos Issue
>
>
>
>
> Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM
>
>
>
> # Connect using following JDBC connection string
>
> # jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_
> HOST@MY-REALM.COM;hive.server2.proxy.user=yourid
>
>
>
>
>
>
>
>
>
>
>
>
>
> *From:* Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com]
> *Sent:* Thursday, January 26, 2017 1:37 AM
> *To:* user@hive.apache.org
> *Subject:* Pls Help me - Hive Kerberos Issue
>
>
>
> Hello,
>
>
>
> Please I need your help with the Kerberos authentication with Hive.
>
>
>
> I am following this guide:
>
> https://www.cloudera.com/documentation/enterprise/5-4-
> x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1
>
> But I am getting this error:
>
> Caused by: org.ietf.jgss.GSSException: No valid credentials provided
> (Mechanism level: Failed to find any Kerberos tgt)
>
>
>
> I have a remote Kerberos server and I can generate a token with kinit for
> my user. I created a keytab file with my passwd for my user. Please tell me
> if it is ok.
>
>
>
> On the another hand when I am debugging the hive code the operative system
> user is authenticated but I need authenticate my Kerberos user, can you
> tell me how I can achieve that? How can I store my tickets where Hive can
> load it?? or How can I verify where Hive is searching the tickets and what
> Hive is reading??
>
>
>
> Thanks so much for your help.
>
>
>
> Best regards,
>
> Ricardo.
>
>
>
>
> ------------------------------
> This message, and any attachments, is for the intended recipient(s) only,
> may contain information that is privileged, confidential and/or proprietary
> and subject to important terms and conditions available at
> http://www.bankofamerica.com/emaildisclaimer. If you are not the intended
> recipient, please delete this message.
>

Re: Pls Help me - Hive Kerberos Issue

Posted by Ricardo Fajardo <ri...@autodesk.com>.
I could not resolve the problem.


I have debugged the code and I found out that:


1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line 208

....

UserGroupInformation.getCurrentUser return (). Two (....

..

This method always returns the user of the operative system but and I need authenticate the user set on the property: hive.server2.proxy.user=yourid because I have a token for this one.


2. I have found out that the hive.server2.proxy.user is implemented on the org.apache.hive.jdbc.HiveConnection class method: openSession() but this code is never executed.


3. On the org.apache.hive.service.auth.HiveAuthFactory class there is this code on the method getAuthTransFactory():

....

      if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) {
        // no-op
....


It means that Kerberos authentication is not implemented?



Please anyone can help me??


Thanks,

Ricardo.

________________________________
From: Dulam, Naresh <na...@bankofamerica.com>
Sent: Thursday, January 26, 2017 8:41:48 AM
To: user@hive.apache.org
Subject: RE: Pls Help me - Hive Kerberos Issue


Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM

# Connect using following JDBC connection string
# jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid






From: Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@hive.apache.org
Subject: Pls Help me - Hive Kerberos Issue

Hello,



Please I need your help with the Kerberos authentication with Hive.



I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my user. I created a keytab file with my passwd for my user. Please tell me if it is ok.

On the another hand when I am debugging the hive code the operative system user is authenticated but I need authenticate my Kerberos user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or How can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Ricardo.


________________________________
This message, and any attachments, is for the intended recipient(s) only, may contain information that is privileged, confidential and/or proprietary and subject to important terms and conditions available at http://www.bankofamerica.com/emaildisclaimer. If you are not the intended recipient, please delete this message.

RE: Pls Help me - Hive Kerberos Issue

Posted by "Dulam, Naresh" <na...@bankofamerica.com>.
Kinit   yourid -k -t your.keytab yourid@MY-REALM.COM

# Connect using following JDBC connection string
# jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_HOST@MY-REALM.COM;hive.server2.proxy.user=yourid






From: Ricardo Fajardo [mailto:ricardo.fajardo@autodesk.com]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@hive.apache.org
Subject: Pls Help me - Hive Kerberos Issue

Hello,



Please I need your help with the Kerberos authentication with Hive.



I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my user. I created a keytab file with my passwd for my user. Please tell me if it is ok.

On the another hand when I am debugging the hive code the operative system user is authenticated but I need authenticate my Kerberos user, can you tell me how I can achieve that? How can I store my tickets where Hive can load it?? or How can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Ricardo.


----------------------------------------------------------------------
This message, and any attachments, is for the intended recipient(s) only, may contain information that is privileged, confidential and/or proprietary and subject to important terms and conditions available at http://www.bankofamerica.com/emaildisclaimer.   If you are not the intended recipient, please delete this message.