You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by sunfulin <su...@163.com> on 2020/02/21 10:18:57 UTC

Flink 1.10连接hive时kerberos认证异常问题

Hi,
我使用Flink 1.10集成hive,在连接metastore的时候由于hive对应CDH集群开启了kerberos认证,抛出了如下异常:请问大家这个该怎么配置或者解决哈?


999  [main] INFO  hive.metastore  - Trying to connect to metastore with URI thrift://namenode01.htsc.com:9083
1175 [main] ERROR org.apache.thrift.transport.TSaslTransport  - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
  at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
  at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
  at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
  at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
  at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:422)
  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
  at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
  at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
  at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
  at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at org.apache.flink.table.catalog.hive.client.HiveShimV200.getHiveMetastoreClient(HiveShimV200.java:43)
  at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240)
  at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71)
  at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35)
  at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:188)
  at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:102)
  at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:235)
  at com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.doJob(HiveMetaJob.java:44)
  at com.htsc.crm_realtime.fatjob.Jobs.JobEntryBase.run(JobEntryBase.java:50)
  at com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.main(HiveMetaJob.java:23)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
  at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
  at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
  at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
  at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
  at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
  at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
  at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
  ... 34 more

Re: Flink 1.10连接hive时kerberos认证异常问题

Posted by Rui Li <li...@gmail.com>.
我的理解也是指定了keytab以后不需要输入密码了,hive这边有一些相关配置比如
hive.metastore.kerberos.keytab.file
hive.metastore.kerberos.principal
可以确认一下flink这边是不是能读到这些配置

On Tue, Feb 25, 2020 at 10:21 PM godfrey he <go...@gmail.com> wrote:

> cc @Rui Li <li...@gmail.com>
>
> sunfulin <su...@163.com> 于2020年2月25日周二 下午2:32写道:
>
>>
>>
>> Hi,
>>
>> 我在配置flink连接hive时,由于集群开启了Kerberos认证,经过一番探索,异常没有了。但是现在连接的时候需要我输入Kerberos用户名和密码。我理解指定了keytab文件路径后,应该不需要用户名和密码了吧?请教各位大神可能的配置问题。
>>
>>
>> security.kerberos.login.use-ticker-cache: false
>> security.kerberos.login.keytab:
>> /app/flink/flink-1.10.10/kerberos/flink_test.keytab
>> security.kerberos.login.principal: flink_test@HADOOP.HTSC.COM
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> At 2020-02-21 18:18:57, "sunfulin" <su...@163.com> wrote:
>>
>> Hi,
>> 我使用Flink
>> 1.10集成hive,在连接metastore的时候由于hive对应CDH集群开启了kerberos认证,抛出了如下异常:请问大家这个该怎么配置或者解决哈?
>>
>>
>> 999  [main] INFO  hive.metastore  - Trying to connect to metastore with
>> URI thrift://namenode01.htsc.com:9083
>> 1175 [main] ERROR org.apache.thrift.transport.TSaslTransport  - SASL
>> negotiation failure
>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>> GSSException: No valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)]
>>   at
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
>>   at
>> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
>>   at
>> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>>   at
>> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>>   at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>>   at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>>   at java.security.AccessController.doPrivileged(Native Method)
>>   at javax.security.auth.Subject.doAs(Subject.java:422)
>>   at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>>   at
>> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>>   at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
>>   at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
>>   at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181)
>>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>   at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>   at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>   at
>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
>>   at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
>>   at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
>>   at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
>>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>   at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>   at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>   at java.lang.reflect.Method.invoke(Method.java:498)
>>   at
>> org.apache.flink.table.catalog.hive.client.HiveShimV200.getHiveMetastoreClient(HiveShimV200.java:43)
>>   at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240)
>>   at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71)
>>   at
>> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35)
>>   at
>> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:188)
>>   at
>> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:102)
>>   at
>> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:235)
>>   at
>> com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.doJob(HiveMetaJob.java:44)
>>   at
>> com.htsc.crm_realtime.fatjob.Jobs.JobEntryBase.run(JobEntryBase.java:50)
>>   at
>> com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.main(HiveMetaJob.java:23)
>> Caused by: GSSException: No valid credentials provided (Mechanism level:
>> Failed to find any Kerberos tgt)
>>   at
>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>   at
>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
>>   at
>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>   at
>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
>>   at
>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>   at
>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>   at
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
>>   ... 34 more
>>
>>
>>
>>
>>
>
>

-- 
Best regards!
Rui Li

Re: Flink 1.10连接hive时kerberos认证异常问题

Posted by godfrey he <go...@gmail.com>.
cc @Rui Li <li...@gmail.com>

sunfulin <su...@163.com> 于2020年2月25日周二 下午2:32写道:

>
>
> Hi,
>
> 我在配置flink连接hive时,由于集群开启了Kerberos认证,经过一番探索,异常没有了。但是现在连接的时候需要我输入Kerberos用户名和密码。我理解指定了keytab文件路径后,应该不需要用户名和密码了吧?请教各位大神可能的配置问题。
>
>
> security.kerberos.login.use-ticker-cache: false
> security.kerberos.login.keytab:
> /app/flink/flink-1.10.10/kerberos/flink_test.keytab
> security.kerberos.login.principal: flink_test@HADOOP.HTSC.COM
>
>
>
>
>
>
>
>
>
> At 2020-02-21 18:18:57, "sunfulin" <su...@163.com> wrote:
>
> Hi,
> 我使用Flink
> 1.10集成hive,在连接metastore的时候由于hive对应CDH集群开启了kerberos认证,抛出了如下异常:请问大家这个该怎么配置或者解决哈?
>
>
> 999  [main] INFO  hive.metastore  - Trying to connect to metastore with
> URI thrift://namenode01.htsc.com:9083
> 1175 [main] ERROR org.apache.thrift.transport.TSaslTransport  - SASL
> negotiation failure
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)]
>   at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
>   at
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
>   at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>   at
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>   at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>   at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:422)
>   at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>   at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>   at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
>   at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
>   at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
>   at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
>   at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
>   at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:498)
>   at
> org.apache.flink.table.catalog.hive.client.HiveShimV200.getHiveMetastoreClient(HiveShimV200.java:43)
>   at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240)
>   at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71)
>   at
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35)
>   at
> org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:188)
>   at
> org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:102)
>   at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:235)
>   at
> com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.doJob(HiveMetaJob.java:44)
>   at
> com.htsc.crm_realtime.fatjob.Jobs.JobEntryBase.run(JobEntryBase.java:50)
>   at
> com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.main(HiveMetaJob.java:23)
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)
>   at
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>   at
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
>   at
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>   at
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
>   at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>   at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>   at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
>   ... 34 more
>
>
>
>
>

Re:Flink 1.10连接hive时kerberos认证异常问题

Posted by sunfulin <su...@163.com>.

Hi,
我在配置flink连接hive时,由于集群开启了Kerberos认证,经过一番探索,异常没有了。但是现在连接的时候需要我输入Kerberos用户名和密码。我理解指定了keytab文件路径后,应该不需要用户名和密码了吧?请教各位大神可能的配置问题。


security.kerberos.login.use-ticker-cache: false
security.kerberos.login.keytab: /app/flink/flink-1.10.10/kerberos/flink_test.keytab
security.kerberos.login.principal: flink_test@HADOOP.HTSC.COM









At 2020-02-21 18:18:57, "sunfulin" <su...@163.com> wrote:

Hi,
我使用Flink 1.10集成hive,在连接metastore的时候由于hive对应CDH集群开启了kerberos认证,抛出了如下异常:请问大家这个该怎么配置或者解决哈?


999  [main] INFO  hive.metastore  - Trying to connect to metastore with URI thrift://namenode01.htsc.com:9083
1175 [main] ERROR org.apache.thrift.transport.TSaslTransport  - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
  at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
  at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
  at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
  at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
  at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:422)
  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
  at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
  at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
  at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
  at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at org.apache.flink.table.catalog.hive.client.HiveShimV200.getHiveMetastoreClient(HiveShimV200.java:43)
  at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240)
  at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71)
  at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35)
  at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:188)
  at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:102)
  at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:235)
  at com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.doJob(HiveMetaJob.java:44)
  at com.htsc.crm_realtime.fatjob.Jobs.JobEntryBase.run(JobEntryBase.java:50)
  at com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.main(HiveMetaJob.java:23)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
  at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
  at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
  at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
  at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
  at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
  at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
  at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
  ... 34 more