You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reid Chan (JIRA)" <ji...@apache.org> on 2017/11/28 10:45:00 UTC
[jira] [Commented] (SPARK-19250) In security cluster, spark beeline
connect to hive metastore failed
[ https://issues.apache.org/jira/browse/SPARK-19250?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16268545#comment-16268545 ]
Reid Chan commented on SPARK-19250:
-----------------------------------
I encounter the same problem, spark version is 2.2.1, hive metastore version is 0.14.0, command is pretty simple,
{{code}}
create table zepdb.test_tablename (
id int,
query string,
name string
);
{{code}}
> In security cluster, spark beeline connect to hive metastore failed
> -------------------------------------------------------------------
>
> Key: SPARK-19250
> URL: https://issues.apache.org/jira/browse/SPARK-19250
> Project: Spark
> Issue Type: Bug
> Reporter: meiyoula
> Labels: security-issue
>
> 1. starting thriftserver in security mode, set hive.metastore.uris to hive metastore uri, also hive is in security mode.
> 2. when use beeline to create table, it can't connect to hive metastore successfully, occurs "Failed to find any Kerberos tgt".
> {quote}
> 2017-01-17 16:25:53,618 | ERROR | [pool-25-thread-1] | SASL negotiation failure | org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:315)
> javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
> at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
> at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1738)
> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:513)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:249)
> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1533)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3119)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3138)
> at org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:791)
> at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:755)
> at org.apache.hadoop.hive.ql.session.SessionState.getAuthenticator(SessionState.java:1461)
> at org.apache.hadoop.hive.ql.session.SessionState.getUserFromAuthenticator(SessionState.java:1014)
> at org.apache.hadoop.hive.ql.metadata.Table.getEmptyTable(Table.java:177)
> at org.apache.hadoop.hive.ql.metadata.Table.<init>(Table.java:119)
> at org.apache.spark.sql.hive.client.HiveClientImpl.org$apache$spark$sql$hive$client$HiveClientImpl$$toHiveTable(HiveClientImpl.scala:803)
> at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply$mcV$sp(HiveClientImpl.scala:430)
> at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:430)
> at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createTable$1.apply(HiveClientImpl.scala:430)
> at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:284)
> at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:231)
> at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:230)
> at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:273)
> at org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:429)
> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply$mcV$sp(HiveExternalCatalog.scala:229)
> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:191)
> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:191)
> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:95)
> at org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:191)
> {quote}
> Reason:
> When open hivemetastore client, first check if has token, because the hive.metastore.uris has been set to local, so it don't obtain token; secondly use KERBEROS to auth, but current user is a proxyuser and tgt can't find in server. So open metastore client failed.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org