You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Pralabh Kumar <pr...@gmail.com> on 2021/08/20 13:50:06 UTC
Spark Thriftserver is failing for when submitting command from beeline
Hi Dev
Environment details
Hadoop 3.2
Hive 3.1
Spark 3.0.3
Cluster : Kerborized .
1) Hive server is running fine
2) Spark sql , sparkshell, spark submit everything is working as expected.
3) Connecting Hive through beeline is working fine (after kinit)
beeline -u "jdbc:hive2://<hostname>:<port>/default;principal=<three part
principal>
Now launched Spark thrift server and try to connect it through beeline.
beeline client perfectly connects with STS .
4) beeline -u "jdbc:hive2://<hostname>:<port>/default;principal=<three part
principal>
a) Log says connected to
Spark sql
Drive : Hive JDBC
Now when I run any commands ("show tables") it fails . Log ins STS says
*21/08/19 19:30:12 DEBUG UserGroupInformation: PrivilegedAction as:<userid>
(auth:PROXY) via <Three part principle> (auth:KERBEROS)
from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)*
*21/08/19 19:30:12 DEBUG UserGroupInformation: PrivilegedAction as:*
*<userid>* * (auth:PROXY) via * *<Three part principle>* * (auth:KERBEROS)
from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)*
21/08/19 19:30:12 DEBUG TSaslTransport: opening transport
org.apache.thrift.transport.TSaslClientTransport@f43fd2f
21/08/19 19:30:12 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:480)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:247)
at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1707)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632)
at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1556)
at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1545)
at
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$databaseExists$1(HiveClientImpl.scala:384)
My guess is authorization through proxy is not working .
Please help
Regards
Pralabh Kumar
Re: Spark Thriftserver is failing for when submitting command from
beeline
Posted by Artemis User <ar...@dtechspace.com>.
Looks like your problem is related to not setting up a hive.xml file
properly. The standard Spark distribution doesn't include a hive.xml
template file in the conf directory. You will have to create one by
yourself. Please refer to the Spark user doc and Hive metastore config
guide for details...
-- ND
On 8/20/21 9:50 AM, Pralabh Kumar wrote:
> Hi Dev
>
> Environment details
>
> Hadoop 3.2
> Hive 3.1
> Spark 3.0.3
>
> Cluster : Kerborized .
>
> 1) Hive server is running fine
> 2) Spark sql , sparkshell, spark submit everything is working as expected.
> 3) Connecting Hive through beeline is working fine (after kinit)
> beeline -u "jdbc:hive2://<hostname>:<port>/default;principal=<three
> part principal>
>
> Now launched Spark thrift server and try to connect it through beeline.
>
> beeline client perfectly connects with STS .
>
> 4) beeline -u "jdbc:hive2://<hostname>:<port>/default;principal=<three
> part principal>
> a) Log says connected to
> Spark sql
> Drive : Hive JDBC
>
>
> Now when I run any commands ("show tables") it fails . Log ins STS says
>
> *21/08/19 19:30:12 DEBUG UserGroupInformation: PrivilegedAction
> as:<userid> (auth:PROXY) via <Three part principle> (auth:KERBEROS)
> from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)*
> *21/08/19 19:30:12 DEBUG UserGroupInformation: PrivilegedAction as:*
> *<userid>* * (auth:PROXY) via * *<Three part principle>*
> * (auth:KERBEROS)
> from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)*
> 21/08/19 19:30:12 DEBUG TSaslTransport: opening transport
> org.apache.thrift.transport.TSaslClientTransport@f43fd2f
> 21/08/19 19:30:12 ERROR TSaslTransport: SASL negotiation failure
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed
> to find any Kerberos tgt)]
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> at
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95)
> at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
> at
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38)
> at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
> at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
> at
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:480)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:247)
> at
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1707)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632)
> at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1556)
> at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1545)
> at
> org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$databaseExists$1(HiveClientImpl.scala:384)
>
>
>
> My guess is authorization through proxy is not working .
>
>
>
> Please help
>
>
> Regards
> Pralabh Kumar
>