You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2018/08/21 17:23:00 UTC

[jira] [Resolved] (SPARK-25172) kererbos issue when use jdbc to connect hive server2 on executors

     [ https://issues.apache.org/jira/browse/SPARK-25172?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-25172.
------------------------------------
    Resolution: Invalid

Please use the mailing list for questions:
http://spark.apache.org/community.html

> kererbos issue when use jdbc to connect hive server2 on executors
> -----------------------------------------------------------------
>
>                 Key: SPARK-25172
>                 URL: https://issues.apache.org/jira/browse/SPARK-25172
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.6.0
>            Reporter: sun zhiwei
>            Priority: Major
>
> Hi All,
>        for some reasons,I must use jdbc to connect hive server2 on executors,but when I test my program,it get the following logs:
>       javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
>     at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>     at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
>     at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
>     at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
>     at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
>     at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203)
>     at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:168)
>     at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
>     at java.sql.DriverManager.getConnection(DriverManager.java:571)
>     at java.sql.DriverManager.getConnection(DriverManager.java:187)
>     at com.hypers.bigdata.util.SparkHandler$.handleTable2(SparkHandler.scala:204)
>     at com.hypers.bigdata.spark.AutoSpark$$anonfun$setupSsc$1$$anonfun$apply$2.apply(AutoSpark.scala:196)
>     at com.hypers.bigdata.spark.AutoSpark$$anonfun$setupSsc$1$$anonfun$apply$2.apply(AutoSpark.scala:107)
>     at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>     at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
>     at com.hypers.bigdata.spark.AutoSpark$$anonfun$setupSsc$1.apply(AutoSpark.scala:107)
>     at com.hypers.bigdata.spark.AutoSpark$$anonfun$setupSsc$1.apply(AutoSpark.scala:106)
>     at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
>     at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
>     at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1888)
>     at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1888)
>     at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>     at org.apache.spark.scheduler.Task.run(Task.scala:89)
>     at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:242)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
>     at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>     at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>     at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>     at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>     at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>     at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>     at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>     ... 31 more
>  
>      I think it might because the executors can not pass the kerberos authentification,so here's my question:
>      how could I make the executors pass kerberos authentification?
>      any suggestions would be prossiblely usefully,thx.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org