You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/05/22 15:00:00 UTC

[jira] [Assigned] (SPARK-24349) obtainDelegationTokens() exits JVM if Driver use JDBC instead of using metastore

     [ https://issues.apache.org/jira/browse/SPARK-24349?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-24349:
------------------------------------

    Assignee: Apache Spark

> obtainDelegationTokens() exits JVM if Driver use JDBC instead of using metastore 
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-24349
>                 URL: https://issues.apache.org/jira/browse/SPARK-24349
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Lantao Jin
>            Assignee: Apache Spark
>            Priority: Major
>
> In [SPARK-23639|https://issues.apache.org/jira/browse/SPARK-23639], use --proxy-user to impersonate will invoke obtainDelegationTokens(), but current Driver use JDBC instead of metastore, it will failed out with
> {code}
> WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist
> Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Hive metastore uri undefined
>         at scala.Predef$.require(Predef.scala:224)
>         at org.apache.spark.sql.hive.thriftserver.HiveCredentialProvider.obtainCredentials(HiveCredentialProvider.scala:73)
>         at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:56)
>         at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:288)
>         at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
>         at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
>         at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:169)
>         at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:167)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>         at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 18/05/22 05:24:16 INFO ShutdownHookManager: Shutdown hook called
> 18/05/22 05:24:16 INFO ShutdownHookManager: Deleting directory /tmp/spark-b63ad788-1a47-4326-9972-c4fde1dc19c3
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org