You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (Jira)" <ji...@apache.org> on 2019/09/12 10:52:00 UTC
[jira] [Resolved] (SPARK-28954) For SparkCLI, start up with conf of
HIVEAUXJARS, we add jar with SessionStateResourceLoader's addJar() API
[ https://issues.apache.org/jira/browse/SPARK-28954?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang resolved SPARK-28954.
---------------------------------
Resolution: Duplicate
> For SparkCLI, start up with conf of HIVEAUXJARS, we add jar with SessionStateResourceLoader's addJar() API
> ----------------------------------------------------------------------------------------------------------
>
> Key: SPARK-28954
> URL: https://issues.apache.org/jira/browse/SPARK-28954
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.4.0, 3.0.0
> Reporter: angerszhu
> Priority: Major
>
> When startup SparkSQL CLI .
> For extra jar passed through hive conf {{HiveConf.ConfVars.HIVEAUXJARS}}, we don't need to use complex APIs to fix different hive version problem, we just can handle it through spark's SessionResourceLoader's API. add jar to Spark and SparkSession's running env.
> *SessionResourceLoader api* :
> {code:java}
> val resourceLoader = SparkSQLEnv.sqlContext.sessionState.resourceLoader
> StringUtils.split(auxJars, ",").foreach(resourceLoader.addJar(_))
> {code}
> *v1.2.1ThriftServerShimUtils*:
> {code:java}
> private[thriftserver] def addToClassPath(
> loader: ClassLoader,
> auxJars: Array[String]): ClassLoader = {
> Utilities.addToClassPath(loader, auxJars)
> }
> {code}
> *v2.3.5ThriftServerShimUtils*:
> {code:java}
> private[thriftserver] def addToClassPath(
> loader: ClassLoader,
> auxJars: Array[String]): ClassLoader = {
> val addAction = new AddToClassPathAction(loader, auxJars.toList.asJava)
> AccessController.doPrivileged(addAction)
> }
> {code}
--
This message was sent by Atlassian Jira
(v8.3.2#803003)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org