You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2015/06/11 21:25:01 UTC

[jira] [Commented] (SPARK-8306) AddJar command needs to set the new class loader to the HiveConf inside executionHive.state.

    [ https://issues.apache.org/jira/browse/SPARK-8306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14582405#comment-14582405 ] 

Yin Huai commented on SPARK-8306:
---------------------------------

In ClientWrapper, we have the following code
{code}
/** Returns the configuration for the current session. */
def conf: HiveConf = SessionState.get().getConf

// TODO: should be a def?s
private val client = Hive.get(conf)
{code}

So, when we create a ClientWrapper, the conf inside client the the conf of the current SessionState (thread local varibale). Later, if we call HiveContext from another thread, the client's hive will not be the same as the conf in the session state returned by SessionteState.get (it returns the SessionState for the current thread).

> AddJar command needs to set the new class loader to the HiveConf inside executionHive.state.
> --------------------------------------------------------------------------------------------
>
>                 Key: SPARK-8306
>                 URL: https://issues.apache.org/jira/browse/SPARK-8306
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>            Reporter: Yin Huai
>            Assignee: Yin Huai
>
> In {{AddJar}} command, we are using {{org.apache.hadoop.hive.ql.metadata.Hive.get().getConf().setClassLoader(newClassLoader)}}. However, the conf returned by {{Hive.get().getConf()}} is not necessary the one set in {{executionHive.state}}. Thus, we may fail to set the correct class loader to {{executionHive}} in some cases.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org