You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Xuefu Zhang (JIRA)" <ji...@apache.org> on 2016/01/23 02:43:39 UTC

[jira] [Issue Comment Deleted] (HIVE-12880) spark-assembly causes Hive class version problems

     [ https://issues.apache.org/jira/browse/HIVE-12880?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xuefu Zhang updated HIVE-12880:
-------------------------------
    Comment: was deleted

(was: [~sershe], thanks for working on this. I think the logic of copying spark-assembly.jar doesn't belong to the script. This process should be one time thing, while the script is executed over and over again. Ideally, this should be part of packaging. Therefore, I think a better way is to undo the original JIRA that introduced the logic.)

> spark-assembly causes Hive class version problems
> -------------------------------------------------
>
>                 Key: HIVE-12880
>                 URL: https://issues.apache.org/jira/browse/HIVE-12880
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Hui Zheng
>            Assignee: Sergey Shelukhin
>         Attachments: HIVE-12880.patch
>
>
> It looks like spark-assembly contains versions of Hive classes (e.g. HiveConf), and these sometimes (always?) come from older versions of Hive.
> We've seen problems where depending on classpath perturbations, NoSuchField errors may be thrown for recently added ConfVars because the HiveConf class comes from spark-assembly.
> Would making sure spark-assembly comes last in the classpath solve the problem?
> Otherwise, can we depend on something that does not package older Hive classes?
> Currently, HIVE-12179 provides a workaround (in non-Spark use case, at least; I am assuming this issue can also affect Hive-on-Spark).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)