You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Rui Li (JIRA)" <ji...@apache.org> on 2016/11/08 06:25:58 UTC

[jira] [Updated] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

     [ https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Rui Li updated HIVE-14825:
--------------------------
    Issue Type: Task  (was: Bug)

> Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0
> ---------------------------------------------------------------------------------------------
>
>                 Key: HIVE-14825
>                 URL: https://issues.apache.org/jira/browse/HIVE-14825
>             Project: Hive
>          Issue Type: Task
>          Components: Documentation
>            Reporter: Ferdinand Xu
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should figure out the minimum set of required jars for HoS to work after bumping up to Spark 2.0.0. By this way, users can decide whether they want to add just the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)