You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Rui Li (JIRA)" <ji...@apache.org> on 2016/11/07 08:53:58 UTC

[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

    [ https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15643542#comment-15643542 ] 

Rui Li commented on HIVE-14825:
-------------------------------

For yarn, I can run some simple queries with the following jars:
{noformat}
scala-library
spark-core
spark-network-common
{noformat}
For local mode, the extra jars needed:
{noformat}
chill-java
chill
jackson-module-paranamer
jackson-module-scala
jersey-container-servlet-core
jersey-server
json4s-ast
kryo-shaded
minlog
scala-xml
spark-launcher
spark-network-shuffle
spark-unsafe
xbean-asm5-shaded
{noformat}

> Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0
> ---------------------------------------------------------------------------------------------
>
>                 Key: HIVE-14825
>                 URL: https://issues.apache.org/jira/browse/HIVE-14825
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Ferdinand Xu
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should figure out the minimum set of required jars for HoS to work after bumping up to Spark 2.0.0. By this way, users can decide whether they want to add just the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)