You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Rui Li (JIRA)" <ji...@apache.org> on 2017/02/16 08:17:41 UTC
[jira] [Commented] (HIVE-15926) Hive 2.1.1 is not supporting any
version of Spark
[ https://issues.apache.org/jira/browse/HIVE-15926?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15869485#comment-15869485 ]
Rui Li commented on HIVE-15926:
-------------------------------
Hi [~hrishidypim], Hive-2.1.1 doesn't support Spark-2.0.0 or later. Regarding the issue you hit with Spark-1.6.x, is your Spark built with Hive? If so, you'll need to install a Spark built w/o Hive. You can refer to this wiki: https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started
> Hive 2.1.1 is not supporting any version of Spark
> --------------------------------------------------
>
> Key: HIVE-15926
> URL: https://issues.apache.org/jira/browse/HIVE-15926
> Project: Hive
> Issue Type: Bug
> Components: CLI
> Affects Versions: 2.1.1
> Reporter: Hrishieksh
>
> I have Hive 2.1.1 and trying to integrate with spark so that i can use spark instead of mr as engine.
> i opened hive cli and run a sql query ,now
> when using Spark version 1.6.1 and 1.6.3 i am geting error
> Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
> at org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
> and whn using spark version 2.0.0 or 2.1.0 i am getting error org/apache/spark/JavaSparkListener class not found excpetion.
> Now tell me what to do with such problem.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)