You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Oleksiy Sayankin (JIRA)" <ji...@apache.org> on 2016/09/16 12:40:20 UTC
[jira] [Comment Edited] (SPARK-17563) Add
org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with Hive-2.X.X
[ https://issues.apache.org/jira/browse/SPARK-17563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15496232#comment-15496232 ]
Oleksiy Sayankin edited comment on SPARK-17563 at 9/16/16 12:40 PM:
--------------------------------------------------------------------
{quote} JobMetricsListener is not part of Spark, right?{quote}
Yes.
Well I can change JavaSparkListener --> SparkListener and
<spark.version>1.6.1</spark.version> --> <spark.version>2.0.0</spark.version>
in pom.xml in Hive-2.X.X. I guess this will work.
was (Author: osayankin):
Well I can change JavaSparkListener --> SparkListener and
<spark.version>1.6.1</spark.version> --> <spark.version>2.0.0</spark.version>
in pom.xml. I guess this will work.
> Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with Hive-2.X.X
> -------------------------------------------------------------------------------
>
> Key: SPARK-17563
> URL: https://issues.apache.org/jira/browse/SPARK-17563
> Project: Spark
> Issue Type: Bug
> Reporter: Oleksiy Sayankin
>
> According to https://issues.apache.org/jira/browse/SPARK-14358 JavaSparkListener was deleted from Spark-2.0.0, but Hive-2.X.X uses JavaSparkListener
> {code}
> package org.apache.hadoop.hive.ql.exec.spark.status.impl;
> import ...
> public class JobMetricsListener extends JavaSparkListener {
> {code}
> Configuring Hive-2.X.X on Spark-2.0.0 will give an exception:
> {code}
> 2016-09-16T11:20:57,474 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(593)) - java.lang.NoClassDefFoundError: org/apache/spark/JavaSparkListener
> {code}
> Please add JavaSparkListener into Spark-2.0.0
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org