You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@oozie.apache.org by "Satish Subhashrao Saley (JIRA)" <ji...@apache.org> on 2017/09/19 23:42:00 UTC
[jira] [Commented] (OOZIE-2885) Running Spark actions should not
need Hive on the classpath
[ https://issues.apache.org/jira/browse/OOZIE-2885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16172503#comment-16172503 ]
Satish Subhashrao Saley commented on OOZIE-2885:
------------------------------------------------
OOZIE-2845 removed reflection based code along with the try-catch block [click here for diff |https://github.com/apache/oozie/commit/6bac84745b9c62907e8cc6a16bad6c76ac3eb9c6#diff-f6e9d67b7671f1c2eb168642bd79789aL596].
OOZIE-2799 added some properties to hive-site.xml for users accessing hive via spark.
I think we should put try-catch block back rather than including hive dependencies in spark sharelib. What do you think [~gezapeti]?
> Running Spark actions should not need Hive on the classpath
> -----------------------------------------------------------
>
> Key: OOZIE-2885
> URL: https://issues.apache.org/jira/browse/OOZIE-2885
> Project: Oozie
> Issue Type: Bug
> Reporter: Peter Cseh
>
> SparkMain references HiveConf so we need hive-common jar on the classpath even if the Spark job has nothing to do with Hive.
> This shouldn't be the case.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)