You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Sergey Shelukhin (JIRA)" <ji...@apache.org> on 2016/01/30 00:50:39 UTC
[jira] [Comment Edited] (HIVE-9608) Define SPARK_HOME if not
defined automagically
[ https://issues.apache.org/jira/browse/HIVE-9608?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15120535#comment-15120535 ]
Sergey Shelukhin edited comment on HIVE-9608 at 1/29/16 11:50 PM:
------------------------------------------------------------------
-FYI - revert of this JIRA is proposed in HIVE-12880, since it may add the default spark-assembly when it's not needed and cause dependency problems due to two versions of Hive being present-. Nevermind
was (Author: sershe):
FYI - revert of this JIRA is proposed in HIVE-12880, since it may add the default spark-assembly when it's not needed and cause dependency problems due to two versions of Hive being present.
> Define SPARK_HOME if not defined automagically
> ----------------------------------------------
>
> Key: HIVE-9608
> URL: https://issues.apache.org/jira/browse/HIVE-9608
> Project: Hive
> Issue Type: Improvement
> Reporter: Brock Noland
> Assignee: Brock Noland
> Priority: Minor
> Fix For: 1.1.0
>
> Attachments: HIVE-9608.patch, HIVE-9608.patch
>
>
> many hadoop installs are in {{dir/\{spark,hive,hadoop,..\}}}. We can infer {{SPARK_HOME}} in these cases.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)