You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@bigtop.apache.org by "Amir Sanjar (JIRA)" <ji...@apache.org> on 2017/01/02 21:02:58 UTC

[jira] [Commented] (BIGTOP-2654) spark binaries need either SPARK_HOME or non existing find-spark-home exe

    [ https://issues.apache.org/jira/browse/BIGTOP-2654?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15793493#comment-15793493 ] 

Amir Sanjar commented on BIGTOP-2654:
-------------------------------------

hm, I have not seen this problem on Ubuntu 16.04. Could you reproduce the problem while executing pyspark or running a job using spark-summit?

> spark binaries need either SPARK_HOME or non existing find-spark-home exe
> -------------------------------------------------------------------------
>
>                 Key: BIGTOP-2654
>                 URL: https://issues.apache.org/jira/browse/BIGTOP-2654
>             Project: Bigtop
>          Issue Type: Bug
>    Affects Versions: 1.1.0
>            Reporter: Olaf Flebbe
>             Fix For: 1.2.0
>
>
> spark-shell and other executables need either the {{SPARK_HOME}} or the {{find-spark-home}} executable.
> The  {{find-spark-home}}  is not packaged (what makes sense, since we use hardcoded /usr/lib/spark for packaging spark.
> The executable does not run without either the environment variable  {{SPARK_HOME}}.
> I prefer not to have a puppet script to create a /etc/profile.d script to fix the situation.
> I tend to patch the executables instead, to have SPARK_HOME set within the executable: (like spark-env.sh)
> {code}
> export SPARK_HOME=${SPARK_HOME:-/usr/lib/spark}
> {code}
> Comments?
> Or do I miss something important (Maybe a debian specific problem?)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)