You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@bigtop.apache.org by "Olaf Flebbe (JIRA)" <ji...@apache.org> on 2017/01/02 20:46:58 UTC

[jira] [Updated] (BIGTOP-2654) spark binaries need either SPARK_HOME or non existing find-spark-home exe

     [ https://issues.apache.org/jira/browse/BIGTOP-2654?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Olaf Flebbe updated BIGTOP-2654:
--------------------------------
    Description: 
spark-shell and other executables need either the {{SPARK_HOME}} or the {{find-spark-home}} executable.

The  {{find-spark-home}}  is not packaged (what makes sense, since we use hardcoded /usr/lib/spark for packaging spark.

The executable does not run without either the environment variable  {{SPARK_HOME}}.

I prefer not to have a puppet script to create a /etc/profile.d script to fix the situation.

I tend to patch the executables instead, to have SPARK_HOME set within the executable: (like spark-env.sh)

{code}
export SPARK_HOME=${SPARK_HOME:-/usr/lib/spark}
{code}
Comments?

Or do I miss something important (Maybe a debian specific problem?)

  was:
spark-shell and other executables need either the {{SPARK_HOME}} or the {{find-spark-home}} executable.

The  {{find-spark-home}}  is not packaged (what makes sense, since we use hardcoded /usr/lib/spark for packaging spark.

The executable does not run without either the environment variable  {{SPARK_HOME}}.

I prefer not to have a puppet script to create a /etc/profile.d script to fix the situation.

I tend to patch the executables instead, to have SPARK_HOME set within the executable: (like spark-env.sh)

{code}
export SPARK_HOME=${SPARK_HOME:-/usr/lib/spark}
{code}
Comments?


> spark binaries need either SPARK_HOME or non existing find-spark-home exe
> -------------------------------------------------------------------------
>
>                 Key: BIGTOP-2654
>                 URL: https://issues.apache.org/jira/browse/BIGTOP-2654
>             Project: Bigtop
>          Issue Type: Bug
>    Affects Versions: 1.1.0
>            Reporter: Olaf Flebbe
>             Fix For: 1.2.0
>
>
> spark-shell and other executables need either the {{SPARK_HOME}} or the {{find-spark-home}} executable.
> The  {{find-spark-home}}  is not packaged (what makes sense, since we use hardcoded /usr/lib/spark for packaging spark.
> The executable does not run without either the environment variable  {{SPARK_HOME}}.
> I prefer not to have a puppet script to create a /etc/profile.d script to fix the situation.
> I tend to patch the executables instead, to have SPARK_HOME set within the executable: (like spark-env.sh)
> {code}
> export SPARK_HOME=${SPARK_HOME:-/usr/lib/spark}
> {code}
> Comments?
> Or do I miss something important (Maybe a debian specific problem?)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)