You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rafael Pecin Ferreira (JIRA)" <ji...@apache.org> on 2016/03/15 20:48:33 UTC

[jira] [Created] (SPARK-13915) Allow bin/spark-submit to be called via symbolic link

Rafael Pecin Ferreira created SPARK-13915:
---------------------------------------------

             Summary: Allow bin/spark-submit to be called via symbolic link
                 Key: SPARK-13915
                 URL: https://issues.apache.org/jira/browse/SPARK-13915
             Project: Spark
          Issue Type: Improvement
          Components: Spark Submit
         Environment: CentOS 6.6
Tarbal spark distribution and CDH-5.x.x Spark version (both)
            Reporter: Rafael Pecin Ferreira
            Priority: Minor


We have a CDH-5 cluster that comes with spark-1.5.0 and we needed to use spark-1.5.1 for bug fix issues.

When I set up the spark (out of the CDH box) to the system alternatives, it created a sequence of symbolic links to the target spark installation.

When I tried to run spark-submit, the bash process call the target with "$0" as /usr/bin/spark-submit, but this script use the "$0" variable to locate its deps and I was facing this messages:
[hdfs@server01 ~]$ env spark-submit
ls: cannot access /usr/assembly/target/scala-2.10: No such file or directory
Failed to find Spark assembly in /usr/assembly/target/scala-2.10.
You need to build Spark before running this program.

I fixed the spark-submit script adding this lines:
if [ -h "$0" ] ; then
    checklink="$0";
    while [ -h "$checklink" ] ; do
        checklink=`readlink $checklink`
    done

    SPARK_HOME="$(cd "`dirname "$checklink"`"/..; pwd)";
else
    SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)";
fi

It would be very nice if this piece of code be put into the spark-submit script to allow us to have multiple spark alternatives on the system.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org