You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/03/15 21:10:33 UTC

[jira] [Commented] (SPARK-13915) Allow bin/spark-submit to be called via symbolic link

    [ https://issues.apache.org/jira/browse/SPARK-13915?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15196113#comment-15196113 ] 

Sean Owen commented on SPARK-13915:
-----------------------------------

I'm not clear if you're suggesting a change to the spark-submit as deployed by this project or not. It already works when symlinked because of {{SPARK_HOME}}. Did you not update or set this?

Also please read https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark

> Allow bin/spark-submit to be called via symbolic link
> -----------------------------------------------------
>
>                 Key: SPARK-13915
>                 URL: https://issues.apache.org/jira/browse/SPARK-13915
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit
>         Environment: CentOS 6.6
> Tarbal spark distribution and CDH-5.x.x Spark version (both)
>            Reporter: Rafael Pecin Ferreira
>            Priority: Minor
>
> We have a CDH-5 cluster that comes with spark-1.5.0 and we needed to use spark-1.5.1 for bug fix issues.
> When I set up the spark (out of the CDH box) to the system alternatives, it created a sequence of symbolic links to the target spark installation.
> When I tried to run spark-submit, the bash process call the target with "$0" as /usr/bin/spark-submit, but this script use the "$0" variable to locate its deps and I was facing this messages:
> [hdfs@server01 ~]$ env spark-submit
> ls: cannot access /usr/assembly/target/scala-2.10: No such file or directory
> Failed to find Spark assembly in /usr/assembly/target/scala-2.10.
> You need to build Spark before running this program.
> I fixed the spark-submit script adding this lines:
> if [ -h "$0" ] ; then
>     checklink="$0";
>     while [ -h "$checklink" ] ; do
>         checklink=`readlink $checklink`
>     done
>     SPARK_HOME="$(cd "`dirname "$checklink"`"/..; pwd)";
> else
>     SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)";
> fi
> It would be very nice if this piece of code be put into the spark-submit script to allow us to have multiple spark alternatives on the system.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org