You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/05/25 23:06:01 UTC

[jira] [Resolved] (SPARK-1894) The default ec2 set-up ignores --driver-class-path

     [ https://issues.apache.org/jira/browse/SPARK-1894?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Patrick Wendell resolved SPARK-1894.
------------------------------------

       Resolution: Fixed
    Fix Version/s:     (was: 1.0.1)
                   1.0.0
         Assignee: Andrew Or

This was fixed in the spark-ec2 repo:
https://github.com/mesos/spark-ec2/pull/51

> The default ec2 set-up ignores --driver-class-path
> --------------------------------------------------
>
>                 Key: SPARK-1894
>                 URL: https://issues.apache.org/jira/browse/SPARK-1894
>             Project: Spark
>          Issue Type: Task
>          Components: EC2
>    Affects Versions: 1.0.0
>            Reporter: Andrew Or
>            Assignee: Andrew Or
>             Fix For: 1.0.0
>
>
> If the user sets up an ec2 cluster, the scripts automatically add the following lines to conf/spark-env.sh
> {code}
> export SPARK_SUBMIT_LIBRARY_PATH="/root/ephemeral-hdfs/lib/native/"
> export SPARK_SUBMIT_CLASSPATH=$SPARK_CLASSPATH:/root/ephemeral-hdfs/conf"
> {code}
> Unfortunately, these variables are exported after spark-submit parses the \-\-driver-* flags, which also set these variables. As a result, all values set via the \-\-driver-* flags get overridden.
> The simple fix is to append to, instead of overwrite, these variables.



--
This message was sent by Atlassian JIRA
(v6.2#6252)