You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gerard Maas (JIRA)" <ji...@apache.org> on 2014/06/01 21:48:01 UTC

[jira] [Created] (SPARK-1985) SPARK_HOME shouldn't be required when spark.executor.uri is provided

Gerard Maas created SPARK-1985:
----------------------------------

             Summary: SPARK_HOME shouldn't be required when spark.executor.uri is provided
                 Key: SPARK-1985
                 URL: https://issues.apache.org/jira/browse/SPARK-1985
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.0.0
         Environment: MESOS
            Reporter: Gerard Maas


When trying to run that simple example on a  Mesos installation,  I get an error that "SPARK_HOME" is not set. A local spark installation should not be required to run a job on Mesos. All that's needed is the executor package, being the assembly.tar.gz on a reachable location (HDFS/S3/HTTP).

I went looking into the code and indeed there's a check on SPARK_HOME [2] regardless of the presence of the assembly but it's actually only used if the assembly is not provided (which is a kind-of best-effort recovery strategy).

Current flow:

if (!SPARK_HOME) fail("No SPARK_HOME") 
else if (assembly) { use assembly) }
else { try use SPARK_HOME to build spark_executor } 

Should be:
sparkExecutor =  if (assembly) {assembly} 
                 else if (SPARK_HOME) {try use SPARK_HOME to build spark_executor}
                 else { fail("No executor found. Please provide spark.executor.uri (preferred) or spark.home")


[1] http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-with-Spark-Mesos-spark-shell-works-fine-td6165.html

[2] https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala#L89



--
This message was sent by Atlassian JIRA
(v6.2#6252)