You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Matt Molek (JIRA)" <ji...@apache.org> on 2018/11/03 19:21:00 UTC

[jira] [Created] (SPARK-25934) Mesos: SPARK_CONF_DIR should not be propogated by spark submit

Matt Molek created SPARK-25934:
----------------------------------

             Summary: Mesos: SPARK_CONF_DIR should not be propogated by spark submit
                 Key: SPARK-25934
                 URL: https://issues.apache.org/jira/browse/SPARK-25934
             Project: Spark
          Issue Type: Bug
          Components: Mesos
    Affects Versions: 2.3.2
            Reporter: Matt Molek


This is very similar to how SPARK_HOME caused problems for spark on Mesos in SPARK-12345

The `spark submit` command is setting spark.mesos.driverEnv.SPARK_CONF_DIR to whatever the SPARK_CONF_DIR was for the command that submitted the job.

This is doesn't make sense for most mesos situations, and it broke spark for my team when we upgraded from 2.2.0 to 2.3.2. I haven't tested it but I think 2.4.0 will have the same issue.

It's preventing spark-env.sh from running because now SPARK_CONF_DIR points to some non-existent directory, instead of the unpacked spark binary in the Mesos sandbox like it should.

I'm not that familiar with the spark code base, but I think this could be fixed by simply adding a `&& k != "SPARK_CONF_DIR"` clause to this filter statement: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala#L421



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org