You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2018/11/16 16:02:00 UTC
[jira] [Resolved] (SPARK-25934) Mesos: SPARK_CONF_DIR should not be
propogated by spark submit
[ https://issues.apache.org/jira/browse/SPARK-25934?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-25934.
-------------------------------
Resolution: Fixed
Fix Version/s: 2.4.1
3.0.0
2.3.3
Issue resolved by pull request 22937
[https://github.com/apache/spark/pull/22937]
> Mesos: SPARK_CONF_DIR should not be propogated by spark submit
> --------------------------------------------------------------
>
> Key: SPARK-25934
> URL: https://issues.apache.org/jira/browse/SPARK-25934
> Project: Spark
> Issue Type: Bug
> Components: Mesos
> Affects Versions: 2.3.2
> Reporter: Matt Molek
> Assignee: Matt Molek
> Priority: Major
> Fix For: 2.3.3, 3.0.0, 2.4.1
>
>
> This is very similar to how SPARK_HOME caused problems for spark on Mesos in SPARK-12345
> The `spark submit` command is setting spark.mesos.driverEnv.SPARK_CONF_DIR to whatever the SPARK_CONF_DIR was for the command that submitted the job.
> This is doesn't make sense for most mesos situations, and it broke spark for my team when we upgraded from 2.2.0 to 2.3.2. I haven't tested it but I think 2.4.0 will have the same issue.
> It's preventing spark-env.sh from running because now SPARK_CONF_DIR points to some non-existent directory, instead of the unpacked spark binary in the Mesos sandbox like it should.
> I'm not that familiar with the spark code base, but I think this could be fixed by simply adding a `&& k != "SPARK_CONF_DIR"` clause to this filter statement: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala#L421
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org