You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Pascal GILLET (JIRA)" <ji...@apache.org> on 2017/10/17 14:38:00 UTC

[jira] [Commented] (SPARK-19606) Support constraints in spark-dispatcher

    [ https://issues.apache.org/jira/browse/SPARK-19606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16207722#comment-16207722 ] 

Pascal GILLET commented on SPARK-19606:
---------------------------------------

* _If "spark.mesos.constraints" is passed with the job then it will wind up overriding the value specified in the "driverDefault" property. _: False. "spark.mesos.constraints" still applies for executors only, while the "driverDefault" will apply for the driver.
* _If "spark.mesos.constraints" is not passed with the job, then the value specified in the "driverDefault" property will get passed to the executors - which we definitely don't want._: True

OK then to add the "spark.mesos.constraints.driver" property.

> Support constraints in spark-dispatcher
> ---------------------------------------
>
>                 Key: SPARK-19606
>                 URL: https://issues.apache.org/jira/browse/SPARK-19606
>             Project: Spark
>          Issue Type: New Feature
>          Components: Mesos
>    Affects Versions: 2.1.0
>            Reporter: Philipp Hoffmann
>
> The `spark.mesos.constraints` configuration is ignored by the spark-dispatcher. The constraints need to be passed in the Framework information when registering with Mesos.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org