You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/12/19 09:45:00 UTC
[jira] [Commented] (SPARK-41585) The Spark exclude node functionality for YARN should work independently of dynamic allocation
[ https://issues.apache.org/jira/browse/SPARK-41585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17649233#comment-17649233 ]
Apache Spark commented on SPARK-41585:
--------------------------------------
User 'LucaCanali' has created a pull request for this issue:
https://github.com/apache/spark/pull/39127
> The Spark exclude node functionality for YARN should work independently of dynamic allocation
> ---------------------------------------------------------------------------------------------
>
> Key: SPARK-41585
> URL: https://issues.apache.org/jira/browse/SPARK-41585
> Project: Spark
> Issue Type: Bug
> Components: YARN
> Affects Versions: 3.3.1
> Reporter: Luca Canali
> Priority: Minor
>
> The Spark exclude node functionality for Spark on YARN, introduced in SPARK-26688, allows users to specify a list of node names that are excluded from resource allocation. This is done using the configuration parameter: {{spark.yarn.exclude.nodes}}
> The feature currently works only for executors allocated via dynamic allocation. To use the feature on Spark 3.3.1, for example, one may need also to configure spark.dynamicAllocation.minExecutors=0 and spark.executor.instances=0, therefore relying on executor resource allocation only via dynamic allocation.
> This proposes to extend the use of Spark exclude node functionality for YARN beyond dynamic allocation, which I believe makes it more consistent also with what the documentation reports for this feature/configuration parameter.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org