You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicolas Fraison (JIRA)" <ji...@apache.org> on 2018/12/11 14:10:00 UTC

[jira] [Created] (SPARK-26340) Ensure cores per executor is greater than cpu per task

Nicolas Fraison created SPARK-26340:
---------------------------------------

             Summary: Ensure cores per executor is greater than cpu per task
                 Key: SPARK-26340
                 URL: https://issues.apache.org/jira/browse/SPARK-26340
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.3.2, 2.2.2
            Reporter: Nicolas Fraison


No check is performed to ensure spark.task.cpus is lower then spark.executor.cores. Which can lead to jobs not able to assign tasks without any understandable issues
The check is only performed in the case of dynamic allocation usage in ExecutorAllocationManager

Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org