You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2019/04/05 18:57:00 UTC

[jira] [Resolved] (SPARK-27192) spark.task.cpus should be less or equal than spark.task.cpus when use static executor allocation

     [ https://issues.apache.org/jira/browse/SPARK-27192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-27192.
-------------------------------
    Resolution: Fixed

Issue resolved by pull request 24261
[https://github.com/apache/spark/pull/24261]

> spark.task.cpus should be less or equal than spark.task.cpus when use static executor allocation
> ------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-27192
>                 URL: https://issues.apache.org/jira/browse/SPARK-27192
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.2.0, 2.3.0, 2.4.0
>            Reporter: Lijia Liu
>            Assignee: Lijia Liu
>            Priority: Minor
>             Fix For: 3.0.0
>
>
> When use dynamic executor allocation, if we set spark.executor.cores small than  spark.task.cpus, exception will be thrown as follows:
> '''spark.executor.cores must not be < spark.task.cpus'''
> But, if dynamic executor allocation not enabled, spark will hang when submit new job for TaskSchedulerImpl will not schedule a task in a executor which available cores is small than 
> spark.task.cpus.See [https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala#L351]
> So, when start task scheduler, spark.task.cpus should be check.
> reproduce
> $SPARK_HOME/bin/spark-shell --conf spark.task.cpus=2  --master local[1]
> scala> sc.parallelize(1 to 9).collect



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org