You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "liuxian (JIRA)" <ji...@apache.org> on 2017/07/22 08:15:01 UTC
[jira] [Updated] (SPARK-21506) The description of
"spark.executor.cores" may be not correct
[ https://issues.apache.org/jira/browse/SPARK-21506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
liuxian updated SPARK-21506:
----------------------------
Description:
The description for "spark.executor.cores" :"The number of cores assigned to each executor is configurable. When this is not explicitly set, only one executor per application will run the same worker "
I think it is not correct, because if one application is not assigned enough cores in the first `schedule()`, another executor may be launched on the same worker in the next time.
was:
This description for "spark.executor.cores" :"The number of cores assigned to each executor is configurable. When this is not explicitly set, only one executor per application will run the same worker "
I think it is not correct, because if one application is not assigned enough cores in the first `schedule()`, another executor may be launched on the same worker in the next time.
> The description of "spark.executor.cores" may be not correct
> -------------------------------------------------------------
>
> Key: SPARK-21506
> URL: https://issues.apache.org/jira/browse/SPARK-21506
> Project: Spark
> Issue Type: Bug
> Components: Documentation
> Affects Versions: 2.3.0
> Reporter: liuxian
>
> The description for "spark.executor.cores" :"The number of cores assigned to each executor is configurable. When this is not explicitly set, only one executor per application will run the same worker "
> I think it is not correct, because if one application is not assigned enough cores in the first `schedule()`, another executor may be launched on the same worker in the next time.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org