You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "zengrui (Jira)" <ji...@apache.org> on 2019/11/07 14:38:00 UTC
[jira] [Created] (SPARK-29791) Add a spark config to allow user to
use executor cores virtually.
zengrui created SPARK-29791:
-------------------------------
Summary: Add a spark config to allow user to use executor cores virtually.
Key: SPARK-29791
URL: https://issues.apache.org/jira/browse/SPARK-29791
Project: Spark
Issue Type: Improvement
Components: Scheduler
Affects Versions: 2.1.0
Reporter: zengrui
We can config the executor cores by "spark.executor.cores". For example, if we config 8 cores for a executor, then the driver can only scheduler 8 tasks to this executor concurrently. In fact, most cases a task does not always occupy a core or more. More time, tasks spent on disk IO or network IO, so we can make driver to scheduler more than 8 tasks to this executor concurrently, it will make the whole job execute more quickly.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org