You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Noritaka Sekiyama (Jira)" <ji...@apache.org> on 2020/06/27 00:19:00 UTC
[jira] [Created] (SPARK-32112) Add a method to calculate the number
of parallel tasks that Spark can process at the same time
Noritaka Sekiyama created SPARK-32112:
-----------------------------------------
Summary: Add a method to calculate the number of parallel tasks that Spark can process at the same time
Key: SPARK-32112
URL: https://issues.apache.org/jira/browse/SPARK-32112
Project: Spark
Issue Type: Improvement
Components: Spark Core
Affects Versions: 3.0.0
Reporter: Noritaka Sekiyama
Repartition/coalesce is very important to optimize Spark application's performance, however, a lot of users are struggling with determining the number of partitions.
This issue is to add a method to calculate the number of parallel tasks that Spark can process at the same time.
It will help Spark users to determine the optimal number of partitions.
Expected use-cases:
- repartition with the calculated parallel tasks
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org