You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sital Kedia (JIRA)" <ji...@apache.org> on 2017/03/25 01:50:42 UTC
[jira] [Created] (SPARK-20091) DagScheduler should allow running
concurrent attempts of a stage in case of multiple fetch failure
Sital Kedia created SPARK-20091:
-----------------------------------
Summary: DagScheduler should allow running concurrent attempts of a stage in case of multiple fetch failure
Key: SPARK-20091
URL: https://issues.apache.org/jira/browse/SPARK-20091
Project: Spark
Issue Type: Improvement
Components: Scheduler
Affects Versions: 2.0.1
Reporter: Sital Kedia
Currently, the Dag scheduler does not allow running concurrent attempts of a stage in case of multiple fetch failure. As a result, in case of multipe fetch failures are detected, serial execution of map stage delays the job run significantly.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org