You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/09/10 07:15:46 UTC

[jira] [Commented] (SPARK-10530) Kill other task attempts when one taskattempt belonging the same task is succeeded in speculation

    [ https://issues.apache.org/jira/browse/SPARK-10530?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14738180#comment-14738180 ] 

Apache Spark commented on SPARK-10530:
--------------------------------------

User 'zjffdu' has created a pull request for this issue:
https://github.com/apache/spark/pull/8683

> Kill other task attempts when one taskattempt belonging the same task is succeeded in speculation
> -------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10530
>                 URL: https://issues.apache.org/jira/browse/SPARK-10530
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler, Spark Core
>            Reporter: Jeff Zhang
>
> Currently when speculation is enabled, other task attempts won't be killed if one task attempt in the same task is succeeded, it is not resource efficient, it would be better to kill other task attempts when one taskattempt belonging the same task is succeeded in speculation.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org