You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/01/28 00:07:39 UTC

[jira] [Commented] (SPARK-13054) Always post TaskEnd event for tasks in cancelled stages

    [ https://issues.apache.org/jira/browse/SPARK-13054?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15120362#comment-15120362 ] 

Apache Spark commented on SPARK-13054:
--------------------------------------

User 'andrewor14' has created a pull request for this issue:
https://github.com/apache/spark/pull/10958

> Always post TaskEnd event for tasks in cancelled stages
> -------------------------------------------------------
>
>                 Key: SPARK-13054
>                 URL: https://issues.apache.org/jira/browse/SPARK-13054
>             Project: Spark
>          Issue Type: Bug
>          Components: Scheduler
>    Affects Versions: 1.0.0
>            Reporter: Andrew Or
>            Assignee: Andrew Or
>
> {code}
>     // The success case is dealt with separately below.
>     // TODO: Why post it only for failed tasks in cancelled stages? Clarify semantics here.
>     if (event.reason != Success) {
>       val attemptId = task.stageAttemptId
>       listenerBus.post(SparkListenerTaskEnd(
>         stageId, attemptId, taskType, event.reason, event.taskInfo, taskMetrics))
>     }
> {code}
> Today we only post task end events for canceled stages if the task failed. There is no reason why we shouldn't just post it for all the tasks, including the ones that succeeded. If we do that we will be able to simplify another branch in the DAGScheduler, which needs a lot of simplification.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org