You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/06/15 14:33:09 UTC

[jira] [Assigned] (SPARK-15963) `TaskKilledException` is not correctly caught in `Executor.TaskRunner`

     [ https://issues.apache.org/jira/browse/SPARK-15963?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-15963:
------------------------------------

    Assignee: Apache Spark

> `TaskKilledException` is not correctly caught in `Executor.TaskRunner`
> ----------------------------------------------------------------------
>
>                 Key: SPARK-15963
>                 URL: https://issues.apache.org/jira/browse/SPARK-15963
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.0, 2.1.0
>            Reporter: Liwei Lin
>            Assignee: Apache Spark
>
> Currently in {{Executor.TaskRunner}}, we:
> {code}
> try {...}
> catch {
>   case _: TaskKilledException | _: InterruptedException if task.killed =>
>   ...
> }
> {code}
> What we intended was:
> - {{TaskKilledException}} OR ({{InterruptedException}} AND {{task.killed}})
> But fact is:
> - ({{TaskKilledException}} OR {{InterruptedException}}) AND {{task.killed}}
> As a consequence, sometimes we can not catch {{TaskKilledException}} and will incorrectly report our task status as {{FAILED}} (which should really be {{KILLED}}).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org