You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "coneyliu (JIRA)" <ji...@apache.org> on 2017/06/21 04:59:00 UTC

[jira] [Commented] (SPARK-19293) Spark 2.1.x unstable with spark.speculation=true

    [ https://issues.apache.org/jira/browse/SPARK-19293?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16056992#comment-16056992 ] 

coneyliu commented on SPARK-19293:
----------------------------------

When a task is finished, the `DAGScheduler` will try to kill the backup of the task. So there could be some exception about kill the `Backup Task`. if this exception influence your application? 

> Spark 2.1.x unstable with spark.speculation=true
> ------------------------------------------------
>
>                 Key: SPARK-19293
>                 URL: https://issues.apache.org/jira/browse/SPARK-19293
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0, 2.1.1
>            Reporter: Damian Momot
>            Priority: Critical
>
> After upgrading from Spark 2.0.2 to 2.1.0 we've observed that jobs are often failing when speculative mode is enabled.
> In 2.0.2 speculative tasks were simply skipped if they were not used for result (i.e. other instance finished earlier) - and it was clearly visible in UI that those tasks were not counted as failures.
> In 2.1.0 many tasks are being marked failed/killed when speculative tasks start to run (that is at the end of stage when there are spare executors to use) which also leads to entire stage/job failures.
> Disabling spark.speculation solves failing problem - but speculative mode is very useful especially when different executors run on machines with varying load (for example in YARN)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org