You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (JIRA)" <ji...@apache.org> on 2016/02/16 19:50:18 UTC

[jira] [Created] (SPARK-13343) speculative tasks that didn't commit shouldn't be marked as success

Thomas Graves created SPARK-13343:
-------------------------------------

             Summary: speculative tasks that didn't commit shouldn't be marked as success
                 Key: SPARK-13343
                 URL: https://issues.apache.org/jira/browse/SPARK-13343
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 1.6.0
            Reporter: Thomas Graves


Currently Speculative tasks that didn't commit can show up as success of failures (depending on timing of commit). This is a bit confusing because that task didn't really succeed in the sense it didn't write anything.  

I think these tasks should be marked as KILLED or something that is more obvious to the user exactly what happened.  it is happened to hit the timing where it got a commit denied exception then it shows up as failed and counts against your task failures.  It shouldn't count against task failures since that failure really doesn't matter.

MapReduce handles these situation so perhaps we can look there for a model.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org