You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hieu Tri Huynh (JIRA)" <ji...@apache.org> on 2018/06/20 02:26:00 UTC

[jira] [Commented] (SPARK-10781) Allow certain number of failed tasks and allow job to succeed

    [ https://issues.apache.org/jira/browse/SPARK-10781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16517716#comment-16517716 ] 

Hieu Tri Huynh commented on SPARK-10781:
----------------------------------------

I attachedĀ a proposed solution for this Jira. Hope to receive opinions from all of you. Thank you.

[^SPARK_10781_Proposed_Solution.pdf]

> Allow certain number of failed tasks and allow job to succeed
> -------------------------------------------------------------
>
>                 Key: SPARK-10781
>                 URL: https://issues.apache.org/jira/browse/SPARK-10781
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.5.0
>            Reporter: Thomas Graves
>            Priority: Major
>         Attachments: SPARK_10781_Proposed_Solution.pdf
>
>
> MapReduce has this config mapreduce.map.failures.maxpercent and mapreduce.reduce.failures.maxpercent which allows for a certain percent of tasks to fail but the job to still succeed.  
> This could be a useful feature in Spark also if a job doesn't need all the tasks to be successful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org