You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/12/10 04:18:11 UTC

[jira] [Commented] (SPARK-6735) Provide options to make maximum executor failure count ( which kills the application ) relative to a window duration or disable it.

    [ https://issues.apache.org/jira/browse/SPARK-6735?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15049950#comment-15049950 ] 

Apache Spark commented on SPARK-6735:
-------------------------------------

User 'jerryshao' has created a pull request for this issue:
https://github.com/apache/spark/pull/10241

> Provide options to make maximum executor failure count ( which kills the application ) relative to a window duration or disable it.
> -----------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-6735
>                 URL: https://issues.apache.org/jira/browse/SPARK-6735
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit, YARN
>    Affects Versions: 1.2.0, 1.2.1, 1.3.0
>            Reporter: Twinkle Sachdeva
>
> Currently there is a setting (spark.yarn.max.executor.failures ) which tells maximum number of executor failures, after which Application fails.
> For long running applications, user can require not to kill the application at all or will require such setting relative to a window duration. This improvement is ti provide such options to make maximum executor failure count ( which kills the application ) relative to a window duration or disable it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org