You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/01/09 09:16:34 UTC

[jira] [Commented] (SPARK-5169) fetch the correct max attempts

    [ https://issues.apache.org/jira/browse/SPARK-5169?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14270743#comment-14270743 ] 

Apache Spark commented on SPARK-5169:
-------------------------------------

User 'WangTaoTheTonic' has created a pull request for this issue:
https://github.com/apache/spark/pull/3942

> fetch the correct max attempts
> ------------------------------
>
>                 Key: SPARK-5169
>                 URL: https://issues.apache.org/jira/browse/SPARK-5169
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>            Reporter: WangTaoTheTonic
>
> If we set an spark.yarn.maxAppAttempts which is larger than yarn.resourcemanager.am.max-attempts in yarn side, it will be overrided. So we should use both spark conf and yarn conf to get the correct retry number which is used to judge if it is the last attempt.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org