You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Henry Kim (JIRA)" <ji...@apache.org> on 2017/01/04 19:16:58 UTC
[jira] [Commented] (SPARK-15984) WARN message
"o.a.h.y.s.resourcemanager.rmapp.RMAppImpl: The specific max attempts: 0
for application: 8 is invalid" when starting application on YARN
[ https://issues.apache.org/jira/browse/SPARK-15984?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15799087#comment-15799087 ]
Henry Kim commented on SPARK-15984:
-----------------------------------
the workaround that Saisai stated is to add the following config to your app
--conf spark.yarn.maxAppAttempts=2
> WARN message "o.a.h.y.s.resourcemanager.rmapp.RMAppImpl: The specific max attempts: 0 for application: 8 is invalid" when starting application on YARN
> ------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-15984
> URL: https://issues.apache.org/jira/browse/SPARK-15984
> Project: Spark
> Issue Type: Improvement
> Components: YARN
> Affects Versions: 2.0.0
> Reporter: Jacek Laskowski
> Priority: Minor
>
> When executing {{spark-shell}} on Spark on YARN 2.7.2 on Mac OS as follows:
> {code}
> YARN_CONF_DIR=hadoop-conf ./bin/spark-shell --master yarn -c spark.shuffle.service.enabled=true --deploy-mode client -c spark.scheduler.mode=FAIR
> {code}
> it ends up with the following WARN in the logs:
> {code}
> 2016-06-16 08:33:05,308 INFO org.apache.hadoop.yarn.server.resourcemanager.ClientRMService: Allocated new applicationId: 8
> 2016-06-16 08:33:07,305 WARN org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl: The specific max attempts: 0 for application: 8 is invalid, because it is out of the range [1, 2]. Use the global max attempts instead.
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org