You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/02/12 21:35:00 UTC

[jira] [Resolved] (SPARK-15955) Failed Spark application returns with exitcode equals to zero

     [ https://issues.apache.org/jira/browse/SPARK-15955?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-15955.
------------------------------------
    Resolution: Cannot Reproduce

I'm pretty sure this has worked reliably for a while. If there's still some edge case, please provide more information.

> Failed Spark application returns with exitcode equals to zero
> -------------------------------------------------------------
>
>                 Key: SPARK-15955
>                 URL: https://issues.apache.org/jira/browse/SPARK-15955
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.6.1
>            Reporter: Yesha Vora
>            Priority: Major
>
> Scenario:
> * Set up cluster with wire-encryption enabled.
> * set 'spark.authenticate.enableSaslEncryption' = 'false' and 'spark.shuffle.service.enabled' :'true'
> * run sparkPi application.
> {code}
> client token: Token { kind: YARN_CLIENT_TOKEN, service:  }
> diagnostics: Max number of executor failures (3) reached
> ApplicationMaster host: xx.xx.xx.xxx
> ApplicationMaster RPC port: 0
> queue: default
> start time: 1465941051976
> final status: FAILED
> tracking URL: https://xx.xx.xx.xxx:8090/proxy/application_1465925772890_0016/
> user: hrt_qa
> Exception in thread "main" org.apache.spark.SparkException: Application application_1465925772890_0016 finished with failed status
> at org.apache.spark.deploy.yarn.Client.run(Client.scala:1092)
> at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1139)
> at org.apache.spark.deploy.yarn.Client.main(Client.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> INFO ShutdownHookManager: Shutdown hook called{code}
> This spark application exits with exitcode = 0. Failed application should not return with exitcode = 0



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org