You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/02/12 21:36:00 UTC

[jira] [Resolved] (SPARK-18898) Exception not failing Scala applications (in yarn)

     [ https://issues.apache.org/jira/browse/SPARK-18898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-18898.
------------------------------------
    Resolution: Cannot Reproduce

I'm very sure this works in cluster mode.

In client mode this is expected. The driver is not running inside YARN and thus cannot affect the final status of the YARN application.

> Exception not failing Scala applications (in yarn)
> --------------------------------------------------
>
>                 Key: SPARK-18898
>                 URL: https://issues.apache.org/jira/browse/SPARK-18898
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit, YARN
>    Affects Versions: 2.0.2
>            Reporter: Raphael
>            Priority: Major
>
> I am submitting my scala applications with SparkLauncher which uses spark-submit project.
> When I throw some error in my spark job, the final status of this job in YARN is FINISHED, after viewing the source code of SparkSubmit:
> {code:title=SparkSubmit.scala|borderStyle=solid}
> ...
> try {
>       mainMethod.invoke(null, childArgs.toArray)
>     } catch {
>       case t: Throwable =>
>         findCause(t) match {
>           case SparkUserAppException(exitCode) =>
>             System.exit(exitCode)
>           case t: Throwable =>
>             throw t
>         }
>     }
> ...
> {code}
>  
> Its seems we have in our code to throw the SparkUserAppException, but this exception is a private case class inside the SparkException class.
> Please make this case class avaliable or give us a way to launch errors inside our applications.
> More details at: http://stackoverflow.com/questions/41184158/how-to-throw-an-exception-in-spark
> In the past, the same issue with pyspark was opened here:
> https://issues.apache.org/jira/browse/SPARK-7736
> And resolved here:
> https://github.com/apache/spark/pull/8258
> Best Regards 
> Raphael.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org