You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Raphael (JIRA)" <ji...@apache.org> on 2016/12/16 16:56:58 UTC
[jira] [Created] (SPARK-18898) Exception not failing Scala
applications (in yarn)
Raphael created SPARK-18898:
-------------------------------
Summary: Exception not failing Scala applications (in yarn)
Key: SPARK-18898
URL: https://issues.apache.org/jira/browse/SPARK-18898
Project: Spark
Issue Type: Bug
Components: Spark Submit, YARN
Affects Versions: 2.0.2
Reporter: Raphael
I am submitting my scala applications with SparkLauncher which uses spark-submit project.
When I throw some error in my spark job, the final status of this job in YARN is FINISHED, after viewing the source code of SparkSubmit:
{code:title=SparkSubmit.scala|borderStyle=solid}
...
try {
mainMethod.invoke(null, childArgs.toArray)
} catch {
case t: Throwable =>
findCause(t) match {
case SparkUserAppException(exitCode) =>
System.exit(exitCode)
case t: Throwable =>
throw t
}
}
...
{code}
Its seems we have in our code to throw the SparkUserAppException but this exception is a private case class inside the SparkException class.
More details at: http://stackoverflow.com/questions/41184158/how-to-throw-an-exception-in-spark
In the past, the same issue with pyspark was opened here:
https://issues.apache.org/jira/browse/SPARK-7736
And resolved here:
https://github.com/apache/spark/pull/8258
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org