You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/12/10 17:11:10 UTC

[jira] [Commented] (SPARK-12265) Spark calls System.exit inside driver instead of throwing exception

    [ https://issues.apache.org/jira/browse/SPARK-12265?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15051157#comment-15051157 ] 

Sean Owen commented on SPARK-12265:
-----------------------------------

Yeah there are a number of places where {{System.exit}} happens (and many related JIRAs); some are legit, some are not. This looks good as I think the intent is not to hard-kill the user app, even if that may be fine for processes only managed by Spark, like executors.

> Spark calls System.exit inside driver instead of throwing exception
> -------------------------------------------------------------------
>
>                 Key: SPARK-12265
>                 URL: https://issues.apache.org/jira/browse/SPARK-12265
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 1.6.0
>            Reporter: Iulian Dragos
>
> Spark may call {{System.exit}} if Mesos sends an error code back to the MesosSchedulerDriver. This makes Spark very hard to test, since this effectively kills the driver application under test. Such tests may run under ScalaTest, that doesn't get a chance to collect a result and populate a report.
> Relevant code is in MesosSchedulerUtils.scala:
> {code}
>             val ret = mesosDriver.run()
>             logInfo("driver.run() returned with code " + ret)
>             if (ret != null && ret.equals(Status.DRIVER_ABORTED)) {
>               System.exit(1)
>             }
> {code}
> Errors should be signaled with a {{SparkException}} in the correct thread.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org