You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sasi (JIRA)" <ji...@apache.org> on 2015/12/07 10:19:11 UTC
[jira] [Commented] (SPARK-12175) Add new flag to Spark that
identify if the driver run on application servers.
[ https://issues.apache.org/jira/browse/SPARK-12175?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15044624#comment-15044624 ]
Sasi commented on SPARK-12175:
------------------------------
Are you guys think of fix it or added implemented for such case?
> Add new flag to Spark that identify if the driver run on application servers.
> -----------------------------------------------------------------------------
>
> Key: SPARK-12175
> URL: https://issues.apache.org/jira/browse/SPARK-12175
> Project: Spark
> Issue Type: Bug
> Reporter: Sasi
>
> Hi,
> I'm running my driver on JBoss and I have noticed that sometimes there's use case that try to create new SparkContext when Spark master is down, therefore an unhandle exception has throw and [SparkUncaughtExceptionHandler|https://github.com/apache/spark/blob/3bd77b213a9cd177c3ea3c61d37e5098e55f75a5/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala]
> do system.exist which kill JBoss.
> I think flag should be configured about which env is running the driver.
> For example,
> if driverRunOnAppServer = true then no system.exist will occurs.
> The develop will have to handle cases when Spark master is down or not.
> Best regards,
> Sasi
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org