You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (JIRA)" <ji...@apache.org> on 2016/10/18 23:13:58 UTC
[jira] [Commented] (SPARK-17630) jvm-exit-on-fatal-error handler
for spark.rpc.netty like there is available for akka
[ https://issues.apache.org/jira/browse/SPARK-17630?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15586979#comment-15586979 ]
Shixiong Zhu commented on SPARK-17630:
--------------------------------------
Yeah, I think we can set up SparkUncaughtExceptionHandler for R and Python users.
> jvm-exit-on-fatal-error handler for spark.rpc.netty like there is available for akka
> ------------------------------------------------------------------------------------
>
> Key: SPARK-17630
> URL: https://issues.apache.org/jira/browse/SPARK-17630
> Project: Spark
> Issue Type: Question
> Components: Spark Core
> Affects Versions: 1.6.0
> Reporter: Mario Briggs
> Attachments: SecondCodePath.txt, firstCodepath.txt
>
>
> Hi,
> I have 2 code-paths from my app that result in a jvm OOM.
> In the first code path, 'akka.jvm-exit-on-fatal-error' kicks in and shuts down the JVM, so that the caller (py4J) get notified with proper stack trace. Attached stack-trace file (firstCodepath.txt)
> In the 2nd code path (rpc.netty), no such handler kicks in and shutdown the JVM, so the caller does not get notified.
> Attached stack-trace file (SecondCodepath.txt)
> Is it possible to have an jvm exit handle for the rpc. netty path?
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org