You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Mario Briggs (JIRA)" <ji...@apache.org> on 2016/09/22 05:43:20 UTC

[jira] [Created] (SPARK-17630) jvm-exit-on-fatal-error for spark.rpc.netty like there is available for akka

Mario Briggs created SPARK-17630:
------------------------------------

             Summary: jvm-exit-on-fatal-error for spark.rpc.netty like there is available for akka
                 Key: SPARK-17630
                 URL: https://issues.apache.org/jira/browse/SPARK-17630
             Project: Spark
          Issue Type: Question
          Components: Spark Core
    Affects Versions: 1.6.0
            Reporter: Mario Briggs


Hi,

I have 2 code-paths from my app that result in a jvm OOM. 

In the first code path, 'akka.jvm-exit-on-fatal-error' kicks in and shuts down the JVM, so that the caller (py4J) get notified with proper stack trace

In the 2nd code path (rpc.netty), no such handler kicks in and shutdown the JVM, so the caller does not get notified.

Is it possible to have an jvm exit handle for the rpc. netty path?

First code path trace
 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org