You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dhruve Ashar (JIRA)" <ji...@apache.org> on 2017/06/22 16:24:00 UTC

[jira] [Created] (SPARK-21181) Suppress memory leak errors reported by netty

Dhruve Ashar created SPARK-21181:
------------------------------------

             Summary: Suppress memory leak errors reported by netty
                 Key: SPARK-21181
                 URL: https://issues.apache.org/jira/browse/SPARK-21181
             Project: Spark
          Issue Type: Bug
          Components: Input/Output
    Affects Versions: 2.1.0
            Reporter: Dhruve Ashar
            Priority: Minor


We are seeing netty report memory leak erros like the one below after switching to 2.1. 

{code}
ERROR ResourceLeakDetector: LEAK: ByteBuf.release() was not called before it's garbage-collected. Enable advanced leak reporting to find out where the leak occurred. To enable advanced leak reporting, specify the JVM option '-Dio.netty.leakDetection.level=advanced' or call ResourceLeakDetector.setLevel() See http://netty.io/wiki/reference-counted-objects.html for more information.
{code}

Looking a bit deeper, Spark is not leaking any memory here, but it is confusing for the user to see the error message in the driver logs. 

After enabling, '-Dio.netty.leakDetection.level=advanced', netty reveals the SparkSaslServer to be the source of these leaks.

Sample trace :https://gist.github.com/dhruve/b299ebc35aa0a185c244a0468927daf1



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org