You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/03/30 13:39:25 UTC
[jira] [Commented] (SPARK-14261) Memory leak in Spark Thrift Server
[ https://issues.apache.org/jira/browse/SPARK-14261?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15217824#comment-15217824 ]
Sean Owen commented on SPARK-14261:
-----------------------------------
I don't think this argues that there's a memory leak. You should do a heap dump to see what the allocated memory consists of. It could be a build-up of cached objects, for example, which aren't cleared unless memory is actually low.
> Memory leak in Spark Thrift Server
> ----------------------------------
>
> Key: SPARK-14261
> URL: https://issues.apache.org/jira/browse/SPARK-14261
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.6.0
> Reporter: Xiaochun Liang
> Attachments: MemorySnapshot.PNG
>
>
> I am running Spark Thrift server on Windows Server 2012. The Spark Thrift server is launched as Yarn client mode. Its memory usage is increased gradually with the queries in. I am wondering there is memory leak in Spark Thrift server.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org