You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/11/23 08:01:15 UTC

[jira] [Commented] (SPARK-18557) Downgrade the memory leak warning message

    [ https://issues.apache.org/jira/browse/SPARK-18557?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15689271#comment-15689271 ] 

Apache Spark commented on SPARK-18557:
--------------------------------------

User 'rxin' has created a pull request for this issue:
https://github.com/apache/spark/pull/15989

> Downgrade the memory leak warning message
> -----------------------------------------
>
>                 Key: SPARK-18557
>                 URL: https://issues.apache.org/jira/browse/SPARK-18557
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Reynold Xin
>            Assignee: Reynold Xin
>
> TaskMemoryManager has a memory leak detector that gets called at task completion callback and checks whether any memory has not been released. If they are not released by the time the callback is invoked, TaskMemoryManager releases them.
> The current error message says something like the following:
> {noformat}
> WARN  [Executor task launch worker-0]
> org.apache.spark.memory.TaskMemoryManager - leak 16.3 MB memory from
> org.apache.spark.unsafe.map.BytesToBytesMap@33fb6a15
> {noformat}
> In practice, there are multiple reasons why these can be triggered in the normal code path (e.g. limit, or task failures), and the fact that these messages are log means the "leak" is fixed by TaskMemoryManager.
> To not confuse users, we should downgrade the message from warning to debug level, and avoid using the word "leak" since it is not actually a leak.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org