You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Damian Momot (JIRA)" <ji...@apache.org> on 2016/03/21 12:37:25 UTC

[jira] [Comment Edited] (SPARK-11293) Spillable collections leak shuffle memory

    [ https://issues.apache.org/jira/browse/SPARK-11293?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15204064#comment-15204064 ] 

Damian Momot edited comment on SPARK-11293 at 3/21/16 11:36 AM:
----------------------------------------------------------------

It looks we are getting same errors on 1.6.0 too - any plans to address it?


was (Author: daimon):
It looks we are getting same errors on 1.6.0 too

> Spillable collections leak shuffle memory
> -----------------------------------------
>
>                 Key: SPARK-11293
>                 URL: https://issues.apache.org/jira/browse/SPARK-11293
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.3.1, 1.4.1, 1.5.1, 1.6.0
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>            Priority: Critical
>
> I discovered multiple leaks of shuffle memory while working on my memory manager consolidation patch, which added the ability to do strict memory leak detection for the bookkeeping that used to be performed by the ShuffleMemoryManager. This uncovered a handful of places where tasks can acquire execution/shuffle memory but never release it, starving themselves of memory.
> Problems that I found:
> * {{ExternalSorter.stop()}} should release the sorter's shuffle/execution memory.
> * BlockStoreShuffleReader should call {{ExternalSorter.stop()}} using a {{CompletionIterator}}.
> * {{ExternalAppendOnlyMap}} exposes no equivalent of {{stop()}} for freeing its resources.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org