You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/10/05 18:14:26 UTC

[jira] [Assigned] (SPARK-10926) Refactor ContextCleaner to allow weak reference cleaning to be done outside of the driver

     [ https://issues.apache.org/jira/browse/SPARK-10926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-10926:
------------------------------------

    Assignee: Apache Spark

> Refactor ContextCleaner to allow weak reference cleaning to be done outside of the driver
> -----------------------------------------------------------------------------------------
>
>                 Key: SPARK-10926
>                 URL: https://issues.apache.org/jira/browse/SPARK-10926
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.5.1
>            Reporter: Matt Cheah
>            Assignee: Apache Spark
>            Priority: Minor
>
> Moreso a dependency for SPARK-10250, specifically this PR: https://github.com/apache/spark/pull/8438
> I want to do extra cleanup of objects based on their weak references being released by the garbage collector. Those objects won't be living on the driver, and hence putting the cleanup logic on the driver doesn't work. Weak reference cleaning seems like a useful extendible component where implementations know how to clean up certain kinds of objects.
> A little hard to explain my idea - I'll post a PR that makes this more clear.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org