You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2014/10/09 23:26:34 UTC

[jira] [Created] (SPARK-3885) Provide mechanism to remove accumulators once they are no longer used

Josh Rosen created SPARK-3885:
---------------------------------

             Summary: Provide mechanism to remove accumulators once they are no longer used
                 Key: SPARK-3885
                 URL: https://issues.apache.org/jira/browse/SPARK-3885
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.1.0, 1.0.2, 1.2.0
            Reporter: Josh Rosen


Spark does not currently provide any mechanism to delete accumulators after they are no longer used.  This can lead to OOMs for long-lived SparkContexts that create many large accumulators.

Part of the problem is that accumulators are registered in a global {{Accumulators}} registry.  Maybe the fix would be as simple as using weak references in the Accumulators registry so that accumulators can be GC'd once they can no longer be used.

In the meantime, here's a workaround that users can try:

Accumulators have a public setValue() method that can be called (only by the driver) to change an accumulator’s value.  You might be able to use this to reset accumulators’ values to smaller objects (e.g. the “zero” object of whatever your accumulator type is, or ‘null’ if you’re sure that the accumulator will never be accessed again).

This issue was originally reported by [~nkronenfeld] on the dev mailing list: http://apache-spark-developers-list.1001551.n3.nabble.com/Fwd-Accumulator-question-td8709.html



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org