You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2016/01/06 23:23:39 UTC
[jira] [Updated] (SPARK-7689) Remove TTL-based metadata cleaning
(spark.cleaner.ttl)
[ https://issues.apache.org/jira/browse/SPARK-7689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Josh Rosen updated SPARK-7689:
------------------------------
Labels: releasenotes (was: )
> Remove TTL-based metadata cleaning (spark.cleaner.ttl)
> ------------------------------------------------------
>
> Key: SPARK-7689
> URL: https://issues.apache.org/jira/browse/SPARK-7689
> Project: Spark
> Issue Type: Sub-task
> Components: Spark Core
> Reporter: Josh Rosen
> Assignee: Apache Spark
> Labels: releasenotes
>
> With the introduction of ContextCleaner, I think there's no longer any reason for most users to enable the MetadataCleaner / {{spark.cleaner.ttl}} (except perhaps for super-long-lived Spark REPLs where you're worried about orphaning RDDs or broadcast variables in your REPL history and having them never get cleaned up, although I think this is an uncommon use-case). I think that this property used to be relevant for Spark Streaming jobs, but I think that's no longer the case since the latest Streaming docs have removed all mentions of {{spark.cleaner.ttl}} (see https://github.com/apache/spark/pull/4956/files#diff-dbee746abf610b52d8a7cb65bf9ea765L1817, for example).
> See http://apache-spark-user-list.1001560.n3.nabble.com/is-spark-cleaner-ttl-safe-td2557.html for an old, related discussion. Also, see https://github.com/apache/spark/pull/126, the PR that introduced the new ContextCleaner mechanism.
> For Spark 2.0, I think we should remove {{spark.cleaner.ttl}} and the associated TTL-based metadata cleaning code.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org