You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2015/06/17 21:47:00 UTC
[jira] [Updated] (SPARK-7689) Deprecate spark.cleaner.ttl
[ https://issues.apache.org/jira/browse/SPARK-7689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Patrick Wendell updated SPARK-7689:
-----------------------------------
Target Version/s: 1.4.1 (was: 1.4.0)
> Deprecate spark.cleaner.ttl
> ---------------------------
>
> Key: SPARK-7689
> URL: https://issues.apache.org/jira/browse/SPARK-7689
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Reporter: Josh Rosen
> Assignee: Josh Rosen
>
> With the introduction of ContextCleaner, I think there's no longer any reason for most users to enable the MetadataCleaner / {{spark.cleaner.ttl}} (except perhaps for super-long-lived Spark REPLs where you're worried about orphaning RDDs or broadcast variables in your REPL history and having them never get cleaned up, although I think this is an uncommon use-case). I think that this property used to be relevant for Spark Streaming jobs, but I think that's no longer the case since the latest Streaming docs have removed all mentions of {{spark.cleaner.ttl}} (see https://github.com/apache/spark/pull/4956/files#diff-dbee746abf610b52d8a7cb65bf9ea765L1817, for example).
> See http://apache-spark-user-list.1001560.n3.nabble.com/is-spark-cleaner-ttl-safe-td2557.html for an old, related discussion. Also, see https://github.com/apache/spark/pull/126, the PR that introduced the new ContextCleaner mechanism.
> We should probably add a deprecation warning to {{spark.cleaner.ttl}} that advises users against using it, since it's an unsafe configuration option that can lead to confusing behavior if it's misused.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org