You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicholas Chammas (Jira)" <ji...@apache.org> on 2021/03/10 20:38:00 UTC

[jira] [Updated] (SPARK-33000) cleanCheckpoints config does not clean all checkpointed RDDs on shutdown

     [ https://issues.apache.org/jira/browse/SPARK-33000?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nicholas Chammas updated SPARK-33000:
-------------------------------------
    Description: 
Maybe it's just that the documentation needs to be updated, but I found this surprising:
{code:python}
$ pyspark
...
>>> spark.conf.set('spark.cleaner.referenceTracking.cleanCheckpoints', 'true')
>>> spark.sparkContext.setCheckpointDir('/tmp/spark/checkpoint/')
>>> a = spark.range(10)
>>> a.checkpoint()
DataFrame[id: bigint]                                                           
>>> exit(){code}
The checkpoint data is left behind in {{/tmp/spark/checkpoint/}}. I expected Spark to clean it up on shutdown.

The documentation for {{spark.cleaner.referenceTracking.cleanCheckpoints}} says:
{quote}Controls whether to clean checkpoint files if the reference is out of scope.
{quote}
When Spark shuts down, everything goes out of scope, so I'd expect all checkpointed RDDs to be cleaned up.

For the record, I see the same behavior in both the Scala and Python REPLs.

Evidence the current behavior is confusing:
 * [https://stackoverflow.com/q/52630858/877069]
 * [https://stackoverflow.com/q/60009856/877069]
 * [https://stackoverflow.com/q/61454740/877069]

 

  was:
Maybe it's just that the documentation needs to be updated, but I found this surprising:
{code:python}
$ pyspark
...
>>> spark.conf.set('spark.cleaner.referenceTracking.cleanCheckpoints', 'true')
>>> spark.sparkContext.setCheckpointDir('/tmp/spark/checkpoint/')
>>> a = spark.range(10)
>>> a.checkpoint()
DataFrame[id: bigint]                                                           
>>> exit(){code}
The checkpoint data is left behind in {{/tmp/spark/checkpoint/}}. I expected Spark to clean it up on shutdown.

The documentation for {{spark.cleaner.referenceTracking.cleanCheckpoints}} says:
{quote}Controls whether to clean checkpoint files if the reference is out of scope.
{quote}
When Spark shuts down, everything goes out of scope, so I'd expect all checkpointed RDDs to be cleaned up.

For the record, I see the same behavior in both the Scala and Python REPLs.


> cleanCheckpoints config does not clean all checkpointed RDDs on shutdown
> ------------------------------------------------------------------------
>
>                 Key: SPARK-33000
>                 URL: https://issues.apache.org/jira/browse/SPARK-33000
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.6
>            Reporter: Nicholas Chammas
>            Priority: Minor
>
> Maybe it's just that the documentation needs to be updated, but I found this surprising:
> {code:python}
> $ pyspark
> ...
> >>> spark.conf.set('spark.cleaner.referenceTracking.cleanCheckpoints', 'true')
> >>> spark.sparkContext.setCheckpointDir('/tmp/spark/checkpoint/')
> >>> a = spark.range(10)
> >>> a.checkpoint()
> DataFrame[id: bigint]                                                           
> >>> exit(){code}
> The checkpoint data is left behind in {{/tmp/spark/checkpoint/}}. I expected Spark to clean it up on shutdown.
> The documentation for {{spark.cleaner.referenceTracking.cleanCheckpoints}} says:
> {quote}Controls whether to clean checkpoint files if the reference is out of scope.
> {quote}
> When Spark shuts down, everything goes out of scope, so I'd expect all checkpointed RDDs to be cleaned up.
> For the record, I see the same behavior in both the Scala and Python REPLs.
> Evidence the current behavior is confusing:
>  * [https://stackoverflow.com/q/52630858/877069]
>  * [https://stackoverflow.com/q/60009856/877069]
>  * [https://stackoverflow.com/q/61454740/877069]
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org