You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Weizhong (JIRA)" <ji...@apache.org> on 2015/06/03 08:20:50 UTC

[jira] [Comment Edited] (SPARK-7917) Spark doesn't clean up Application Directories (local dirs)

    [ https://issues.apache.org/jira/browse/SPARK-7917?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14570339#comment-14570339 ] 

Weizhong edited comment on SPARK-7917 at 6/3/15 6:20 AM:
---------------------------------------------------------

On standalone mode, will kept one tmp dir on each executor as local root dir and reuse it each time we submit a application, the tmp dir will be deleted when Worker stopped.


was (Author: sephiroth-lin):
On standalone mode, will kept one tmp dir on each executor and reuse it each time we submit a application, the tmp dir will be deleted when Worker stopped.

> Spark doesn't clean up Application Directories (local dirs) 
> ------------------------------------------------------------
>
>                 Key: SPARK-7917
>                 URL: https://issues.apache.org/jira/browse/SPARK-7917
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.3.0
>            Reporter: Zach Fry
>
> Similar to SPARK-4834. 
> Spark does clean up the cache and lock files in the local dirs, however, it doesn't clean up the actual directories. 
> We have to write custom scripts to go back through the local dirs and find directories that don't contain any files and clear those out. 
> Its a pretty simple repro: 
> Run a job that does some shuffling, wait for the shuffle files to get cleaned up, go and look on disk at spark.local.dir and notice that the directory(s) are still there, but there are no files in them.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org