You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Prasanna Gautam (JIRA)" <ji...@apache.org> on 2015/05/08 23:19:01 UTC
[jira] [Commented] (SPARK-2572) Can't delete local dir on executor
automatically when running spark over Mesos.
[ https://issues.apache.org/jira/browse/SPARK-2572?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14535557#comment-14535557 ]
Prasanna Gautam commented on SPARK-2572:
----------------------------------------
This is still happening as of Spark-1.3.0 with pySpark, when the context is closed the files aren't deleted. Neither does sc.clearFiles() seem to remove the /tmp/spark-* directories.
> Can't delete local dir on executor automatically when running spark over Mesos.
> -------------------------------------------------------------------------------
>
> Key: SPARK-2572
> URL: https://issues.apache.org/jira/browse/SPARK-2572
> Project: Spark
> Issue Type: Bug
> Components: Mesos
> Affects Versions: 1.0.0
> Reporter: Yadong Qi
> Priority: Minor
>
> When running spark over Mesos in “fine-grained” modes or “coarse-grained” mode. After the application finished.The local dir(/tmp/spark-local-20140718114058-834c) on executor can't not delete automatically.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org