You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Fede Bar (JIRA)" <ji...@apache.org> on 2016/01/18 21:08:39 UTC

[jira] [Commented] (SPARK-10975) Shuffle files left behind on Mesos without dynamic allocation

    [ https://issues.apache.org/jira/browse/SPARK-10975?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15105769#comment-15105769 ] 

Fede Bar commented on SPARK-10975:
----------------------------------

Hi, this issue is not quite resolved actually, please see the ticket [SPARK-12430|https://issues.apache.org/jira/browse/SPARK-12430] .
The clean-up does not delete blockmgr-<UUID> folder but only spark-<UUID> folder, even if now those directories are created under Mesos temporary path.




> Shuffle files left behind on Mesos without dynamic allocation
> -------------------------------------------------------------
>
>                 Key: SPARK-10975
>                 URL: https://issues.apache.org/jira/browse/SPARK-10975
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 1.5.1
>            Reporter: Iulian Dragos
>
> (from mailing list)
> Running on Mesos in coarse-grained mode. No dynamic allocation or shuffle service. 
> I see that there are two types of temporary files under /tmp folder associated with every executor: /tmp/spark-<UUID> and /tmp/blockmgr-<UUID>. When job is finished /tmp/spark-<UUID> is gone, but blockmgr directory is left with all gigabytes in it. 
> The reason is that logic to clean up files is only enabled when the shuffle service is running, see https://github.com/apache/spark/pull/7820
> The shuffle files should be placed in the Mesos sandbox or under `tmp/spark` unless the shuffle service is enabled.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org