You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/07/20 09:28:04 UTC

[jira] [Commented] (SPARK-5423) ExternalAppendOnlyMap won't delete temp spilled file if some exception happens during using it

    [ https://issues.apache.org/jira/browse/SPARK-5423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14633117#comment-14633117 ] 

Apache Spark commented on SPARK-5423:
-------------------------------------

User 'zsxwing' has created a pull request for this issue:
https://github.com/apache/spark/pull/7529

> ExternalAppendOnlyMap won't delete temp spilled file if some exception happens during using it 
> -----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-5423
>                 URL: https://issues.apache.org/jira/browse/SPARK-5423
>             Project: Spark
>          Issue Type: Improvement
>          Components: Shuffle
>    Affects Versions: 1.0.0
>            Reporter: Shixiong Zhu
>            Assignee: Shixiong Zhu
>
> ExternalAppendOnlyMap won't delete temp spilled file if some exception happens during using it.
> There is already a TODO in the comment:
> {code}
>     // TODO: Ensure this gets called even if the iterator isn't drained.
>     private def cleanup() {
>       batchIndex = batchOffsets.length  // Prevent reading any other batch
>       val ds = deserializeStream
>       deserializeStream = null
>       fileStream = null
>       ds.close()
>       file.delete()
>     }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org