You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by srowen <gi...@git.apache.org> on 2018/01/29 12:08:13 UTC

[GitHub] spark issue #20422: [SPARK-23253][Core][Shuffle]Only write shuffle temporary...

Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/20422
  
    Is this just trying to reuse a file that should have been cleaned up after prior failure? If so is that possible as a more direct solution? I wonder if there aren't corner cases here where the file exists and it is still being written to by another process. This could result in corruption. But I am not familiar with this mechanism


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org