You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by rezasafi <gi...@git.apache.org> on 2017/10/26 02:31:14 UTC

[GitHub] spark issue #19388: [SPARK-22162] Executors and the driver should use consis...

Github user rezasafi commented on the issue:

    https://github.com/apache/spark/pull/19388
  
    Sorry for the delay. It seems that to be able to commit the same rdd in different stages we need to use stageId. So the jobId and other configurations in the write method of SparkHadoopWriter should be based on the stageId of the rdd and not the rddId. I have a hacky solution for this, but I am working on a better one and will update this PR ASAP.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org