You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/13 23:58:20 UTC

[GitHub] viirya commented on a change in pull request #23563: [SPARK-26634]Do not allow task of FetchFailureStage commit in OutputCommitCoordinator

viirya commented on a change in pull request #23563: [SPARK-26634]Do not allow task of FetchFailureStage commit in OutputCommitCoordinator
URL: https://github.com/apache/spark/pull/23563#discussion_r256641340
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/scheduler/OutputCommitCoordinator.scala
 ##########
 @@ -114,13 +115,16 @@ private[spark] class OutputCommitCoordinator(conf: SparkConf, isDriver: Boolean)
    * yet been initialized.
    *
    * @param stage the stage id.
+   * @param stageAttemptNumber the stage attempt number.
    * @param maxPartitionId the maximum partition id that could appear in this stage's tasks (i.e.
    *                       the maximum possible value of `context.partitionId`).
    */
-  private[scheduler] def stageStart(stage: Int, maxPartitionId: Int): Unit = synchronized {
+  private[scheduler] def stageStart(
+    stage: Int, stageAttemptNumber: Int, maxPartitionId: Int): Unit = synchronized {
     stageStates.get(stage) match {
       case Some(state) =>
         require(state.authorizedCommitters.length == maxPartitionId + 1)
+        state.latestStageAttempt = stageAttemptNumber
         logInfo(s"Reusing state from previous attempt of stage $stage.")
 
       case _ =>
 
 Review comment:
   It is better to assign  `stageAttemptNumber` to `latestStageAttempt` of newly create `StageState` too.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org