You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/17 06:43:18 UTC

[GitHub] [spark] Ngone51 commented on a change in pull request #34018: [SPARK-36772] FinalizeShuffleMerge fails with an exception due to attempt id not matching

Ngone51 commented on a change in pull request #34018:
URL: https://github.com/apache/spark/pull/34018#discussion_r710793117



##########
File path: core/src/main/scala/org/apache/spark/SparkEnv.scala
##########
@@ -355,6 +355,11 @@ object SparkEnv extends Logging {
       None
     }
 
+    // Set the application attemptId in the ExternalShuffleClient is applicable.
+    // If there is no attemptId assigned, set the attemptId to -1.
+    externalShuffleClient.foreach(
+      shuffleClient => shuffleClient.setAppAttemptId(conf.getInt(config.APP_ATTEMPT_ID.key, -1)))

Review comment:
       I know the executor can get the right attempted. My concern is that keep setting it here is wrong in terms of the driver's view. People could be confused by setting 2 times.  Besides, setting after SparkEnv created for the executor makes the behavior be consistent.  
   
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org