You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/05/01 18:16:15 UTC

[GitHub] [spark] squito commented on a change in pull request #24485: [SPARK-27590][CORE] do not consider skipped tasks when scheduling speculative tasks

squito commented on a change in pull request #24485: [SPARK-27590][CORE] do not consider skipped tasks when scheduling speculative tasks
URL: https://github.com/apache/spark/pull/24485#discussion_r280161614
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala
 ##########
 @@ -1038,7 +1035,11 @@ private[spark] class TaskSetManager(
     val minFinishedForSpeculation = (speculationQuantile * numTasks).floor.toInt
     logDebug("Checking for speculative tasks: minFinished = " + minFinishedForSpeculation)
 
-    if (tasksSuccessful >= minFinishedForSpeculation && tasksSuccessful > 0) {
+    // It's possible that a task is marked as completed by the scheduler, then the size of
+    // `successfulTaskDurations` may not equal to `tasksSuccessful`. Here we should only count the
+    // tasks that are submitted by this `TaskSetManager` and are completed successfully.
+    val numSuccessfulTasks = successfulTaskDurations.size()
+    if (numSuccessfulTasks >= minFinishedForSpeculation && numSuccessfulTasks > 0) {
 
 Review comment:
   unrelated to your change, but a potential cleanup here -- why not compute `minFinishedForSpeculation` in the construction, and even set it to `math.max(1, (speculationQuantile * numTasks).floor.toInt)`?  Then you also don't need the extra `numSuccessfulTasks > 0` here.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org