You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/02 18:46:32 UTC

[GitHub] [spark] vanzin commented on a change in pull request #24245: [SPARK-13704][CORE][YARN] Reduce rack resolution time

vanzin commented on a change in pull request #24245: [SPARK-13704][CORE][YARN] Reduce rack resolution time
URL: https://github.com/apache/spark/pull/24245#discussion_r271444178
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala
 ##########
 @@ -186,8 +186,23 @@ private[spark] class TaskSetManager(
 
   // Add all our tasks to the pending lists. We do this in reverse order
   // of task index so that tasks with low indices get launched first.
-  for (i <- (0 until numTasks).reverse) {
-    addPendingTask(i)
+  addPendingTasks()
+
+  private def addPendingTasks(): Unit = {
+    val (_, duration) = Utils.timeTakenMs {
+      for (i <- (0 until numTasks).reverse) {
+        addPendingTask(i, resolveRacks = false)
+      }
+      // Resolve the rack for each host. This can be slow, so de-dupe the list of hosts,
+      // and assign the rack to all relevant task indices.
+      val racks = sched.getRacksForHosts(pendingTasksForHost.keySet.toSeq)
 
 Review comment:
   There's an implicit assumption here that `map.keySet` and `map.values` iterate in the same order. I'm not sure if that's guaranteed, and at the same time I don't see why that wouldn't be the case, but just wanted to point this out.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org