You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/03/04 05:41:48 UTC

[GitHub] [spark] Ngone51 commented on a change in pull request #35719: [SPARK-38401][SQL][CORE] Unify get preferred locations for shuffle in AQE

Ngone51 commented on a change in pull request #35719:
URL: https://github.com/apache/spark/pull/35719#discussion_r819279830



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/ShuffledRowRDD.scala
##########
@@ -177,19 +177,36 @@ class ShuffledRowRDD(
     val tracker = SparkEnv.get.mapOutputTracker.asInstanceOf[MapOutputTrackerMaster]
     partition.asInstanceOf[ShuffledRowRDDPartition].spec match {
       case CoalescedPartitionSpec(startReducerIndex, endReducerIndex, _) =>
-        // TODO order by partition size.
-        startReducerIndex.until(endReducerIndex).flatMap { reducerIndex =>
-          tracker.getPreferredLocationsForShuffle(dependency, reducerIndex)
-        }
+        tracker.getPreferredLocationsForShuffle(
+          dependency,
+          0,
+          dependency.rdd.getNumPartitions,
+          startReducerIndex,
+          endReducerIndex)
 
-      case PartialReducerPartitionSpec(_, startMapIndex, endMapIndex, _) =>
-        tracker.getMapLocation(dependency, startMapIndex, endMapIndex)
+      case PartialReducerPartitionSpec(reducerIndex, startMapIndex, endMapIndex, _) =>
+        tracker.getPreferredLocationsForShuffle(

Review comment:
       Wouldn't this change the behavior? Previously, `getMapLocation` returns the location anyway but`getPreferredLocationsForShuffle` seems to only return the location only when it has large shuffle data size.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org