You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/03/07 16:30:37 UTC

[GitHub] [spark] attilapiros commented on a change in pull request #23986: [SPARK-27070] Fix performance bug in DefaultPartitionCoalescer

attilapiros commented on a change in pull request #23986: [SPARK-27070] Fix performance bug in DefaultPartitionCoalescer
URL: https://github.com/apache/spark/pull/23986#discussion_r263460077
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/rdd/CoalescedRDD.scala
 ##########
 @@ -213,15 +210,16 @@ private class DefaultPartitionCoalescer(val balanceSlack: Double = 0.10)
   }
 
   /**
-   * Sorts and gets the least element of the list associated with key in groupHash
+   * Gets the least element of the list associated with key in groupHash
    * The returned PartitionGroup is the least loaded of all groups that represent the machine "key"
    *
    * @param key string representing a partitioned group on preferred machine key
    * @return Option of [[PartitionGroup]] that has least elements for key
    */
-  def getLeastGroupHash(key: String): Option[PartitionGroup] = {
-    groupHash.get(key).map(_.sortWith(compare).head)
-  }
+  def getLeastGroupHash(key: String): Option[PartitionGroup] =
+    groupHash
 
 Review comment:
   What about avoiding `if` and `flatmap` and `Some()` , like:
   ```
       groupHash.get(key).filter(_.nonEmpty).map(_.minBy(_.numPartitions))
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org