You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2015/05/03 00:05:56 UTC

spark git commit: [SPARK-7323] [SPARK CORE] Use insertAll instead of insert while merging combiners in reducer

Repository: spark
Updated Branches:
  refs/heads/master 856a571ef -> da303526e


[SPARK-7323] [SPARK CORE] Use insertAll instead of insert while merging combiners in reducer

Author: Mridul Muralidharan <mr...@yahoo-inc.com>

Closes #5862 from mridulm/optimize_aggregator and squashes the following commits:

61cf43a [Mridul Muralidharan] Use insertAll instead of insert - much more expensive to do it per tuple


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/da303526
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/da303526
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/da303526

Branch: refs/heads/master
Commit: da303526e54e9a0adfedb49417f383cde7870a69
Parents: 856a571
Author: Mridul Muralidharan <mr...@yahoo-inc.com>
Authored: Sat May 2 23:05:51 2015 +0100
Committer: Sean Owen <so...@cloudera.com>
Committed: Sat May 2 23:05:51 2015 +0100

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/Aggregator.scala | 5 +----
 1 file changed, 1 insertion(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/da303526/core/src/main/scala/org/apache/spark/Aggregator.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/Aggregator.scala b/core/src/main/scala/org/apache/spark/Aggregator.scala
index 3b684bb..af9765d 100644
--- a/core/src/main/scala/org/apache/spark/Aggregator.scala
+++ b/core/src/main/scala/org/apache/spark/Aggregator.scala
@@ -88,10 +88,7 @@ case class Aggregator[K, V, C] (
       combiners.iterator
     } else {
       val combiners = new ExternalAppendOnlyMap[K, C, C](identity, mergeCombiners, mergeCombiners)
-      while (iter.hasNext) {
-        val pair = iter.next()
-        combiners.insert(pair._1, pair._2)
-      }
+      combiners.insertAll(iter)
       // Update task metrics if context is not null
       // TODO: Make context non-optional in a future release
       Option(context).foreach { c =>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org