You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by rx...@apache.org on 2014/01/11 21:08:04 UTC

[3/3] git commit: Merge pull request #359 from ScrapCodes/clone-writables

Merge pull request #359 from ScrapCodes/clone-writables

We clone hadoop key and values by default and reuse objects if asked to.

 We try to clone for most common types of writables and we call WritableUtils.clone otherwise intention is to optimize, for example for NullWritable there is no need and for Long, int and String creating a new object with value set would be faster than doing copy on object hopefully.

There is another way to do this PR where we ask for both key and values whether to clone them or not, but could not think of a use case for it except either of them is actually a NullWritable for which I have already worked around. So thought that would be unnecessary.


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/ee6e7f9b
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/ee6e7f9b
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/ee6e7f9b

Branch: refs/heads/master
Commit: ee6e7f9b8cc56985787546882fba291cf9ad7667
Parents: 4216178 59b03e0
Author: Reynold Xin <rx...@apache.org>
Authored: Sat Jan 11 12:07:55 2014 -0800
Committer: Reynold Xin <rx...@apache.org>
Committed: Sat Jan 11 12:07:55 2014 -0800

----------------------------------------------------------------------
 .../scala/org/apache/spark/SparkContext.scala   | 78 +++++++++++---------
 .../scala/org/apache/spark/rdd/HadoopRDD.scala  | 29 +++++---
 .../org/apache/spark/rdd/NewHadoopRDD.scala     | 20 ++++-
 .../scala/org/apache/spark/util/Utils.scala     | 28 ++++++-
 4 files changed, 106 insertions(+), 49 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/ee6e7f9b/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------

http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/ee6e7f9b/core/src/main/scala/org/apache/spark/rdd/NewHadoopRDD.scala
----------------------------------------------------------------------