You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2017/02/09 11:18:07 UTC

spark git commit: [MINOR][CORE] Fix incorrect documentation of WritableConverter

Repository: spark
Updated Branches:
  refs/heads/master 9d9d67c79 -> 1a09cd634


[MINOR][CORE] Fix incorrect documentation of WritableConverter

## What changes were proposed in this pull request?

`WritableConverter` and `WritableFactory` work in opposite directions. But both of them are documented with same description:

> A class encapsulating how to convert some type T to Writable. It stores both the Writable class corresponding to T (e.g. IntWritable for Int) and a function for doing the conversion.

This error is a result of commit 2604939. As a note, `WritableFactory` was added from commit d37978d, which resolves [SPARK-4795](https://issues.apache.org/jira/browse/SPARK-4795) with the correct description.

This PR fix the documentation of `WritableConverter`, along with some improvements on type description.

## How was this patch tested?

`build/mvn clean checkstyle:checkstyle`

Author: Lee Dongjin <do...@apache.org>

Closes #16830 from dongjinleekr/feature/fix-writableconverter-doc.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/1a09cd63
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/1a09cd63
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/1a09cd63

Branch: refs/heads/master
Commit: 1a09cd634610329e85ff212c71cf67c697da5f84
Parents: 9d9d67c
Author: Lee Dongjin <do...@apache.org>
Authored: Thu Feb 9 11:18:02 2017 +0000
Committer: Sean Owen <so...@cloudera.com>
Committed: Thu Feb 9 11:18:02 2017 +0000

----------------------------------------------------------------------
 .../scala/org/apache/spark/SparkContext.scala     | 18 ++++++++++--------
 1 file changed, 10 insertions(+), 8 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/1a09cd63/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala b/core/src/main/scala/org/apache/spark/SparkContext.scala
index 869c5d7..40189a2 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -2745,11 +2745,12 @@ private object SparkMasterRegex {
 }
 
 /**
- * A class encapsulating how to convert some type T to Writable. It stores both the Writable class
- * corresponding to T (e.g. IntWritable for Int) and a function for doing the conversion.
- * The getter for the writable class takes a ClassTag[T] in case this is a generic object
- * that doesn't know the type of T when it is created. This sounds strange but is necessary to
- * support converting subclasses of Writable to themselves (writableWritableConverter).
+ * A class encapsulating how to convert some type `T` from `Writable`. It stores both the `Writable`
+ * class corresponding to `T` (e.g. `IntWritable` for `Int`) and a function for doing the
+ * conversion.
+ * The getter for the writable class takes a `ClassTag[T]` in case this is a generic object
+ * that doesn't know the type of `T` when it is created. This sounds strange but is necessary to
+ * support converting subclasses of `Writable` to themselves (`writableWritableConverter()`).
  */
 private[spark] class WritableConverter[T](
     val writableClass: ClassTag[T] => Class[_ <: Writable],
@@ -2800,9 +2801,10 @@ object WritableConverter {
 }
 
 /**
- * A class encapsulating how to convert some type T to Writable. It stores both the Writable class
- * corresponding to T (e.g. IntWritable for Int) and a function for doing the conversion.
- * The Writable class will be used in `SequenceFileRDDFunctions`.
+ * A class encapsulating how to convert some type `T` to `Writable`. It stores both the `Writable`
+ * class corresponding to `T` (e.g. `IntWritable` for `Int`) and a function for doing the
+ * conversion.
+ * The `Writable` class will be used in `SequenceFileRDDFunctions`.
  */
 private[spark] class WritableFactory[T](
     val writableClass: ClassTag[T] => Class[_ <: Writable],


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org