You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by an...@apache.org on 2015/10/15 23:47:15 UTC

spark git commit: fix typo bellow -> below

Repository: spark
Updated Branches:
  refs/heads/master a5719804c -> 723aa75a9


fix typo bellow -> below

Author: Britta Weber <br...@elasticsearch.com>

Closes #9136 from brwe/typo-bellow.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/723aa75a
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/723aa75a
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/723aa75a

Branch: refs/heads/master
Commit: 723aa75a9d566c698aa49597f4f655396fef77bd
Parents: a571980
Author: Britta Weber <br...@elasticsearch.com>
Authored: Thu Oct 15 14:47:11 2015 -0700
Committer: Andrew Or <an...@databricks.com>
Committed: Thu Oct 15 14:47:11 2015 -0700

----------------------------------------------------------------------
 docs/mllib-collaborative-filtering.md | 2 +-
 docs/mllib-linear-methods.md          | 4 ++--
 2 files changed, 3 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/723aa75a/docs/mllib-collaborative-filtering.md
----------------------------------------------------------------------
diff --git a/docs/mllib-collaborative-filtering.md b/docs/mllib-collaborative-filtering.md
index b3fd51d..1ad5212 100644
--- a/docs/mllib-collaborative-filtering.md
+++ b/docs/mllib-collaborative-filtering.md
@@ -119,7 +119,7 @@ All of MLlib's methods use Java-friendly types, so you can import and call them
 way you do in Scala. The only caveat is that the methods take Scala RDD objects, while the
 Spark Java API uses a separate `JavaRDD` class. You can convert a Java RDD to a Scala one by
 calling `.rdd()` on your `JavaRDD` object. A self-contained application example
-that is equivalent to the provided example in Scala is given bellow:
+that is equivalent to the provided example in Scala is given below:
 
 Refer to the [`ALS` Java docs](api/java/org/apache/spark/mllib/recommendation/ALS.html) for details on the API.
 

http://git-wip-us.apache.org/repos/asf/spark/blob/723aa75a/docs/mllib-linear-methods.md
----------------------------------------------------------------------
diff --git a/docs/mllib-linear-methods.md b/docs/mllib-linear-methods.md
index a3e1620..0c76e6e 100644
--- a/docs/mllib-linear-methods.md
+++ b/docs/mllib-linear-methods.md
@@ -230,7 +230,7 @@ All of MLlib's methods use Java-friendly types, so you can import and call them
 way you do in Scala. The only caveat is that the methods take Scala RDD objects, while the
 Spark Java API uses a separate `JavaRDD` class. You can convert a Java RDD to a Scala one by
 calling `.rdd()` on your `JavaRDD` object. A self-contained application example
-that is equivalent to the provided example in Scala is given bellow:
+that is equivalent to the provided example in Scala is given below:
 
 Refer to the [`SVMWithSGD` Java docs](api/java/org/apache/spark/mllib/classification/SVMWithSGD.html) and [`SVMModel` Java docs](api/java/org/apache/spark/mllib/classification/SVMModel.html) for details on the API.
 
@@ -612,7 +612,7 @@ All of MLlib's methods use Java-friendly types, so you can import and call them
 way you do in Scala. The only caveat is that the methods take Scala RDD objects, while the
 Spark Java API uses a separate `JavaRDD` class. You can convert a Java RDD to a Scala one by
 calling `.rdd()` on your `JavaRDD` object. The corresponding Java example to
-the Scala snippet provided, is presented bellow:
+the Scala snippet provided, is presented below:
 
 Refer to the [`LinearRegressionWithSGD` Java docs](api/java/org/apache/spark/mllib/regression/LinearRegressionWithSGD.html) and [`LinearRegressionModel` Java docs](api/java/org/apache/spark/mllib/regression/LinearRegressionModel.html) for details on the API.
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org