You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by pw...@apache.org on 2014/01/14 08:08:38 UTC

[5/8] git commit: Update Python required version to 2.7, and mention MLlib support

Update Python required version to 2.7, and mention MLlib support


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/224f1a75
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/224f1a75
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/224f1a75

Branch: refs/heads/master
Commit: 224f1a754a636de6b31d636524826d8d59e90379
Parents: 5741078
Author: Matei Zaharia <ma...@databricks.com>
Authored: Sun Jan 12 00:15:34 2014 -0800
Committer: Matei Zaharia <ma...@databricks.com>
Committed: Sun Jan 12 00:15:34 2014 -0800

----------------------------------------------------------------------
 docs/python-programming-guide.md | 8 +++++++-
 1 file changed, 7 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/224f1a75/docs/python-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/python-programming-guide.md b/docs/python-programming-guide.md
index c4236f8..b07899c 100644
--- a/docs/python-programming-guide.md
+++ b/docs/python-programming-guide.md
@@ -52,7 +52,7 @@ In addition, PySpark fully supports interactive use---simply run `./bin/pyspark`
 
 # Installing and Configuring PySpark
 
-PySpark requires Python 2.6 or higher.
+PySpark requires Python 2.7 or higher.
 PySpark applications are executed using a standard CPython interpreter in order to support Python modules that use C extensions.
 We have not tested PySpark with Python 3 or with alternative Python interpreters, such as [PyPy](http://pypy.org/) or [Jython](http://www.jython.org/).
 
@@ -149,6 +149,12 @@ sc = SparkContext(conf = conf)
 [API documentation](api/pyspark/index.html) for PySpark is available as Epydoc.
 Many of the methods also contain [doctests](http://docs.python.org/2/library/doctest.html) that provide additional usage examples.
 
+# Libraries
+
+[MLlib](mllib-guide.html) is also available in PySpark. To use it, you'll need
+[NumPy](http://www.numpy.org) version 1.7 or newer. The [MLlib guide](mllib-guide.html) contains
+some example applications.
+
 # Where to Go from Here
 
 PySpark also includes several sample programs in the [`python/examples` folder](https://github.com/apache/incubator-spark/tree/master/python/examples).