You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by me...@apache.org on 2014/11/11 07:39:13 UTC
spark git commit: [branch-1.1][SPARK-3990] add a note on ALS usage
Repository: spark
Updated Branches:
refs/heads/branch-1.1 11798d00f -> b2cb357d7
[branch-1.1][SPARK-3990] add a note on ALS usage
Because we switched back to Kryo in #3187 , we need to leave a note about the workaround.
Author: Xiangrui Meng <me...@databricks.com>
Closes #3190 from mengxr/SPARK-3990-1.1 and squashes the following commits:
d4818f3 [Xiangrui Meng] fix python style
53725b0 [Xiangrui Meng] add a note about SPARK-3990
56ad70e [Xiangrui Meng] add a note about SPARK-3990
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b2cb357d
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/b2cb357d
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/b2cb357d
Branch: refs/heads/branch-1.1
Commit: b2cb357d7d5bdbdaa00db2b56d7a2651caece93f
Parents: 11798d0
Author: Xiangrui Meng <me...@databricks.com>
Authored: Mon Nov 10 22:39:09 2014 -0800
Committer: Xiangrui Meng <me...@databricks.com>
Committed: Mon Nov 10 22:39:09 2014 -0800
----------------------------------------------------------------------
python/pyspark/mllib/recommendation.py | 12 ++++++++++++
1 file changed, 12 insertions(+)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/b2cb357d/python/pyspark/mllib/recommendation.py
----------------------------------------------------------------------
diff --git a/python/pyspark/mllib/recommendation.py b/python/pyspark/mllib/recommendation.py
index e863fc2..d4c06c0 100644
--- a/python/pyspark/mllib/recommendation.py
+++ b/python/pyspark/mllib/recommendation.py
@@ -60,6 +60,18 @@ class MatrixFactorizationModel(object):
class ALS(object):
+ """Alternating Least Squares matrix factorization.
+
+ SPARK-3990: In Spark 1.1.x, we use Kryo serialization by default in
+ PySpark. ALS does not work under this default setting. You can switch
+ back to the default Java serialization by setting:
+
+ spark.serializer=org.apache.spark.serializer.JavaSerializer
+
+ Please go to http://spark.apache.org/docs/latest/configuration.html
+ for instructions on how to configure Spark.
+ """
+
@classmethod
def train(cls, ratings, rank, iterations=5, lambda_=0.01, blocks=-1):
sc = ratings.context
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org