You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by ma...@apache.org on 2014/09/09 00:46:13 UTC

git commit: [SPARK-3417] Use new-style classes in PySpark

Repository: spark
Updated Branches:
  refs/heads/master 26bc7655d -> 939a322c8


[SPARK-3417] Use new-style classes in PySpark

Tiny PR making SQLContext a new-style class.  This allows various type logic to work more effectively

```Python
In [1]: import pyspark

In [2]: pyspark.sql.SQLContext.mro()
Out[2]: [pyspark.sql.SQLContext, object]
```

Author: Matthew Rocklin <mr...@gmail.com>

Closes #2288 from mrocklin/sqlcontext-new-style-class and squashes the following commits:

4aadab6 [Matthew Rocklin] update other old-style classes
a2dc02f [Matthew Rocklin] pyspark.sql.SQLContext is new-style class


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/939a322c
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/939a322c
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/939a322c

Branch: refs/heads/master
Commit: 939a322c85956eda150b10afb2ed1d8d959a7bdf
Parents: 26bc765
Author: Matthew Rocklin <mr...@gmail.com>
Authored: Mon Sep 8 15:45:28 2014 -0700
Committer: Michael Armbrust <mi...@databricks.com>
Committed: Mon Sep 8 15:45:36 2014 -0700

----------------------------------------------------------------------
 python/pyspark/mllib/random.py | 2 +-
 python/pyspark/mllib/util.py   | 2 +-
 python/pyspark/sql.py          | 2 +-
 python/pyspark/storagelevel.py | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/939a322c/python/pyspark/mllib/random.py
----------------------------------------------------------------------
diff --git a/python/pyspark/mllib/random.py b/python/pyspark/mllib/random.py
index 3e59c73..d53c95f 100644
--- a/python/pyspark/mllib/random.py
+++ b/python/pyspark/mllib/random.py
@@ -28,7 +28,7 @@ from pyspark.serializers import NoOpSerializer
 __all__ = ['RandomRDDs', ]
 
 
-class RandomRDDs:
+class RandomRDDs(object):
     """
     Generator methods for creating RDDs comprised of i.i.d samples from
     some distribution.

http://git-wip-us.apache.org/repos/asf/spark/blob/939a322c/python/pyspark/mllib/util.py
----------------------------------------------------------------------
diff --git a/python/pyspark/mllib/util.py b/python/pyspark/mllib/util.py
index 4962d05..1c7b8c8 100644
--- a/python/pyspark/mllib/util.py
+++ b/python/pyspark/mllib/util.py
@@ -25,7 +25,7 @@ from pyspark.rdd import RDD
 from pyspark.serializers import NoOpSerializer
 
 
-class MLUtils:
+class MLUtils(object):
 
     """
     Helper methods to load, save and pre-process data used in MLlib.

http://git-wip-us.apache.org/repos/asf/spark/blob/939a322c/python/pyspark/sql.py
----------------------------------------------------------------------
diff --git a/python/pyspark/sql.py b/python/pyspark/sql.py
index 004d493..53eea6d 100644
--- a/python/pyspark/sql.py
+++ b/python/pyspark/sql.py
@@ -899,7 +899,7 @@ def _create_cls(dataType):
     return Row
 
 
-class SQLContext:
+class SQLContext(object):
 
     """Main entry point for Spark SQL functionality.
 

http://git-wip-us.apache.org/repos/asf/spark/blob/939a322c/python/pyspark/storagelevel.py
----------------------------------------------------------------------
diff --git a/python/pyspark/storagelevel.py b/python/pyspark/storagelevel.py
index 2aa0fb9..676aa0f 100644
--- a/python/pyspark/storagelevel.py
+++ b/python/pyspark/storagelevel.py
@@ -18,7 +18,7 @@
 __all__ = ["StorageLevel"]
 
 
-class StorageLevel:
+class StorageLevel(object):
 
     """
     Flags for controlling the storage of an RDD. Each StorageLevel records whether to use memory,


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org