You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/10/12 17:13:59 UTC

[GitHub] [spark] ueshin commented on a change in pull request #34197: [SPARK-36938][PYTHON] Inline type hints for group.py in python/pyspark/sql

ueshin commented on a change in pull request #34197:
URL: https://github.com/apache/spark/pull/34197#discussion_r727337026



##########
File path: python/pyspark/sql/group.py
##########
@@ -53,12 +64,12 @@ class GroupedData(PandasGroupedOpsMixin):
     .. versionadded:: 1.3
     """
 
-    def __init__(self, jgd, df):
+    def __init__(self, jgd: JavaObject, df: DataFrame) -> None:
         self._jgd = jgd
         self._df = df
-        self.sql_ctx = df.sql_ctx
+        self.sql_ctx: SQLContext = df.sql_ctx
 
-    def agg(self, *exprs):
+    def agg(self, *exprs: Column) -> DataFrame:

Review comment:
       The overload definitions are missing?

##########
File path: python/pyspark/sql/group.py
##########
@@ -14,19 +14,27 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 #
+from __future__ import annotations

Review comment:
       We can't use `annotations` future flag yet. We still support Python 3.6 as of now.

##########
File path: python/pyspark/sql/group.py
##########
@@ -53,12 +64,12 @@ class GroupedData(PandasGroupedOpsMixin):
     .. versionadded:: 1.3
     """
 
-    def __init__(self, jgd, df):
+    def __init__(self, jgd: JavaObject, df: DataFrame) -> None:

Review comment:
       nit: I don't think we need `-> None` for the initializer.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org