You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "ueshin (via GitHub)" <gi...@apache.org> on 2023/03/02 23:31:54 UTC

[GitHub] [spark] ueshin opened a new pull request, #40260: [SPARK-42630][CONNECT][PYTHON] Delay parsing DDL string until SparkConnectClient is available

ueshin opened a new pull request, #40260:
URL: https://github.com/apache/spark/pull/40260

   ### What changes were proposed in this pull request?
   
   Delays parsing DDL string for Python UDFs until `SparkConnectClient` is available.
   
   Also changes `createDataFrame` to use the proto `DDLParse`.
   
   ### Why are the changes needed?
   
   Currently `parse_data_type` depends on `PySparkSession` that creates a local PySpark, but it won't be available in the client side.
   
   When `SparkConnectClient` is available, we can use the new proto `DDLParse` to parse the data types as string.
   
   ### Does this PR introduce _any_ user-facing change?
   
   The UDF's `returnType` attribute could be a string in Spark Connect if it is provided as string.
   
   ### How was this patch tested?
   
   Existing tests.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] xinrong-meng commented on a diff in pull request #40260: [SPARK-42630][CONNECT][PYTHON] Delay parsing DDL string until SparkConnectClient is available

Posted by "xinrong-meng (via GitHub)" <gi...@apache.org>.
xinrong-meng commented on code in PR #40260:
URL: https://github.com/apache/spark/pull/40260#discussion_r1123974740


##########
python/pyspark/sql/connect/udf.py:
##########
@@ -99,9 +97,7 @@ def __init__(
             )
 
         self.func = func
-        self._returnType = (
-            parse_data_type(returnType) if isinstance(returnType, str) else returnType
-        )
+        self._returnType = returnType

Review Comment:
   In the vanilla PySpark, a `returnType` property is defined as https://github.com/apache/spark/blob/master/python/pyspark/sql/udf.py#L233, which returns DataType always.
   
   In the existing Connect's code, `wrapper.returnType = self._returnType` is utilized instead.
   
   I am afraid that line of change may break that property parity.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] xinrong-meng commented on a diff in pull request #40260: [SPARK-42630][CONNECT][PYTHON] Delay parsing DDL string until SparkConnectClient is available

Posted by "xinrong-meng (via GitHub)" <gi...@apache.org>.
xinrong-meng commented on code in PR #40260:
URL: https://github.com/apache/spark/pull/40260#discussion_r1123974740


##########
python/pyspark/sql/connect/udf.py:
##########
@@ -99,9 +97,7 @@ def __init__(
             )
 
         self.func = func
-        self._returnType = (
-            parse_data_type(returnType) if isinstance(returnType, str) else returnType
-        )
+        self._returnType = returnType

Review Comment:
   In the vanilla PySpark, a `returnType` property is defined as https://github.com/apache/spark/blob/master/python/pyspark/sql/udf.py#L233, which returns a DataType always.
   
   In the existing Connect's code, `wrapper.returnType = self._returnType` is utilized instead.
   
   I am afraid that line of change may break that property parity.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] xinrong-meng commented on a diff in pull request #40260: [SPARK-42630][CONNECT][PYTHON] Delay parsing DDL string until SparkConnectClient is available

Posted by "xinrong-meng (via GitHub)" <gi...@apache.org>.
xinrong-meng commented on code in PR #40260:
URL: https://github.com/apache/spark/pull/40260#discussion_r1123976155


##########
python/pyspark/sql/connect/_typing.py:
##########
@@ -57,7 +57,7 @@ class UserDefinedFunctionLike(Protocol):
     deterministic: bool
 
     @property
-    def returnType(self) -> DataType:
+    def returnType(self) -> DataTypeOrString:

Review Comment:
   Oh yes, I commented separately below.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon closed pull request #40260: [SPARK-42630][CONNECT][PYTHON] Introduce UnparsedDataType and delay parsing DDL string until SparkConnectClient is available

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon closed pull request #40260: [SPARK-42630][CONNECT][PYTHON] Introduce UnparsedDataType and delay parsing DDL string until SparkConnectClient is available
URL: https://github.com/apache/spark/pull/40260


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on a diff in pull request #40260: [SPARK-42630][CONNECT][PYTHON] Delay parsing DDL string until SparkConnectClient is available

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #40260:
URL: https://github.com/apache/spark/pull/40260#discussion_r1123992438


##########
python/pyspark/sql/connect/_typing.py:
##########
@@ -57,7 +57,7 @@ class UserDefinedFunctionLike(Protocol):
     deterministic: bool
 
     @property
-    def returnType(self) -> DataType:
+    def returnType(self) -> DataTypeOrString:

Review Comment:
   Yeah, but I am still confused about it:
   
   that the old implementation
   `PySparkSession.builder.getOrCreate().createDataFrame(data=[], schema=data_type).schema` works.
   
   I also tried 
   ```
       session = PySparkSession.builder.getOrCreate()
       parsed = session.client._analyze(  # type: ignore[attr-defined]
           method="ddl_parse", ddl_string=data_type
       ).parsed
   ```
   and at least the tests passed.
   
   But if I try
   ```
        parsed = PySparkSession.builder.getOrCreate().client._analyze(  # type: ignore[attr-defined]
           method="ddl_parse", ddl_string=data_type
       ).parsed
   ```
   the tests always fail with `ValueError: Cannot invoke RPC on closed channel!`
   
   Maybe we will have to add a pure python ddl parser, i don't know



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on a diff in pull request #40260: [SPARK-42630][CONNECT][PYTHON] Delay parsing DDL string until SparkConnectClient is available

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #40260:
URL: https://github.com/apache/spark/pull/40260#discussion_r1123974022


##########
python/pyspark/sql/connect/_typing.py:
##########
@@ -57,7 +57,7 @@ class UserDefinedFunctionLike(Protocol):
     deterministic: bool
 
     @property
-    def returnType(self) -> DataType:
+    def returnType(self) -> DataTypeOrString:

Review Comment:
   do this introduce a behavior change?
   
   https://github.com/apache/spark/blob/6ff760d483124b121d79c3a2d5fdc3ee3f27dd00/python/pyspark/sql/_typing.pyi#L70-L77
   
   It seems that the PySpark UDF's `returnType` is always a `DataType`
   
   cc @xinrong-meng 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] ueshin commented on a diff in pull request #40260: [SPARK-42630][CONNECT][PYTHON] Delay parsing DDL string until SparkConnectClient is available

Posted by "ueshin (via GitHub)" <gi...@apache.org>.
ueshin commented on code in PR #40260:
URL: https://github.com/apache/spark/pull/40260#discussion_r1123979423


##########
python/pyspark/sql/connect/_typing.py:
##########
@@ -57,7 +57,7 @@ class UserDefinedFunctionLike(Protocol):
     deterministic: bool
 
     @property
-    def returnType(self) -> DataType:
+    def returnType(self) -> DataTypeOrString:

Review Comment:
   Yes, that's true.
   However, the client is not available here and we don't have a proper way to parse the string here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] xinrong-meng commented on a diff in pull request #40260: [SPARK-42630][CONNECT][PYTHON] Delay parsing DDL string until SparkConnectClient is available

Posted by "xinrong-meng (via GitHub)" <gi...@apache.org>.
xinrong-meng commented on code in PR #40260:
URL: https://github.com/apache/spark/pull/40260#discussion_r1123974740


##########
python/pyspark/sql/connect/udf.py:
##########
@@ -99,9 +97,7 @@ def __init__(
             )
 
         self.func = func
-        self._returnType = (
-            parse_data_type(returnType) if isinstance(returnType, str) else returnType
-        )
+        self._returnType = returnType

Review Comment:
   In the vanilla PySpark, a `returnType` property is defined as https://github.com/apache/spark/blob/master/python/pyspark/sql/udf.py#L233, which returns a DataType always.
   
   In the existing Connect's code, `wrapper.returnType = self._returnType` is utilized instead.
   
   I am afraid this line of change may break that property parity.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40260: [SPARK-42630][CONNECT][PYTHON] Introduce UnparsedDataType and delay parsing DDL string until SparkConnectClient is available

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40260:
URL: https://github.com/apache/spark/pull/40260#issuecomment-1461828508

   Merged to master and branch-3.4.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org