You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "zhengruifeng (via GitHub)" <gi...@apache.org> on 2023/03/29 03:38:00 UTC

[GitHub] [spark] zhengruifeng opened a new pull request, #40582: [SPARK-42954][PYTHON][CONNECT] Add `YearMonthIntervalType` to PySpark and Spark Connect Python Client

zhengruifeng opened a new pull request, #40582:
URL: https://github.com/apache/spark/pull/40582

   ### What changes were proposed in this pull request?
   Add `YearMonthIntervalType` to PySpark and Spark Connect Python Client
   
   ### Why are the changes needed?
   function parity
   
   **Note** 
   the added  `YearMonthIntervalType` is not support in `collect`/`createDataFrame`, since I don't find a python built-in type for `YearMonthIntervalType` (like `datetime.timedelta` for `DayTimeIntervalType`), we need further discussion.
   
   
   ### Does this PR introduce _any_ user-facing change?
   yes, new data type in python
   
   before this PR
   ```
   In [1]: spark.sql("SELECT INTERVAL '10-8' YEAR TO MONTH AS interval")
   Out[1]: ---------------------------------------------------------------------------
   ValueError                                Traceback (most recent call last)
   File ~/Dev/spark/python/pyspark/sql/dataframe.py:570, in DataFrame.schema(self)
       568 try:
       569     self._schema = cast(
   --> 570         StructType, _parse_datatype_json_string(self._jdf.schema().json())
       571     )
       572 except Exception as e:
   
   ...
   
   ValueError: Unable to parse datatype from schema. Could not parse datatype: interval year to month
   ```
   
   
   after this PR
   ```
   In [3]: spark.sql("SELECT INTERVAL '10-8' YEAR TO MONTH AS interval")
   Out[3]: DataFrame[interval: interval year to month]
   ```
   
   
   ### How was this patch tested?
   added UT


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng closed pull request #40582: [SPARK-42954][PYTHON][CONNECT] Add `YearMonthIntervalType` to PySpark and Spark Connect Python Client

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng closed pull request #40582: [SPARK-42954][PYTHON][CONNECT] Add `YearMonthIntervalType` to PySpark and Spark Connect Python Client
URL: https://github.com/apache/spark/pull/40582


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on pull request #40582: [SPARK-42954][PYTHON][CONNECT] Add `YearMonthIntervalType` to PySpark and Spark Connect Python Client

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on PR #40582:
URL: https://github.com/apache/spark/pull/40582#issuecomment-1488273071

   thank you for reviews, merged into master


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org