You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/10/04 18:06:42 UTC

[GitHub] [spark] pralabhkumar commented on a change in pull request #33980: [SPARK-32285][PYTHON] Add PySpark support for nested timestamps with arrow

pralabhkumar commented on a change in pull request #33980:
URL: https://github.com/apache/spark/pull/33980#discussion_r721598237



##########
File path: python/pyspark/sql/pandas/types.py
##########
@@ -296,7 +337,34 @@ def _check_series_convert_timestamps_localize(s, from_timezone, to_timezone):
         return s
 
 
-def _check_series_convert_timestamps_local_tz(s, timezone):
+def __handle_array_of_timestamps(series, to_tz,  from_tz=None):
+    """
+
+    :param series: Pandas series
+    :param to_tz: to timezone
+    :param from_tz: from time zone
+    :return: return series respecting timezone
+    """
+    from pandas.api.types import is_datetime64tz_dtype, is_datetime64_dtype
+    import pandas as pd
+    from pandas import Series
+    data_after_conversion = []
+    for data in series:

Review comment:
       @BryanCutler  , Have done some changes , remove all the unnecessary specialized conversion and also creating series 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org