You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/12/09 09:14:00 UTC

[jira] [Commented] (SPARK-41455) Resolve dtypes inconsistencies of date/timestamp functions

    [ https://issues.apache.org/jira/browse/SPARK-41455?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17645204#comment-17645204 ] 

Hyukjin Kwon commented on SPARK-41455:
--------------------------------------

Spark Connect currently doesn't support configuration handling so it would be difficult to make it working per the configuration for now.

> Resolve dtypes inconsistencies of date/timestamp functions
> ----------------------------------------------------------
>
>                 Key: SPARK-41455
>                 URL: https://issues.apache.org/jira/browse/SPARK-41455
>             Project: Spark
>          Issue Type: Sub-task
>          Components: PySpark
>    Affects Versions: 3.4.0
>            Reporter: Xinrong Meng
>            Priority: Major
>
> When implementing date/timestamp functions, we notice inconsistent dtypes with PySpark, as shown below.
> {code:python}
> >> sdf.select(SF.current_timestamp()).toPandas().dtypes
> current_timestamp()    datetime64[ns]
> dtype: object
> >>> cdf.select(CF.current_timestamp()).toPandas().dtypes
> current_timestamp()    datetime64[ns, America/Los_Angeles]
> {code}
> Affected functions include:
> {code:python}
> to_timestamp, from_utc_timestamp, to_utc_timestamp, timestamp_seconds, current_timestamp, date_trunc
> {code}
> We may have to implement `is_timestamp_ntz_preferred` for Connect.
> After the fix, tests of those date/timestamp functions which use `compare_by_show` should be switched to `toPandas` comparison.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org