You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2019/02/15 08:56:00 UTC

[jira] [Assigned] (SPARK-26887) Create datetime.date directly instead of creating datetime64[ns] as intermediate data.

     [ https://issues.apache.org/jira/browse/SPARK-26887?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-26887:
------------------------------------

    Assignee:     (was: Apache Spark)

> Create datetime.date directly instead of creating datetime64[ns] as intermediate data.
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-26887
>                 URL: https://issues.apache.org/jira/browse/SPARK-26887
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.4.0
>            Reporter: Takuya Ueshin
>            Priority: Major
>
> Currently {{DataFrame.toPandas()}} with arrow enabled or {{ArrowStreamPandasSerializer}} for pandas UDF with pyarrow<0.12 creates {{datetime64[ns]}} type series as intermediate data and then convert to {{datetime.date}} series, but the intermediate {{datetime64[ns]}} might cause an overflow even if the date is valid.
> {noformat}
> >>> import datetime
> >>>
> >>> t  = [datetime.date(2262, 4, 12), datetime.date(2263, 4, 12)]
> >>>
> >>> df = spark.createDataFrame(t, 'date')
> >>> df.show()
> +----------+
> |     value|
> +----------+
> |2262-04-12|
> |2263-04-12|
> +----------+
> >>>
> >>> spark.conf.set("spark.sql.execution.arrow.enabled", "true")
> >>>
> >>> df.toPandas()
>         value
> 0  1677-09-21
> 1  1678-09-21
> {noformat}
> We should avoid creating such intermediate data and create {{datetime.date}} series directly instead.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org