You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2022/06/14 00:36:00 UTC

[jira] [Resolved] (SPARK-39433) to_date function returns a null for the first week of the year

     [ https://issues.apache.org/jira/browse/SPARK-39433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen resolved SPARK-39433.
----------------------------------
    Resolution: Duplicate

Oh, I think it's a subset of the linked issue https://issues.apache.org/jira/browse/SPARK-38571

> to_date function returns a null for the first week of the year
> --------------------------------------------------------------
>
>                 Key: SPARK-39433
>                 URL: https://issues.apache.org/jira/browse/SPARK-39433
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.1.2
>            Reporter: CHARLES HOGG
>            Priority: Minor
>
> When I use week of year in the to_date function, the first week of the year returns a null for many years.
> ```
> df=pyrasa.sparkSession.createDataFrame([["2013-01"],["2013-02"],["2017-01"],["2018-01"]],["input"])
> df.select(func.col("input"),func.to_date(func.col("input"),"yyyy-ww").alias("date")) \
>   .show()
> ```
> ```
> +-------+----------+
> |  input|      date|
> +-------+----------+
> |2013-01|      null|
> |2013-02|2013-01-06|
> |2017-01|2017-01-01|
> |2018-01|      null|
> +-------+----------+
> ```
> Why is this? Is it a bug in the to_date function?



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org