You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maciej Szymkiewicz (Jira)" <ji...@apache.org> on 2021/12/30 15:09:00 UTC

[jira] [Resolved] (SPARK-37738) PySpark date_add only accepts an integer as it's second parameter

     [ https://issues.apache.org/jira/browse/SPARK-37738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Maciej Szymkiewicz resolved SPARK-37738.
----------------------------------------
    Fix Version/s: 3.3.0
       Resolution: Fixed

Issue resolved by pull request 35032
[https://github.com/apache/spark/pull/35032]

> PySpark date_add only accepts an integer as it's second parameter
> -----------------------------------------------------------------
>
>                 Key: SPARK-37738
>                 URL: https://issues.apache.org/jira/browse/SPARK-37738
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark, SQL
>    Affects Versions: 3.2.0
>            Reporter: Daniel Davies
>            Priority: Minor
>             Fix For: 3.3.0
>
>
> Hello,
> I have a quick question regarding the PySpark date_add function (and it's related functions I guess). Using date_add as an example, the PySpark API takes a [column, and an int as it's second parameter.|#L2203]]
> This feels a bit weird, since the underlying SQL expression can take a column as the second parameter also- in fact, to my limited understanding, the scala [API itself|https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L3114] just calls lit on this second parameter anyway. Is there a reason date_add doesn't support a column type as the second parameter in PySpark?
> This isn't a major issue, as the alternative is of course to just use date_add in an expr statement- I just wondered what the usability is being traded off for. I'm happy to contribute a PR if this is something that would be worthwhile pursuing.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org