You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/12/26 03:51:00 UTC

[jira] [Commented] (SPARK-37738) PySpark date_add only accepts an integer as it's second parameter

    [ https://issues.apache.org/jira/browse/SPARK-37738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17465292#comment-17465292 ] 

Hyukjin Kwon commented on SPARK-37738:
--------------------------------------

we should make it supported for columns too if it only takes ints. feel free to create a PR 

> PySpark date_add only accepts an integer as it's second parameter
> -----------------------------------------------------------------
>
>                 Key: SPARK-37738
>                 URL: https://issues.apache.org/jira/browse/SPARK-37738
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark, Spark Core
>    Affects Versions: 3.2.0
>            Reporter: Daniel Davies
>            Priority: Minor
>
> Hello,
> I have a quick question regarding the PySpark date_add function (and it's related functions I guess). Using date_add as an example, the PySpark API takes a [column, and an int as it's second parameter.|#L2203]]
> This feels a bit weird, since the underlying SQL expression can take a column as the second parameter also- in fact, to my limited understanding, the scala [API itself|https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L3114] just calls lit on this second parameter anyway. Is there a reason date_add doesn't support a column type as the second parameter in PySpark?
> This isn't a major issue, as the alternative is of course to just use date_add in an expr statement- I just wondered what the usability is being traded off for. I'm happy to contribute a PR if this is something that would be worthwhile pursuing.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org