You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Brandon Bradley (JIRA)" <ji...@apache.org> on 2019/04/04 16:05:00 UTC

[jira] [Commented] (SPARK-26393) Different behaviors of date_add when calling it inside expr

    [ https://issues.apache.org/jira/browse/SPARK-26393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16810015#comment-16810015 ] 

Brandon Bradley commented on SPARK-26393:
-----------------------------------------

I'm experiencing this as well; except I get `TypeError: 'Column' object is not callable` as an error.

Where is the discussion around disallowing other types?

> Different behaviors of date_add when calling it inside expr
> -----------------------------------------------------------
>
>                 Key: SPARK-26393
>                 URL: https://issues.apache.org/jira/browse/SPARK-26393
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.3.2
>            Reporter: Ahmed Kamal
>            Priority: Minor
>
> When Calling date_add from pyspark.sql.functions directly without using expr, like this : 
> {code:java}
> df.withColumn("added", F.date_add(F.to_date(F.lit('1998-9-26')), F.col('days'))).toPandas(){code}
> It will raise Error : `TypeError: Column is not iterable`
> because it only taking a number not a column 
> but when i try to use it inside an expr, like this :
> {code:java}
> df.withColumn("added", F.expr("date_add(to_date('1998-9-26'), days)")).toPandas(){code}
> It will work fine.
> Shouldn't it behave the same way ? 
> and i think its logical to accept a column  here as well.
> A python Notebook to demonstrate :
> [https://gist.github.com/AhmedKamal20/fec10337e815baa44f115d307e3b07eb]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org