You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Priyanka Garg (JIRA)" <ji...@apache.org> on 2018/08/07 06:05:00 UTC

[jira] [Updated] (SPARK-24703) Unable to multiply calendar interval

     [ https://issues.apache.org/jira/browse/SPARK-24703?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Priyanka Garg updated SPARK-24703:
----------------------------------
    Summary: Unable to multiply calendar interval  (was: Unable to multiply calender interval with long/int)

> Unable to multiply calendar interval
> ------------------------------------
>
>                 Key: SPARK-24703
>                 URL: https://issues.apache.org/jira/browse/SPARK-24703
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.1
>            Reporter: Priyanka Garg
>            Priority: Major
>
> When i am trying to multiply calender interval with long/int , I am getting below error. The same syntax is supported in Postgres.
>  spark.sql("select 3 *  interval '1' day").show()
> org.apache.spark.sql.AnalysisException: cannot resolve '(3 * interval 1 days)' due to data type mismatch: differing types in '(3 * interval 1 days)' (int and calendarinterval).; line 1 pos 7;
> 'Project [unresolvedalias((3 * interval 1 days), None)]
> +- OneRowRelation
>  
>   at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
>   at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:93)
>   at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:85)
>   at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org