You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (JIRA)" <ji...@apache.org> on 2018/07/02 06:58:00 UTC
[jira] [Commented] (SPARK-24703) Unable to multiply calender
interval with long/int
[ https://issues.apache.org/jira/browse/SPARK-24703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16529456#comment-16529456 ]
Takeshi Yamamuro commented on SPARK-24703:
------------------------------------------
yea, I've noticed that the SQL standard supports the syntax: http://download.mimer.com/pub/developer/docs/html_100/Mimer_SQL_Engine_DocSet/Syntax_Rules4.html#wp1113535
> Unable to multiply calender interval with long/int
> --------------------------------------------------
>
> Key: SPARK-24703
> URL: https://issues.apache.org/jira/browse/SPARK-24703
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.1
> Reporter: Priyanka Garg
> Priority: Major
>
> When i am trying to multiply calender interval with long/int , I am getting below error. The same syntax is supported in Postgres.
> spark.sql("select 3 * interval '1' day").show()
> org.apache.spark.sql.AnalysisException: cannot resolve '(3 * interval 1 days)' due to data type mismatch: differing types in '(3 * interval 1 days)' (int and calendarinterval).; line 1 pos 7;
> 'Project [unresolvedalias((3 * interval 1 days), None)]
> +- OneRowRelation
>
> at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
> at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:93)
> at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:85)
> at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org