You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Mateus Pires (JIRA)" <ji...@apache.org> on 2018/07/03 14:50:00 UTC
[jira] [Issue Comment Deleted] (SPARK-24702) Unable to cast to
calendar interval in spark sql.
[ https://issues.apache.org/jira/browse/SPARK-24702?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Mateus Pires updated SPARK-24702:
----------------------------------------
Comment: was deleted
(was: https://github.com/apache/spark/pull/21706)
> Unable to cast to calendar interval in spark sql.
> -------------------------------------------------
>
> Key: SPARK-24702
> URL: https://issues.apache.org/jira/browse/SPARK-24702
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.1
> Reporter: Priyanka Garg
> Priority: Major
>
> when I am trying to cast string type to calendar interval type, I am getting the following error:
> spark.sql("select cast(cast(interval '1' day as string) as calendarinterval)").show()
> ------------------------------------------------^^^
>
> at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitPrimitiveDataType$1.apply(AstBuilder.scala:1673)
> at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitPrimitiveDataType$1.apply(AstBuilder.scala:1651)
> at org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:108)
> at org.apache.spark.sql.catalyst.parser.AstBuilder.visitPrimitiveDataType(AstBuilder.scala:1651)
> at org.apache.spark.sql.catalyst.parser.AstBuilder.visitPrimitiveDataType(AstBuilder.scala:49)
> at org.apache.spark.sql.catalyst.parser.SqlBaseParser$PrimitiveDataTypeContext.accept(SqlBaseParser.java:13779)
> at org.apache.spark.sql.catalyst.parser.AstBuilder.typedVisit(AstBuilder.scala:55)
> at org.apache.spark.sql.catalyst.parser.AstBuilder.org$apache$spark$sql$catalyst$parser$AstBuilder$$visitSparkDataType(AstBuilde
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org