You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kent Yao (Jira)" <ji...@apache.org> on 2020/08/31 09:58:00 UTC
[jira] [Commented] (SPARK-32752) Alias breaks for interval typed
literals
[ https://issues.apache.org/jira/browse/SPARK-32752?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17187613#comment-17187613 ]
Kent Yao commented on SPARK-32752:
----------------------------------
this cases will be captured by grammar rule `multiUnitsInterval`, e.g. for interval '1 day' day, the value is `1 day` and the unit is day
> Alias breaks for interval typed literals
> ----------------------------------------
>
> Key: SPARK-32752
> URL: https://issues.apache.org/jira/browse/SPARK-32752
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.0.0, 3.1.0
> Reporter: Kent Yao
> Priority: Major
>
> Cases we found:
> {code:java}
> +-- !query
> +select interval '1 day' as day
> +-- !query schema
> +struct<>
> +-- !query output
> +org.apache.spark.sql.catalyst.parser.ParseException
> +
> +no viable alternative at input 'as'(line 1, pos 24)
> +
> +== SQL ==
> +select interval '1 day' as day
> +------------------------^^^
> +
> +
> +-- !query
> +select interval '1 day' day
> +-- !query schema
> +struct<>
> +-- !query output
> +org.apache.spark.sql.catalyst.parser.ParseException
> +
> +Error parsing ' 1 day day' to interval, unrecognized number 'day'(line 1, pos 16)
> +
> +== SQL ==
> +select interval '1 day' day
> +----------------^^^
> +
> +
> +-- !query
> +select interval '1-2' year as y
> +-- !query schema
> +struct<>
> +-- !query output
> +org.apache.spark.sql.catalyst.parser.ParseException
> +
> +Error parsing ' 1-2 year' to interval, invalid value '1-2'(line 1, pos 16)
> +
> +== SQL ==
> +select interval '1-2' year as y
> +----------------^^^
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org