You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2018/10/29 04:29:01 UTC

[jira] [Resolved] (SPARK-25797) Views created via 2.1 cannot be read via 2.2+

     [ https://issues.apache.org/jira/browse/SPARK-25797?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun resolved SPARK-25797.
-----------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.3
                   2.2.3

Issue resolved by pull request 22851
[https://github.com/apache/spark/pull/22851]

> Views created via 2.1 cannot be read via 2.2+
> ---------------------------------------------
>
>                 Key: SPARK-25797
>                 URL: https://issues.apache.org/jira/browse/SPARK-25797
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0, 2.2.1, 2.2.2, 2.3.0, 2.3.1, 2.3.2
>            Reporter: Chenxiao Mao
>            Assignee: Chenxiao Mao
>            Priority: Major
>             Fix For: 2.2.3, 2.3.3
>
>
> We ran into this issue when we update our Spark from 2.1 to 2.3. Below's a simple example to reproduce the issue.
> Create views via Spark 2.1
> {code:sql}
> create view v1 as
> select (cast(1 as decimal(18,0)) + cast(1 as decimal(18,0))) c1;
> {code}
> Query views via Spark 2.3
> {code:sql}
> select * from v1;
> Error in query: Cannot up cast `c1` from decimal(20,0) to c1#3906: decimal(19,0) as it may truncate
> {code}
> After investigation, we found that this is because when a view is created via Spark 2.1, the expanded text is saved instead of the original text. Unfortunately, the blow expanded text is buggy.
> {code:sql}
> spark-sql> desc extended v1;
> c1 decimal(19,0) NULL
> Detailed Table Information
> Database default
> Table v1
> Type VIEW
> View Text SELECT `gen_attr_0` AS `c1` FROM (SELECT (CAST(CAST(1 AS DECIMAL(18,0)) AS DECIMAL(19,0)) + CAST(CAST(1 AS DECIMAL(18,0)) AS DECIMAL(19,0))) AS `gen_attr_0`) AS gen_subquery_0
> {code}
> We can see that c1 is decimal(19,0), however in the expanded text there is decimal(19,0) + decimal(19,0) which results in decimal(20,0). Since Spark 2.2, decimal(20,0) in query is not allowed to cast to view definition column decimal(19,0). ([https://github.com/apache/spark/pull/16561])
> I further tested other decimal calculations. Only add/subtract has this issue.
> Create views via 2.1:
> {code:sql}
> create view v1 as
> select (cast(1 as decimal(18,0)) + cast(1 as decimal(18,0))) c1;
> create view v2 as
> select (cast(1 as decimal(18,0)) - cast(1 as decimal(18,0))) c1;
> create view v3 as
> select (cast(1 as decimal(18,0)) * cast(1 as decimal(18,0))) c1;
> create view v4 as
> select (cast(1 as decimal(18,0)) / cast(1 as decimal(18,0))) c1;
> create view v5 as
> select (cast(1 as decimal(18,0)) % cast(1 as decimal(18,0))) c1;
> create view v6 as
> select cast(1 as decimal(18,0)) c1
> union
> select cast(1 as decimal(19,0)) c1;
> {code}
> Query views via Spark 2.3
> {code:sql}
> select * from v1;
> Error in query: Cannot up cast `c1` from decimal(20,0) to c1#3906: decimal(19,0) as it may truncate
> select * from v2;
> Error in query: Cannot up cast `c1` from decimal(20,0) to c1#3909: decimal(19,0) as it may truncate
> select * from v3;
> 1
> select * from v4;
> 1
> select * from v5;
> 0
> select * from v6;
> 1
> {code}
> Views created via Spark 2.2+ don't have this issue because Spark 2.2+ does not generate expanded text for view (https://issues.apache.org/jira/browse/SPARK-18209).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org