You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "kevinshin (Jira)" <ji...@apache.org> on 2023/02/17 06:28:00 UTC

[jira] [Created] (SPARK-42473) An explicit cast will be needed when INSERT OVERWRITE SELECT UNION ALL

kevinshin created SPARK-42473:
---------------------------------

             Summary: An explicit cast will be needed when INSERT OVERWRITE SELECT UNION ALL
                 Key: SPARK-42473
                 URL: https://issues.apache.org/jira/browse/SPARK-42473
             Project: Spark
          Issue Type: Bug
          Components: Optimizer
    Affects Versions: 3.3.1
         Environment: spark 3.3.1
            Reporter: kevinshin


*when 'union all' and one select statement use* *Literal as column value , the other* *select statement  has computed expression at the same column , then the whole statement will compile failed. A explicit cast will be needed.*

for example:

explain

*INSERT* OVERWRITE *TABLE* test.spark33_decimal_orc

*select* *null* *as* amt1, *cast*('256.99' *as* *decimal*(20,8)) *as* amt2

*union* *all*

*select* *cast*('200.99' *as* *decimal*(20,8)){*}/{*}100 *as* amt1,*cast*('256.99' *as* *decimal*(20,8)) *as* amt2

*will got error :* 

org.apache.spark.*sql*.catalyst.expressions.Literal cannot be *cast* *to* org.apache.spark.*sql*.catalyst.expressions.AnsiCast

The SQL will need to change to : 

explain

*INSERT* OVERWRITE *TABLE* test.spark33_decimal_orc

*select* *null* *as* amt1,*cast*('256.99' *as* *decimal*(20,8)) *as* amt2

*union* *all*

*select* {*}cast{*}({*}cast{*}('200.99' *as* {*}decimal{*}(20,8)){*}/{*}100 *as* {*}decimal{*}(20,8)) *as* amt1,*cast*('256.99' *as* *decimal*(20,8)) *as* amt2

 

but this is not need in spark3.2.1 , is this a bug for spark3.3.1 ? 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org