You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vitalii Li (Jira)" <ji...@apache.org> on 2022/04/28 19:52:00 UTC

[jira] [Updated] (SPARK-39060) Typo in error messages of decimal overflow

     [ https://issues.apache.org/jira/browse/SPARK-39060?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Vitalii Li updated SPARK-39060:
-------------------------------
    Description: 
   org.apache.spark.SparkArithmeticException 

   Decimal(expanded,10000000000000000000000000000000000000.1,39,1}) cannot be represented as Decimal(38, 1). If necessary set spark.sql.ansi.enabled to false to bypass this error.
 

As shown in {{decimalArithmeticOperations.sql.out}}

Notice the extra {{}}} before ‘cannot’


 
 
 
 

  was:
```
-- !query
select (5e36BD + 0.1) + 5e36BD
-- !query schema
struct<>
-- !query output
org.apache.spark.SparkArithmeticException
[CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded,10000000000000000000000000000000000000.1,39,1}) cannot be represented as Decimal(38, 1). If necessary set "spark.sql.ansi.enabled" to false to bypass this error.
== SQL(line 1, position 7) ==
select (5e36BD + 0.1) + 5e36BD
^^^^^^^^^^^^^^^^^^^^^^^
```
 
 
 
 
 


> Typo in error messages of decimal overflow
> ------------------------------------------
>
>                 Key: SPARK-39060
>                 URL: https://issues.apache.org/jira/browse/SPARK-39060
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.1
>            Reporter: Vitalii Li
>            Priority: Major
>
>    org.apache.spark.SparkArithmeticException 
>    Decimal(expanded,10000000000000000000000000000000000000.1,39,1}) cannot be represented as Decimal(38, 1). If necessary set spark.sql.ansi.enabled to false to bypass this error.
>  
> As shown in {{decimalArithmeticOperations.sql.out}}
> Notice the extra {{}}} before ‘cannot’
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org