You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ShuMing Li (JIRA)" <ji...@apache.org> on 2019/08/14 05:27:00 UTC

[jira] [Updated] (SPARK-28724) Throw error message when cast out range decimal to long

     [ https://issues.apache.org/jira/browse/SPARK-28724?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

ShuMing Li updated SPARK-28724:
-------------------------------
    Description: 
Maybe this is a bug in `Scala` when convert `BigDecimal` to `Long`, however Spark should keep the result correct when query the sqls below:
{code:java}
spark-sql> select cast(20190801002382000052000000017638 as int);
-1493203738

spark-sql> select cast(20190801002382000052000000017638 as bigint);
4671677505944388838

{code}
After this patch, the result will throw AnalysisException :
{code:java}
spark-sql> select cast(20190801002382000052000000017638 as bigint);
Error in query: Decimal 20190801002382000052000000017638 does not fit in range [-9223372036854775808, 9223372036854775807] for type Long;{code}
 

For `toFloat/toDouble`, the result is reasonable:
{code:java}
spark-sql> select cast(201908010023820000520000000176380000000000000000.0 as double);
Error in query: DecimalType can only support precision up to 38 == SQL == select cast(201908010023820000520000000176380000000000000000.0 as double)

spark-sql> select cast(201908010023820000520000000176380000000000000000.0 as float); Error in query: DecimalType can only support precision up to 38 == SQL == select cast(201908010023820000520000000176380000000000000000.0 as float)
{code}
 

 

  was:
Maybe this is a bug in `Scala` when convert `BigDecimal` to `Long`, however Spark should keep the result correct when query the sqls below:
{code:java}
spark-sql> select cast(20190801002382000052000000017638 as int);
-1493203738

spark-sql> select cast(20190801002382000052000000017638 as bigint);
4671677505944388838

{code}
After this patch, the result is below:
{code:java}
spark-sql> select cast(20190801002382000052000000017638 as bigint);

java.lang.ArithmeticException: Decimal 20190801002382000052000000017638 does not fit in range [-9223372036854775808, 9223372036854775807] for type Long
at org.apache.spark.sql.types.Decimal.toLong(Decimal.scala:219){code}
 

For `toFloat/toDouble`, the result is reasonable:
{code:java}
spark-sql> select cast(201908010023820000520000000176380000000000000000.0 as double);
Error in query: DecimalType can only support precision up to 38 == SQL == select cast(201908010023820000520000000176380000000000000000.0 as double)

spark-sql> select cast(201908010023820000520000000176380000000000000000.0 as float); Error in query: DecimalType can only support precision up to 38 == SQL == select cast(201908010023820000520000000176380000000000000000.0 as float)
{code}
 

 


> Throw error message when cast out range decimal to long
> -------------------------------------------------------
>
>                 Key: SPARK-28724
>                 URL: https://issues.apache.org/jira/browse/SPARK-28724
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: ShuMing Li
>            Priority: Minor
>
> Maybe this is a bug in `Scala` when convert `BigDecimal` to `Long`, however Spark should keep the result correct when query the sqls below:
> {code:java}
> spark-sql> select cast(20190801002382000052000000017638 as int);
> -1493203738
> spark-sql> select cast(20190801002382000052000000017638 as bigint);
> 4671677505944388838
> {code}
> After this patch, the result will throw AnalysisException :
> {code:java}
> spark-sql> select cast(20190801002382000052000000017638 as bigint);
> Error in query: Decimal 20190801002382000052000000017638 does not fit in range [-9223372036854775808, 9223372036854775807] for type Long;{code}
>  
> For `toFloat/toDouble`, the result is reasonable:
> {code:java}
> spark-sql> select cast(201908010023820000520000000176380000000000000000.0 as double);
> Error in query: DecimalType can only support precision up to 38 == SQL == select cast(201908010023820000520000000176380000000000000000.0 as double)
> spark-sql> select cast(201908010023820000520000000176380000000000000000.0 as float); Error in query: DecimalType can only support precision up to 38 == SQL == select cast(201908010023820000520000000176380000000000000000.0 as float)
> {code}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org