You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/07/01 20:31:00 UTC
[jira] [Assigned] (SPARK-35955) Fix decimal overflow issues for
Average
[ https://issues.apache.org/jira/browse/SPARK-35955?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-35955:
------------------------------------
Assignee: (was: Apache Spark)
> Fix decimal overflow issues for Average
> ---------------------------------------
>
> Key: SPARK-35955
> URL: https://issues.apache.org/jira/browse/SPARK-35955
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Karen Feng
> Priority: Major
>
> Fix decimal overflow issues for decimal average in ANSI mode. Linked to SPARK-32018 and SPARK-28067, which address decimal sum.
> Repro:
>
> {code:java}
> import org.apache.spark.sql.functions._
> spark.conf.set("spark.sql.ansi.enabled", true)
> val df = Seq(
> (BigDecimal("10000000000000000000"), 1),
> (BigDecimal("10000000000000000000"), 1),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2),
> (BigDecimal("10000000000000000000"), 2)).toDF("decNum", "intNum")
> val df2 = df.withColumnRenamed("decNum", "decNum2").join(df, "intNum").agg(mean("decNum"))
> df2.show(40,false)
> {code}
>
> Should throw an exception (as sum overflows), but instead returns:
>
> {code:java}
> +-----------+
> |avg(decNum)|
> +-----------+
> |null |
> +-----------+{code}
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org