You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (Jira)" <ji...@apache.org> on 2022/05/13 04:44:00 UTC
[jira] [Updated] (SPARK-39173) The error message is different if disable broadcast join
[ https://issues.apache.org/jira/browse/SPARK-39173?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang updated SPARK-39173:
--------------------------------
Description:
How to reproduce this issue:
{code:scala}
Seq(-1, 1000000000L).foreach { broadcastThreshold =>
withSQLConf(
SQLConf.AUTO_BROADCASTJOIN_THRESHOLD.key -> broadcastThreshold.toString,
SQLConf.ANSI_ENABLED.key -> "true") {
val df = sql(
"""
|SELECT
| item.i_brand_id brand_id,
| avg(ss_ext_sales_price) avg_agg
|FROM store_sales, item
|WHERE store_sales.ss_item_sk = item.i_item_sk
|GROUP BY item.i_brand_id
""".stripMargin)
df.collect()
}
}
{code}
{noformat}
Error message: org.apache.spark.SparkArithmeticException: [CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded,999999999999999999999999999999999.28175,38,5}) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to false to bypass this error.
Error message: org.apache.spark.SparkArithmeticException: [ARITHMETIC_OVERFLOW] Overflow in sum of decimals. If necessary set spark.sql.ansi.enabled to false (except for ANSI interval type) to bypass this error.
{noformat}
was:
How to reproduce this issue:
{code:scala}
Seq(-1, 1000000000L).foreach { broadcastThreshold =>
withSQLConf(
SQLConf.AUTO_BROADCASTJOIN_THRESHOLD.key -> broadcastThreshold.toString,
SQLConf.ANSI_ENABLED.key -> "true") {
val df = sql(
"""
|SELECT
| item.i_brand_id brand_id,
| avg(ss_ext_sales_price) avg_agg
|FROM store_sales, item
|WHERE store_sales.ss_item_sk = item.i_item_sk
|GROUP BY item.i_brand_id
""".stripMargin)
df.collect()
}
}
{code}
{noformat}
Error message: Job aborted due to stage failure: Task 0 in stage 10.0 failed 1 times, most recent failure: Lost task 0.0 in stage 10.0 (TID 9) (localhost executor driver): org.apache.spark.SparkArithmeticException: [CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded,999999999999999999999999999999999.28175,38,5}) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to false to bypass this error.
Error message: Job aborted due to stage failure: Task 0 in stage 14.0 failed 1 times, most recent failure: Lost task 0.0 in stage 14.0 (TID 14) (localhost executor driver): org.apache.spark.SparkArithmeticException: [ARITHMETIC_OVERFLOW] Overflow in sum of decimals. If necessary set spark.sql.ansi.enabled to false (except for ANSI interval type) to bypass this error.
{noformat}
> The error message is different if disable broadcast join
> --------------------------------------------------------
>
> Key: SPARK-39173
> URL: https://issues.apache.org/jira/browse/SPARK-39173
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.4.0
> Reporter: Yuming Wang
> Priority: Major
>
> How to reproduce this issue:
> {code:scala}
> Seq(-1, 1000000000L).foreach { broadcastThreshold =>
> withSQLConf(
> SQLConf.AUTO_BROADCASTJOIN_THRESHOLD.key -> broadcastThreshold.toString,
> SQLConf.ANSI_ENABLED.key -> "true") {
> val df = sql(
> """
> |SELECT
> | item.i_brand_id brand_id,
> | avg(ss_ext_sales_price) avg_agg
> |FROM store_sales, item
> |WHERE store_sales.ss_item_sk = item.i_item_sk
> |GROUP BY item.i_brand_id
> """.stripMargin)
> df.collect()
> }
> }
> {code}
> {noformat}
> Error message: org.apache.spark.SparkArithmeticException: [CANNOT_CHANGE_DECIMAL_PRECISION] Decimal(expanded,999999999999999999999999999999999.28175,38,5}) cannot be represented as Decimal(38, 6). If necessary set "spark.sql.ansi.enabled" to false to bypass this error.
> Error message: org.apache.spark.SparkArithmeticException: [ARITHMETIC_OVERFLOW] Overflow in sum of decimals. If necessary set spark.sql.ansi.enabled to false (except for ANSI interval type) to bypass this error.
> {noformat}
--
This message was sent by Atlassian Jira
(v8.20.7#820007)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org