You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2022/04/29 04:38:00 UTC
[jira] [Resolved] (SPARK-39040) Respect NaNvl in EquivalentExpressions for expression elimination
[ https://issues.apache.org/jira/browse/SPARK-39040?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-39040.
---------------------------------
Fix Version/s: 3.3.0
Resolution: Fixed
Issue resolved by pull request 36376
[https://github.com/apache/spark/pull/36376]
> Respect NaNvl in EquivalentExpressions for expression elimination
> -----------------------------------------------------------------
>
> Key: SPARK-39040
> URL: https://issues.apache.org/jira/browse/SPARK-39040
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.4.0
> Reporter: XiDuo You
> Priority: Major
> Fix For: 3.3.0
>
>
> For example the query will fail:
> {code:java}
> set spark.sql.ansi.enabled=true;
> set spark.sql.optimizer.excludedRules=org.apache.spark.sql.catalyst.optimizer.ConstantFolding;
> SELECT nanvl(1, 1/0 + 1/0); {code}
> {code:java}
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 4.0 failed 1 times, most recent failure: Lost task 0.0 in stage 4.0 (TID 4) (10.221.98.68 executor driver): org.apache.spark.SparkArithmeticException: divide by zero. To return NULL instead, use 'try_divide'. If necessary set spark.sql.ansi.enabled to false (except for ANSI interval type) to bypass this error.
> == SQL(line 1, position 17) ==
> select nanvl(1 , 1/0 + 1/0)
> ^^^ at org.apache.spark.sql.errors.QueryExecutionErrors$.divideByZeroError(QueryExecutionErrors.scala:151)
> {code}
> We should respect the ordering of conditional expression that always evaluate the predicate branch first, so the query above should not fail.
--
This message was sent by Atlassian Jira
(v8.20.7#820007)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org