You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/08/08 12:53:00 UTC
[jira] [Commented] (SPARK-39896) The structural integrity of the plan is broken after UnwrapCastInBinaryComparison
[ https://issues.apache.org/jira/browse/SPARK-39896?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17576754#comment-17576754 ]
Apache Spark commented on SPARK-39896:
--------------------------------------
User 'cfmcgrady' has created a pull request for this issue:
https://github.com/apache/spark/pull/37439
> The structural integrity of the plan is broken after UnwrapCastInBinaryComparison
> ---------------------------------------------------------------------------------
>
> Key: SPARK-39896
> URL: https://issues.apache.org/jira/browse/SPARK-39896
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.3.0
> Reporter: Yuming Wang
> Priority: Major
>
> {code:scala}
> sql("create table t1(a decimal(3, 0)) using parquet")
> sql("insert into t1 values(100), (10), (1)")
> sql("select * from t1 where a in(100000, 10, 0, 1.00)").show
> {code}
> {noformat}
> After applying rule org.apache.spark.sql.catalyst.optimizer.UnwrapCastInBinaryComparison in batch Operator Optimization before Inferring Filters, the structural integrity of the plan is broken.
> java.lang.RuntimeException: After applying rule org.apache.spark.sql.catalyst.optimizer.UnwrapCastInBinaryComparison in batch Operator Optimization before Inferring Filters, the structural integrity of the plan is broken.
> at org.apache.spark.sql.errors.QueryExecutionErrors$.structuralIntegrityIsBrokenAfterApplyingRuleError(QueryExecutionErrors.scala:1325)
> at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:229)
> {noformat}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org