You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/04/30 06:18:00 UTC
[jira] [Assigned] (SPARK-35273) CombineFilters support
non-deterministic expressions
[ https://issues.apache.org/jira/browse/SPARK-35273?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-35273:
------------------------------------
Assignee: Apache Spark
> CombineFilters support non-deterministic expressions
> ----------------------------------------------------
>
> Key: SPARK-35273
> URL: https://issues.apache.org/jira/browse/SPARK-35273
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.2.0
> Reporter: Yuming Wang
> Assignee: Apache Spark
> Priority: Major
>
> For example:
> {code:scala}
> spark.sql("create table t1(id int) using parquet")
> spark.sql("select * from (select * from t1 where id not in (1, 3, 6)) t where id = 7 and rand() <= 0.01").explain("cost")
> {code}
> Current:
> {noformat}
> == Optimized Logical Plan ==
> Filter (isnotnull(id#0) AND ((id#0 = 7) AND (rand(-639771619343876662) <= 0.01))), Statistics(sizeInBytes=1.0 B)
> +- Filter NOT id#0 IN (1,3,6), Statistics(sizeInBytes=1.0 B)
> +- Relation default.t1[id#0] parquet, Statistics(sizeInBytes=0.0 B)
> {noformat}
> Expected:
> {noformat}
> == Optimized Logical Plan ==
> Filter (isnotnull(id#0) AND (NOT id#0 IN (1,3,6) AND ((id#0 = 7) AND (rand(-1485510186481201685) <= 0.01)))), Statistics(sizeInBytes=1.0 B)
> +- Relation default.t1[id#0] parquet, Statistics(sizeInBytes=0.0 B)
> {noformat}
> Another example:
> {code:scala}
> spark.sql("create table t1(id int) using parquet")
> spark.sql("create view v1 as select * from t1 where id not in (1, 3, 6)")
> spark.sql("select * from v1 where id = 7 and rand() <= 0.01").explain("cost")
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org