You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2021/03/08 09:03:00 UTC

[jira] [Resolved] (SPARK-34598) RewritePredicateSubquery Rule must not update Filters without subqueries

     [ https://issues.apache.org/jira/browse/SPARK-34598?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-34598.
---------------------------------
    Fix Version/s: 3.2.0
       Resolution: Fixed

Issue resolved by pull request 31712
[https://github.com/apache/spark/pull/31712]

> RewritePredicateSubquery Rule must not update Filters without subqueries
> ------------------------------------------------------------------------
>
>                 Key: SPARK-34598
>                 URL: https://issues.apache.org/jira/browse/SPARK-34598
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.1
>            Reporter: Swinky Mann
>            Assignee: Swinky Mann
>            Priority: Minor
>             Fix For: 3.2.0
>
>
> 1. Currently RewritePredicateSubquery rule updates Filter node for queries without any subquery as well. This shouldn't happen. 
> 2. Also `Filter(conditions.reduce(And), child)` in the rule might create a skewed expression tree even though original expression is  balanced.
>  
> {noformat}
> === Applying Rule org.apache.spark.sql.catalyst.optimizer.RewritePredicateSubquery ===
>  Project [a#0]                                                        Project [a#0]
> !+- Filter (((a#0 > 1) OR (b#1 > 2)) AND ((c#2 > 1) AND (d#3 > 2)))   +- Filter ((((a#0 > 1) OR (b#1 > 2)) AND (c#2 > 1)) AND (d#3 > 2))
>     +- LocalRelation <empty>, [a#0, b#1, c#2, d#3]                       +- LocalRelation <empty>, [a#0, b#1, c#2, d#3]{noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org