You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/09/12 20:04:01 UTC

[jira] [Resolved] (SPARK-21979) Improve QueryPlanConstraints framework

     [ https://issues.apache.org/jira/browse/SPARK-21979?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li resolved SPARK-21979.
-----------------------------
       Resolution: Fixed
         Assignee: Gengliang Wang
    Fix Version/s: 2.3.0

> Improve QueryPlanConstraints framework
> --------------------------------------
>
>                 Key: SPARK-21979
>                 URL: https://issues.apache.org/jira/browse/SPARK-21979
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Gengliang Wang
>            Assignee: Gengliang Wang
>            Priority: Critical
>             Fix For: 2.3.0
>
>
> Improve QueryPlanConstraints framework, make it robust and simple.
> In apache/spark#15319, constraints for expressions like a = f(b, c) is resolved.
> However, for expressions like
> a = f(b, c) && c = g(a, b)
> The current QueryPlanConstraints framework will produce non-converging constraints.
> Essentially, the problem is caused by having both the name and child of aliases in the same constraint set. We infer constraints, and push down constraints as predicates in filters, later on these predicates are propagated as constraints, etc..
> Simply using the alias names only can resolve these problems. The size of constraints is reduced without losing any information. We can always get these inferred constraints on child of aliases when pushing down filters.
> Also, the EqualNullSafe between name and child in propagating alias is meaningless
> allConstraints += EqualNullSafe(e, a.toAttribute)
> It just produce redundant constraints.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org