You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/09/12 09:15:01 UTC

[jira] [Assigned] (SPARK-21979) Improve QueryPlanConstraints framework

     [ https://issues.apache.org/jira/browse/SPARK-21979?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-21979:
------------------------------------

    Assignee:     (was: Apache Spark)

> Improve QueryPlanConstraints framework
> --------------------------------------
>
>                 Key: SPARK-21979
>                 URL: https://issues.apache.org/jira/browse/SPARK-21979
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Gengliang Wang
>            Priority: Critical
>
> Improve QueryPlanConstraints framework, make it robust and simple.
> In apache/spark#15319, constraints for expressions like a = f(b, c) is resolved.
> However, for expressions like
> a = f(b, c) && c = g(a, b)
> The current QueryPlanConstraints framework will produce non-converging constraints.
> Essentially, the problem is caused by having both the name and child of aliases in the same constraint set. We infer constraints, and push down constraints as predicates in filters, later on these predicates are propagated as constraints, etc..
> Simply using the alias names only can resolve these problems. The size of constraints is reduced without losing any information. We can always get these inferred constraints on child of aliases when pushing down filters.
> Also, the EqualNullSafe between name and child in propagating alias is meaningless
> allConstraints += EqualNullSafe(e, a.toAttribute)
> It just produce redundant constraints.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org