You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2021/05/31 04:58:00 UTC

[jira] [Resolved] (SPARK-35545) Split SubqueryExpression's children field into outer attributes and join conditions

     [ https://issues.apache.org/jira/browse/SPARK-35545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-35545.
---------------------------------
    Fix Version/s: 3.2.0
       Resolution: Fixed

Issue resolved by pull request 32687
[https://github.com/apache/spark/pull/32687]

> Split SubqueryExpression's children field into outer attributes and join conditions
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-35545
>                 URL: https://issues.apache.org/jira/browse/SPARK-35545
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Allison Wang
>            Assignee: Allison Wang
>            Priority: Major
>             Fix For: 3.2.0
>
>
> Currently the children field of a subquery expression is used to store both collected outer references inside the subquery plan, and also join conditions after correlated predicates are pulled up. For example
> SELECT (SELECT max(c1) FROM t1 WHERE t1.c1 = t2.c1) FROM t2
> After analysis phase:
> scalar-subquery [t2.c1]
> After PullUpCorrelatedPredicates:
> scalar-subquery [t1.c1 = t2.c1]
> The references for a subquery expressions is also confusing: 
> override lazy val references: AttributeSet =
>  if (plan.resolved) super.references -- plan.outputSet else super.references 
> We should split this children field into outer attribute references and join conditions.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org