You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nattavut Sutyanyong (JIRA)" <ji...@apache.org> on 2017/01/05 14:37:58 UTC

[jira] [Commented] (SPARK-18874) First phase: Deferring the correlated predicate pull up to Optimizer phase

    [ https://issues.apache.org/jira/browse/SPARK-18874?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15801504#comment-15801504 ] 

Nattavut Sutyanyong commented on SPARK-18874:
---------------------------------------------

[~rxin], [~hvanhovell], [~smilegator] Just an FYI that an initial version of the design doc has been posted for public review. Your comments are much appreciated. Thanks!

> First phase: Deferring the correlated predicate pull up to Optimizer phase
> --------------------------------------------------------------------------
>
>                 Key: SPARK-18874
>                 URL: https://issues.apache.org/jira/browse/SPARK-18874
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>            Reporter: Nattavut Sutyanyong
>
> This JIRA implements the first phase of SPARK-18455 by deferring the correlated predicate pull up from Analyzer to Optimizer. The goal is to preserve the current functionality of subquery in Spark 2.0 (if it works, it continues to work after this JIRA, if it does not, it won't). The performance of subquery processing is expected to be at par with Spark 2.0.
> The representation of the LogicalPlan after Analyzer will be different after this JIRA that it will preserve the original positions of correlated predicates in a subquery. This new representation is a preparation work for the second phase of extending the support of correlated subquery to cases Spark 2.0 does not support such as deep correlation, outer references in SELECT clause.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org