You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2015/05/08 07:58:59 UTC

[jira] [Resolved] (SPARK-7232) Add a Substitution batch for spark sql analyzer

     [ https://issues.apache.org/jira/browse/SPARK-7232?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yin Huai resolved SPARK-7232.
-----------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0
         Assignee: Fei Wang

It has been resolved by https://github.com/apache/spark/commit/f496bf3c539a873ffdf3aa803847ef7b50135bd7.

> Add a Substitution batch for spark sql analyzer
> -----------------------------------------------
>
>                 Key: SPARK-7232
>                 URL: https://issues.apache.org/jira/browse/SPARK-7232
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.3.1
>            Reporter: Fei Wang
>            Assignee: Fei Wang
>             Fix For: 1.4.0
>
>
> Added a new batch named `Substitution` before Resolution batch. The motivation for this is there are kind of cases we want to do some substitution on the parsed logical plan before resolve it. 
> Consider this two cases:
> 1 CTE, for cte we first build a row logical plan
> 'With Map(q1 -> 'Subquery q1
>                              'Project ['key]
>                                'UnresolvedRelation [src], None)
>  'Project [*]
>   'Filter ('key = 5)
>    'UnresolvedRelation [q1], None
> In `With` logicalplan here is a map stored the (q1-> subquery), we want first take off the with command and substitute the  q1 of UnresolvedRelation by the subquery
> 2 Another example is Window function, in window function user may define some windows, we also need substitute the window name of child by the concrete window. this should also done in the Substitution batch.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org