You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/10/28 02:00:00 UTC

[jira] [Resolved] (SPARK-22181) ReplaceExceptWithFilter if one or both of the datasets are fully derived out of Filters from a same parent

     [ https://issues.apache.org/jira/browse/SPARK-22181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li resolved SPARK-22181.
-----------------------------
       Resolution: Fixed
         Assignee: Sathiya Kumar
    Fix Version/s: 2.3.0

> ReplaceExceptWithFilter if one or both of the datasets are fully derived out of Filters from a same parent
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-22181
>                 URL: https://issues.apache.org/jira/browse/SPARK-22181
>             Project: Spark
>          Issue Type: New Feature
>          Components: Optimizer, SQL
>    Affects Versions: 2.1.1, 2.2.0
>            Reporter: Sathiya Kumar
>            Assignee: Sathiya Kumar
>            Priority: Minor
>             Fix For: 2.3.0
>
>
> While applying Except operator between two datasets, if one or both of the datasets are purely transformed using filter operations, then instead of rewriting the Except operator using expensive join operation, we can rewrite it using filter operation by flipping the filter condition of the right node.
> Example:
> {code:sql}
>    SELECT a1, a2 FROM Tab1 WHERE a2 = 12 EXCEPT SELECT a1, a2 FROM Tab1 WHERE a1 = 5
>    ==>  SELECT DISTINCT a1, a2 FROM Tab1 WHERE a2 = 12 AND (a1 is null OR a1 <> 5)
> {code}
> For more details please refer: [this post|https://github.com/sathiyapk/Blog-Posts/blob/master/SparkOptimizer.md]



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org