You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sathiya Kumar (JIRA)" <ji...@apache.org> on 2017/10/13 16:17:00 UTC

[jira] [Updated] (SPARK-22181) ReplaceExceptWithFilter if one or both of the datasets are fully derived out of Filters from a same parent

     [ https://issues.apache.org/jira/browse/SPARK-22181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sathiya Kumar updated SPARK-22181:
----------------------------------
    Summary: ReplaceExceptWithFilter if one or both of the datasets are fully derived out of Filters from a same parent  (was: ReplaceExceptWithNotFilter if one or both of the datasets are fully derived out of Filters from a same parent)

> ReplaceExceptWithFilter if one or both of the datasets are fully derived out of Filters from a same parent
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-22181
>                 URL: https://issues.apache.org/jira/browse/SPARK-22181
>             Project: Spark
>          Issue Type: New Feature
>          Components: Optimizer, SQL
>    Affects Versions: 2.1.1, 2.2.0
>            Reporter: Sathiya Kumar
>            Priority: Minor
>
> While applying Except operator between two datasets, if one or both of the datasets are purely transformed using filter operations, then instead of rewriting the Except operator using expensive join operation, we can rewrite it using filter operation by flipping the filter condition of the right node.
> Example:
> {code:sql}
>    SELECT a1, a2 FROM Tab1 WHERE a2 = 12 EXCEPT SELECT a1, a2 FROM Tab1 WHERE a1 = 5
>    ==>  SELECT DISTINCT a1, a2 FROM Tab1 WHERE a2 = 12 AND a1 <> 5
> {code}
> For more details please refer: [this post|https://github.com/sathiyapk/Blog-Posts/blob/master/SparkOptimizer.md]



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org