You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nattavut Sutyanyong (JIRA)" <ji...@apache.org> on 2016/12/21 19:17:58 UTC

[jira] [Commented] (SPARK-16951) Alternative implementation of NOT IN to Anti-join

    [ https://issues.apache.org/jira/browse/SPARK-16951?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15767904#comment-15767904 ] 

Nattavut Sutyanyong commented on SPARK-16951:
---------------------------------------------

I was wrong on case 3 when the subquery produces empty row. The semantics of the predicate

<expr> NOT IN ( <query-block> )

is

<expr> <> ALL ( <query-block> )

which is when the result of the <query-block> is empty, the predicate is True and therefore all the rows from the parent side is returned.

> Alternative implementation of NOT IN to Anti-join
> -------------------------------------------------
>
>                 Key: SPARK-16951
>                 URL: https://issues.apache.org/jira/browse/SPARK-16951
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Nattavut Sutyanyong
>
> A transformation currently used to process {{NOT IN}} subquery is to rewrite to a form of Anti-join with null-aware property in the Logical Plan and then translate to a form of {{OR}} predicate joining the parent side and the subquery side of the {{NOT IN}}. As a result, the presence of {{OR}} predicate is limited to the nested-loop join execution plan, which will have a major performance implication if both sides' results are large.
> This JIRA sketches an idea of changing the OR predicate to a form similar to the technique used in the implementation of the Existence join that addresses the problem of {{EXISTS (..) OR ..}} type of queries.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org