You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nattavut Sutyanyong (JIRA)" <ji...@apache.org> on 2016/08/09 03:55:20 UTC

[jira] [Comment Edited] (SPARK-16951) Alternative implementation of NOT IN to Anti-join

    [ https://issues.apache.org/jira/browse/SPARK-16951?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15412882#comment-15412882 ] 

Nattavut Sutyanyong edited comment on SPARK-16951 at 8/9/16 3:54 AM:
---------------------------------------------------------------------

The following output is tested on Spark master trunk built on August 5, 2016.

{noformat}
scala> Seq(1,2).toDF("c1").createOrReplaceTempView("t1")

scala> Seq(1).toDF("c2").createOrReplaceTempView("t2")

scala> sql("select t2.c2+1 as c3 from t1 left join t2 on t1.c1=t2.c2").createOrReplaceTempView("t3")

scala> sql("select * from t1").show
+---+
| c1|
+---+
|  1|
|  2|
+---+


scala> sql("select * from t2").show
+---+
| c2|
+---+
|  1|
+---+


scala> sql("select * from t3").show
+----+
|  c3|
+----+
|   2|
|null|
+----+
{noformat}

Case 1:
{noformat}
scala> sql("select * from t3 where c3 not in (select c2 from t2)").show
+----+
|  c3|
+----+
|   2|
|null|
+----+
{noformat}
The correct result is:
{noformat}
+----+
|  c3|
+----+
|   2|
+----+
{noformat}

Case 2:
{noformat}
scala> sql("select * from t1 where c1 not in (select c3 from t3)").show
+---+
| c1|
+---+
+---+
{noformat}

The answer is correct.

Case 3:
{noformat}
scala> sql("select * from t1 where c1 not in (select c2 from t2 where 1=2)").show
+---+
| c1|
+---+
|  1|
|  2|
+---+
{noformat}

The correct result is:
{noformat}
+---+
| c1|
+---+
+---+
{noformat}


was (Author: nsyca):
The following output is tested on Spark master trunk built on August 5, 2016.

<noformat>
scala> Seq(1,2).toDF("c1").createOrReplaceTempView("t1")

scala> Seq(1).toDF("c2").createOrReplaceTempView("t2")

scala> sql("select t2.c2+1 as c3 from t1 left join t2 on t1.c1=t2.c2").createOrReplaceTempView("t3")

scala> sql("select * from t1").show
+---+
| c1|
+---+
|  1|
|  2|
+---+


scala> sql("select * from t2").show
+---+
| c2|
+---+
|  1|
+---+


scala> sql("select * from t3").show
+----+
|  c3|
+----+
|   2|
|null|
+----+
<noformat>

Case 1:
<noformat>
scala> sql("select * from t3 where c3 not in (select c2 from t2)").show
+----+
|  c3|
+----+
|   2|
|null|
+----+
<noformat>
The correct result is:
<noformat>
+----+
|  c3|
+----+
|   2|
+----+
<noformat>

Case 2:
<noformat>
scala> sql("select * from t1 where c1 not in (select c3 from t3)").show
+---+
| c1|
+---+
+---+
<noformat>

The answer is correct.

Case 3:
<noformat>
scala> sql("select * from t1 where c1 not in (select c2 from t2 where 1=2)").show
+---+
| c1|
+---+
|  1|
|  2|
+---+
<noformat>

The correct result is:
<noformat>
+---+
| c1|
+---+
+---+
<noformat>

> Alternative implementation of NOT IN to Anti-join
> -------------------------------------------------
>
>                 Key: SPARK-16951
>                 URL: https://issues.apache.org/jira/browse/SPARK-16951
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Nattavut Sutyanyong
>
> A transformation currently used to process {{NOT IN}} subquery is to rewrite to a form of Anti-join with null-aware property in the Logical Plan and then translate to a form of {{OR}} predicate joining the parent side and the subquery side of the {{NOT IN}}. As a result, the presence of {{OR}} predicate is limited to the nested-loop join execution plan, which will have a major performance implication if both sides' results are large.
> This JIRA sketches an idea of changing the OR predicate to a form similar to the technique used in the implementation of the Existence join that addresses the problem of {{EXISTS (..) OR ..}} type of queries.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org