You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2023/08/09 07:12:00 UTC

[jira] [Assigned] (SPARK-44551) Wrong semantics for null IN (empty list) - IN expression execution

     [ https://issues.apache.org/jira/browse/SPARK-44551?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan reassigned SPARK-44551:
-----------------------------------

    Assignee: Jack Chen

> Wrong semantics for null IN (empty list) - IN expression execution
> ------------------------------------------------------------------
>
>                 Key: SPARK-44551
>                 URL: https://issues.apache.org/jira/browse/SPARK-44551
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Jack Chen
>            Assignee: Jack Chen
>            Priority: Major
>
> {{null IN (empty list)}} incorrectly evaluates to null, when it should evaluate to false. (The reason it should be false is because a IN (b1, b2) is defined as a = b1 OR a = b2, and an empty IN list is treated as an empty OR which is false. This is specified by ANSI SQL.)
> Many places in Spark execution (In, InSet, InSubquery) and optimization (OptimizeIn, NullPropagation) implemented this wrong behavior. Also note that the Spark behavior for the null IN (empty list) is inconsistent in some places - literal IN lists generally return null (incorrect), while IN/NOT IN subqueries mostly return false/true, respectively (correct) in this case.
> This is a longstanding correctness issue which has existed since null support for IN expressions was first added to Spark.
> Doc with more details: [https://docs.google.com/document/d/1k8AY8oyT-GI04SnP7eXttPDnDj-Ek-c3luF2zL6DPNU/edit] 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org