You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Animesh Baranawal (JIRA)" <ji...@apache.org> on 2015/06/26 07:14:04 UTC

[jira] [Comment Edited] (SPARK-8636) CaseKeyWhen has incorrect NULL handling

    [ https://issues.apache.org/jira/browse/SPARK-8636?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14602411#comment-14602411 ] 

Animesh Baranawal edited comment on SPARK-8636 at 6/26/15 5:13 AM:
-------------------------------------------------------------------

So the condition should be :
if (l == null || r == null) false
else l == r ?


was (Author: animeshbaranawal):
So the condition should be :
if (l == null || r == null) false
else l == r

> CaseKeyWhen has incorrect NULL handling
> ---------------------------------------
>
>                 Key: SPARK-8636
>                 URL: https://issues.apache.org/jira/browse/SPARK-8636
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>            Reporter: Santiago M. Mola
>              Labels: starter
>
> CaseKeyWhen implementation in Spark uses the following equals implementation:
> {code}
>   private def equalNullSafe(l: Any, r: Any) = {
>     if (l == null && r == null) {
>       true
>     } else if (l == null || r == null) {
>       false
>     } else {
>       l == r
>     }
>   }
> {code}
> Which is not correct, since in SQL, NULL is never equal to NULL (actually, it is not unequal either). In this case, a NULL value in a CASE WHEN expression should never match.
> For example, you can execute this in MySQL:
> {code}
> SELECT CASE NULL WHEN NULL THEN "NULL MATCHES" ELSE "NULL DOES NOT MATCH" END FROM DUAL;
> {code}
> And the result will be "NULL DOES NOT MATCH".



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org