You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2022/06/10 23:53:00 UTC

[jira] [Comment Edited] (SPARK-39419) When the comparator of ArraySort returns null, it should fail.

    [ https://issues.apache.org/jira/browse/SPARK-39419?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17553002#comment-17553002 ] 

Dongjoon Hyun edited comment on SPARK-39419 at 6/10/22 11:52 PM:
-----------------------------------------------------------------

This is resolved via

[https://github.com/apache/spark/pull/36812] (master)

[https://github.com/apache/spark/pull/36834] (branch-3.3)

[https://github.com/apache/spark/pull/36835] (branch-3.2)


was (Author: dongjoon):
This is resolved via https://github.com/apache/spark/pull/36835

> When the comparator of ArraySort returns null, it should fail.
> --------------------------------------------------------------
>
>                 Key: SPARK-39419
>                 URL: https://issues.apache.org/jira/browse/SPARK-39419
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Takuya Ueshin
>            Assignee: Takuya Ueshin
>            Priority: Major
>             Fix For: 3.2.2, 3.3.1
>
>
> When the comparator of {{ArraySort}} returns {{null}}, currently it handles it as {{0}} (equal).
> According to the doc, 
> {quote}
> It returns -1, 0, or 1 as the first element is less than, equal to, or greater than the second element. If the comparator function returns other values (including null), the function will fail and raise an error.
> {quote}
> It's fine to return non -1, 0, 1 integers to follow the Java convention (still need to update the doc, though), but it should throw an exception for {{null}} result.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org