You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2023/03/16 11:37:00 UTC

[jira] [Updated] (SPARK-41233) High-order function: array_prepend

     [ https://issues.apache.org/jira/browse/SPARK-41233?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-41233:
---------------------------------
    Fix Version/s:     (was: 3.5.0)

> High-order function: array_prepend
> ----------------------------------
>
>                 Key: SPARK-41233
>                 URL: https://issues.apache.org/jira/browse/SPARK-41233
>             Project: Spark
>          Issue Type: Sub-task
>          Components: PySpark, SQL
>    Affects Versions: 3.4.0
>            Reporter: Ruifeng Zheng
>            Priority: Major
>
> refer to https://docs.snowflake.com/en/developer-guide/snowpark/reference/python/api/snowflake.snowpark.functions.array_prepend.html
> 1, about the data type validation:
> In Snowflake’s array_append, array_prepend and array_insert functions, the element data type does not need to match the data type of the existing elements in the array.
> While in Spark, we want to leverage the same data type validation as array_remove.
> 2, about the NULL handling
> Currently, SparkSQL, SnowSQL and PostgreSQL deal with NULL values in different ways.
> Existing functions array_contains, array_position and array_remove in SparkSQL handle NULL in this way, if the input array or/and element is NULL, returns NULL. However, this behavior should be broken.
> We should implement the NULL handling in array_prepend in this way:
> 2.1, if the array is NULL, returns NULL;
> 2.2 if the array is not NULL, the element is NULL, append the NULL value into the array



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org