You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by we...@apache.org on 2023/02/15 13:00:58 UTC

[spark] branch branch-3.4 updated: [SPARK-42405][SQL] Improve array insert documentation

This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.4 by this push:
     new d351569f2a9 [SPARK-42405][SQL] Improve array insert documentation
d351569f2a9 is described below

commit d351569f2a938ee0d4a120f6ef0e44ae9602bdba
Author: Daniel Davies <dd...@palantir.com>
AuthorDate: Wed Feb 15 21:00:24 2023 +0800

    [SPARK-42405][SQL] Improve array insert documentation
    
    ### What changes were proposed in this pull request?
    
    Part of cleanup from existing PR https://github.com/apache/spark/pull/38867 - documentation on the scala class ArrayInsert should match the python array_insert function. See comment here: https://github.com/apache/spark/pull/38867#discussion_r1097054656.
    
    ### Why are the changes needed?
    
    See https://github.com/apache/spark/pull/38867#discussion_r1097054656.
    
    ### Does this PR introduce _any_ user-facing change?
    Yes- better documentation of the array_insert function
    
    ### How was this patch tested?
    Not applicable/ standard unit testing.
    
    Closes #39975 from Daniel-Davies/ddavies/SPARK-42405.
    
    Authored-by: Daniel Davies <dd...@palantir.com>
    Signed-off-by: Wenchen Fan <we...@databricks.com>
    (cherry picked from commit a14c6bb2710cb7d43538e9754ca536f0269eb3c4)
    Signed-off-by: Wenchen Fan <we...@databricks.com>
---
 python/pyspark/sql/functions.py                                    | 6 +++---
 .../spark/sql/catalyst/expressions/collectionOperations.scala      | 7 ++++++-
 2 files changed, 9 insertions(+), 4 deletions(-)

diff --git a/python/pyspark/sql/functions.py b/python/pyspark/sql/functions.py
index ac842101b28..b103af72e36 100644
--- a/python/pyspark/sql/functions.py
+++ b/python/pyspark/sql/functions.py
@@ -7680,9 +7680,9 @@ def array_distinct(col: "ColumnOrName") -> Column:
 def array_insert(arr: "ColumnOrName", pos: Union["ColumnOrName", int], value: Any) -> Column:
     """
     Collection function: adds an item into a given array at a specified array index.
-    Array indices start at 1 (or from the end if the index is negative).
-    Index specified beyond the size of the current array (plus additional element)
-    is extended with 'null' elements.
+    Array indices start at 1, or start from the end if index is negative.
+    Index above array size appends the array, or prepends the array if index is negative,
+    with 'null' elements.
 
     .. versionadded:: 3.4.0
 
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
index 28c4a9eba68..289859d420b 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
@@ -4603,7 +4603,12 @@ case class ArrayExcept(left: Expression, right: Expression) extends ArrayBinaryL
 
 // scalastyle:off line.size.limit
 @ExpressionDescription(
-  usage = "_FUNC_(x, pos, val) - Places val into index pos of array x (array indices start at 1, or start from the end if start is negative).\",",
+  usage = """
+    _FUNC_(x, pos, val) - Places val into index pos of array x.
+      Array indices start at 1, or start from the end if index is negative.
+      Index above array size appends the array, or prepends the array if index is negative,
+      with 'null' elements.
+  """,
   examples = """
     Examples:
       > SELECT _FUNC_(array(1, 2, 3, 4), 5, 5);


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org