You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by crafty-coder <gi...@git.apache.org> on 2018/08/07 21:12:36 UTC

[GitHub] spark pull request #22031: [TODO][SPARK-23932][SQL] Higher order function zi...

Github user crafty-coder commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22031#discussion_r208387111
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/higherOrderFunctions.scala ---
    @@ -442,3 +442,93 @@ case class ArrayAggregate(
     
       override def prettyName: String = "aggregate"
     }
    +
    +/**
    + * Transform elements in an array using the transform function. This is similar to
    + * a `map` in functional programming.
    + */
    +// scalastyle:off line.size.limit
    +@ExpressionDescription(
    +  usage = "_FUNC_(expr, func) - Merges the two given arrays, element-wise, into a single array using function. If one array is shorter, nulls are appended at the end to match the length of the longer array, before applying function.",
    +  examples = """
    +    Examples:
    +      > SELECT _FUNC_(array(1, 2, 3), x -> x + 1);
    --- End diff --
    
    The examples are not accurate.
    
    You could something like:
    
    ```
     > SELECT _FUNC_(array(1, 2, 3), array('a', 'b', 'c'), (x, y) -> (y, x));                           
      array(('a', 1), ('b', 3), ('c', 5))                                                               
     > SELECT _FUNC_(array(1, 2), array(3, 4), (x, y) -> x + y));                                       
      array(4, 6)                                                                                       
     > SELECT _FUNC_(array('a', 'b', 'c'), array('d', 'e', 'f'), (x, y) -> concat(x, y));               
      array('ad', 'be', 'cf')                                                                           
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org