You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/05/26 01:42:00 UTC

[jira] [Resolved] (SPARK-39293) The accumulator of ArrayAggregate should copy the intermediate result if string, struct, array, or map

     [ https://issues.apache.org/jira/browse/SPARK-39293?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-39293.
----------------------------------
    Resolution: Fixed

> The accumulator of ArrayAggregate should copy the intermediate result if string, struct, array, or map
> ------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-39293
>                 URL: https://issues.apache.org/jira/browse/SPARK-39293
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.3, 3.1.2, 3.2.1, 3.3.0
>            Reporter: Takuya Ueshin
>            Priority: Major
>             Fix For: 3.1.3, 3.0.4, 3.2.2, 3.3.1
>
>
> The accumulator of ArrayAggregate should copy the intermediate result if string, struct, array, or map.
> {code:scala}
> import org.apache.spark.sql.functions._
> val reverse = udf((s: String) => s.reverse)
> val df = Seq(Array("abc", "def")).toDF("array")
> val testArray = df.withColumn(
>   "agg",
>   aggregate(
>     col("array"),
>     array().cast("array<string>"),
>     (acc, s) => concat(acc, array(reverse(s)))))
> aggArray.show(truncate=false)
> {code}
> should be:
> {code}
> +----------+----------+
> |array     |agg       |
> +----------+----------+
> |[abc, def]|[cba, fed]|
> +----------+----------+
> {code}
> but:
> {code}
> +----------+----------+
> |array     |agg       |
> +----------+----------+
> |[abc, def]|[fed, fed]|
> +----------+----------+
> {code}



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org