You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/07/27 13:22:00 UTC

[jira] [Assigned] (SPARK-32459) UDF regression of WrappedArray supporting caused by SPARK-31826

     [ https://issues.apache.org/jira/browse/SPARK-32459?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-32459:
------------------------------------

    Assignee:     (was: Apache Spark)

> UDF regression of WrappedArray supporting caused by SPARK-31826
> ---------------------------------------------------------------
>
>                 Key: SPARK-32459
>                 URL: https://issues.apache.org/jira/browse/SPARK-32459
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: wuyi
>            Priority: Major
>
>  
> {code:java}
> test("WrappedArray") {
>   val myUdf = udf((a: WrappedArray[Int]) =>
>     WrappedArray.make[Int](Array(a.head + 99)))
>   checkAnswer(Seq(Array(1))
>     .toDF("col")
>     .select(myUdf(Column("col"))),
>     Row(ArrayBuffer(100)))
> }{code}
> Execute the above test in master branch, we'll hit the error:
> {code:java}
> [info]   org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, 192.168.101.3, executor driver): java.lang.RuntimeException: Error while decoding: java.lang.ClassCastException: scala.collection.mutable.ArrayBuffer cannot be cast to scala.collection.mutable.WrappedArray[info]  {code}
> However, the test can be executed successfully in branch-3.0.
>  
> This's actually a regression caused by SPARK-31826. And the regression happens after we changed the catalyst-to-scala converter from CatalystTypeConverters to ExpressionEncoder.deserializer .
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org