You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2023/11/27 14:11:00 UTC

[jira] [Updated] (SPARK-46124) Replace explicit `ArrayOps#toSeq` with `s.c.immutable.ArraySeq.unsafeWrapArray`

     [ https://issues.apache.org/jira/browse/SPARK-46124?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

ASF GitHub Bot updated SPARK-46124:
-----------------------------------
    Labels: pull-request-available  (was: )

> Replace explicit  `ArrayOps#toSeq` with `s.c.immutable.ArraySeq.unsafeWrapArray`
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-46124
>                 URL: https://issues.apache.org/jira/browse/SPARK-46124
>             Project: Spark
>          Issue Type: Sub-task
>          Components: DStreams, Kubernetes, ML, MLlib, Spark Core, SQL, Structured Streaming, YARN
>    Affects Versions: 4.0.0
>            Reporter: Yang Jie
>            Priority: Major
>              Labels: pull-request-available
>
> There is a behavioral difference between Scala 2.13 and 2.12 for explicit `ArrayOps.toSeq` calls, similar to the implicit conversion from `Array` to `Seq`.
> In Scala 2.12, it returns a `mutable.WrappedArray`, which does not involve a collection copy.
> ```scala
> Welcome to Scala 2.12.18 (OpenJDK 64-Bit Server VM, Java 17.0.9).
> Type in expressions for evaluation. Or try :help.
> scala> Array(1,2,3).toSeq
> res0: Seq[Int] = WrappedArray(1, 2, 3)
> ```
> However, in Scala 2.13, it returns an `immutable.ArraySeq` that with collection copy.
> Since we have always used the non-collection copy behavior for this explicit conversion in the era of Scala 2.12, it is safe to assume that no collection copy is needed for Scala 2.13.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org