You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2019/12/09 20:43:00 UTC

[jira] [Resolved] (SPARK-30158) Resolve Array + reference type compile problems in 2.13, with sc.parallelize

     [ https://issues.apache.org/jira/browse/SPARK-30158?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen resolved SPARK-30158.
----------------------------------
    Fix Version/s: 3.0.0
       Resolution: Fixed

Issue resolved by pull request 26787
[https://github.com/apache/spark/pull/26787]

> Resolve Array + reference type compile problems in 2.13, with sc.parallelize
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-30158
>                 URL: https://issues.apache.org/jira/browse/SPARK-30158
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core, SQL
>    Affects Versions: 3.0.0
>            Reporter: Sean R. Owen
>            Assignee: Sean R. Owen
>            Priority: Minor
>             Fix For: 3.0.0
>
>
> Scala 2.13 has some different rules about resolving Arrays as Seqs when the array is of a reference type. This primarily affects calls to {{sc.parallelize(Array(...))}} where elements aren't primitive:
> {code}
> [ERROR] [Error] /Users/seanowen/Documents/spark_2.13/mllib/src/main/scala/org/apache/spark/mllib/pmml/PMMLExportable.scala:61: overloaded method value apply with alternatives:
>   (x: Unit,xs: Unit*)Array[Unit] <and>
>   (x: Double,xs: Double*)Array[Double] <and>
>   ...
> {code}
> This is easy to resolve by using Seq instead.
> Closely related: WrappedArray is a type def in 2.13, which makes it unusable in Java. One set of tests needs to adapt. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org