You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Peter Kaiser (Jira)" <ji...@apache.org> on 2021/03/15 16:04:00 UTC

[jira] [Created] (SPARK-34746) Spark dependencies require scala 2.12.12

Peter Kaiser created SPARK-34746:
------------------------------------

             Summary: Spark dependencies require scala 2.12.12
                 Key: SPARK-34746
                 URL: https://issues.apache.org/jira/browse/SPARK-34746
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.1.1
            Reporter: Peter Kaiser


In our application we're creating a spark session programmatically. The application is built using gradle.

After upgrading spark to 3.1.1 it no longer works, due to incompatible classes on driver and executor (namely: scala.lang.collections.immutable.WrappedArray.ofRef).



Turns out this was caused by different scala versions on driver vs. executor. While spark still comes with Scala 2.12.10, some of its dependencies in the gradle build require Scala 2.12.12:
{noformat}
Cannot find a version of 'org.scala-lang:scala-library' that satisfies the version constraints:
Dependency path '...' --> '...' --> 'org.scala-lang:scala-library:{strictly 2.12.10}'
Dependency path '...' --> 'org.apache.spark:spark-core_2.12:3.1.1' --> 'org.json4s:json4s-jackson_2.12:3.7.0-M5' --> 'org.scala-lang:scala-library:2.12.12' {noformat}
 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org