You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Damian Momot (JIRA)" <ji...@apache.org> on 2018/11/13 10:16:00 UTC

[jira] [Created] (SPARK-26037) ./dev/make-distribution.sh fails for Scala 2.12

Damian Momot created SPARK-26037:
------------------------------------

             Summary: ./dev/make-distribution.sh fails for Scala 2.12
                 Key: SPARK-26037
                 URL: https://issues.apache.org/jira/browse/SPARK-26037
             Project: Spark
          Issue Type: Bug
          Components: Build
    Affects Versions: 2.4.0
            Reporter: Damian Momot


Trying to create spark distribution for scala 2.12 and Spark 2.4.0

First add cloudera repository:
{code:java}
<repository>
  <id>cloudera</id>
  <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
{code}
Try to make distribution:
{code:java}
./dev/change-scala-version.sh 2.12
./dev/make-distribution.sh --name scala_2.12-cdh5.15.1 --tgz -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.15.1 -DskipTests -Phive -Phive-thriftserver -Pyarn -Dscala-2.12{code}
I'm getting multiple failures in Spark Project Sketch - all look the same, here's first for reference
{code:java}
~/spark-2.4.0/common/sketch/src/test/scala/org/apache/spark/util/sketch/BitArraySuite.scala:32: exception during macro expansion: 
java.lang.ClassNotFoundException: scala.runtime.LazyRef
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.scalactic.MacroOwnerRepair$Utils.repairOwners(MacroOwnerRepair.scala:66)
at org.scalactic.MacroOwnerRepair.repairOwners(MacroOwnerRepair.scala:46)
at org.scalactic.BooleanMacro.genMacro(BooleanMacro.scala:837)
at org.scalatest.AssertionsMacro$.assert(AssertionsMacro.scala:34)
assert(new BitArray(64).bitSize() == 64)
{code}
Same commands for Scala 2.11 work fine



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org