You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Douglas Colkitt (Jira)" <ji...@apache.org> on 2019/11/25 17:45:00 UTC

[jira] [Created] (SPARK-30029) Build fails at Spark Core with -Phadoop-3.2

Douglas Colkitt created SPARK-30029:
---------------------------------------

             Summary: Build fails at Spark Core with -Phadoop-3.2
                 Key: SPARK-30029
                 URL: https://issues.apache.org/jira/browse/SPARK-30029
             Project: Spark
          Issue Type: Bug
          Components: Build
    Affects Versions: 2.4.4
         Environment: Build with OpenJDK-11, Scala 2.12., and Maven 3.6.2.

Using Debian 10 with Kernel 4.16.0-6-amd64

Running on a VirtualBox hosted on a MacBook Pro.
            Reporter: Douglas Colkitt


When building from source using the following:
{code:java}
./build/mvn -DskipTests -Phadoop-3.2 clean package
{code}
Spark Core stage fails with the following errors:
{code:java}
[INFO] --- scala-maven-plugin:4.3.0:testCompile (scala-test-compile-first) @ spark-core_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: /home/vagrant/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__55.0-1.3.1_20191012T045515.jar
[INFO] Compiling 272 Scala sources and 27 Java sources to /home/vagrant/spark/core/target/scala-2.12/test-classes ...
[ERROR] [Error] /home/vagrant/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:23: object lang is not a member of package org.apache.commons
[ERROR] [Error] /home/vagrant/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:49: not found: value SerializationUtils
[ERROR] two errors found{code}
Strangely, resuming the failed build with does result in Spark Core stage successfully completing:
{code:java}
./build/mvn -DskipTests -Phadoop-3.2 clean package -rf :spark-core_2.12
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org