You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Douglas Colkitt (Jira)" <ji...@apache.org> on 2019/11/16 09:33:00 UTC

[jira] [Updated] (SPARK-29925) Maven Build fails with Hadoop Version 3.2.0

     [ https://issues.apache.org/jira/browse/SPARK-29925?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Douglas Colkitt updated SPARK-29925:
------------------------------------
    Description: 
Build fails at Spark Core stage when using Maven with specified Hadoop Cloud package. The build command run is:
{code:java}
./build/mvn -DskipTests -Dhadoop.version=3.2.0 package
{code}
The build error output is
{code:java}
[INFO] 
[INFO] --- scala-maven-plugin:4.2.0:testCompile (scala-test-compile-first) @ spark-core_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiling 262 Scala sources and 27 Java sources to /usr/local/src/spark/core/target/scala-2.12/test-classes ...
[ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:23: object lang is not a member of package org.apache.commons
[ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:49: not found: value SerializationUtils
[ERROR] two errors found{code}
The problem does _not_ occur when building without Hadoop package specification, i.e. when running:
{code:java}
./build/mvn -DskipTests package
{code}
 

  was:
Build fails at Spark Core stage when using Maven with specified Hadoop Cloud package. The build command run is:
{code:java}
./build/mvn -DskipTests -Phadoop-cloud -Dhadoop.version=3.2.0 package
{code}
The build error output is
{code:java}
[INFO] 
[INFO] --- scala-maven-plugin:4.2.0:testCompile (scala-test-compile-first) @ spark-core_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiling 262 Scala sources and 27 Java sources to /usr/local/src/spark/core/target/scala-2.12/test-classes ...
[ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:23: object lang is not a member of package org.apache.commons
[ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:49: not found: value SerializationUtils
[ERROR] two errors found{code}
The problem does _not_ occur when building without Hadoop package specification, i.e. when running:
{code:java}
./build/mvn -DskipTests package
{code}
 

        Summary: Maven Build fails with Hadoop Version 3.2.0  (was: Maven Build fails with flag: -Phadoop-cloud )

> Maven Build fails with Hadoop Version 3.2.0
> -------------------------------------------
>
>                 Key: SPARK-29925
>                 URL: https://issues.apache.org/jira/browse/SPARK-29925
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 3.1.0
>         Environment: The build was tested in two environments. The first was Debian 10 running OpenJDK 11 with Scala 2.12. The second was Debian 9.1 with OpenJDK 8 and Scala 2.12.
> The same error occurred in both environments. 
> Both environments used Linux kernel 4.19. Both environments were VirtualBox VMs running on a MacBook. 
>            Reporter: Douglas Colkitt
>            Priority: Minor
>
> Build fails at Spark Core stage when using Maven with specified Hadoop Cloud package. The build command run is:
> {code:java}
> ./build/mvn -DskipTests -Dhadoop.version=3.2.0 package
> {code}
> The build error output is
> {code:java}
> [INFO] 
> [INFO] --- scala-maven-plugin:4.2.0:testCompile (scala-test-compile-first) @ spark-core_2.12 ---
> [INFO] Using incremental compilation using Mixed compile order
> [INFO] Compiling 262 Scala sources and 27 Java sources to /usr/local/src/spark/core/target/scala-2.12/test-classes ...
> [ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:23: object lang is not a member of package org.apache.commons
> [ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:49: not found: value SerializationUtils
> [ERROR] two errors found{code}
> The problem does _not_ occur when building without Hadoop package specification, i.e. when running:
> {code:java}
> ./build/mvn -DskipTests package
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org