You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (Jira)" <ji...@apache.org> on 2019/11/16 11:51:00 UTC

[jira] [Comment Edited] (SPARK-29925) Maven Build fails with Hadoop Version 3.2.0

    [ https://issues.apache.org/jira/browse/SPARK-29925?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16975681#comment-16975681 ] 

Yuming Wang edited comment on SPARK-29925 at 11/16/19 11:50 AM:
----------------------------------------------------------------

You should  build with {{hadoop-3.2}} profile. https://github.com/apache/spark/commit/90c64ea4194ed7d5e1b315b3287f64dc661c8963#diff-e700812356511df02cda7d3ccd38ca02


was (Author: q79969786):
You should  build with {{hadoop-3.2}} profile. https://github.com/apache/spark/blob/f77c10de38d0563b2e42d1200a1fbbdb3018c2e9/pom.xml#L2919-L2942

> Maven Build fails with Hadoop Version 3.2.0
> -------------------------------------------
>
>                 Key: SPARK-29925
>                 URL: https://issues.apache.org/jira/browse/SPARK-29925
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 3.1.0
>         Environment: The build was tested in two environments. The first was Debian 10 running OpenJDK 11 with Scala 2.12. The second was Debian 9.1 with OpenJDK 8 and Scala 2.12.
> The same error occurred in both environments. 
> Both environments used Linux kernel 4.19. Both environments were VirtualBox VMs running on a MacBook. 
>            Reporter: Douglas Colkitt
>            Priority: Minor
>
> Build fails at Spark Core stage when using Maven with specified Hadoop version 3.2. The build command run is:
> {code:java}
> ./build/mvn -DskipTests -Dhadoop.version=3.2.0 package
> {code}
> The build error output is
> {code:java}
> [INFO] 
> [INFO] --- scala-maven-plugin:4.2.0:testCompile (scala-test-compile-first) @ spark-core_2.12 ---
> [INFO] Using incremental compilation using Mixed compile order
> [INFO] Compiling 262 Scala sources and 27 Java sources to /usr/local/src/spark/core/target/scala-2.12/test-classes ...
> [ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:23: object lang is not a member of package org.apache.commons
> [ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:49: not found: value SerializationUtils
> [ERROR] two errors found{code}
> The problem does _not_ occur when building without Hadoop package specification, i.e. when running:
> {code:java}
> ./build/mvn -DskipTests package
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org