You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/06/27 04:53:00 UTC

[jira] [Commented] (SPARK-28161) Can't build Spark 2.4.3 based on Ubuntu and Oracle Java 8 SDK v212

    [ https://issues.apache.org/jira/browse/SPARK-28161?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16873813#comment-16873813 ] 

Hyukjin Kwon commented on SPARK-28161:
--------------------------------------

Can you show the full error messages? Also, let's ask questions to mailing list before filing it as an issue. You would get better answer there. I don't think it's an issue within Spark for now but rather sounds like an env issue.

> Can't build Spark 2.4.3 based on Ubuntu and Oracle Java 8 SDK v212
> ------------------------------------------------------------------
>
>                 Key: SPARK-28161
>                 URL: https://issues.apache.org/jira/browse/SPARK-28161
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 2.4.3
>         Environment: {code}
> # Dockerfile
> # Pull base image.
> FROM ubuntu:16.04
> RUN apt update --fix-missing
> RUN apt-get install -y software-properties-common
> RUN mkdir /usr/java
> ADD jdk-8u212-linux-x64.tar /usr/java
> ENV JAVA_HOME=/usr/java/jdk1.8.0_212
> RUN update-alternatives --install /usr/bin/java java ${JAVA_HOME%*/}/bin/java 20000
> RUN update-alternatives --install /usr/bin/javac javac ${JAVA_HOME%*/}/bin/javac 20000
> ENV PATH="${PATH}:/usr/java/jdk1.8.0_212/bin"
> ENV MAVEN_VERSION 3.6.1
> RUN apt-get install -y curl wget
> RUN curl -fsSL [http://archive.apache.org/dist/maven/maven-3/$]{MAVEN_VERSION}/binaries/apache-maven-${MAVEN_VERSION}-bin.tar.gz | tar xzf - -C /usr/share \
>  && mv /usr/share/apache-maven-${MAVEN_VERSION} /usr/share/maven \
>  && ln -s /usr/share/maven/bin/mvn /usr/bin/mvn
> ENV MAVEN_HOME /usr/share/maven
> ENV MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=512m"
> ENV SPARK_SRC="/usr/src/spark"
> ENV BRANCH="v2.4.3"
> RUN apt-get update && apt-get install -y --no-install-recommends \
>  git python3 python3-setuptools r-base-dev r-cran-evaluate
> RUN mkdir -p $SPARK_SRC
> RUN git clone --branch $BRANCH [https://github.com/apache/spark] $SPARK_SRC
> WORKDIR $SPARK_SRC
> RUN ./build/mvn -DskipTests clean package
> {code}
>            Reporter: Martin Nigsch
>            Priority: Minor
>
> Trying to build spark from source based on the Dockerfile attached locally (launched on docker on OSX) fails. 
> Attempts to change/add the following things beyond what's recommended on the build page do not bring improvement: 
> 1. adding ```*RUN ./dev/change-scala-version.sh 2.11```* --> doesn't help
> 2. editing the pom.xml to exclude zinc as in one of the answers in [https://stackoverflow.com/questions/28004552/problems-while-compiling-spark-with-maven/41223558] --> doesn't help
> 3. adding options -DrecompileMode=all   --> doesn't help
> I've downloaded the java from Oracle directly ( jdk-8u212-linux-x64.tar ) which is manually put into /usr/java as the Oracle java seems to be recommended. 
> Build fails at project streaming with:
> {code}
> [INFO] ------------------------------------------------------------------------
> [INFO] Reactor Summary for Spark Project Parent POM 2.4.3:
> [INFO]
> [INFO] Spark Project Parent POM ........................... SUCCESS [ 59.875 s]
> [INFO] Spark Project Tags ................................. SUCCESS [ 20.386 s]
> [INFO] Spark Project Sketch ............................... SUCCESS [ 3.026 s]
> [INFO] Spark Project Local DB ............................. SUCCESS [ 5.654 s]
> [INFO] Spark Project Networking ........................... SUCCESS [ 7.401 s]
> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 3.400 s]
> [INFO] Spark Project Unsafe ............................... SUCCESS [ 6.306 s]
> [INFO] Spark Project Launcher ............................. SUCCESS [ 17.471 s]
> [INFO] Spark Project Core ................................. SUCCESS [02:36 min]
> [INFO] Spark Project ML Local Library ..................... SUCCESS [ 50.313 s]
> [INFO] Spark Project GraphX ............................... SUCCESS [ 21.097 s]
> [INFO] Spark Project Streaming ............................ SUCCESS [ 52.537 s]
> [INFO] Spark Project Catalyst ............................. SUCCESS [02:44 min]
> [INFO] Spark Project SQL .................................. FAILURE [10:44 min]
> [INFO] Spark Project ML Library ........................... SKIPPED
> [INFO] Spark Project Tools ................................ SKIPPED
> [INFO] Spark Project Hive ................................. SKIPPED
> [INFO] Spark Project REPL ................................. SKIPPED
> [INFO] Spark Project Assembly ............................. SKIPPED
> [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
> [INFO] Kafka 0.10+ Source for Structured Streaming ........ SKIPPED
> [INFO] Spark Project Examples ............................. SKIPPED
> [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
> [INFO] Spark Avro ......................................... SKIPPED
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 20:15 min
> [INFO] Finished at: 2019-06-25T09:45:49Z
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-sql_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: CompileFailed -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the command
> [ERROR] mvn <goals> -rf :spark-sql_2.11
> The command '/bin/sh -c ./build/mvn -DskipTests clean package' returned a non-zero code: 1
> {code}
> Any help? I'm stuck with this since 2 days, hence I'm raising this issue. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org