You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Raghuveer Chanda <ra...@gmail.com> on 2015/10/30 07:34:34 UTC

Spark 1.5.1 Build Failure

Hi,

I am trying to build spark 1.5.1 for hadoop 2.5 but I get the following
error.


*build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.0-cdh5.3.2 -DskipTests
clean package*


[INFO] Spark Project Parent POM ........................... SUCCESS [
 9.812 s]
[INFO] Spark Project Launcher ............................. SUCCESS [
27.701 s]
[INFO] Spark Project Networking ........................... SUCCESS [
16.721 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [
 8.617 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [
27.124 s]
[INFO] Spark Project Core ................................. FAILURE [09:08
min]

Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile
(scala-test-compile-first) on project spark-core_2.10: Execution
scala-test-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile failed.
CompileFailed -> [Help 1]



-- 
Regards,
Raghuveer Chanda

Re: Spark 1.5.1 Build Failure

Posted by Raghuveer Chanda <ra...@gmail.com>.
There seems to be a error at the zinc server, how can I shut down the zinc
server completely
*build/zinc-0.3.5.3/bin/zinc -shutdown *will shutdown but it again restarts
with the mvn/build command ?



*Error in Debug mode :*

*[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile
(scala-test-compile-first) on project spark-core_2.10: Execution
scala-test-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile failed.
CompileFailed -> [Help 1]*
*org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
goal net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile
(scala-test-compile-first) on project spark-core_2.10: Execution
scala-test-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile failed.*
* at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:224)*
* at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)*
* at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)*
* at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)*
* at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)*
* at
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)*
* at
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)*
* at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)*
* at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)*
* at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)*
* at org.apache.maven.cli.MavenCli.execute(MavenCli.java:862)*
* at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:286)*
* at org.apache.maven.cli.MavenCli.main(MavenCli.java:197)*
* at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
* at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)*
* at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
* at java.lang.reflect.Method.invoke(Method.java:606)*
* at
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)*
* at
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)*
* at
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)*
* at
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)*
*Caused by: org.apache.maven.plugin.PluginExecutionException: Execution
scala-test-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile failed.*
* at
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:145)*
* at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)*
* ... 20 more*
*Caused by: Compile failed via zinc server*
* at
sbt_inc.SbtIncrementalCompiler.zincCompile(SbtIncrementalCompiler.java:136)*
* at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:86)*
* at
scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303)*
* at
scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119)*
* at
scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99)*
* at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)*
* at scala_maven.ScalaTestCompileMojo.execute(ScalaTestCompileMojo.java:48)*
* at
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)*
* ... 21 more*
*[ERROR] *
*[ERROR] *
*[ERROR] For more information about the errors and possible solutions,
please read the following articles:*
*[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
<http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException>*
*[ERROR] *
*[ERROR] After correcting the problems, you can resume the build with the
command*
*[ERROR]   mvn <goals> -rf :spark-core_2.10*

*Regards*
*Raghuveer*

On Fri, Oct 30, 2015 at 1:18 PM, Raghuveer Chanda <
raghuveer.chanda@gmail.com> wrote:

> Thanks for the reply.
>
> I am using the mvn and scala from the source code build/mvn only and I get
> the same error without hadoop also after clean package.
>
>
> *Java Version:*
>
> *rchanda@ubuntu:~/Downloads/spark-1.5.1$ java -version*
> *java version "1.7.0_85"*
> *OpenJDK Runtime Environment (IcedTea 2.6.1) (7u85-2.6.1-5ubuntu0.14.04.1)*
> *OpenJDK 64-Bit Server VM (build 24.85-b03, mixed mode)*
>
> *Complete Error:*
>
> *rchanda@ubuntu:~/Downloads/spark-1.5.1$ build/mvn -DskiptTests clean
> package*
> *Using `mvn` from path:
> /home/rchanda/Downloads/spark-1.5.1/build/apache-maven-3.3.3/bin/mvn*
> *[INFO] Scanning for projects...*
> *[INFO]
> ------------------------------------------------------------------------*
> *[INFO] Reactor Build Order:*
> *[INFO] *
> *[INFO] Spark Project Parent POM*
> *[INFO] Spark Project Launcher*
> *[INFO] Spark Project Networking*
> *[INFO] Spark Project Shuffle Streaming Service*
> *[INFO] Spark Project Unsafe*
> *[INFO] Spark Project Core*
> *[INFO] Spark Project Bagel*
> *[INFO] Spark Project GraphX*
> *[INFO] Spark Project Streaming*
> *[INFO] Spark Project Catalyst*
> *[INFO] Spark Project SQL*
> *[INFO] Spark Project ML Library*
> *[INFO] Spark Project Tools*
> *[INFO] Spark Project Hive*
> *[INFO] Spark Project REPL*
> *[INFO] Spark Project Assembly*
> *[INFO] Spark Project External Twitter*
> *[INFO] Spark Project External Flume Sink*
> *[INFO] Spark Project External Flume*
> *[INFO] Spark Project External Flume Assembly*
> *[INFO] Spark Project External MQTT*
> *[INFO] Spark Project External MQTT Assembly*
> *[INFO] Spark Project External ZeroMQ*
> *[INFO] Spark Project External Kafka*
> *[INFO] Spark Project Examples*
> *[INFO] Spark Project External Kafka Assembly*
> *[INFO]
>       *
> *[INFO]
> ------------------------------------------------------------------------*
> *[INFO] Building Spark Project Parent POM 1.5.1*
> *[INFO]
> ------------------------------------------------------------------------*
> *[INFO] *
> *[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
> spark-parent_2.10 ---*
> *[INFO] *
> *[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
> spark-parent_2.10 ---*
> *[INFO] *
> *[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
> spark-parent_2.10 ---*
> *[INFO] Add Source directory:
> /home/rchanda/Downloads/spark-1.5.1/src/main/scala*
> *[INFO] Add Test Source directory:
> /home/rchanda/Downloads/spark-1.5.1/src/test/scala*
> *[INFO] *
> *[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-parent_2.10 ---*
> *[INFO] *
> *[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
> spark-parent_2.10 ---*
> *[INFO] No sources to compile*
> *[INFO] *
> *[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @
> spark-parent_2.10 ---*
> *[INFO] Executing tasks*
>
> *main:*
> *    [mkdir] Created dir: /home/rchanda/Downloads/spark-1.5.1/target/tmp*
> *[INFO] Executed tasks*
> *[INFO] *
> *[INFO] --- scala-maven-plugin:3.2.2:testCompile
> (scala-test-compile-first) @ spark-parent_2.10 ---*
> *[INFO] No sources to compile*
> *[INFO] *
> *[INFO] --- maven-dependency-plugin:2.10:build-classpath (default) @
> spark-parent_2.10 ---*
> *[INFO] *
> *[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-parent_2.10 ---*
> *Discovery starting.*
> *Discovery completed in 178 milliseconds.*
> *Run starting. Expected test count is: 0*
> *DiscoverySuite:*
> *Run completed in 403 milliseconds.*
> *Total number of tests run: 0*
> *Suites: completed 1, aborted 0*
> *Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0*
> *No tests were executed.*
> *[INFO] *
> *[INFO] --- maven-jar-plugin:2.6:test-jar (prepare-test-jar) @
> spark-parent_2.10 ---*
> *[INFO] Building jar:
> /home/rchanda/Downloads/spark-1.5.1/target/spark-parent_2.10-1.5.1-tests.jar*
> *[INFO] *
> *[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @
> spark-parent_2.10 ---*
> *[INFO] *
> *[INFO] --- maven-shade-plugin:2.4.1:shade (default) @ spark-parent_2.10
> ---*
> *[INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded
> jar.*
> *[INFO] Replacing original artifact with shaded artifact.*
> *[INFO] *
> *[INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) @
> spark-parent_2.10 ---*
> *[INFO] *
> *[INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) @
> spark-parent_2.10 ---*
> *[INFO]
>       *
> *[INFO]
> ------------------------------------------------------------------------*
> *[INFO] Building Spark Project Launcher 1.5.1*
> *[INFO]
> ------------------------------------------------------------------------*
> *[INFO] *
> *[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
> spark-launcher_2.10 ---*
> *[INFO] *
> *[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
> spark-launcher_2.10 ---*
> *[INFO] *
> *[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
> spark-launcher_2.10 ---*
> *[INFO] Add Source directory:
> /home/rchanda/Downloads/spark-1.5.1/launcher/src/main/scala*
> *[INFO] Add Test Source directory:
> /home/rchanda/Downloads/spark-1.5.1/launcher/src/test/scala*
> *[INFO] *
> *[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-launcher_2.10 ---*
> *[INFO] *
> *[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> spark-launcher_2.10 ---*
> *[INFO] Using 'UTF-8' encoding to copy filtered resources.*
> *[INFO] skip non existing resourceDirectory
> /home/rchanda/Downloads/spark-1.5.1/launcher/src/main/resources*
> *[INFO] Copying 3 resources*
> *[INFO] *
> *[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
> spark-launcher_2.10 ---*
> *[INFO] Using zinc server for incremental compilation*
> *[info] Compiling 8 Java sources to
> /home/rchanda/Downloads/spark-1.5.1/launcher/target/scala-2.10/classes...*
> *[info] Error occurred during initialization of VM*
> *[info] java.lang.Error: Properties init: Could not determine current
> working directory.*
> *[info] at java.lang.System.initProperties(Native Method)*
> *[info] at java.lang.System.initializeSystemClass(System.java:1119)*
> *[info] *
> *[error] Compile failed at Oct 29, 2015 2:11:26 PM [0.110s]*
> *[INFO]
> ------------------------------------------------------------------------*
> *[INFO] Reactor Summary:*
> *[INFO] *
> *[INFO] Spark Project Parent POM ........................... SUCCESS [
>  9.628 s]*
> *[INFO] Spark Project Launcher ............................. FAILURE [
>  7.573 s]*
> *[INFO] Spark Project Networking ........................... SKIPPED*
> *[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED*
> *[INFO] Spark Project Unsafe ............................... SKIPPED*
> *[INFO] Spark Project Core ................................. SKIPPED*
> *[INFO] Spark Project Bagel ................................ SKIPPED*
> *[INFO] Spark Project GraphX ............................... SKIPPED*
> *[INFO] Spark Project Streaming ............................ SKIPPED*
> *[INFO] Spark Project Catalyst ............................. SKIPPED*
> *[INFO] Spark Project SQL .................................. SKIPPED*
> *[INFO] Spark Project ML Library ........................... SKIPPED*
> *[INFO] Spark Project Tools ................................ SKIPPED*
> *[INFO] Spark Project Hive ................................. SKIPPED*
> *[INFO] Spark Project REPL ................................. SKIPPED*
> *[INFO] Spark Project Assembly ............................. SKIPPED*
> *[INFO] Spark Project External Twitter ..................... SKIPPED*
> *[INFO] Spark Project External Flume Sink .................. SKIPPED*
> *[INFO] Spark Project External Flume ....................... SKIPPED*
> *[INFO] Spark Project External Flume Assembly .............. SKIPPED*
> *[INFO] Spark Project External MQTT ........................ SKIPPED*
> *[INFO] Spark Project External MQTT Assembly ............... SKIPPED*
> *[INFO] Spark Project External ZeroMQ ...................... SKIPPED*
> *[INFO] Spark Project External Kafka ....................... SKIPPED*
> *[INFO] Spark Project Examples ............................. SKIPPED*
> *[INFO] Spark Project External Kafka Assembly .............. SKIPPED*
> *[INFO]
> ------------------------------------------------------------------------*
> *[INFO] BUILD FAILURE*
> *[INFO]
> ------------------------------------------------------------------------*
> *[INFO] Total time: 19.258 s*
> *[INFO] Finished at: 2015-10-29T14:11:26-07:00*
> *[INFO] Final Memory: 39M/94M*
> *[INFO]
> ------------------------------------------------------------------------*
> *[ERROR] Failed to execute goal
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first)
> on project spark-launcher_2.10: Execution scala-compile-first of goal
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed
> -> [Help 1]*
> *[ERROR] *
> *[ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.*
> *[ERROR] Re-run Maven using the -X switch to enable full debug logging.*
> *[ERROR] *
> *[ERROR] For more information about the errors and possible solutions,
> please read the following articles:*
> *[ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
> <http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException>*
> *[ERROR] *
> *[ERROR] After correcting the problems, you can resume the build with the
> command*
> *[ERROR]   mvn <goals> -rf :spark-launcher_2.10*
>
>
>
> On Fri, Oct 30, 2015 at 12:55 PM, Jia Zhan <zh...@gmail.com> wrote:
>
>> Hi,
>>
>> Have tried tried building it successfully without hadoop?
>>
>> $build/mnv -DskiptTests clean package
>>
>> Can you check it build/mvn was started successfully, or it's using your
>> own mvn? Let us know your jdk version as well.
>>
>> On Thu, Oct 29, 2015 at 11:34 PM, Raghuveer Chanda <
>> raghuveer.chanda@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I am trying to build spark 1.5.1 for hadoop 2.5 but I get the following
>>> error.
>>>
>>>
>>> *build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.0-cdh5.3.2
>>> -DskipTests clean package*
>>>
>>>
>>> [INFO] Spark Project Parent POM ........................... SUCCESS [
>>>  9.812 s]
>>> [INFO] Spark Project Launcher ............................. SUCCESS [
>>> 27.701 s]
>>> [INFO] Spark Project Networking ........................... SUCCESS [
>>> 16.721 s]
>>> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [
>>>  8.617 s]
>>> [INFO] Spark Project Unsafe ............................... SUCCESS [
>>> 27.124 s]
>>> [INFO] Spark Project Core ................................. FAILURE
>>> [09:08 min]
>>>
>>> Failed to execute goal
>>> net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile
>>> (scala-test-compile-first) on project spark-core_2.10: Execution
>>> scala-test-compile-first of goal
>>> net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile failed.
>>> CompileFailed -> [Help 1]
>>>
>>>
>>>
>>> --
>>> Regards,
>>> Raghuveer Chanda
>>>
>>>
>>
>>
>> --
>> Jia Zhan
>>
>>
>
>
> --
> Regards,
> Raghuveer Chanda
> Computer Science and Engineering
> IIT Kharagpur
> +91-9475470374
>



-- 
Regards,
Raghuveer Chanda
Computer Science and Engineering
IIT Kharagpur
+91-9475470374

Re: Spark 1.5.1 Build Failure

Posted by Raghuveer Chanda <ra...@gmail.com>.
Thanks for the reply.

I am using the mvn and scala from the source code build/mvn only and I get
the same error without hadoop also after clean package.


*Java Version:*

*rchanda@ubuntu:~/Downloads/spark-1.5.1$ java -version*
*java version "1.7.0_85"*
*OpenJDK Runtime Environment (IcedTea 2.6.1) (7u85-2.6.1-5ubuntu0.14.04.1)*
*OpenJDK 64-Bit Server VM (build 24.85-b03, mixed mode)*

*Complete Error:*

*rchanda@ubuntu:~/Downloads/spark-1.5.1$ build/mvn -DskiptTests clean
package*
*Using `mvn` from path:
/home/rchanda/Downloads/spark-1.5.1/build/apache-maven-3.3.3/bin/mvn*
*[INFO] Scanning for projects...*
*[INFO]
------------------------------------------------------------------------*
*[INFO] Reactor Build Order:*
*[INFO] *
*[INFO] Spark Project Parent POM*
*[INFO] Spark Project Launcher*
*[INFO] Spark Project Networking*
*[INFO] Spark Project Shuffle Streaming Service*
*[INFO] Spark Project Unsafe*
*[INFO] Spark Project Core*
*[INFO] Spark Project Bagel*
*[INFO] Spark Project GraphX*
*[INFO] Spark Project Streaming*
*[INFO] Spark Project Catalyst*
*[INFO] Spark Project SQL*
*[INFO] Spark Project ML Library*
*[INFO] Spark Project Tools*
*[INFO] Spark Project Hive*
*[INFO] Spark Project REPL*
*[INFO] Spark Project Assembly*
*[INFO] Spark Project External Twitter*
*[INFO] Spark Project External Flume Sink*
*[INFO] Spark Project External Flume*
*[INFO] Spark Project External Flume Assembly*
*[INFO] Spark Project External MQTT*
*[INFO] Spark Project External MQTT Assembly*
*[INFO] Spark Project External ZeroMQ*
*[INFO] Spark Project External Kafka*
*[INFO] Spark Project Examples*
*[INFO] Spark Project External Kafka Assembly*
*[INFO]
    *
*[INFO]
------------------------------------------------------------------------*
*[INFO] Building Spark Project Parent POM 1.5.1*
*[INFO]
------------------------------------------------------------------------*
*[INFO] *
*[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
spark-parent_2.10 ---*
*[INFO] *
*[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
spark-parent_2.10 ---*
*[INFO] *
*[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
spark-parent_2.10 ---*
*[INFO] Add Source directory:
/home/rchanda/Downloads/spark-1.5.1/src/main/scala*
*[INFO] Add Test Source directory:
/home/rchanda/Downloads/spark-1.5.1/src/test/scala*
*[INFO] *
*[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
spark-parent_2.10 ---*
*[INFO] *
*[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
spark-parent_2.10 ---*
*[INFO] No sources to compile*
*[INFO] *
*[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @
spark-parent_2.10 ---*
*[INFO] Executing tasks*

*main:*
*    [mkdir] Created dir: /home/rchanda/Downloads/spark-1.5.1/target/tmp*
*[INFO] Executed tasks*
*[INFO] *
*[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first)
@ spark-parent_2.10 ---*
*[INFO] No sources to compile*
*[INFO] *
*[INFO] --- maven-dependency-plugin:2.10:build-classpath (default) @
spark-parent_2.10 ---*
*[INFO] *
*[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-parent_2.10 ---*
*Discovery starting.*
*Discovery completed in 178 milliseconds.*
*Run starting. Expected test count is: 0*
*DiscoverySuite:*
*Run completed in 403 milliseconds.*
*Total number of tests run: 0*
*Suites: completed 1, aborted 0*
*Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0*
*No tests were executed.*
*[INFO] *
*[INFO] --- maven-jar-plugin:2.6:test-jar (prepare-test-jar) @
spark-parent_2.10 ---*
*[INFO] Building jar:
/home/rchanda/Downloads/spark-1.5.1/target/spark-parent_2.10-1.5.1-tests.jar*
*[INFO] *
*[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @
spark-parent_2.10 ---*
*[INFO] *
*[INFO] --- maven-shade-plugin:2.4.1:shade (default) @ spark-parent_2.10
---*
*[INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded
jar.*
*[INFO] Replacing original artifact with shaded artifact.*
*[INFO] *
*[INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) @
spark-parent_2.10 ---*
*[INFO] *
*[INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) @
spark-parent_2.10 ---*
*[INFO]
    *
*[INFO]
------------------------------------------------------------------------*
*[INFO] Building Spark Project Launcher 1.5.1*
*[INFO]
------------------------------------------------------------------------*
*[INFO] *
*[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
spark-launcher_2.10 ---*
*[INFO] *
*[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
spark-launcher_2.10 ---*
*[INFO] *
*[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
spark-launcher_2.10 ---*
*[INFO] Add Source directory:
/home/rchanda/Downloads/spark-1.5.1/launcher/src/main/scala*
*[INFO] Add Test Source directory:
/home/rchanda/Downloads/spark-1.5.1/launcher/src/test/scala*
*[INFO] *
*[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
spark-launcher_2.10 ---*
*[INFO] *
*[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
spark-launcher_2.10 ---*
*[INFO] Using 'UTF-8' encoding to copy filtered resources.*
*[INFO] skip non existing resourceDirectory
/home/rchanda/Downloads/spark-1.5.1/launcher/src/main/resources*
*[INFO] Copying 3 resources*
*[INFO] *
*[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
spark-launcher_2.10 ---*
*[INFO] Using zinc server for incremental compilation*
*[info] Compiling 8 Java sources to
/home/rchanda/Downloads/spark-1.5.1/launcher/target/scala-2.10/classes...*
*[info] Error occurred during initialization of VM*
*[info] java.lang.Error: Properties init: Could not determine current
working directory.*
*[info] at java.lang.System.initProperties(Native Method)*
*[info] at java.lang.System.initializeSystemClass(System.java:1119)*
*[info] *
*[error] Compile failed at Oct 29, 2015 2:11:26 PM [0.110s]*
*[INFO]
------------------------------------------------------------------------*
*[INFO] Reactor Summary:*
*[INFO] *
*[INFO] Spark Project Parent POM ........................... SUCCESS [
 9.628 s]*
*[INFO] Spark Project Launcher ............................. FAILURE [
 7.573 s]*
*[INFO] Spark Project Networking ........................... SKIPPED*
*[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED*
*[INFO] Spark Project Unsafe ............................... SKIPPED*
*[INFO] Spark Project Core ................................. SKIPPED*
*[INFO] Spark Project Bagel ................................ SKIPPED*
*[INFO] Spark Project GraphX ............................... SKIPPED*
*[INFO] Spark Project Streaming ............................ SKIPPED*
*[INFO] Spark Project Catalyst ............................. SKIPPED*
*[INFO] Spark Project SQL .................................. SKIPPED*
*[INFO] Spark Project ML Library ........................... SKIPPED*
*[INFO] Spark Project Tools ................................ SKIPPED*
*[INFO] Spark Project Hive ................................. SKIPPED*
*[INFO] Spark Project REPL ................................. SKIPPED*
*[INFO] Spark Project Assembly ............................. SKIPPED*
*[INFO] Spark Project External Twitter ..................... SKIPPED*
*[INFO] Spark Project External Flume Sink .................. SKIPPED*
*[INFO] Spark Project External Flume ....................... SKIPPED*
*[INFO] Spark Project External Flume Assembly .............. SKIPPED*
*[INFO] Spark Project External MQTT ........................ SKIPPED*
*[INFO] Spark Project External MQTT Assembly ............... SKIPPED*
*[INFO] Spark Project External ZeroMQ ...................... SKIPPED*
*[INFO] Spark Project External Kafka ....................... SKIPPED*
*[INFO] Spark Project Examples ............................. SKIPPED*
*[INFO] Spark Project External Kafka Assembly .............. SKIPPED*
*[INFO]
------------------------------------------------------------------------*
*[INFO] BUILD FAILURE*
*[INFO]
------------------------------------------------------------------------*
*[INFO] Total time: 19.258 s*
*[INFO] Finished at: 2015-10-29T14:11:26-07:00*
*[INFO] Final Memory: 39M/94M*
*[INFO]
------------------------------------------------------------------------*
*[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first)
on project spark-launcher_2.10: Execution scala-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed
-> [Help 1]*
*[ERROR] *
*[ERROR] To see the full stack trace of the errors, re-run Maven with the
-e switch.*
*[ERROR] Re-run Maven using the -X switch to enable full debug logging.*
*[ERROR] *
*[ERROR] For more information about the errors and possible solutions,
please read the following articles:*
*[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
<http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException>*
*[ERROR] *
*[ERROR] After correcting the problems, you can resume the build with the
command*
*[ERROR]   mvn <goals> -rf :spark-launcher_2.10*



On Fri, Oct 30, 2015 at 12:55 PM, Jia Zhan <zh...@gmail.com> wrote:

> Hi,
>
> Have tried tried building it successfully without hadoop?
>
> $build/mnv -DskiptTests clean package
>
> Can you check it build/mvn was started successfully, or it's using your
> own mvn? Let us know your jdk version as well.
>
> On Thu, Oct 29, 2015 at 11:34 PM, Raghuveer Chanda <
> raghuveer.chanda@gmail.com> wrote:
>
>> Hi,
>>
>> I am trying to build spark 1.5.1 for hadoop 2.5 but I get the following
>> error.
>>
>>
>> *build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.0-cdh5.3.2
>> -DskipTests clean package*
>>
>>
>> [INFO] Spark Project Parent POM ........................... SUCCESS [
>>  9.812 s]
>> [INFO] Spark Project Launcher ............................. SUCCESS [
>> 27.701 s]
>> [INFO] Spark Project Networking ........................... SUCCESS [
>> 16.721 s]
>> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [
>>  8.617 s]
>> [INFO] Spark Project Unsafe ............................... SUCCESS [
>> 27.124 s]
>> [INFO] Spark Project Core ................................. FAILURE
>> [09:08 min]
>>
>> Failed to execute goal
>> net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile
>> (scala-test-compile-first) on project spark-core_2.10: Execution
>> scala-test-compile-first of goal
>> net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile failed.
>> CompileFailed -> [Help 1]
>>
>>
>>
>> --
>> Regards,
>> Raghuveer Chanda
>>
>>
>
>
> --
> Jia Zhan
>
>


-- 
Regards,
Raghuveer Chanda
Computer Science and Engineering
IIT Kharagpur
+91-9475470374

Re: Spark 1.5.1 Build Failure

Posted by Jia Zhan <zh...@gmail.com>.
Hi,

Have tried tried building it successfully without hadoop?

$build/mnv -DskiptTests clean package

Can you check it build/mvn was started successfully, or it's using your own
mvn? Let us know your jdk version as well.

On Thu, Oct 29, 2015 at 11:34 PM, Raghuveer Chanda <
raghuveer.chanda@gmail.com> wrote:

> Hi,
>
> I am trying to build spark 1.5.1 for hadoop 2.5 but I get the following
> error.
>
>
> *build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.0-cdh5.3.2 -DskipTests
> clean package*
>
>
> [INFO] Spark Project Parent POM ........................... SUCCESS [
>  9.812 s]
> [INFO] Spark Project Launcher ............................. SUCCESS [
> 27.701 s]
> [INFO] Spark Project Networking ........................... SUCCESS [
> 16.721 s]
> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [
>  8.617 s]
> [INFO] Spark Project Unsafe ............................... SUCCESS [
> 27.124 s]
> [INFO] Spark Project Core ................................. FAILURE [09:08
> min]
>
> Failed to execute goal
> net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile
> (scala-test-compile-first) on project spark-core_2.10: Execution
> scala-test-compile-first of goal
> net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile failed.
> CompileFailed -> [Help 1]
>
>
>
> --
> Regards,
> Raghuveer Chanda
>
>


-- 
Jia Zhan