You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by AssafMendelson <as...@rsa.com> on 2016/09/29 08:08:38 UTC

building runnable distribution from source

Hi,
I am trying to compile the latest branch of spark in order to try out some code I wanted to contribute.

I was looking at the instructions to build from http://spark.apache.org/docs/latest/building-spark.html
So at first I did:
./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
This worked without a problem and compiled.

I then did
./dev/make-distribution.sh --name custom-spark --tgz -e -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
Which failed.
(I added the -e because the first run, without it suggested adding this to get more information).
If I look at the compilation itself, It provides no messages for spark project core:

[INFO] Building Spark Project Core 2.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project YARN Shuffle Service 2.1.0-SNAPSHOT
[INFO] -----------------------------------------------------------------------

However, when I reach the summary I find that core has failed to compile.
Below is the messages from the end of the compilation but I can't find any direct error.
I tried to google this but found no solution. Could anyone point me to how to fix this?


[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ spark-core_2.11 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 74 source files to /home/mendea3/git/spark/core/target/scala-2.11/classes
[INFO]
[INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.11 ---
Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is properly installed.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [  4.165 s]
[INFO] Spark Project Tags ................................. SUCCESS [  5.163 s]
[INFO] Spark Project Sketch ............................... SUCCESS [  7.393 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 18.929 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.528 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 14.453 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 15.198 s]
[INFO] Spark Project Core ................................. FAILURE [ 57.641 s]
[INFO] Spark Project ML Local Library ..................... SUCCESS [ 10.561 s]
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SUCCESS [  4.188 s]
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 16.128 s]
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Flume Sink .................. SUCCESS [  9.855 s]
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
[INFO] Spark Project Java 8 Tests ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:52 min (Wall Clock)
[INFO] Finished at: 2016-09-29T10:48:57+03:00
[INFO] Final Memory: 49M/771M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project spark-core_2.11: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project spark-core_2.11: Command execution failed.
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:185)
        at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:181)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.maven.plugin.MojoExecutionException: Command execution failed.
        at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:276)
        at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
        ... 11 more
Caused by: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)
        at org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
        at org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
        at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:660)
        at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:265)
        ... 13 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-core_2.11





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/building-runnable-distribution-from-source-tp27808.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: building runnable distribution from source

Posted by Michael Segel <ms...@hotmail.com>.
You may want to replace the 2.4 with a later release.

On Sep 29, 2016, at 3:08 AM, AssafMendelson <as...@rsa.com>> wrote:

Hi,
I am trying to compile the latest branch of spark in order to try out some code I wanted to contribute.

I was looking at the instructions to build from http://spark.apache.org/docs/latest/building-spark.html
So at first I did:
./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
This worked without a problem and compiled.

I then did
./dev/make-distribution.sh --name custom-spark --tgz -e -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
Which failed.
(I added the –e because the first run, without it suggested adding this to get more information).
If I look at the compilation itself, It provides no messages for spark project core:

[INFO] Building Spark Project Core 2.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project YARN Shuffle Service 2.1.0-SNAPSHOT
[INFO] -----------------------------------------------------------------------

However, when I reach the summary I find that core has failed to compile.
Below is the messages from the end of the compilation but I can’t find any direct error.
I tried to google this but found no solution. Could anyone point me to how to fix this?


[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ spark-core_2.11 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 74 source files to /home/mendea3/git/spark/core/target/scala-2.11/classes
[INFO]
[INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.11 ---
Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is properly installed.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [  4.165 s]
[INFO] Spark Project Tags ................................. SUCCESS [  5.163 s]
[INFO] Spark Project Sketch ............................... SUCCESS [  7.393 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 18.929 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.528 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 14.453 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 15.198 s]
[INFO] Spark Project Core ................................. FAILURE [ 57.641 s]
[INFO] Spark Project ML Local Library ..................... SUCCESS [ 10.561 s]
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SUCCESS [  4.188 s]
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 16.128 s]
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Flume Sink .................. SUCCESS [  9.855 s]
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
[INFO] Spark Project Java 8 Tests ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:52 min (Wall Clock)
[INFO] Finished at: 2016-09-29T10:48:57+03:00
[INFO] Final Memory: 49M/771M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project spark-core_2.11: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project spark-core_2.11: Command execution failed.
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:185)
        at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:181)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.maven.plugin.MojoExecutionException: Command execution failed.
        at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:276)
        at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
        ... 11 more
Caused by: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)
        at org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
        at org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
        at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:660)
        at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:265)
        ... 13 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-core_2.11


________________________________
View this message in context: building runnable distribution from source<http://apache-spark-user-list.1001560.n3.nabble.com/building-runnable-distribution-from-source-tp27808.html>
Sent from the Apache Spark User List mailing list archive<http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com<http://nabble.com>.


RE: building runnable distribution from source

Posted by "Mendelson, Assaf" <As...@rsa.com>.
Thanks, that solved it.
If there is a developer here, it would be useful if this error would be marked as error instead of INFO (especially since this causes core to fail instead of an R package).
Thanks,
	Assaf.

-----Original Message-----
From: Ding Fei [mailto:dingfei@stars.org.cn] 
Sent: Thursday, September 29, 2016 1:20 PM
To: Mendelson, Assaf
Cc: user@spark.apache.org
Subject: Re: building runnable distribution from source

Check that your R is properly installed:

>Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is
properly 
>installed.



On Thu, 2016-09-29 at 01:08 -0700, AssafMendelson wrote:
> Hi,
> 
> I am trying to compile the latest branch of spark in order to try out 
> some code I wanted to contribute.
> 
> 
> I was looking at the instructions to build from 
> http://spark.apache.org/docs/latest/building-spark.html
> 
> So at first I did:
> 
> ./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests 
> clean package
> 
> This worked without a problem and compiled.
> 
>  
> 
> I then did
> 
> ./dev/make-distribution.sh --name custom-spark --tgz -e -Psparkr
> -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
> 
> Which failed.
> 
> (I added the –e because the first run, without it suggested adding 
> this to get more information).
> 
> If I look at the compilation itself, It provides no messages for spark 
> project core:
> 
>  
> 
> [INFO] Building Spark Project Core 2.1.0-SNAPSHOT
> 
> [INFO]
> ----------------------------------------------------------------------
> --
> 
> [INFO]                                                                         
> 
> [INFO]
> ----------------------------------------------------------------------
> --
> 
> [INFO] Building Spark Project YARN Shuffle Service 2.1.0-SNAPSHOT
> 
> [INFO]
> ----------------------------------------------------------------------
> -
> 
>  
> 
> However, when I reach the summary I find that core has failed to 
> compile.
> 
> Below is the messages from the end of the compilation but I can’t find 
> any direct error.
> 
> I tried to google this but found no solution. Could anyone point me to 
> how to fix this?
> 
>  
> 
>  
> 
> [INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @
> spark-core_2.11 ---
> 
> [INFO] Changes detected - recompiling the module!
> 
> [INFO] Compiling 74 source files
> to /home/mendea3/git/spark/core/target/scala-2.11/classes
> 
> [INFO]
> 
> [INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.11
> ---
> 
> Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is 
> properly installed.
> 
> [INFO]
> ----------------------------------------------------------------------
> --
> 
> [INFO] Reactor Summary:
> 
> [INFO]
> 
> [INFO] Spark Project Parent POM ........................... SUCCESS [
> 4.165 s]
> 
> [INFO] Spark Project Tags ................................. SUCCESS [
> 5.163 s]
> 
> [INFO] Spark Project Sketch ............................... SUCCESS [
> 7.393 s]
> 
> [INFO] Spark Project Networking ........................... SUCCESS [ 
> 18.929 s]
> 
> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 
> 10.528 s]
> 
> [INFO] Spark Project Unsafe ............................... SUCCESS [ 
> 14.453 s]
> 
> [INFO] Spark Project Launcher ............................. SUCCESS [ 
> 15.198 s]
> 
> [INFO] Spark Project Core ................................. FAILURE [ 
> 57.641 s]
> 
> [INFO] Spark Project ML Local Library ..................... SUCCESS [ 
> 10.561 s]
> 
> [INFO] Spark Project GraphX ............................... SKIPPED
> 
> [INFO] Spark Project Streaming ............................ SKIPPED
> 
> [INFO] Spark Project Catalyst ............................. SKIPPED
> 
> [INFO] Spark Project SQL .................................. SKIPPED
> 
> [INFO] Spark Project ML Library ........................... SKIPPED
> 
> [INFO] Spark Project Tools ................................ SUCCESS [
> 4.188 s]
> 
> [INFO] Spark Project Hive ................................. SKIPPED
> 
> [INFO] Spark Project REPL ................................. SKIPPED
> 
> [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 
> 16.128 s]
> 
> [INFO] Spark Project YARN ................................. SKIPPED
> 
> [INFO] Spark Project Hive Thrift Server ................... SKIPPED
> 
> [INFO] Spark Project Assembly ............................. SKIPPED
> 
> [INFO] Spark Project External Flume Sink .................. SUCCESS [
> 9.855 s]
> 
> [INFO] Spark Project External Flume ....................... SKIPPED
> 
> [INFO] Spark Project External Flume Assembly .............. SKIPPED
> 
> [INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
> 
> [INFO] Spark Project Examples ............................. SKIPPED
> 
> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> 
> [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
> 
> [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
> 
> [INFO] Spark Project Java 8 Tests ......................... SKIPPED
> 
> [INFO]
> ----------------------------------------------------------------------
> --
> 
> [INFO] BUILD FAILURE
> 
> [INFO]
> ----------------------------------------------------------------------
> --
> 
> [INFO] Total time: 01:52 min (Wall Clock)
> 
> [INFO] Finished at: 2016-09-29T10:48:57+03:00
> 
> [INFO] Final Memory: 49M/771M
> 
> [INFO]
> ----------------------------------------------------------------------
> --
> 
> [ERROR] Failed to execute goal
> org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project
> spark-core_2.11: Command execution failed. Process exited with an
> error: 1 (Exit value: 1) -> [Help 1]
> 
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to 
> execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec
> (sparkr-pkg) on project spark-core_2.11: Command execution failed.
> 
>         at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.
> java:212)
> 
>         at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.
> java:153)
> 
>         at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.
> java:145)
> 
>         at
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProjec
> t(LifecycleModuleBuilder.java:116)
> 
>         at
> org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreade
> dBuilder$1.call(MultiThreadedBuilder.java:185)
> 
>         at
> org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreade
> dBuilder$1.call(MultiThreadedBuilder.java:181)
> 
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 
>         at java.util.concurrent.Executors
> $RunnableAdapter.call(Executors.java:511)
> 
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.j
> ava:1142)
> 
>         at java.util.concurrent.ThreadPoolExecutor
> $Worker.run(ThreadPoolExecutor.java:617)
> 
>         at java.lang.Thread.run(Thread.java:745)
> 
> Caused by: org.apache.maven.plugin.MojoExecutionException: Command 
> execution failed.
> 
>         at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:276)
> 
>         at
> org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultB
> uildPluginManager.java:134)
> 
>         at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.
> java:207)
> 
>         ... 11 more
> 
> Caused by: org.apache.commons.exec.ExecuteException: Process exited 
> with an error: 1 (Exit value: 1)
> 
>         at
> org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecuto
> r.java:404)
> 
>         at
> org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:1
> 66)
> 
>         at
> org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:660)
> 
>         at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:265)
> 
>         ... 13 more
> 
> [ERROR]
> 
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> 
> [ERROR]
> 
> [ERROR] For more information about the errors and possible solutions, 
> please read the following articles:
> 
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionExceptio
> n
> 
> [ERROR]
> 
> [ERROR] After correcting the problems, you can resume the build with 
> the command
> 
> [ERROR]   mvn <goals> -rf :spark-core_2.11
> 
>  
> 
> 
> 
> 
> ______________________________________________________________________
> View this message in context: building runnable distribution from 
> source Sent from the Apache Spark User List mailing list archive at 
> Nabble.com.





Re: building runnable distribution from source

Posted by Ding Fei <di...@stars.org.cn>.
Check that your R is properly installed:

>Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is
properly 
>installed.



On Thu, 2016-09-29 at 01:08 -0700, AssafMendelson wrote:
> Hi,
> 
> I am trying to compile the latest branch of spark in order to try out
> some code I wanted to contribute.
> 
> 
> I was looking at the instructions to build from
> http://spark.apache.org/docs/latest/building-spark.html
> 
> So at first I did:
> 
> ./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests
> clean package
> 
> This worked without a problem and compiled.
> 
>  
> 
> I then did 
> 
> ./dev/make-distribution.sh --name custom-spark --tgz -e -Psparkr
> -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn 
> 
> Which failed.
> 
> (I added the \u2013e because the first run, without it suggested adding
> this to get more information).
> 
> If I look at the compilation itself, It provides no messages for spark
> project core:
> 
>  
> 
> [INFO] Building Spark Project Core 2.1.0-SNAPSHOT
> 
> [INFO]
> ------------------------------------------------------------------------
> 
> [INFO]                                                                         
> 
> [INFO]
> ------------------------------------------------------------------------
> 
> [INFO] Building Spark Project YARN Shuffle Service 2.1.0-SNAPSHOT
> 
> [INFO]
> -----------------------------------------------------------------------
> 
>  
> 
> However, when I reach the summary I find that core has failed to
> compile.
> 
> Below is the messages from the end of the compilation but I can\u2019t find
> any direct error. 
> 
> I tried to google this but found no solution. Could anyone point me to
> how to fix this?
> 
>  
> 
>  
> 
> [INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @
> spark-core_2.11 ---
> 
> [INFO] Changes detected - recompiling the module!
> 
> [INFO] Compiling 74 source files
> to /home/mendea3/git/spark/core/target/scala-2.11/classes
> 
> [INFO] 
> 
> [INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.11
> ---
> 
> Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is
> properly installed.
> 
> [INFO]
> ------------------------------------------------------------------------
> 
> [INFO] Reactor Summary:
> 
> [INFO] 
> 
> [INFO] Spark Project Parent POM ........................... SUCCESS [
> 4.165 s]
> 
> [INFO] Spark Project Tags ................................. SUCCESS [
> 5.163 s]
> 
> [INFO] Spark Project Sketch ............................... SUCCESS [
> 7.393 s]
> 
> [INFO] Spark Project Networking ........................... SUCCESS
> [ 18.929 s]
> 
> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS
> [ 10.528 s]
> 
> [INFO] Spark Project Unsafe ............................... SUCCESS
> [ 14.453 s]
> 
> [INFO] Spark Project Launcher ............................. SUCCESS
> [ 15.198 s]
> 
> [INFO] Spark Project Core ................................. FAILURE
> [ 57.641 s]
> 
> [INFO] Spark Project ML Local Library ..................... SUCCESS
> [ 10.561 s]
> 
> [INFO] Spark Project GraphX ............................... SKIPPED
> 
> [INFO] Spark Project Streaming ............................ SKIPPED
> 
> [INFO] Spark Project Catalyst ............................. SKIPPED
> 
> [INFO] Spark Project SQL .................................. SKIPPED
> 
> [INFO] Spark Project ML Library ........................... SKIPPED
> 
> [INFO] Spark Project Tools ................................ SUCCESS [
> 4.188 s]
> 
> [INFO] Spark Project Hive ................................. SKIPPED
> 
> [INFO] Spark Project REPL ................................. SKIPPED
> 
> [INFO] Spark Project YARN Shuffle Service ................. SUCCESS
> [ 16.128 s]
> 
> [INFO] Spark Project YARN ................................. SKIPPED
> 
> [INFO] Spark Project Hive Thrift Server ................... SKIPPED
> 
> [INFO] Spark Project Assembly ............................. SKIPPED
> 
> [INFO] Spark Project External Flume Sink .................. SUCCESS [
> 9.855 s]
> 
> [INFO] Spark Project External Flume ....................... SKIPPED
> 
> [INFO] Spark Project External Flume Assembly .............. SKIPPED
> 
> [INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
> 
> [INFO] Spark Project Examples ............................. SKIPPED
> 
> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> 
> [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
> 
> [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
> 
> [INFO] Spark Project Java 8 Tests ......................... SKIPPED
> 
> [INFO]
> ------------------------------------------------------------------------
> 
> [INFO] BUILD FAILURE
> 
> [INFO]
> ------------------------------------------------------------------------
> 
> [INFO] Total time: 01:52 min (Wall Clock)
> 
> [INFO] Finished at: 2016-09-29T10:48:57+03:00
> 
> [INFO] Final Memory: 49M/771M
> 
> [INFO]
> ------------------------------------------------------------------------
> 
> [ERROR] Failed to execute goal
> org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project
> spark-core_2.11: Command execution failed. Process exited with an
> error: 1 (Exit value: 1) -> [Help 1]
> 
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
> execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec
> (sparkr-pkg) on project spark-core_2.11: Command execution failed.
> 
>         at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
> 
>         at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
> 
>         at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
> 
>         at
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
> 
>         at
> org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:185)
> 
>         at
> org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:181)
> 
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 
>         at java.util.concurrent.Executors
> $RunnableAdapter.call(Executors.java:511)
> 
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 
>         at java.util.concurrent.ThreadPoolExecutor
> $Worker.run(ThreadPoolExecutor.java:617)
> 
>         at java.lang.Thread.run(Thread.java:745)
> 
> Caused by: org.apache.maven.plugin.MojoExecutionException: Command
> execution failed.
> 
>         at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:276)
> 
>         at
> org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
> 
>         at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
> 
>         ... 11 more
> 
> Caused by: org.apache.commons.exec.ExecuteException: Process exited
> with an error: 1 (Exit value: 1)
> 
>         at
> org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
> 
>         at
> org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
> 
>         at
> org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:660)
> 
>         at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:265)
> 
>         ... 13 more
> 
> [ERROR] 
> 
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> 
> [ERROR]
> 
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> 
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> 
> [ERROR] 
> 
> [ERROR] After correcting the problems, you can resume the build with
> the command
> 
> [ERROR]   mvn <goals> -rf :spark-core_2.11
> 
>  
> 
> 
> 
> 
> ______________________________________________________________________
> View this message in context: building runnable distribution from
> source
> Sent from the Apache Spark User List mailing list archive at
> Nabble.com.





---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org