You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Eli Super <el...@gmail.com> on 2016/07/19 11:22:38 UTC

I'm trying to understand how to compile Spark

Hi

I have a windows laptop

I just downloaded the spark 1.4.1 source code.

I try to compile *org.apache.spark.mllib.fpm* with *mvn *

My goal is to replace *original *org\apache\spark\mllib\fpm\* in
*spark-assembly-1.4.1-hadoop2.6.0.jar*

As I understand from this link

https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-Eclipse


I need to execute following command : build/mvn package -DskipTests -pl
assembly
I executed : mvn org.apache.spark.mllib.fpm  -DskipTests -pl assembly

Then I got an error
 [INFO] Scanning for projects...
[ERROR] [ERROR] Could not find the selected project in the reactor:
assembly @

Thanks for any help

Re: I'm trying to understand how to compile Spark

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

hadoop-2.7 would be more fresh. You don't need hadoop.version when the
defaults are fine. 2.7.2 for hadoop-2.7 profile.

Jacdk

On 19 Jul 2016 6:09 p.m., "Jakob Odersky" <ja...@odersky.com> wrote:

> Hi Eli,
>
> to build spark, just run
>
>     build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests
> package
>
> in your source directory, where package is the actual word "package".
> This will recompile the whole project, so it may take a while when
> running the first time.
> Replacing a single file in an existing jar is not recommended unless
> it is for a quick test, so I would also suggest that you give your
> local spark compilation a custom version as to avoid any ambiguity if
> you depend on it from somewhere else.
>
> Check out this page
> http://spark.apache.org/docs/1.4.1/building-spark.html for more
> detailed information on the build process.
>
> --jakob
>
>
> On Tue, Jul 19, 2016 at 6:42 AM, Ted Yu <yu...@gmail.com> wrote:
> > org.apache.spark.mllib.fpm is not a maven goal.
> >
> > -pl is For Individual Projects.
> >
> > Your first build action should not include -pl.
> >
> >
> > On Tue, Jul 19, 2016 at 4:22 AM, Eli Super <el...@gmail.com> wrote:
> >>
> >> Hi
> >>
> >> I have a windows laptop
> >>
> >> I just downloaded the spark 1.4.1 source code.
> >>
> >> I try to compile org.apache.spark.mllib.fpm with mvn
> >>
> >> My goal is to replace original org\apache\spark\mllib\fpm\* in
> >> spark-assembly-1.4.1-hadoop2.6.0.jar
> >>
> >> As I understand from this link
> >>
> >>
> >>
> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-Eclipse
> >>
> >>
> >> I need to execute following command : build/mvn package -DskipTests -pl
> >> assembly
> >> I executed : mvn org.apache.spark.mllib.fpm  -DskipTests -pl assembly
> >>
> >> Then I got an error
> >>  [INFO] Scanning for projects...
> >> [ERROR] [ERROR] Could not find the selected project in the reactor:
> >> assembly @
> >>
> >> Thanks for any help
> >>
> >>
> >>
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Re: I'm trying to understand how to compile Spark

Posted by Jakob Odersky <ja...@odersky.com>.
Hi Eli,

to build spark, just run

    build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests package

in your source directory, where package is the actual word "package".
This will recompile the whole project, so it may take a while when
running the first time.
Replacing a single file in an existing jar is not recommended unless
it is for a quick test, so I would also suggest that you give your
local spark compilation a custom version as to avoid any ambiguity if
you depend on it from somewhere else.

Check out this page
http://spark.apache.org/docs/1.4.1/building-spark.html for more
detailed information on the build process.

--jakob


On Tue, Jul 19, 2016 at 6:42 AM, Ted Yu <yu...@gmail.com> wrote:
> org.apache.spark.mllib.fpm is not a maven goal.
>
> -pl is For Individual Projects.
>
> Your first build action should not include -pl.
>
>
> On Tue, Jul 19, 2016 at 4:22 AM, Eli Super <el...@gmail.com> wrote:
>>
>> Hi
>>
>> I have a windows laptop
>>
>> I just downloaded the spark 1.4.1 source code.
>>
>> I try to compile org.apache.spark.mllib.fpm with mvn
>>
>> My goal is to replace original org\apache\spark\mllib\fpm\* in
>> spark-assembly-1.4.1-hadoop2.6.0.jar
>>
>> As I understand from this link
>>
>>
>> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-Eclipse
>>
>>
>> I need to execute following command : build/mvn package -DskipTests -pl
>> assembly
>> I executed : mvn org.apache.spark.mllib.fpm  -DskipTests -pl assembly
>>
>> Then I got an error
>>  [INFO] Scanning for projects...
>> [ERROR] [ERROR] Could not find the selected project in the reactor:
>> assembly @
>>
>> Thanks for any help
>>
>>
>>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: I'm trying to understand how to compile Spark

Posted by Ted Yu <yu...@gmail.com>.
org.apache.spark.mllib.fpm is not a maven goal.

-pl is For Individual Projects.

Your first build action should not include -pl.


On Tue, Jul 19, 2016 at 4:22 AM, Eli Super <el...@gmail.com> wrote:

> Hi
>
> I have a windows laptop
>
> I just downloaded the spark 1.4.1 source code.
>
> I try to compile *org.apache.spark.mllib.fpm* with *mvn *
>
> My goal is to replace *original *org\apache\spark\mllib\fpm\* in
> *spark-assembly-1.4.1-hadoop2.6.0.jar*
>
> As I understand from this link
>
>
> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-Eclipse
>
>
> I need to execute following command : build/mvn package -DskipTests -pl
> assembly
> I executed : mvn org.apache.spark.mllib.fpm  -DskipTests -pl assembly
>
> Then I got an error
>  [INFO] Scanning for projects...
> [ERROR] [ERROR] Could not find the selected project in the reactor:
> assembly @
>
> Thanks for any help
>
>
>
>