You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Marcelo Vanzin <va...@cloudera.com> on 2016/04/05 05:00:42 UTC

Build changes after SPARK-13579

Hey all,

We merged  SPARK-13579 today, and if you're like me and have your
hands automatically type "sbt assembly" anytime you're building Spark,
that won't work anymore.

You should now use "sbt package"; you'll still need "sbt assembly" if
you require one of the remaining assemblies (streaming connectors,
yarn shuffle service).


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Build changes after SPARK-13579

Posted by Michael Gummelt <mg...@mesosphere.io>.
This line: "build/sbt clean assembly"

should also be changed, right?

On Tue, Jul 19, 2016 at 1:18 AM, Sean Owen <so...@cloudera.com> wrote:

> If the change is just to replace "sbt assembly/assembly" with "sbt
> package", done. LMK if there are more edits.
>
> On Mon, Jul 18, 2016 at 10:00 PM, Michael Gummelt
> <mg...@mesosphere.io> wrote:
> > I just flailed on this a bit before finding this email.  Can someone
> please
> > update
> >
> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup
> >
> > On Mon, Apr 4, 2016 at 10:01 PM, Reynold Xin <rx...@databricks.com>
> wrote:
> >>
> >> pyspark and R
> >>
> >> On Mon, Apr 4, 2016 at 9:59 PM, Marcelo Vanzin <va...@cloudera.com>
> >> wrote:
> >>>
> >>> No, tests (except pyspark) should work without having to package
> anything
> >>> first.
> >>>
> >>> On Mon, Apr 4, 2016 at 9:58 PM, Koert Kuipers <ko...@tresata.com>
> wrote:
> >>> > do i need to run sbt package before doing tests?
> >>> >
> >>> > On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin <vanzin@cloudera.com
> >
> >>> > wrote:
> >>> >>
> >>> >> Hey all,
> >>> >>
> >>> >> We merged  SPARK-13579 today, and if you're like me and have your
> >>> >> hands automatically type "sbt assembly" anytime you're building
> Spark,
> >>> >> that won't work anymore.
> >>> >>
> >>> >> You should now use "sbt package"; you'll still need "sbt assembly"
> if
> >>> >> you require one of the remaining assemblies (streaming connectors,
> >>> >> yarn shuffle service).
> >>> >>
> >>> >>
> >>> >> --
> >>> >> Marcelo
> >>> >>
> >>> >>
> ---------------------------------------------------------------------
> >>> >> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >>> >> For additional commands, e-mail: dev-help@spark.apache.org
> >>> >>
> >>> >
> >>>
> >>>
> >>>
> >>> --
> >>> Marcelo
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >>> For additional commands, e-mail: dev-help@spark.apache.org
> >>>
> >>
> >
> >
> >
> > --
> > Michael Gummelt
> > Software Engineer
> > Mesosphere
>



-- 
Michael Gummelt
Software Engineer
Mesosphere

Re: Build changes after SPARK-13579

Posted by Sean Owen <so...@cloudera.com>.
If the change is just to replace "sbt assembly/assembly" with "sbt
package", done. LMK if there are more edits.

On Mon, Jul 18, 2016 at 10:00 PM, Michael Gummelt
<mg...@mesosphere.io> wrote:
> I just flailed on this a bit before finding this email.  Can someone please
> update
> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup
>
> On Mon, Apr 4, 2016 at 10:01 PM, Reynold Xin <rx...@databricks.com> wrote:
>>
>> pyspark and R
>>
>> On Mon, Apr 4, 2016 at 9:59 PM, Marcelo Vanzin <va...@cloudera.com>
>> wrote:
>>>
>>> No, tests (except pyspark) should work without having to package anything
>>> first.
>>>
>>> On Mon, Apr 4, 2016 at 9:58 PM, Koert Kuipers <ko...@tresata.com> wrote:
>>> > do i need to run sbt package before doing tests?
>>> >
>>> > On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin <va...@cloudera.com>
>>> > wrote:
>>> >>
>>> >> Hey all,
>>> >>
>>> >> We merged  SPARK-13579 today, and if you're like me and have your
>>> >> hands automatically type "sbt assembly" anytime you're building Spark,
>>> >> that won't work anymore.
>>> >>
>>> >> You should now use "sbt package"; you'll still need "sbt assembly" if
>>> >> you require one of the remaining assemblies (streaming connectors,
>>> >> yarn shuffle service).
>>> >>
>>> >>
>>> >> --
>>> >> Marcelo
>>> >>
>>> >> ---------------------------------------------------------------------
>>> >> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> >> For additional commands, e-mail: dev-help@spark.apache.org
>>> >>
>>> >
>>>
>>>
>>>
>>> --
>>> Marcelo
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>
>
>
>
> --
> Michael Gummelt
> Software Engineer
> Mesosphere

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Build changes after SPARK-13579

Posted by Michael Gummelt <mg...@mesosphere.io>.
I just flailed on this a bit before finding this email.  Can someone please
update
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup

On Mon, Apr 4, 2016 at 10:01 PM, Reynold Xin <rx...@databricks.com> wrote:

> pyspark and R
>
> On Mon, Apr 4, 2016 at 9:59 PM, Marcelo Vanzin <va...@cloudera.com>
> wrote:
>
>> No, tests (except pyspark) should work without having to package anything
>> first.
>>
>> On Mon, Apr 4, 2016 at 9:58 PM, Koert Kuipers <ko...@tresata.com> wrote:
>> > do i need to run sbt package before doing tests?
>> >
>> > On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin <va...@cloudera.com>
>> wrote:
>> >>
>> >> Hey all,
>> >>
>> >> We merged  SPARK-13579 today, and if you're like me and have your
>> >> hands automatically type "sbt assembly" anytime you're building Spark,
>> >> that won't work anymore.
>> >>
>> >> You should now use "sbt package"; you'll still need "sbt assembly" if
>> >> you require one of the remaining assemblies (streaming connectors,
>> >> yarn shuffle service).
>> >>
>> >>
>> >> --
>> >> Marcelo
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> >> For additional commands, e-mail: dev-help@spark.apache.org
>> >>
>> >
>>
>>
>>
>> --
>> Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>


-- 
Michael Gummelt
Software Engineer
Mesosphere

Re: Build changes after SPARK-13579

Posted by Reynold Xin <rx...@databricks.com>.
pyspark and R

On Mon, Apr 4, 2016 at 9:59 PM, Marcelo Vanzin <va...@cloudera.com> wrote:

> No, tests (except pyspark) should work without having to package anything
> first.
>
> On Mon, Apr 4, 2016 at 9:58 PM, Koert Kuipers <ko...@tresata.com> wrote:
> > do i need to run sbt package before doing tests?
> >
> > On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin <va...@cloudera.com>
> wrote:
> >>
> >> Hey all,
> >>
> >> We merged  SPARK-13579 today, and if you're like me and have your
> >> hands automatically type "sbt assembly" anytime you're building Spark,
> >> that won't work anymore.
> >>
> >> You should now use "sbt package"; you'll still need "sbt assembly" if
> >> you require one of the remaining assemblies (streaming connectors,
> >> yarn shuffle service).
> >>
> >>
> >> --
> >> Marcelo
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: dev-help@spark.apache.org
> >>
> >
>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Build changes after SPARK-13579

Posted by Marcelo Vanzin <va...@cloudera.com>.
No, tests (except pyspark) should work without having to package anything first.

On Mon, Apr 4, 2016 at 9:58 PM, Koert Kuipers <ko...@tresata.com> wrote:
> do i need to run sbt package before doing tests?
>
> On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin <va...@cloudera.com> wrote:
>>
>> Hey all,
>>
>> We merged  SPARK-13579 today, and if you're like me and have your
>> hands automatically type "sbt assembly" anytime you're building Spark,
>> that won't work anymore.
>>
>> You should now use "sbt package"; you'll still need "sbt assembly" if
>> you require one of the remaining assemblies (streaming connectors,
>> yarn shuffle service).
>>
>>
>> --
>> Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Build changes after SPARK-13579

Posted by Koert Kuipers <ko...@tresata.com>.
do i need to run sbt package before doing tests?

On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin <va...@cloudera.com> wrote:

> Hey all,
>
> We merged  SPARK-13579 today, and if you're like me and have your
> hands automatically type "sbt assembly" anytime you're building Spark,
> that won't work anymore.
>
> You should now use "sbt package"; you'll still need "sbt assembly" if
> you require one of the remaining assemblies (streaming connectors,
> yarn shuffle service).
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>