You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Tom Graves <tg...@yahoo.com.INVALID> on 2014/06/04 19:47:14 UTC

Re: [VOTE] Release Apache Spark 1.0.0 (RC11)

Testing... Resending as it appears my message didn't go through last week.

Tom


On Wednesday, May 28, 2014 4:12 PM, Tom Graves <tg...@yahoo.com> wrote:
 


+1. Tested spark on yarn (cluster mode, client mode, pyspark, spark-shell) on hadoop 0.23 and 2.4. 

Tom


On Wednesday, May 28, 2014 3:07 PM, Sean McNamara <Se...@Webtrends.com> wrote:
 


Pulled down, compiled, and tested examples on OS X and ubuntu.
Deployed app we are building on spark and poured data through it.

+1

Sean



On May 26, 2014, at 8:39 AM, Tathagata Das <ta...@gmail.com> wrote:

> Please vote on releasing the following candidate as Apache Spark version 1.0.0!
> 
> This has a few important bug fixes on top of rc10:
> SPARK-1900 and SPARK-1918: https://github.com/apache/spark/pull/853
> SPARK-1870: https://github.com/apache/spark/pull/848
> SPARK-1897: https://github.com/apache/spark/pull/849
> 
> The tag to be voted on is v1.0.0-rc11 (commit c69d97cd):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=c69d97cdb42f809cb71113a1db4194c21372242a
> 
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~tdas/spark-1.0.0-rc11/
> 
> Release
 artifacts are signed with the following key:
> https://people.apache.org/keys/committer/tdas.asc
> 
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1019/
> 
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/
> 
> Please vote on releasing this package as Apache Spark 1.0.0!
> 
> The vote is open until
 Thursday, May 29, at 16:00 UTC and passes if
> a majority of at least 3 +1 PMC votes are cast.
> 
> [ ] +1 Release this package as Apache Spark 1.0.0
> [ ] -1 Do not release this package because ...
> 
> To learn more about Apache Spark, please see
> http://spark.apache.org/
> 
> == API Changes ==
> We welcome users to compile Spark applications against 1.0. There are
> a few API changes in this release. Here are links to the associated
> upgrade guides - user facing changes have been kept as small as
> possible.
> 
> Changes to ML vector specification:
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/mllib-guide.html#from-09-to-10
> 
> Changes to the Java API:
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark
> 
> Changes to the streaming API:
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x
> 
> Changes to the GraphX API:
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/graphx-programming-guide.html#upgrade-guide-from-spark-091
> 
> Other changes:
> coGroup and related functions now return Iterable[T] instead of Seq[T]
> ==> Call toSeq on the result to restore the old behavior
> 
> SparkContext.jarOfClass returns Option[String] instead of
 Seq[String]
> ==> Call toSeq on the result to restore old behavior

Re: [VOTE] Release Apache Spark 1.0.0 (RC11)

Posted by Patrick Wendell <pw...@gmail.com>.
Hey There,

The best way is to use the v1.0.0 tag:
https://github.com/apache/spark/releases/tag/v1.0.0

- Patrick

On Wed, Jun 4, 2014 at 12:19 PM, Debasish Das <de...@gmail.com> wrote:
> Hi Patrick,
>
> We maintain internal Spark mirror in sync with Spark github master...
>
> What's the way to get the 1.0.0 stable release from github to deploy on our
> production cluster ? Is there a tag for 1.0.0 that I should use to deploy ?
>
> Thanks.
> Deb
>
>
>
> On Wed, Jun 4, 2014 at 10:49 AM, Patrick Wendell <pw...@gmail.com> wrote:
>
>> Received!
>>
>> On Wed, Jun 4, 2014 at 10:47 AM, Tom Graves
>> <tg...@yahoo.com.invalid> wrote:
>> > Testing... Resending as it appears my message didn't go through last
>> week.
>> >
>> > Tom
>> >
>> >
>> > On Wednesday, May 28, 2014 4:12 PM, Tom Graves <tg...@yahoo.com>
>> wrote:
>> >
>> >
>> >
>> > +1. Tested spark on yarn (cluster mode, client mode, pyspark,
>> spark-shell) on hadoop 0.23 and 2.4.
>> >
>> > Tom
>> >
>> >
>> > On Wednesday, May 28, 2014 3:07 PM, Sean McNamara
>> <Se...@Webtrends.com> wrote:
>> >
>> >
>> >
>> > Pulled down, compiled, and tested examples on OS X and ubuntu.
>> > Deployed app we are building on spark and poured data through it.
>> >
>> > +1
>> >
>> > Sean
>> >
>> >
>> >
>> > On May 26, 2014, at 8:39 AM, Tathagata Das <ta...@gmail.com>
>> wrote:
>> >
>> >> Please vote on releasing the following candidate as Apache Spark
>> version 1.0.0!
>> >>
>> >> This has a few important bug fixes on top of rc10:
>> >> SPARK-1900 and SPARK-1918: https://github.com/apache/spark/pull/853
>> >> SPARK-1870: https://github.com/apache/spark/pull/848
>> >> SPARK-1897: https://github.com/apache/spark/pull/849
>> >>
>> >> The tag to be voted on is v1.0.0-rc11 (commit c69d97cd):
>> >>
>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=c69d97cdb42f809cb71113a1db4194c21372242a
>> >>
>> >> The release files, including signatures, digests, etc. can be found at:
>> >> http://people.apache.org/~tdas/spark-1.0.0-rc11/
>> >>
>> >> Release
>> >  artifacts are signed with the following key:
>> >> https://people.apache.org/keys/committer/tdas.asc
>> >>
>> >> The staging repository for this release can be found at:
>> >> https://repository.apache.org/content/repositories/orgapachespark-1019/
>> >>
>> >> The documentation corresponding to this release can be found at:
>> >> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/
>> >>
>> >> Please vote on releasing this package as Apache Spark 1.0.0!
>> >>
>> >> The vote is open until
>> >  Thursday, May 29, at 16:00 UTC and passes if
>> >> a majority of at least 3 +1 PMC votes are cast.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 1.0.0
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see
>> >> http://spark.apache.org/
>> >>
>> >> == API Changes ==
>> >> We welcome users to compile Spark applications against 1.0. There are
>> >> a few API changes in this release. Here are links to the associated
>> >> upgrade guides - user facing changes have been kept as small as
>> >> possible.
>> >>
>> >> Changes to ML vector specification:
>> >>
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/mllib-guide.html#from-09-to-10
>> >>
>> >> Changes to the Java API:
>> >>
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark
>> >>
>> >> Changes to the streaming API:
>> >>
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x
>> >>
>> >> Changes to the GraphX API:
>> >>
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/graphx-programming-guide.html#upgrade-guide-from-spark-091
>> >>
>> >> Other changes:
>> >> coGroup and related functions now return Iterable[T] instead of Seq[T]
>> >> ==> Call toSeq on the result to restore the old behavior
>> >>
>> >> SparkContext.jarOfClass returns Option[String] instead of
>> >  Seq[String]
>> >> ==> Call toSeq on the result to restore old behavior
>>

Re: [VOTE] Release Apache Spark 1.0.0 (RC11)

Posted by Debasish Das <de...@gmail.com>.
Hi Patrick,

We maintain internal Spark mirror in sync with Spark github master...

What's the way to get the 1.0.0 stable release from github to deploy on our
production cluster ? Is there a tag for 1.0.0 that I should use to deploy ?

Thanks.
Deb



On Wed, Jun 4, 2014 at 10:49 AM, Patrick Wendell <pw...@gmail.com> wrote:

> Received!
>
> On Wed, Jun 4, 2014 at 10:47 AM, Tom Graves
> <tg...@yahoo.com.invalid> wrote:
> > Testing... Resending as it appears my message didn't go through last
> week.
> >
> > Tom
> >
> >
> > On Wednesday, May 28, 2014 4:12 PM, Tom Graves <tg...@yahoo.com>
> wrote:
> >
> >
> >
> > +1. Tested spark on yarn (cluster mode, client mode, pyspark,
> spark-shell) on hadoop 0.23 and 2.4.
> >
> > Tom
> >
> >
> > On Wednesday, May 28, 2014 3:07 PM, Sean McNamara
> <Se...@Webtrends.com> wrote:
> >
> >
> >
> > Pulled down, compiled, and tested examples on OS X and ubuntu.
> > Deployed app we are building on spark and poured data through it.
> >
> > +1
> >
> > Sean
> >
> >
> >
> > On May 26, 2014, at 8:39 AM, Tathagata Das <ta...@gmail.com>
> wrote:
> >
> >> Please vote on releasing the following candidate as Apache Spark
> version 1.0.0!
> >>
> >> This has a few important bug fixes on top of rc10:
> >> SPARK-1900 and SPARK-1918: https://github.com/apache/spark/pull/853
> >> SPARK-1870: https://github.com/apache/spark/pull/848
> >> SPARK-1897: https://github.com/apache/spark/pull/849
> >>
> >> The tag to be voted on is v1.0.0-rc11 (commit c69d97cd):
> >>
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=c69d97cdb42f809cb71113a1db4194c21372242a
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> http://people.apache.org/~tdas/spark-1.0.0-rc11/
> >>
> >> Release
> >  artifacts are signed with the following key:
> >> https://people.apache.org/keys/committer/tdas.asc
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1019/
> >>
> >> The documentation corresponding to this release can be found at:
> >> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/
> >>
> >> Please vote on releasing this package as Apache Spark 1.0.0!
> >>
> >> The vote is open until
> >  Thursday, May 29, at 16:00 UTC and passes if
> >> a majority of at least 3 +1 PMC votes are cast.
> >>
> >> [ ] +1 Release this package as Apache Spark 1.0.0
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see
> >> http://spark.apache.org/
> >>
> >> == API Changes ==
> >> We welcome users to compile Spark applications against 1.0. There are
> >> a few API changes in this release. Here are links to the associated
> >> upgrade guides - user facing changes have been kept as small as
> >> possible.
> >>
> >> Changes to ML vector specification:
> >>
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/mllib-guide.html#from-09-to-10
> >>
> >> Changes to the Java API:
> >>
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark
> >>
> >> Changes to the streaming API:
> >>
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x
> >>
> >> Changes to the GraphX API:
> >>
> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/graphx-programming-guide.html#upgrade-guide-from-spark-091
> >>
> >> Other changes:
> >> coGroup and related functions now return Iterable[T] instead of Seq[T]
> >> ==> Call toSeq on the result to restore the old behavior
> >>
> >> SparkContext.jarOfClass returns Option[String] instead of
> >  Seq[String]
> >> ==> Call toSeq on the result to restore old behavior
>

Re: [VOTE] Release Apache Spark 1.0.0 (RC11)

Posted by Patrick Wendell <pw...@gmail.com>.
Received!

On Wed, Jun 4, 2014 at 10:47 AM, Tom Graves
<tg...@yahoo.com.invalid> wrote:
> Testing... Resending as it appears my message didn't go through last week.
>
> Tom
>
>
> On Wednesday, May 28, 2014 4:12 PM, Tom Graves <tg...@yahoo.com> wrote:
>
>
>
> +1. Tested spark on yarn (cluster mode, client mode, pyspark, spark-shell) on hadoop 0.23 and 2.4.
>
> Tom
>
>
> On Wednesday, May 28, 2014 3:07 PM, Sean McNamara <Se...@Webtrends.com> wrote:
>
>
>
> Pulled down, compiled, and tested examples on OS X and ubuntu.
> Deployed app we are building on spark and poured data through it.
>
> +1
>
> Sean
>
>
>
> On May 26, 2014, at 8:39 AM, Tathagata Das <ta...@gmail.com> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version 1.0.0!
>>
>> This has a few important bug fixes on top of rc10:
>> SPARK-1900 and SPARK-1918: https://github.com/apache/spark/pull/853
>> SPARK-1870: https://github.com/apache/spark/pull/848
>> SPARK-1897: https://github.com/apache/spark/pull/849
>>
>> The tag to be voted on is v1.0.0-rc11 (commit c69d97cd):
>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=c69d97cdb42f809cb71113a1db4194c21372242a
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~tdas/spark-1.0.0-rc11/
>>
>> Release
>  artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/tdas.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1019/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/
>>
>> Please vote on releasing this package as Apache Spark 1.0.0!
>>
>> The vote is open until
>  Thursday, May 29, at 16:00 UTC and passes if
>> a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 1.0.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see
>> http://spark.apache.org/
>>
>> == API Changes ==
>> We welcome users to compile Spark applications against 1.0. There are
>> a few API changes in this release. Here are links to the associated
>> upgrade guides - user facing changes have been kept as small as
>> possible.
>>
>> Changes to ML vector specification:
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/mllib-guide.html#from-09-to-10
>>
>> Changes to the Java API:
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/java-programming-guide.html#upgrading-from-pre-10-versions-of-spark
>>
>> Changes to the streaming API:
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x
>>
>> Changes to the GraphX API:
>> http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/graphx-programming-guide.html#upgrade-guide-from-spark-091
>>
>> Other changes:
>> coGroup and related functions now return Iterable[T] instead of Seq[T]
>> ==> Call toSeq on the result to restore the old behavior
>>
>> SparkContext.jarOfClass returns Option[String] instead of
>  Seq[String]
>> ==> Call toSeq on the result to restore old behavior