You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Rahul Singhal <Ra...@guavus.com> on 2014/06/05 05:29:16 UTC

Re: Announcing Spark 1.0.0

Could someone please clarify my confusion or is this not an issue that we
should be concerned about?

Thanks,
Rahul Singhal





On 30/05/14 5:28 PM, "Rahul Singhal" <Ra...@guavus.com> wrote:

>Is it intentional/ok that the tag v1.0.0 is behind tag v1.0.0-rc11?
>
>
>Thanks,
>Rahul Singhal
>
>
>
>
>
>On 30/05/14 3:43 PM, "Patrick Wendell" <pw...@gmail.com> wrote:
>
>>I'm thrilled to announce the availability of Spark 1.0.0! Spark 1.0.0
>>is a milestone release as the first in the 1.0 line of releases,
>>providing API stability for Spark's core interfaces.
>>
>>Spark 1.0.0 is Spark's largest release ever, with contributions from
>>117 developers. I'd like to thank everyone involved in this release -
>>it was truly a community effort with fixes, features, and
>>optimizations contributed from dozens of organizations.
>>
>>This release expands Spark's standard libraries, introducing a new SQL
>>package (SparkSQL) which lets users integrate SQL queries into
>>existing Spark workflows. MLlib, Spark's machine learning library, is
>>expanded with sparse vector support and several new algorithms. The
>>GraphX and Streaming libraries also introduce new features and
>>optimizations. Spark's core engine adds support for secured YARN
>>clusters, a unified tool for submitting Spark applications, and
>>several performance and stability improvements. Finally, Spark adds
>>support for Java 8 lambda syntax and improves coverage of the Java and
>>Python API's.
>>
>>Those features only scratch the surface - check out the release notes
>>here:
>>http://spark.apache.org/releases/spark-release-1-0-0.html
>>
>>Note that since release artifacts were posted recently, certain
>>mirrors may not have working downloads for a few hours.
>>
>>- Patrick
>


Re: Announcing Spark 1.0.0

Posted by Patrick Wendell <pw...@gmail.com>.
Hey Rahul,

The v1.0.0 tag is correct. When we release Spark we create multiple
candidates. One of the candidates is promoted to the full release. So
rc11 is also the same as the official v1.0.0 release.

- Patrick

On Wed, Jun 4, 2014 at 8:29 PM, Rahul Singhal <Ra...@guavus.com> wrote:
> Could someone please clarify my confusion or is this not an issue that we
> should be concerned about?
>
> Thanks,
> Rahul Singhal
>
>
>
>
>
> On 30/05/14 5:28 PM, "Rahul Singhal" <Ra...@guavus.com> wrote:
>
>>Is it intentional/ok that the tag v1.0.0 is behind tag v1.0.0-rc11?
>>
>>
>>Thanks,
>>Rahul Singhal
>>
>>
>>
>>
>>
>>On 30/05/14 3:43 PM, "Patrick Wendell" <pw...@gmail.com> wrote:
>>
>>>I'm thrilled to announce the availability of Spark 1.0.0! Spark 1.0.0
>>>is a milestone release as the first in the 1.0 line of releases,
>>>providing API stability for Spark's core interfaces.
>>>
>>>Spark 1.0.0 is Spark's largest release ever, with contributions from
>>>117 developers. I'd like to thank everyone involved in this release -
>>>it was truly a community effort with fixes, features, and
>>>optimizations contributed from dozens of organizations.
>>>
>>>This release expands Spark's standard libraries, introducing a new SQL
>>>package (SparkSQL) which lets users integrate SQL queries into
>>>existing Spark workflows. MLlib, Spark's machine learning library, is
>>>expanded with sparse vector support and several new algorithms. The
>>>GraphX and Streaming libraries also introduce new features and
>>>optimizations. Spark's core engine adds support for secured YARN
>>>clusters, a unified tool for submitting Spark applications, and
>>>several performance and stability improvements. Finally, Spark adds
>>>support for Java 8 lambda syntax and improves coverage of the Java and
>>>Python API's.
>>>
>>>Those features only scratch the surface - check out the release notes
>>>here:
>>>http://spark.apache.org/releases/spark-release-1-0-0.html
>>>
>>>Note that since release artifacts were posted recently, certain
>>>mirrors may not have working downloads for a few hours.
>>>
>>>- Patrick
>>
>