You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Andrew Or <an...@databricks.com> on 2014/12/02 22:36:47 UTC

Announcing Spark 1.1.1!

I am happy to announce the availability of Spark 1.1.1! This is a
maintenance release with many bug fixes, most of which are concentrated in
the core. This list includes various fixes to sort-based shuffle, memory
leak, and spilling issues. Contributions from this release came from 55
developers.

Visit the release notes [1] to read about the new features, or
download [2] the release today.

[1] http://spark.apache.org/releases/spark-release-1-1-1.html
[2] http://spark.apache.org/downloads.html

Please e-mail me directly for any typo's in the release notes or name
listing.

Thanks for everyone who contributed, and congratulations!
-Andrew

Re: Announcing Spark 1.1.1!

Posted by Aaron Davidson <il...@gmail.com>.
Because this was a maintenance release, we should not have introduced any
binary backwards or forwards incompatibilities. Therefore, applications
that were written and compiled against 1.1.0 should still work against a
1.1.1 cluster, and vice versa.

On Wed, Dec 3, 2014 at 1:30 PM, Andrew Or <an...@databricks.com> wrote:

> By the Spark server do you mean the standalone Master? It is best if they
> are upgraded together because there have been changes to the Master in
> 1.1.1. Although it might "just work", it's highly recommended to restart
> your cluster manager too.
>
> 2014-12-03 13:19 GMT-08:00 Romi Kuntsman <ro...@totango.com>:
>
> About version compatibility and upgrade path -  can the Java application
>> dependencies and the Spark server be upgraded separately (i.e. will 1.1.0
>> library work with 1.1.1 server, and vice versa), or do they need to be
>> upgraded together?
>>
>> Thanks!
>>
>> *Romi Kuntsman*, *Big Data Engineer*
>>  http://www.totango.com
>>
>> On Tue, Dec 2, 2014 at 11:36 PM, Andrew Or <an...@databricks.com> wrote:
>>
>>> I am happy to announce the availability of Spark 1.1.1! This is a
>>> maintenance release with many bug fixes, most of which are concentrated in
>>> the core. This list includes various fixes to sort-based shuffle, memory
>>> leak, and spilling issues. Contributions from this release came from 55
>>> developers.
>>>
>>> Visit the release notes [1] to read about the new features, or
>>> download [2] the release today.
>>>
>>> [1] http://spark.apache.org/releases/spark-release-1-1-1.html
>>> [2] http://spark.apache.org/downloads.html
>>>
>>> Please e-mail me directly for any typo's in the release notes or name
>>> listing.
>>>
>>> Thanks for everyone who contributed, and congratulations!
>>> -Andrew
>>>
>>
>>
>

Re: Announcing Spark 1.1.1!

Posted by Andrew Or <an...@databricks.com>.
By the Spark server do you mean the standalone Master? It is best if they
are upgraded together because there have been changes to the Master in
1.1.1. Although it might "just work", it's highly recommended to restart
your cluster manager too.

2014-12-03 13:19 GMT-08:00 Romi Kuntsman <ro...@totango.com>:

> About version compatibility and upgrade path -  can the Java application
> dependencies and the Spark server be upgraded separately (i.e. will 1.1.0
> library work with 1.1.1 server, and vice versa), or do they need to be
> upgraded together?
>
> Thanks!
>
> *Romi Kuntsman*, *Big Data Engineer*
>  http://www.totango.com
>
> On Tue, Dec 2, 2014 at 11:36 PM, Andrew Or <an...@databricks.com> wrote:
>
>> I am happy to announce the availability of Spark 1.1.1! This is a
>> maintenance release with many bug fixes, most of which are concentrated in
>> the core. This list includes various fixes to sort-based shuffle, memory
>> leak, and spilling issues. Contributions from this release came from 55
>> developers.
>>
>> Visit the release notes [1] to read about the new features, or
>> download [2] the release today.
>>
>> [1] http://spark.apache.org/releases/spark-release-1-1-1.html
>> [2] http://spark.apache.org/downloads.html
>>
>> Please e-mail me directly for any typo's in the release notes or name
>> listing.
>>
>> Thanks for everyone who contributed, and congratulations!
>> -Andrew
>>
>
>

Re: Announcing Spark 1.1.1!

Posted by Romi Kuntsman <ro...@totango.com>.
About version compatibility and upgrade path -  can the Java application
dependencies and the Spark server be upgraded separately (i.e. will 1.1.0
library work with 1.1.1 server, and vice versa), or do they need to be
upgraded together?

Thanks!

*Romi Kuntsman*, *Big Data Engineer*
 http://www.totango.com

On Tue, Dec 2, 2014 at 11:36 PM, Andrew Or <an...@databricks.com> wrote:

> I am happy to announce the availability of Spark 1.1.1! This is a
> maintenance release with many bug fixes, most of which are concentrated in
> the core. This list includes various fixes to sort-based shuffle, memory
> leak, and spilling issues. Contributions from this release came from 55
> developers.
>
> Visit the release notes [1] to read about the new features, or
> download [2] the release today.
>
> [1] http://spark.apache.org/releases/spark-release-1-1-1.html
> [2] http://spark.apache.org/downloads.html
>
> Please e-mail me directly for any typo's in the release notes or name
> listing.
>
> Thanks for everyone who contributed, and congratulations!
> -Andrew
>

Re: Announcing Spark 1.1.1!

Posted by rzykov <rz...@gmail.com>.
Andrew and developers, thank you for excellent release!

It fixed almost all of our issues. Now we are migrating to Spark from Zoo of
Python, Java, Hive, Pig jobs.
Our Scala/Spark jobs often failed on 1.1. Spark 1.1.1 works like a Swiss
watch.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Announcing-Spark-1-1-1-tp20195p20251.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org