You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by kant kodali <ka...@gmail.com> on 2017/05/02 05:43:44 UTC

Spark 2.2.0 or Spark 2.3.0?

Hi All,

If I understand the Spark standard release process correctly. It looks like
the official release is going to be sometime end of this month and it is
going to be 2.2.0 right (not 2.3.0)? I am eagerly looking for Spark 2.2.0
because of the "update mode" option in Spark Streaming. Please correct me
if I am wrong.

Thanks!

Re: Spark 2.2.0 or Spark 2.3.0?

Posted by Michael Armbrust <mi...@databricks.com>.
An RC for 2.2.0
<http://home.apache.org/~pwendell/spark-releases/spark-2.2.0-rc1-bin/> was
released last week.  Please test.

Note that update mode has been supported since 2.0.

On Mon, May 1, 2017 at 10:43 PM, kant kodali <ka...@gmail.com> wrote:

> Hi All,
>
> If I understand the Spark standard release process correctly. It looks
> like the official release is going to be sometime end of this month and it
> is going to be 2.2.0 right (not 2.3.0)? I am eagerly looking for Spark
> 2.2.0 because of the "update mode" option in Spark Streaming. Please
> correct me if I am wrong.
>
> Thanks!
>

Re: Spark 2.2.0 or Spark 2.3.0?

Posted by Felix Cheung <fe...@hotmail.com>.
Yes 2.2.0

________________________________
From: kant kodali <ka...@gmail.com>
Sent: Monday, May 1, 2017 10:43:44 PM
To: dev
Subject: Spark 2.2.0 or Spark 2.3.0?

Hi All,

If I understand the Spark standard release process correctly. It looks like the official release is going to be sometime end of this month and it is going to be 2.2.0 right (not 2.3.0)? I am eagerly looking for Spark 2.2.0 because of the "update mode" option in Spark Streaming. Please correct me if I am wrong.

Thanks!