You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Sameer Agarwal <sa...@apache.org> on 2018/02/22 22:23:53 UTC

[VOTE] Spark 2.3.0 (RC5)

Please vote on releasing the following candidate as Apache Spark version
2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5:
https://github.com/apache/spark/tree/v2.3.0-rc5
(992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here:
https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/

Release artifacts are signed with the following key:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1266/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are
currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an
existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the
current RC and see if anything important breaks, in the Java/Scala you can
add the staging repository to your projects resolvers and test with the RC
(make sure to clean up the artifact cache before/after so you don't end up
building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes,
documentation, and API tweaks that impact compatibility should be worked on
immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release
unless the bug in question is a regression from 2.2.0. That being said, if
there is something which is a regression from 2.2.0 and has not been
correctly targeted please ping me or a committer to help target the issue
(you can see the open issues listed as impacting Spark 2.3.0 at
https://s.apache.org/WmoI).

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Nicholas Chammas <ni...@gmail.com>.
Launched a test cluster on EC2 with Flintrock
<https://github.com/nchammas/flintrock> and ran some simple tests. Building
Spark took much longer than usual, but that may just be a fluke. Otherwise,
all looks good to me.

+1

On Fri, Feb 23, 2018 at 10:55 AM Denny Lee <de...@gmail.com> wrote:

> +1 (non-binding)
>
> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
> joshgoldsboroughster@gmail.com> wrote:
>
>> New to testing out Spark RCs for the community but I was able to run some
>> of the basic unit tests without error so for what it's worth, I'm a +1.
>>
>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org>
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
>>> and passes if a majority of at least 3 PMC +1 votes are cast.
>>>
>>>
>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.3.0-rc5:
>>> https://github.com/apache/spark/tree/v2.3.0-rc5
>>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>
>>> List of JIRA tickets resolved in this release can be found here:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1266/
>>>
>>> The documentation corresponding to this release can be found at:
>>>
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html
>>>
>>>
>>> FAQ
>>>
>>> =======================================
>>> What are the unresolved issues targeted for 2.3.0?
>>> =======================================
>>>
>>> Please see https://s.apache.org/oXKi. At the time of writing, there are
>>> currently no known release blockers.
>>>
>>> =========================
>>> How can I help test this release?
>>> =========================
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala you
>>> can add the staging repository to your projects resolvers and test with the
>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>> up building with a out of date RC going forward).
>>>
>>> ===========================================
>>> What should happen to JIRA tickets still targeting 2.3.0?
>>> ===========================================
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>> appropriate.
>>>
>>> ===================
>>> Why is my bug not fixed?
>>> ===================
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.2.0. That being said, if
>>> there is something which is a regression from 2.2.0 and has not been
>>> correctly targeted please ping me or a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.3.0 at
>>> https://s.apache.org/WmoI).
>>>
>>
>>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Sameer Agarwal <sa...@apache.org>.
This vote passes! I'll follow up with a formal release announcement soon.

+1:
Wenchen Fan (binding)
Takuya Ueshin
Xingbo Jiang
Gengliang Wang
Weichen Xu
Sean Owen (binding)
Josh Goldsborough
Denny Lee
Nicholas Chammas
Marcelo Vanzin (binding)
Holden Karau (binding)
Cheng Lian (binding)
Bryan Cutler
Hyukjin Kwon
Ricardo Almeida
Xiao Li (binding)
Ryan Blue
Dongjoon Hyun
Michael Armbrust (binding)
Nan Zhu
Felix Cheung (binding)
Nick Pentreath (binding)

+0: None

-1: None

On 27 February 2018 at 00:21, Nick Pentreath <ni...@gmail.com>
wrote:

> +1 (binding)
>
> Built and ran Scala tests with "-Phadoop-2.6 -Pyarn -Phive", all passed.
>
> Python tests passed (also including pyspark-streaming w/kafka-0.8 and
> flume packages built)
>
>
> On Tue, 27 Feb 2018 at 10:09 Felix Cheung <fe...@hotmail.com>
> wrote:
>
>> +1
>>
>> Tested R:
>>
>> install from package, CRAN tests, manual tests, help check, vignettes
>> check
>>
>> Filed this https://issues.apache.org/jira/browse/SPARK-23461
>> This is not a regression so not a blocker of the release.
>>
>> Tested this on win-builder and r-hub. On r-hub on multiple platforms
>> everything passed. For win-builder tests failed on x86 but passed x64 -
>> perhaps due to an intermittent download issue causing a gzip error,
>> re-testing now but won’t hold the release on this.
>>
>> ------------------------------
>> *From:* Nan Zhu <zh...@gmail.com>
>> *Sent:* Monday, February 26, 2018 4:03:22 PM
>> *To:* Michael Armbrust
>> *Cc:* dev
>> *Subject:* Re: [VOTE] Spark 2.3.0 (RC5)
>>
>> +1  (non-binding), tested with internal workloads and benchmarks
>>
>> On Mon, Feb 26, 2018 at 12:09 PM, Michael Armbrust <
>> michael@databricks.com> wrote:
>>
>>> +1 all our pipelines have been running the RC for several days now.
>>>
>>> On Mon, Feb 26, 2018 at 10:33 AM, Dongjoon Hyun <dongjoon.hyun@gmail.com
>>> > wrote:
>>>
>>>> +1 (non-binding).
>>>>
>>>> Bests,
>>>> Dongjoon.
>>>>
>>>>
>>>>
>>>> On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <rb...@netflix.com.invalid>
>>>> wrote:
>>>>
>>>>> +1 (non-binding)
>>>>>
>>>>> On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <ga...@gmail.com> wrote:
>>>>>
>>>>>> +1 (binding) in Spark SQL, Core and PySpark.
>>>>>>
>>>>>> Xiao
>>>>>>
>>>>>> 2018-02-24 14:49 GMT-08:00 Ricardo Almeida <
>>>>>> ricardo.almeida@actnowib.com>:
>>>>>>
>>>>>>> +1 (non-binding)
>>>>>>>
>>>>>>> same as previous RC
>>>>>>>
>>>>>>> On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> +1
>>>>>>>>
>>>>>>>> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:
>>>>>>>>
>>>>>>>>> +1
>>>>>>>>> Tests passed and additionally ran Arrow related tests and did some
>>>>>>>>> perf checks with python 2.7.14
>>>>>>>>>
>>>>>>>>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <
>>>>>>>>> holden@pigscanfly.ca> wrote:
>>>>>>>>>
>>>>>>>>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or
>>>>>>>>>> someone with Arrow experience sign off on this release.
>>>>>>>>>>
>>>>>>>>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <
>>>>>>>>>> lian.cs.zju@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> +1 (binding)
>>>>>>>>>>>
>>>>>>>>>>> Passed all the tests, looks good.
>>>>>>>>>>>
>>>>>>>>>>> Cheng
>>>>>>>>>>>
>>>>>>>>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>>>>>>>>
>>>>>>>>>>> +1 (binding)
>>>>>>>>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>>>>>>>>
>>>>>>>>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com>
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> +1 (non-binding)
>>>>>>>>>>>>
>>>>>>>>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>>>>>>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> New to testing out Spark RCs for the community but I was able
>>>>>>>>>>>>> to run some of the basic unit tests without error so for what it's worth,
>>>>>>>>>>>>> I'm a +1.
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <
>>>>>>>>>>>>> sameerag@apache.org> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Please vote on releasing the following candidate as Apache
>>>>>>>>>>>>>> Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at
>>>>>>>>>>>>>> 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>>>>>>>> https://spark.apache.org/
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> The tag to be voted on is v2.3.0-rc5:
>>>>>>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc5 (
>>>>>>>>>>>>>> 992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> List of JIRA tickets resolved in this release can be found
>>>>>>>>>>>>>> here: https://issues.apache.org/jira/projects/SPARK/versions/
>>>>>>>>>>>>>> 12339551
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>>>>>>>> found at:
>>>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>>>>>>> https://repository.apache.org/content/repositories/
>>>>>>>>>>>>>> orgapachespark-1266/
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> The documentation corresponding to this release can be found
>>>>>>>>>>>>>> at:
>>>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-
>>>>>>>>>>>>>> docs/_site/index.html
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> FAQ
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> =======================================
>>>>>>>>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>>>>>>>>> =======================================
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Please see https://s.apache.org/oXKi. At the time of
>>>>>>>>>>>>>> writing, there are currently no known release blockers.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> =========================
>>>>>>>>>>>>>> How can I help test this release?
>>>>>>>>>>>>>> =========================
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>>>>>>>> taking an existing Spark workload and running on this release candidate,
>>>>>>>>>>>>>> then reporting any regressions.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>>>>>>>>> Java/Scala you can add the staging repository to your projects resolvers
>>>>>>>>>>>>>> and test with the RC (make sure to clean up the artifact cache before/after
>>>>>>>>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ===========================================
>>>>>>>>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>>>>>>>>> ===========================================
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Committers should look at those and triage. Extremely
>>>>>>>>>>>>>> important bug fixes, documentation, and API tweaks that impact
>>>>>>>>>>>>>> compatibility should be worked on immediately. Everything else please
>>>>>>>>>>>>>> retarget to 2.3.1 or 2.4.0 as appropriate.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ===================
>>>>>>>>>>>>>> Why is my bug not fixed?
>>>>>>>>>>>>>> ===================
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> In order to make timely releases, we will typically not hold
>>>>>>>>>>>>>> the release unless the bug in question is a regression from 2.2.0. That
>>>>>>>>>>>>>> being said, if there is something which is a regression from 2.2.0 and has
>>>>>>>>>>>>>> not been correctly targeted please ping me or a committer to help target
>>>>>>>>>>>>>> the issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>>>>>>>>>> https://s.apache.org/WmoI).
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Ryan Blue
>>>>> Software Engineer
>>>>> Netflix
>>>>>
>>>>
>>>>
>>>
>>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Nick Pentreath <ni...@gmail.com>.
+1 (binding)

Built and ran Scala tests with "-Phadoop-2.6 -Pyarn -Phive", all passed.

Python tests passed (also including pyspark-streaming w/kafka-0.8 and flume
packages built)

On Tue, 27 Feb 2018 at 10:09 Felix Cheung <fe...@hotmail.com> wrote:

> +1
>
> Tested R:
>
> install from package, CRAN tests, manual tests, help check, vignettes check
>
> Filed this https://issues.apache.org/jira/browse/SPARK-23461
> This is not a regression so not a blocker of the release.
>
> Tested this on win-builder and r-hub. On r-hub on multiple platforms
> everything passed. For win-builder tests failed on x86 but passed x64 -
> perhaps due to an intermittent download issue causing a gzip error,
> re-testing now but won’t hold the release on this.
>
> ------------------------------
> *From:* Nan Zhu <zh...@gmail.com>
> *Sent:* Monday, February 26, 2018 4:03:22 PM
> *To:* Michael Armbrust
> *Cc:* dev
> *Subject:* Re: [VOTE] Spark 2.3.0 (RC5)
>
> +1  (non-binding), tested with internal workloads and benchmarks
>
> On Mon, Feb 26, 2018 at 12:09 PM, Michael Armbrust <michael@databricks.com
> > wrote:
>
>> +1 all our pipelines have been running the RC for several days now.
>>
>> On Mon, Feb 26, 2018 at 10:33 AM, Dongjoon Hyun <do...@gmail.com>
>> wrote:
>>
>>> +1 (non-binding).
>>>
>>> Bests,
>>> Dongjoon.
>>>
>>>
>>>
>>> On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <rb...@netflix.com.invalid>
>>> wrote:
>>>
>>>> +1 (non-binding)
>>>>
>>>> On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <ga...@gmail.com> wrote:
>>>>
>>>>> +1 (binding) in Spark SQL, Core and PySpark.
>>>>>
>>>>> Xiao
>>>>>
>>>>> 2018-02-24 14:49 GMT-08:00 Ricardo Almeida <
>>>>> ricardo.almeida@actnowib.com>:
>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>> same as previous RC
>>>>>>
>>>>>> On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> +1
>>>>>>>
>>>>>>> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:
>>>>>>>
>>>>>>>> +1
>>>>>>>> Tests passed and additionally ran Arrow related tests and did some
>>>>>>>> perf checks with python 2.7.14
>>>>>>>>
>>>>>>>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <holden@pigscanfly.ca
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or
>>>>>>>>> someone with Arrow experience sign off on this release.
>>>>>>>>>
>>>>>>>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <lian.cs.zju@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> +1 (binding)
>>>>>>>>>>
>>>>>>>>>> Passed all the tests, looks good.
>>>>>>>>>>
>>>>>>>>>> Cheng
>>>>>>>>>>
>>>>>>>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>>>>>>>
>>>>>>>>>> +1 (binding)
>>>>>>>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>>>>>>>
>>>>>>>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com>
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>> +1 (non-binding)
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>>>>>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> New to testing out Spark RCs for the community but I was able
>>>>>>>>>>>> to run some of the basic unit tests without error so for what it's worth,
>>>>>>>>>>>> I'm a +1.
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <
>>>>>>>>>>>> sameerag@apache.org> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Please vote on releasing the following candidate as Apache
>>>>>>>>>>>>> Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at
>>>>>>>>>>>>> 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>>>>>>>
>>>>>>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>>>>>>> https://spark.apache.org/
>>>>>>>>>>>>>
>>>>>>>>>>>>> The tag to be voted on is v2.3.0-rc5:
>>>>>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc5
>>>>>>>>>>>>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>>>>>>>
>>>>>>>>>>>>> List of JIRA tickets resolved in this release can be found
>>>>>>>>>>>>> here:
>>>>>>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>>>>>>>>>
>>>>>>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>>>>>>> found at:
>>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>>>>>>>
>>>>>>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>>>>>>
>>>>>>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>>>>>>
>>>>>>>>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1266/
>>>>>>>>>>>>>
>>>>>>>>>>>>> The documentation corresponding to this release can be found
>>>>>>>>>>>>> at:
>>>>>>>>>>>>>
>>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> FAQ
>>>>>>>>>>>>>
>>>>>>>>>>>>> =======================================
>>>>>>>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>>>>>>>> =======================================
>>>>>>>>>>>>>
>>>>>>>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing,
>>>>>>>>>>>>> there are currently no known release blockers.
>>>>>>>>>>>>>
>>>>>>>>>>>>> =========================
>>>>>>>>>>>>> How can I help test this release?
>>>>>>>>>>>>> =========================
>>>>>>>>>>>>>
>>>>>>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>>>>>>> taking an existing Spark workload and running on this release candidate,
>>>>>>>>>>>>> then reporting any regressions.
>>>>>>>>>>>>>
>>>>>>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>>>>>>>> Java/Scala you can add the staging repository to your projects resolvers
>>>>>>>>>>>>> and test with the RC (make sure to clean up the artifact cache before/after
>>>>>>>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>>>>>>>
>>>>>>>>>>>>> ===========================================
>>>>>>>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>>>>>>>> ===========================================
>>>>>>>>>>>>>
>>>>>>>>>>>>> Committers should look at those and triage. Extremely
>>>>>>>>>>>>> important bug fixes, documentation, and API tweaks that impact
>>>>>>>>>>>>> compatibility should be worked on immediately. Everything else please
>>>>>>>>>>>>> retarget to 2.3.1 or 2.4.0 as appropriate.
>>>>>>>>>>>>>
>>>>>>>>>>>>> ===================
>>>>>>>>>>>>> Why is my bug not fixed?
>>>>>>>>>>>>> ===================
>>>>>>>>>>>>>
>>>>>>>>>>>>> In order to make timely releases, we will typically not hold
>>>>>>>>>>>>> the release unless the bug in question is a regression from 2.2.0. That
>>>>>>>>>>>>> being said, if there is something which is a regression from 2.2.0 and has
>>>>>>>>>>>>> not been correctly targeted please ping me or a committer to help target
>>>>>>>>>>>>> the issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>>>>>>>>> https://s.apache.org/WmoI).
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Ryan Blue
>>>> Software Engineer
>>>> Netflix
>>>>
>>>
>>>
>>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Felix Cheung <fe...@hotmail.com>.
+1

Tested R:

install from package, CRAN tests, manual tests, help check, vignettes check

Filed this https://issues.apache.org/jira/browse/SPARK-23461
This is not a regression so not a blocker of the release.

Tested this on win-builder and r-hub. On r-hub on multiple platforms everything passed. For win-builder tests failed on x86 but passed x64 - perhaps due to an intermittent download issue causing a gzip error, re-testing now but won’t hold the release on this.

________________________________
From: Nan Zhu <zh...@gmail.com>
Sent: Monday, February 26, 2018 4:03:22 PM
To: Michael Armbrust
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC5)

+1  (non-binding), tested with internal workloads and benchmarks

On Mon, Feb 26, 2018 at 12:09 PM, Michael Armbrust <mi...@databricks.com>> wrote:
+1 all our pipelines have been running the RC for several days now.

On Mon, Feb 26, 2018 at 10:33 AM, Dongjoon Hyun <do...@gmail.com>> wrote:
+1 (non-binding).

Bests,
Dongjoon.



On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <rb...@netflix.com.invalid>> wrote:
+1 (non-binding)

On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <ga...@gmail.com>> wrote:
+1 (binding) in Spark SQL, Core and PySpark.

Xiao

2018-02-24 14:49 GMT-08:00 Ricardo Almeida <ri...@actnowib.com>>:
+1 (non-binding)

same as previous RC

On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com>> wrote:
+1

2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>>:
+1
Tests passed and additionally ran Arrow related tests and did some perf checks with python 2.7.14

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca>> wrote:
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com>> wrote:

+1 (binding)

Passed all the tests, looks good.

Cheng

On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com>> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <jo...@gmail.com>> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org>> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/

Release artifacts are signed with the following key:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1266/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).





--
Twitter: https://twitter.com/holdenkarau







--
Ryan Blue
Software Engineer
Netflix




Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Nan Zhu <zh...@gmail.com>.
+1  (non-binding), tested with internal workloads and benchmarks

On Mon, Feb 26, 2018 at 12:09 PM, Michael Armbrust <mi...@databricks.com>
wrote:

> +1 all our pipelines have been running the RC for several days now.
>
> On Mon, Feb 26, 2018 at 10:33 AM, Dongjoon Hyun <do...@gmail.com>
> wrote:
>
>> +1 (non-binding).
>>
>> Bests,
>> Dongjoon.
>>
>>
>>
>> On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <rb...@netflix.com.invalid>
>> wrote:
>>
>>> +1 (non-binding)
>>>
>>> On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <ga...@gmail.com> wrote:
>>>
>>>> +1 (binding) in Spark SQL, Core and PySpark.
>>>>
>>>> Xiao
>>>>
>>>> 2018-02-24 14:49 GMT-08:00 Ricardo Almeida <
>>>> ricardo.almeida@actnowib.com>:
>>>>
>>>>> +1 (non-binding)
>>>>>
>>>>> same as previous RC
>>>>>
>>>>> On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> +1
>>>>>>
>>>>>> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:
>>>>>>
>>>>>>> +1
>>>>>>> Tests passed and additionally ran Arrow related tests and did some
>>>>>>> perf checks with python 2.7.14
>>>>>>>
>>>>>>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or
>>>>>>>> someone with Arrow experience sign off on this release.
>>>>>>>>
>>>>>>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> +1 (binding)
>>>>>>>>>
>>>>>>>>> Passed all the tests, looks good.
>>>>>>>>>
>>>>>>>>> Cheng
>>>>>>>>>
>>>>>>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>>>>>>
>>>>>>>>> +1 (binding)
>>>>>>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>>>>>>
>>>>>>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> +1 (non-binding)
>>>>>>>>>>
>>>>>>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>>>>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> New to testing out Spark RCs for the community but I was able to
>>>>>>>>>>> run some of the basic unit tests without error so for what it's worth, I'm
>>>>>>>>>>> a +1.
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <
>>>>>>>>>>> sameerag@apache.org> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Please vote on releasing the following candidate as Apache
>>>>>>>>>>>> Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at
>>>>>>>>>>>> 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>>>>>>
>>>>>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>>>>>> https://spark.apache.org/
>>>>>>>>>>>>
>>>>>>>>>>>> The tag to be voted on is v2.3.0-rc5:
>>>>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc5
>>>>>>>>>>>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>>>>>>
>>>>>>>>>>>> List of JIRA tickets resolved in this release can be found
>>>>>>>>>>>> here: https://issues.apache.org/jira
>>>>>>>>>>>> /projects/SPARK/versions/12339551
>>>>>>>>>>>>
>>>>>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>>>>>> found at:
>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>>>>>>
>>>>>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>>>>>
>>>>>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>>>>>>>> spark-1266/
>>>>>>>>>>>>
>>>>>>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>>>>>>>>> /_site/index.html
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> FAQ
>>>>>>>>>>>>
>>>>>>>>>>>> =======================================
>>>>>>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>>>>>>> =======================================
>>>>>>>>>>>>
>>>>>>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing,
>>>>>>>>>>>> there are currently no known release blockers.
>>>>>>>>>>>>
>>>>>>>>>>>> =========================
>>>>>>>>>>>> How can I help test this release?
>>>>>>>>>>>> =========================
>>>>>>>>>>>>
>>>>>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>>>>>> taking an existing Spark workload and running on this release candidate,
>>>>>>>>>>>> then reporting any regressions.
>>>>>>>>>>>>
>>>>>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>>>>>>> Java/Scala you can add the staging repository to your projects resolvers
>>>>>>>>>>>> and test with the RC (make sure to clean up the artifact cache before/after
>>>>>>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>>>>>>
>>>>>>>>>>>> ===========================================
>>>>>>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>>>>>>> ===========================================
>>>>>>>>>>>>
>>>>>>>>>>>> Committers should look at those and triage. Extremely important
>>>>>>>>>>>> bug fixes, documentation, and API tweaks that impact compatibility should
>>>>>>>>>>>> be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0
>>>>>>>>>>>> as appropriate.
>>>>>>>>>>>>
>>>>>>>>>>>> ===================
>>>>>>>>>>>> Why is my bug not fixed?
>>>>>>>>>>>> ===================
>>>>>>>>>>>>
>>>>>>>>>>>> In order to make timely releases, we will typically not hold
>>>>>>>>>>>> the release unless the bug in question is a regression from 2.2.0. That
>>>>>>>>>>>> being said, if there is something which is a regression from 2.2.0 and has
>>>>>>>>>>>> not been correctly targeted please ping me or a committer to help target
>>>>>>>>>>>> the issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>>>>>>>> https://s.apache.org/WmoI).
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Ryan Blue
>>> Software Engineer
>>> Netflix
>>>
>>
>>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Michael Armbrust <mi...@databricks.com>.
+1 all our pipelines have been running the RC for several days now.

On Mon, Feb 26, 2018 at 10:33 AM, Dongjoon Hyun <do...@gmail.com>
wrote:

> +1 (non-binding).
>
> Bests,
> Dongjoon.
>
>
>
> On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <rb...@netflix.com.invalid>
> wrote:
>
>> +1 (non-binding)
>>
>> On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <ga...@gmail.com> wrote:
>>
>>> +1 (binding) in Spark SQL, Core and PySpark.
>>>
>>> Xiao
>>>
>>> 2018-02-24 14:49 GMT-08:00 Ricardo Almeida <ricardo.almeida@actnowib.com
>>> >:
>>>
>>>> +1 (non-binding)
>>>>
>>>> same as previous RC
>>>>
>>>> On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com> wrote:
>>>>
>>>>> +1
>>>>>
>>>>> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:
>>>>>
>>>>>> +1
>>>>>> Tests passed and additionally ran Arrow related tests and did some
>>>>>> perf checks with python 2.7.14
>>>>>>
>>>>>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca>
>>>>>> wrote:
>>>>>>
>>>>>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or
>>>>>>> someone with Arrow experience sign off on this release.
>>>>>>>
>>>>>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> +1 (binding)
>>>>>>>>
>>>>>>>> Passed all the tests, looks good.
>>>>>>>>
>>>>>>>> Cheng
>>>>>>>>
>>>>>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>>>>>
>>>>>>>> +1 (binding)
>>>>>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>>>>>
>>>>>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> +1 (non-binding)
>>>>>>>>>
>>>>>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>>>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> New to testing out Spark RCs for the community but I was able to
>>>>>>>>>> run some of the basic unit tests without error so for what it's worth, I'm
>>>>>>>>>> a +1.
>>>>>>>>>>
>>>>>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <
>>>>>>>>>> sameerag@apache.org> wrote:
>>>>>>>>>>
>>>>>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>>>>>
>>>>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>>>>> https://spark.apache.org/
>>>>>>>>>>>
>>>>>>>>>>> The tag to be voted on is v2.3.0-rc5:
>>>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc5
>>>>>>>>>>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>>>>>
>>>>>>>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>>>>>>>
>>>>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>>>>> found at:
>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>>>>>
>>>>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>>>>
>>>>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>>>>>>> spark-1266/
>>>>>>>>>>>
>>>>>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>>>>>>>> /_site/index.html
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> FAQ
>>>>>>>>>>>
>>>>>>>>>>> =======================================
>>>>>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>>>>>> =======================================
>>>>>>>>>>>
>>>>>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing,
>>>>>>>>>>> there are currently no known release blockers.
>>>>>>>>>>>
>>>>>>>>>>> =========================
>>>>>>>>>>> How can I help test this release?
>>>>>>>>>>> =========================
>>>>>>>>>>>
>>>>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>>>>> taking an existing Spark workload and running on this release candidate,
>>>>>>>>>>> then reporting any regressions.
>>>>>>>>>>>
>>>>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>>>>>> Java/Scala you can add the staging repository to your projects resolvers
>>>>>>>>>>> and test with the RC (make sure to clean up the artifact cache before/after
>>>>>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>>>>>
>>>>>>>>>>> ===========================================
>>>>>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>>>>>> ===========================================
>>>>>>>>>>>
>>>>>>>>>>> Committers should look at those and triage. Extremely important
>>>>>>>>>>> bug fixes, documentation, and API tweaks that impact compatibility should
>>>>>>>>>>> be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0
>>>>>>>>>>> as appropriate.
>>>>>>>>>>>
>>>>>>>>>>> ===================
>>>>>>>>>>> Why is my bug not fixed?
>>>>>>>>>>> ===================
>>>>>>>>>>>
>>>>>>>>>>> In order to make timely releases, we will typically not hold the
>>>>>>>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>>>>>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>>>>>>>>> been correctly targeted please ping me or a committer to help target the
>>>>>>>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>>>>>>> https://s.apache.org/WmoI).
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> Ryan Blue
>> Software Engineer
>> Netflix
>>
>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Dongjoon Hyun <do...@gmail.com>.
+1 (non-binding).

Bests,
Dongjoon.


On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <rb...@netflix.com.invalid>
wrote:

> +1 (non-binding)
>
> On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <ga...@gmail.com> wrote:
>
>> +1 (binding) in Spark SQL, Core and PySpark.
>>
>> Xiao
>>
>> 2018-02-24 14:49 GMT-08:00 Ricardo Almeida <ri...@actnowib.com>
>> :
>>
>>> +1 (non-binding)
>>>
>>> same as previous RC
>>>
>>> On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com> wrote:
>>>
>>>> +1
>>>>
>>>> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:
>>>>
>>>>> +1
>>>>> Tests passed and additionally ran Arrow related tests and did some
>>>>> perf checks with python 2.7.14
>>>>>
>>>>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca>
>>>>> wrote:
>>>>>
>>>>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or
>>>>>> someone with Arrow experience sign off on this release.
>>>>>>
>>>>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> +1 (binding)
>>>>>>>
>>>>>>> Passed all the tests, looks good.
>>>>>>>
>>>>>>> Cheng
>>>>>>>
>>>>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>>>>
>>>>>>> +1 (binding)
>>>>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>>>>
>>>>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:
>>>>>>>
>>>>>>>> +1 (non-binding)
>>>>>>>>
>>>>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> New to testing out Spark RCs for the community but I was able to
>>>>>>>>> run some of the basic unit tests without error so for what it's worth, I'm
>>>>>>>>> a +1.
>>>>>>>>>
>>>>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <
>>>>>>>>> sameerag@apache.org> wrote:
>>>>>>>>>
>>>>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>>>>
>>>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>>>> https://spark.apache.org/
>>>>>>>>>>
>>>>>>>>>> The tag to be voted on is v2.3.0-rc5:
>>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc5
>>>>>>>>>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>>>>
>>>>>>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>>>>>>
>>>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>>>> found at:
>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>>>>
>>>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>>>
>>>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>>>>>> spark-1266/
>>>>>>>>>>
>>>>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>>>>>>> /_site/index.html
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> FAQ
>>>>>>>>>>
>>>>>>>>>> =======================================
>>>>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>>>>> =======================================
>>>>>>>>>>
>>>>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing,
>>>>>>>>>> there are currently no known release blockers.
>>>>>>>>>>
>>>>>>>>>> =========================
>>>>>>>>>> How can I help test this release?
>>>>>>>>>> =========================
>>>>>>>>>>
>>>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>>>> taking an existing Spark workload and running on this release candidate,
>>>>>>>>>> then reporting any regressions.
>>>>>>>>>>
>>>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>>>>> Java/Scala you can add the staging repository to your projects resolvers
>>>>>>>>>> and test with the RC (make sure to clean up the artifact cache before/after
>>>>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>>>>
>>>>>>>>>> ===========================================
>>>>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>>>>> ===========================================
>>>>>>>>>>
>>>>>>>>>> Committers should look at those and triage. Extremely important
>>>>>>>>>> bug fixes, documentation, and API tweaks that impact compatibility should
>>>>>>>>>> be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0
>>>>>>>>>> as appropriate.
>>>>>>>>>>
>>>>>>>>>> ===================
>>>>>>>>>> Why is my bug not fixed?
>>>>>>>>>> ===================
>>>>>>>>>>
>>>>>>>>>> In order to make timely releases, we will typically not hold the
>>>>>>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>>>>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>>>>>>>> been correctly targeted please ping me or a committer to help target the
>>>>>>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>>>>>> https://s.apache.org/WmoI).
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Ryan Blue <rb...@netflix.com.INVALID>.
+1 (non-binding)

On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <ga...@gmail.com> wrote:

> +1 (binding) in Spark SQL, Core and PySpark.
>
> Xiao
>
> 2018-02-24 14:49 GMT-08:00 Ricardo Almeida <ri...@actnowib.com>:
>
>> +1 (non-binding)
>>
>> same as previous RC
>>
>> On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com> wrote:
>>
>>> +1
>>>
>>> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:
>>>
>>>> +1
>>>> Tests passed and additionally ran Arrow related tests and did some perf
>>>> checks with python 2.7.14
>>>>
>>>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca>
>>>> wrote:
>>>>
>>>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or
>>>>> someone with Arrow experience sign off on this release.
>>>>>
>>>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> +1 (binding)
>>>>>>
>>>>>> Passed all the tests, looks good.
>>>>>>
>>>>>> Cheng
>>>>>>
>>>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>>>
>>>>>> +1 (binding)
>>>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>>>
>>>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:
>>>>>>
>>>>>>> +1 (non-binding)
>>>>>>>
>>>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>>>>
>>>>>>>> New to testing out Spark RCs for the community but I was able to
>>>>>>>> run some of the basic unit tests without error so for what it's worth, I'm
>>>>>>>> a +1.
>>>>>>>>
>>>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <
>>>>>>>> sameerag@apache.org> wrote:
>>>>>>>>
>>>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>>>
>>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>>> https://spark.apache.org/
>>>>>>>>>
>>>>>>>>> The tag to be voted on is v2.3.0-rc5:
>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc5
>>>>>>>>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>>>
>>>>>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>>>>>
>>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>>> found at:
>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>>>
>>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>>
>>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>>>>> spark-1266/
>>>>>>>>>
>>>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>>>>>> /_site/index.html
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> FAQ
>>>>>>>>>
>>>>>>>>> =======================================
>>>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>>>> =======================================
>>>>>>>>>
>>>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing,
>>>>>>>>> there are currently no known release blockers.
>>>>>>>>>
>>>>>>>>> =========================
>>>>>>>>> How can I help test this release?
>>>>>>>>> =========================
>>>>>>>>>
>>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>>> taking an existing Spark workload and running on this release candidate,
>>>>>>>>> then reporting any regressions.
>>>>>>>>>
>>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>>>> Java/Scala you can add the staging repository to your projects resolvers
>>>>>>>>> and test with the RC (make sure to clean up the artifact cache before/after
>>>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>>>
>>>>>>>>> ===========================================
>>>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>>>> ===========================================
>>>>>>>>>
>>>>>>>>> Committers should look at those and triage. Extremely important
>>>>>>>>> bug fixes, documentation, and API tweaks that impact compatibility should
>>>>>>>>> be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0
>>>>>>>>> as appropriate.
>>>>>>>>>
>>>>>>>>> ===================
>>>>>>>>> Why is my bug not fixed?
>>>>>>>>> ===================
>>>>>>>>>
>>>>>>>>> In order to make timely releases, we will typically not hold the
>>>>>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>>>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>>>>>>> been correctly targeted please ping me or a committer to help target the
>>>>>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>>>>> https://s.apache.org/WmoI).
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>
>>>>
>>>>
>>>
>>
>


-- 
Ryan Blue
Software Engineer
Netflix

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Xiao Li <ga...@gmail.com>.
+1 (binding) in Spark SQL, Core and PySpark.

Xiao

2018-02-24 14:49 GMT-08:00 Ricardo Almeida <ri...@actnowib.com>:

> +1 (non-binding)
>
> same as previous RC
>
> On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com> wrote:
>
>> +1
>>
>> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:
>>
>>> +1
>>> Tests passed and additionally ran Arrow related tests and did some perf
>>> checks with python 2.7.14
>>>
>>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca>
>>> wrote:
>>>
>>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or
>>>> someone with Arrow experience sign off on this release.
>>>>
>>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com>
>>>> wrote:
>>>>
>>>>> +1 (binding)
>>>>>
>>>>> Passed all the tests, looks good.
>>>>>
>>>>> Cheng
>>>>>
>>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>>
>>>>> +1 (binding)
>>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>>
>>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:
>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>>>
>>>>>>> New to testing out Spark RCs for the community but I was able to run
>>>>>>> some of the basic unit tests without error so for what it's worth, I'm a +1.
>>>>>>>
>>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sameerag@apache.org
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>>
>>>>>>>>
>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>>
>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>
>>>>>>>>
>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>> https://spark.apache.org/
>>>>>>>>
>>>>>>>> The tag to be voted on is v2.3.0-rc5:
>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc5
>>>>>>>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>>
>>>>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>>>>
>>>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>>>> at:
>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>>
>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>
>>>>>>>> The staging repository for this release can be found at:
>>>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>>>> spark-1266/
>>>>>>>>
>>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>>>>> /_site/index.html
>>>>>>>>
>>>>>>>>
>>>>>>>> FAQ
>>>>>>>>
>>>>>>>> =======================================
>>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>>> =======================================
>>>>>>>>
>>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing,
>>>>>>>> there are currently no known release blockers.
>>>>>>>>
>>>>>>>> =========================
>>>>>>>> How can I help test this release?
>>>>>>>> =========================
>>>>>>>>
>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>> taking an existing Spark workload and running on this release candidate,
>>>>>>>> then reporting any regressions.
>>>>>>>>
>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>>> Java/Scala you can add the staging repository to your projects resolvers
>>>>>>>> and test with the RC (make sure to clean up the artifact cache before/after
>>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>>
>>>>>>>> ===========================================
>>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>>> ===========================================
>>>>>>>>
>>>>>>>> Committers should look at those and triage. Extremely important bug
>>>>>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>>>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>>>>>>> appropriate.
>>>>>>>>
>>>>>>>> ===================
>>>>>>>> Why is my bug not fixed?
>>>>>>>> ===================
>>>>>>>>
>>>>>>>> In order to make timely releases, we will typically not hold the
>>>>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>>>>>> been correctly targeted please ping me or a committer to help target the
>>>>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>>>> https://s.apache.org/WmoI).
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Twitter: https://twitter.com/holdenkarau
>>>>
>>>
>>>
>>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Ricardo Almeida <ri...@actnowib.com>.
+1 (non-binding)

same as previous RC

On 24 February 2018 at 11:10, Hyukjin Kwon <gu...@gmail.com> wrote:

> +1
>
> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:
>
>> +1
>> Tests passed and additionally ran Arrow related tests and did some perf
>> checks with python 2.7.14
>>
>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca>
>> wrote:
>>
>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or someone
>>> with Arrow experience sign off on this release.
>>>
>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com>
>>> wrote:
>>>
>>>> +1 (binding)
>>>>
>>>> Passed all the tests, looks good.
>>>>
>>>> Cheng
>>>>
>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>
>>>> +1 (binding)
>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>
>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:
>>>>
>>>>> +1 (non-binding)
>>>>>
>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>>
>>>>>> New to testing out Spark RCs for the community but I was able to run
>>>>>> some of the basic unit tests without error so for what it's worth, I'm a +1.
>>>>>>
>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org>
>>>>>> wrote:
>>>>>>
>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>
>>>>>>>
>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>
>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>
>>>>>>>
>>>>>>> To learn more about Apache Spark, please see
>>>>>>> https://spark.apache.org/
>>>>>>>
>>>>>>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>>>>>>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>
>>>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>>>
>>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>>> at:
>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>
>>>>>>> Release artifacts are signed with the following key:
>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>
>>>>>>> The staging repository for this release can be found at:
>>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>>> spark-1266/
>>>>>>>
>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>>>> /_site/index.html
>>>>>>>
>>>>>>>
>>>>>>> FAQ
>>>>>>>
>>>>>>> =======================================
>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>> =======================================
>>>>>>>
>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing, there
>>>>>>> are currently no known release blockers.
>>>>>>>
>>>>>>> =========================
>>>>>>> How can I help test this release?
>>>>>>> =========================
>>>>>>>
>>>>>>> If you are a Spark user, you can help us test this release by taking
>>>>>>> an existing Spark workload and running on this release candidate, then
>>>>>>> reporting any regressions.
>>>>>>>
>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>> Java/Scala you can add the staging repository to your projects resolvers
>>>>>>> and test with the RC (make sure to clean up the artifact cache before/after
>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>
>>>>>>> ===========================================
>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>> ===========================================
>>>>>>>
>>>>>>> Committers should look at those and triage. Extremely important bug
>>>>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>>>>>> appropriate.
>>>>>>>
>>>>>>> ===================
>>>>>>> Why is my bug not fixed?
>>>>>>> ===================
>>>>>>>
>>>>>>> In order to make timely releases, we will typically not hold the
>>>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>>>>> been correctly targeted please ping me or a committer to help target the
>>>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>>> https://s.apache.org/WmoI).
>>>>>>>
>>>>>>
>>>>>>
>>>>
>>>
>>>
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Hyukjin Kwon <gu...@gmail.com>.
+1

2018-02-24 16:57 GMT+09:00 Bryan Cutler <cu...@gmail.com>:

> +1
> Tests passed and additionally ran Arrow related tests and did some perf
> checks with python 2.7.14
>
> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca>
> wrote:
>
>> Note: given the state of Jenkins I'd love to see Bryan Cutler or someone
>> with Arrow experience sign off on this release.
>>
>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com>
>> wrote:
>>
>>> +1 (binding)
>>>
>>> Passed all the tests, looks good.
>>>
>>> Cheng
>>>
>>> On 2/23/18 15:00, Holden Karau wrote:
>>>
>>> +1 (binding)
>>> PySpark artifacts install in a fresh Py3 virtual env
>>>
>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:
>>>
>>>> +1 (non-binding)
>>>>
>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>> joshgoldsboroughster@gmail.com> wrote:
>>>>
>>>>> New to testing out Spark RCs for the community but I was able to run
>>>>> some of the basic unit tests without error so for what it's worth, I'm a +1.
>>>>>
>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org>
>>>>> wrote:
>>>>>
>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>
>>>>>>
>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>
>>>>>> [ ] -1 Do not release this package because ...
>>>>>>
>>>>>>
>>>>>> To learn more about Apache Spark, please see
>>>>>> https://spark.apache.org/
>>>>>>
>>>>>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>>>>>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>
>>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>>
>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>> at:
>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>
>>>>>> Release artifacts are signed with the following key:
>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>
>>>>>> The staging repository for this release can be found at:
>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>> spark-1266/
>>>>>>
>>>>>> The documentation corresponding to this release can be found at:
>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>>> /_site/index.html
>>>>>>
>>>>>>
>>>>>> FAQ
>>>>>>
>>>>>> =======================================
>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>> =======================================
>>>>>>
>>>>>> Please see https://s.apache.org/oXKi. At the time of writing, there
>>>>>> are currently no known release blockers.
>>>>>>
>>>>>> =========================
>>>>>> How can I help test this release?
>>>>>> =========================
>>>>>>
>>>>>> If you are a Spark user, you can help us test this release by taking
>>>>>> an existing Spark workload and running on this release candidate, then
>>>>>> reporting any regressions.
>>>>>>
>>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>>>> can add the staging repository to your projects resolvers and test with the
>>>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>>>> up building with a out of date RC going forward).
>>>>>>
>>>>>> ===========================================
>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>> ===========================================
>>>>>>
>>>>>> Committers should look at those and triage. Extremely important bug
>>>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>>>>> appropriate.
>>>>>>
>>>>>> ===================
>>>>>> Why is my bug not fixed?
>>>>>> ===================
>>>>>>
>>>>>> In order to make timely releases, we will typically not hold the
>>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>>>> been correctly targeted please ping me or a committer to help target the
>>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>>> https://s.apache.org/WmoI).
>>>>>>
>>>>>
>>>>>
>>>
>>
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>>
>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Bryan Cutler <cu...@gmail.com>.
+1
Tests passed and additionally ran Arrow related tests and did some perf
checks with python 2.7.14

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <ho...@pigscanfly.ca> wrote:

> Note: given the state of Jenkins I'd love to see Bryan Cutler or someone
> with Arrow experience sign off on this release.
>
> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com> wrote:
>
>> +1 (binding)
>>
>> Passed all the tests, looks good.
>>
>> Cheng
>>
>> On 2/23/18 15:00, Holden Karau wrote:
>>
>> +1 (binding)
>> PySpark artifacts install in a fresh Py3 virtual env
>>
>> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>> joshgoldsboroughster@gmail.com> wrote:
>>>
>>>> New to testing out Spark RCs for the community but I was able to run
>>>> some of the basic unit tests without error so for what it's worth, I'm a +1.
>>>>
>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org>
>>>> wrote:
>>>>
>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>
>>>>>
>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>
>>>>> [ ] -1 Do not release this package because ...
>>>>>
>>>>>
>>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>>
>>>>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>>>>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>
>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>
>>>>> The release files, including signatures, digests, etc. can be found at:
>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>
>>>>> Release artifacts are signed with the following key:
>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>
>>>>> The staging repository for this release can be found at:
>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>> spark-1266/
>>>>>
>>>>> The documentation corresponding to this release can be found at:
>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>> /_site/index.html
>>>>>
>>>>>
>>>>> FAQ
>>>>>
>>>>> =======================================
>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>> =======================================
>>>>>
>>>>> Please see https://s.apache.org/oXKi. At the time of writing, there
>>>>> are currently no known release blockers.
>>>>>
>>>>> =========================
>>>>> How can I help test this release?
>>>>> =========================
>>>>>
>>>>> If you are a Spark user, you can help us test this release by taking
>>>>> an existing Spark workload and running on this release candidate, then
>>>>> reporting any regressions.
>>>>>
>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>>> can add the staging repository to your projects resolvers and test with the
>>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>>> up building with a out of date RC going forward).
>>>>>
>>>>> ===========================================
>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>> ===========================================
>>>>>
>>>>> Committers should look at those and triage. Extremely important bug
>>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>>>> appropriate.
>>>>>
>>>>> ===================
>>>>> Why is my bug not fixed?
>>>>> ===================
>>>>>
>>>>> In order to make timely releases, we will typically not hold the
>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>>> been correctly targeted please ping me or a committer to help target the
>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>> https://s.apache.org/WmoI).
>>>>>
>>>>
>>>>
>>
>
>
> --
> Twitter: https://twitter.com/holdenkarau
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Holden Karau <ho...@pigscanfly.ca>.
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone
with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <li...@gmail.com> wrote:

> +1 (binding)
>
> Passed all the tests, looks good.
>
> Cheng
>
> On 2/23/18 15:00, Holden Karau wrote:
>
> +1 (binding)
> PySpark artifacts install in a fresh Py3 virtual env
>
> On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:
>
>> +1 (non-binding)
>>
>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>> joshgoldsboroughster@gmail.com> wrote:
>>
>>> New to testing out Spark RCs for the community but I was able to run
>>> some of the basic unit tests without error so for what it's worth, I'm a +1.
>>>
>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org>
>>> wrote:
>>>
>>>> Please vote on releasing the following candidate as Apache Spark
>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>
>>>>
>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>>
>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>
>>>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>>>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>
>>>> List of JIRA tickets resolved in this release can be found here:
>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>
>>>> Release artifacts are signed with the following key:
>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>
>>>> The staging repository for this release can be found at:
>>>> https://repository.apache.org/content/repositories/orgapachespark-1266/
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>> /_site/index.html
>>>>
>>>>
>>>> FAQ
>>>>
>>>> =======================================
>>>> What are the unresolved issues targeted for 2.3.0?
>>>> =======================================
>>>>
>>>> Please see https://s.apache.org/oXKi. At the time of writing, there
>>>> are currently no known release blockers.
>>>>
>>>> =========================
>>>> How can I help test this release?
>>>> =========================
>>>>
>>>> If you are a Spark user, you can help us test this release by taking an
>>>> existing Spark workload and running on this release candidate, then
>>>> reporting any regressions.
>>>>
>>>> If you're working in PySpark you can set up a virtual env and install
>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>> can add the staging repository to your projects resolvers and test with the
>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>> up building with a out of date RC going forward).
>>>>
>>>> ===========================================
>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>> ===========================================
>>>>
>>>> Committers should look at those and triage. Extremely important bug
>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>>> appropriate.
>>>>
>>>> ===================
>>>> Why is my bug not fixed?
>>>> ===================
>>>>
>>>> In order to make timely releases, we will typically not hold the
>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>> been correctly targeted please ping me or a committer to help target the
>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>> https://s.apache.org/WmoI).
>>>>
>>>
>>>
>


-- 
Twitter: https://twitter.com/holdenkarau

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Cheng Lian <li...@gmail.com>.
+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
> +1 (binding)
> PySpark artifacts install in a fresh Py3 virtual env
>
> On Feb 23, 2018 7:55 AM, "Denny Lee" <denny.g.lee@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     +1 (non-binding)
>
>     On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough
>     <joshgoldsboroughster@gmail.com
>     <ma...@gmail.com>> wrote:
>
>         New to testing out Spark RCs for the community but I was able
>         to run some of the basic unit tests without error so for what
>         it's worth, I'm a +1.
>
>         On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal
>         <sameerag@apache.org <ma...@apache.org>> wrote:
>
>             Please vote on releasing the following candidate as Apache
>             Spark version 2.3.0. The vote is open until Tuesday
>             February 27, 2018 at 8:00:00 am UTC and passes if a
>             majority of at least 3 PMC +1 votes are cast.
>
>
>             [ ] +1 Release this package as Apache Spark 2.3.0
>
>             [ ] -1 Do not release this package because ...
>
>
>             To learn more about Apache Spark, please see
>             https://spark.apache.org/
>
>             The tag to be voted on is v2.3.0-rc5:
>             https://github.com/apache/spark/tree/v2.3.0-rc5
>             <https://github.com/apache/spark/tree/v2.3.0-rc5>
>             (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>
>             List of JIRA tickets resolved in this release can be found
>             here:
>             https://issues.apache.org/jira/projects/SPARK/versions/12339551
>             <https://issues.apache.org/jira/projects/SPARK/versions/12339551>
>
>             The release files, including signatures, digests, etc. can
>             be found at:
>             https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>             <https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/>
>
>             Release artifacts are signed with the following key:
>             https://dist.apache.org/repos/dist/dev/spark/KEYS
>             <https://dist.apache.org/repos/dist/dev/spark/KEYS>
>
>             The staging repository for this release can be found at:
>             https://repository.apache.org/content/repositories/orgapachespark-1266/
>             <https://repository.apache.org/content/repositories/orgapachespark-1266/>
>
>             The documentation corresponding to this release can be
>             found at:
>             https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html
>             <https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html>
>
>
>             FAQ
>
>             =======================================
>             What are the unresolved issues targeted for 2.3.0?
>             =======================================
>
>             Please see https://s.apache.org/oXKi. At the time of
>             writing, there are currently no known release blockers.
>
>             =========================
>             How can I help test this release?
>             =========================
>
>             If you are a Spark user, you can help us test this release
>             by taking an existing Spark workload and running on this
>             release candidate, then reporting any regressions.
>
>             If you're working in PySpark you can set up a virtual env
>             and install the current RC and see if anything important
>             breaks, in the Java/Scala you can add the staging
>             repository to your projects resolvers and test with the RC
>             (make sure to clean up the artifact cache before/after so
>             you don't end up building with a out of date RC going
>             forward).
>
>             ===========================================
>             What should happen to JIRA tickets still targeting 2.3.0?
>             ===========================================
>
>             Committers should look at those and triage. Extremely
>             important bug fixes, documentation, and API tweaks that
>             impact compatibility should be worked on immediately.
>             Everything else please retarget to 2.3.1 or 2.4.0 as
>             appropriate.
>
>             ===================
>             Why is my bug not fixed?
>             ===================
>
>             In order to make timely releases, we will typically not
>             hold the release unless the bug in question is a
>             regression from 2.2.0. That being said, if there is
>             something which is a regression from 2.2.0 and has not
>             been correctly targeted please ping me or a committer to
>             help target the issue (you can see the open issues listed
>             as impacting Spark 2.3.0 at https://s.apache.org/WmoI).
>
>


Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Holden Karau <ho...@gmail.com>.
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <de...@gmail.com> wrote:

> +1 (non-binding)
>
> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
> joshgoldsboroughster@gmail.com> wrote:
>
>> New to testing out Spark RCs for the community but I was able to run some
>> of the basic unit tests without error so for what it's worth, I'm a +1.
>>
>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org>
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
>>> and passes if a majority of at least 3 PMC +1 votes are cast.
>>>
>>>
>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>
>>> List of JIRA tickets resolved in this release can be found here:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1266/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>> /_site/index.html
>>>
>>>
>>> FAQ
>>>
>>> =======================================
>>> What are the unresolved issues targeted for 2.3.0?
>>> =======================================
>>>
>>> Please see https://s.apache.org/oXKi. At the time of writing, there are
>>> currently no known release blockers.
>>>
>>> =========================
>>> How can I help test this release?
>>> =========================
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala you
>>> can add the staging repository to your projects resolvers and test with the
>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>> up building with a out of date RC going forward).
>>>
>>> ===========================================
>>> What should happen to JIRA tickets still targeting 2.3.0?
>>> ===========================================
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>> appropriate.
>>>
>>> ===================
>>> Why is my bug not fixed?
>>> ===================
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.2.0. That being said, if
>>> there is something which is a regression from 2.2.0 and has not been
>>> correctly targeted please ping me or a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.3.0 at
>>> https://s.apache.org/WmoI).
>>>
>>
>>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Denny Lee <de...@gmail.com>.
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
joshgoldsboroughster@gmail.com> wrote:

> New to testing out Spark RCs for the community but I was able to run some
> of the basic unit tests without error so for what it's worth, I'm a +1.
>
> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org>
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
>> and passes if a majority of at least 3 PMC +1 votes are cast.
>>
>>
>> [ ] +1 Release this package as Apache Spark 2.3.0
>>
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.3.0-rc5:
>> https://github.com/apache/spark/tree/v2.3.0-rc5
>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>
>> List of JIRA tickets resolved in this release can be found here:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>
>> Release artifacts are signed with the following key:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1266/
>>
>> The documentation corresponding to this release can be found at:
>>
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html
>>
>>
>> FAQ
>>
>> =======================================
>> What are the unresolved issues targeted for 2.3.0?
>> =======================================
>>
>> Please see https://s.apache.org/oXKi. At the time of writing, there are
>> currently no known release blockers.
>>
>> =========================
>> How can I help test this release?
>> =========================
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you can
>> add the staging repository to your projects resolvers and test with the RC
>> (make sure to clean up the artifact cache before/after so you don't end up
>> building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 2.3.0?
>> ===========================================
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>> appropriate.
>>
>> ===================
>> Why is my bug not fixed?
>> ===================
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.2.0. That being said, if
>> there is something which is a regression from 2.2.0 and has not been
>> correctly targeted please ping me or a committer to help target the issue
>> (you can see the open issues listed as impacting Spark 2.3.0 at
>> https://s.apache.org/WmoI).
>>
>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Josh Goldsborough <jo...@gmail.com>.
New to testing out Spark RCs for the community but I was able to run some
of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <sa...@apache.org> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
> and passes if a majority of at least 3 PMC +1 votes are cast.
>
>
> [ ] +1 Release this package as Apache Spark 2.3.0
>
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/
> spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>
> List of JIRA tickets resolved in this release can be found here:
> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>
> Release artifacts are signed with the following key:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1266/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-
> docs/_site/index.html
>
>
> FAQ
>
> =======================================
> What are the unresolved issues targeted for 2.3.0?
> =======================================
>
> Please see https://s.apache.org/oXKi. At the time of writing, there are
> currently no known release blockers.
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you can
> add the staging repository to your projects resolvers and test with the RC
> (make sure to clean up the artifact cache before/after so you don't end up
> building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.3.0?
> ===========================================
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
> appropriate.
>
> ===================
> Why is my bug not fixed?
> ===================
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.2.0. That being said, if
> there is something which is a regression from 2.2.0 and has not been
> correctly targeted please ping me or a committer to help target the issue
> (you can see the open issues listed as impacting Spark 2.3.0 at
> https://s.apache.org/WmoI).
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Marcelo Vanzin <va...@cloudera.com>.
+1

Checked the archives; ran a subset of our internal tests on the
hadoop2.7 archive, looks good.

On Thu, Feb 22, 2018 at 2:23 PM, Sameer Agarwal <sa...@apache.org> wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
> and passes if a majority of at least 3 PMC +1 votes are cast.
>
>
> [ ] +1 Release this package as Apache Spark 2.3.0
>
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.3.0-rc5:
> https://github.com/apache/spark/tree/v2.3.0-rc5
> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>
> List of JIRA tickets resolved in this release can be found here:
> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>
> Release artifacts are signed with the following key:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1266/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html
>
>
> FAQ
>
> =======================================
> What are the unresolved issues targeted for 2.3.0?
> =======================================
>
> Please see https://s.apache.org/oXKi. At the time of writing, there are
> currently no known release blockers.
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you can
> add the staging repository to your projects resolvers and test with the RC
> (make sure to clean up the artifact cache before/after so you don't end up
> building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.3.0?
> ===========================================
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
> appropriate.
>
> ===================
> Why is my bug not fixed?
> ===================
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.2.0. That being said, if
> there is something which is a regression from 2.2.0 and has not been
> correctly targeted please ping me or a committer to help target the issue
> (you can see the open issues listed as impacting Spark 2.3.0 at
> https://s.apache.org/WmoI).



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Sean Owen <sr...@apache.org>.
Same result as last RC for me. +1

On Thu, Feb 22, 2018 at 4:23 PM Sameer Agarwal <sa...@apache.org> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
> and passes if a majority of at least 3 PMC +1 votes are cast.
>
>
> [ ] +1 Release this package as Apache Spark 2.3.0
>
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.3.0-rc5:
> https://github.com/apache/spark/tree/v2.3.0-rc5
> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>
> List of JIRA tickets resolved in this release can be found here:
> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>
> Release artifacts are signed with the following key:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1266/
>
> The documentation corresponding to this release can be found at:
>
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html
>
>
> FAQ
>
> =======================================
> What are the unresolved issues targeted for 2.3.0?
> =======================================
>
> Please see https://s.apache.org/oXKi. At the time of writing, there are
> currently no known release blockers.
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you can
> add the staging repository to your projects resolvers and test with the RC
> (make sure to clean up the artifact cache before/after so you don't end up
> building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.3.0?
> ===========================================
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
> appropriate.
>
> ===================
> Why is my bug not fixed?
> ===================
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.2.0. That being said, if
> there is something which is a regression from 2.2.0 and has not been
> correctly targeted please ping me or a committer to help target the issue
> (you can see the open issues listed as impacting Spark 2.3.0 at
> https://s.apache.org/WmoI).
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Weichen Xu <we...@databricks.com>.
+1

On Fri, Feb 23, 2018 at 5:40 PM, Gengliang <lt...@gmail.com> wrote:

> +1
>
> On Fri, Feb 23, 2018 at 11:35 AM, Xingbo Jiang <ji...@gmail.com>
> wrote:
>
>> +1
>>
>> 2018-02-23 11:26 GMT+08:00 Takuya UESHIN <ue...@happy-camper.st>:
>>
>>> +1
>>>
>>> On Fri, Feb 23, 2018 at 12:24 PM, Wenchen Fan <cl...@gmail.com>
>>> wrote:
>>>
>>>> +1
>>>>
>>>> On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <sa...@apache.org>
>>>> wrote:
>>>>
>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>
>>>>>
>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>
>>>>> [ ] -1 Do not release this package because ...
>>>>>
>>>>>
>>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>>
>>>>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>>>>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>
>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>
>>>>> The release files, including signatures, digests, etc. can be found at:
>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>
>>>>> Release artifacts are signed with the following key:
>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>
>>>>> The staging repository for this release can be found at:
>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>> spark-1266/
>>>>>
>>>>> The documentation corresponding to this release can be found at:
>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>> /_site/index.html
>>>>>
>>>>>
>>>>> FAQ
>>>>>
>>>>> =======================================
>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>> =======================================
>>>>>
>>>>> Please see https://s.apache.org/oXKi. At the time of writing, there
>>>>> are currently no known release blockers.
>>>>>
>>>>> =========================
>>>>> How can I help test this release?
>>>>> =========================
>>>>>
>>>>> If you are a Spark user, you can help us test this release by taking
>>>>> an existing Spark workload and running on this release candidate, then
>>>>> reporting any regressions.
>>>>>
>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>>> can add the staging repository to your projects resolvers and test with the
>>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>>> up building with a out of date RC going forward).
>>>>>
>>>>> ===========================================
>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>> ===========================================
>>>>>
>>>>> Committers should look at those and triage. Extremely important bug
>>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>>>> appropriate.
>>>>>
>>>>> ===================
>>>>> Why is my bug not fixed?
>>>>> ===================
>>>>>
>>>>> In order to make timely releases, we will typically not hold the
>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>>> been correctly targeted please ping me or a committer to help target the
>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>>> https://s.apache.org/WmoI).
>>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Takuya UESHIN
>>> Tokyo, Japan
>>>
>>> http://twitter.com/ueshin
>>>
>>
>>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Gengliang <lt...@gmail.com>.
+1

On Fri, Feb 23, 2018 at 11:35 AM, Xingbo Jiang <ji...@gmail.com>
wrote:

> +1
>
> 2018-02-23 11:26 GMT+08:00 Takuya UESHIN <ue...@happy-camper.st>:
>
>> +1
>>
>> On Fri, Feb 23, 2018 at 12:24 PM, Wenchen Fan <cl...@gmail.com>
>> wrote:
>>
>>> +1
>>>
>>> On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <sa...@apache.org>
>>> wrote:
>>>
>>>> Please vote on releasing the following candidate as Apache Spark
>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00
>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>
>>>>
>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>>
>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>
>>>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>>>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>
>>>> List of JIRA tickets resolved in this release can be found here:
>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>
>>>> Release artifacts are signed with the following key:
>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>
>>>> The staging repository for this release can be found at:
>>>> https://repository.apache.org/content/repositories/orgapachespark-1266/
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>> /_site/index.html
>>>>
>>>>
>>>> FAQ
>>>>
>>>> =======================================
>>>> What are the unresolved issues targeted for 2.3.0?
>>>> =======================================
>>>>
>>>> Please see https://s.apache.org/oXKi. At the time of writing, there
>>>> are currently no known release blockers.
>>>>
>>>> =========================
>>>> How can I help test this release?
>>>> =========================
>>>>
>>>> If you are a Spark user, you can help us test this release by taking an
>>>> existing Spark workload and running on this release candidate, then
>>>> reporting any regressions.
>>>>
>>>> If you're working in PySpark you can set up a virtual env and install
>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>> can add the staging repository to your projects resolvers and test with the
>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>> up building with a out of date RC going forward).
>>>>
>>>> ===========================================
>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>> ===========================================
>>>>
>>>> Committers should look at those and triage. Extremely important bug
>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>>> appropriate.
>>>>
>>>> ===================
>>>> Why is my bug not fixed?
>>>> ===================
>>>>
>>>> In order to make timely releases, we will typically not hold the
>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>> said, if there is something which is a regression from 2.2.0 and has not
>>>> been correctly targeted please ping me or a committer to help target the
>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 at
>>>> https://s.apache.org/WmoI).
>>>>
>>>
>>>
>>
>>
>> --
>> Takuya UESHIN
>> Tokyo, Japan
>>
>> http://twitter.com/ueshin
>>
>
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Xingbo Jiang <ji...@gmail.com>.
+1

2018-02-23 11:26 GMT+08:00 Takuya UESHIN <ue...@happy-camper.st>:

> +1
>
> On Fri, Feb 23, 2018 at 12:24 PM, Wenchen Fan <cl...@gmail.com> wrote:
>
>> +1
>>
>> On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <sa...@apache.org>
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
>>> and passes if a majority of at least 3 PMC +1 votes are cast.
>>>
>>>
>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>
>>> List of JIRA tickets resolved in this release can be found here:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1266/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>> /_site/index.html
>>>
>>>
>>> FAQ
>>>
>>> =======================================
>>> What are the unresolved issues targeted for 2.3.0?
>>> =======================================
>>>
>>> Please see https://s.apache.org/oXKi. At the time of writing, there are
>>> currently no known release blockers.
>>>
>>> =========================
>>> How can I help test this release?
>>> =========================
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala you
>>> can add the staging repository to your projects resolvers and test with the
>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>> up building with a out of date RC going forward).
>>>
>>> ===========================================
>>> What should happen to JIRA tickets still targeting 2.3.0?
>>> ===========================================
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>>> appropriate.
>>>
>>> ===================
>>> Why is my bug not fixed?
>>> ===================
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.2.0. That being said, if
>>> there is something which is a regression from 2.2.0 and has not been
>>> correctly targeted please ping me or a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.3.0 at
>>> https://s.apache.org/WmoI).
>>>
>>
>>
>
>
> --
> Takuya UESHIN
> Tokyo, Japan
>
> http://twitter.com/ueshin
>

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Takuya UESHIN <ue...@happy-camper.st>.
+1

On Fri, Feb 23, 2018 at 12:24 PM, Wenchen Fan <cl...@gmail.com> wrote:

> +1
>
> On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <sa...@apache.org>
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
>> and passes if a majority of at least 3 PMC +1 votes are cast.
>>
>>
>> [ ] +1 Release this package as Apache Spark 2.3.0
>>
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spar
>> k/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>
>> List of JIRA tickets resolved in this release can be found here:
>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>
>> Release artifacts are signed with the following key:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1266/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>> /_site/index.html
>>
>>
>> FAQ
>>
>> =======================================
>> What are the unresolved issues targeted for 2.3.0?
>> =======================================
>>
>> Please see https://s.apache.org/oXKi. At the time of writing, there are
>> currently no known release blockers.
>>
>> =========================
>> How can I help test this release?
>> =========================
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you can
>> add the staging repository to your projects resolvers and test with the RC
>> (make sure to clean up the artifact cache before/after so you don't end up
>> building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 2.3.0?
>> ===========================================
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
>> appropriate.
>>
>> ===================
>> Why is my bug not fixed?
>> ===================
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.2.0. That being said, if
>> there is something which is a regression from 2.2.0 and has not been
>> correctly targeted please ping me or a committer to help target the issue
>> (you can see the open issues listed as impacting Spark 2.3.0 at
>> https://s.apache.org/WmoI).
>>
>
>


-- 
Takuya UESHIN
Tokyo, Japan

http://twitter.com/ueshin

Re: [VOTE] Spark 2.3.0 (RC5)

Posted by Wenchen Fan <cl...@gmail.com>.
+1

On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <sa...@apache.org> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
> and passes if a majority of at least 3 PMC +1 votes are cast.
>
>
> [ ] +1 Release this package as Apache Spark 2.3.0
>
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.3.0-rc5: https://github.com/apache/
> spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>
> List of JIRA tickets resolved in this release can be found here:
> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>
> Release artifacts are signed with the following key:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1266/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-
> docs/_site/index.html
>
>
> FAQ
>
> =======================================
> What are the unresolved issues targeted for 2.3.0?
> =======================================
>
> Please see https://s.apache.org/oXKi. At the time of writing, there are
> currently no known release blockers.
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you can
> add the staging repository to your projects resolvers and test with the RC
> (make sure to clean up the artifact cache before/after so you don't end up
> building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.3.0?
> ===========================================
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
> appropriate.
>
> ===================
> Why is my bug not fixed?
> ===================
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.2.0. That being said, if
> there is something which is a regression from 2.2.0 and has not been
> correctly targeted please ping me or a committer to help target the issue
> (you can see the open issues listed as impacting Spark 2.3.0 at
> https://s.apache.org/WmoI).
>