You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Xingbo Jiang <ji...@gmail.com> on 2019/10/31 06:00:13 UTC

[VOTE] SPARK 3.0.0-preview (RC2)

Please vote on releasing the following candidate as Apache Spark version
3.0.0-preview.

The vote is open until November 3 PST and passes if a majority +1 PMC votes
are cast, with
a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0-preview
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-preview-rc2 (commit
007c873ae34f58651481ccba30e8e2ba38a692c4):
https://github.com/apache/spark/tree/v3.0.0-preview-rc2

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/

Signatures used for Spark RCs can be found in this file:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1336/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/

The list of bug fixes going into 3.0.0 can be found at the following URL:
https://issues.apache.org/jira/projects/SPARK/versions/12339177

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with an out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.0.0?
===========================================

The current list of open tickets targeted at 3.0.0 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target
Version/s" = 3.0.0

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Re: [VOTE] SPARK 3.0.0-preview (RC2)

Posted by Xingbo Jiang <ji...@gmail.com>.
This vote passes! I'll follow up with a formal release announcement soon.

+1:
Sean Owen (binding)
Wenchen Fan (binding)
Hyukjin Kwon (binding)
Dongjoon Hyun (binding)
Takeshi Yamamuro

+0: None

-1: None

Thanks, everyone!

Xingbo

On Mon, Nov 4, 2019 at 9:35 AM Dongjoon Hyun <do...@gmail.com>
wrote:

> Hi, Xingbo.
>
> Could you sent a vote result email to finalize this vote, please?
>
> Bests,
> Dongjoon.
>
> On Fri, Nov 1, 2019 at 2:55 PM Takeshi Yamamuro <li...@gmail.com>
> wrote:
>
>> +1, too.
>>
>> On Sat, Nov 2, 2019 at 3:36 AM Hyukjin Kwon <gu...@gmail.com> wrote:
>>
>>> +1
>>>
>>> On Fri, 1 Nov 2019, 15:36 Wenchen Fan, <cl...@gmail.com> wrote:
>>>
>>>> The PR builder uses Hadoop 2.7 profile, which makes me think that 2.7
>>>> is more stable and we should make releases using 2.7 by default.
>>>>
>>>> +1
>>>>
>>>> On Fri, Nov 1, 2019 at 7:16 AM Xiao Li <li...@databricks.com> wrote:
>>>>
>>>>> Spark 3.0 will still use the Hadoop 2.7 profile by default, I think.
>>>>> Hadoop 2.7 profile is much more stable than Hadoop 3.2 profile.
>>>>>
>>>>> On Thu, Oct 31, 2019 at 3:54 PM Sean Owen <sr...@gmail.com> wrote:
>>>>>
>>>>>> This isn't a big thing, but I see that the pyspark build includes
>>>>>> Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
>>>>>> 3.2 by default.
>>>>>>
>>>>>> Otherwise, the tests all seems to pass with JDK 8 / 11 with all
>>>>>> profiles enabled, so I'm +1 on it.
>>>>>>
>>>>>>
>>>>>> On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <ji...@gmail.com>
>>>>>> wrote:
>>>>>> >
>>>>>> > Please vote on releasing the following candidate as Apache Spark
>>>>>> version 3.0.0-preview.
>>>>>> >
>>>>>> > The vote is open until November 3 PST and passes if a majority +1
>>>>>> PMC votes are cast, with
>>>>>> > a minimum of 3 +1 votes.
>>>>>> >
>>>>>> > [ ] +1 Release this package as Apache Spark 3.0.0-preview
>>>>>> > [ ] -1 Do not release this package because ...
>>>>>> >
>>>>>> > To learn more about Apache Spark, please see
>>>>>> http://spark.apache.org/
>>>>>> >
>>>>>> > The tag to be voted on is v3.0.0-preview-rc2 (commit
>>>>>> 007c873ae34f58651481ccba30e8e2ba38a692c4):
>>>>>> > https://github.com/apache/spark/tree/v3.0.0-preview-rc2
>>>>>> >
>>>>>> > The release files, including signatures, digests, etc. can be found
>>>>>> at:
>>>>>> >
>>>>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
>>>>>> >
>>>>>> > Signatures used for Spark RCs can be found in this file:
>>>>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>> >
>>>>>> > The staging repository for this release can be found at:
>>>>>> >
>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1336/
>>>>>> >
>>>>>> > The documentation corresponding to this release can be found at:
>>>>>> >
>>>>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
>>>>>> >
>>>>>> > The list of bug fixes going into 3.0.0 can be found at the
>>>>>> following URL:
>>>>>> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>>>> >
>>>>>> > FAQ
>>>>>> >
>>>>>> > =========================
>>>>>> > How can I help test this release?
>>>>>> > =========================
>>>>>> >
>>>>>> > If you are a Spark user, you can help us test this release by taking
>>>>>> > an existing Spark workload and running on this release candidate,
>>>>>> then
>>>>>> > reporting any regressions.
>>>>>> >
>>>>>> > If you're working in PySpark you can set up a virtual env and
>>>>>> install
>>>>>> > the current RC and see if anything important breaks, in the
>>>>>> Java/Scala
>>>>>> > you can add the staging repository to your projects resolvers and
>>>>>> test
>>>>>> > with the RC (make sure to clean up the artifact cache before/after
>>>>>> so
>>>>>> > you don't end up building with an out of date RC going forward).
>>>>>> >
>>>>>> > ===========================================
>>>>>> > What should happen to JIRA tickets still targeting 3.0.0?
>>>>>> > ===========================================
>>>>>> >
>>>>>> > The current list of open tickets targeted at 3.0.0 can be found at:
>>>>>> > https://issues.apache.org/jira/projects/SPARK and search for
>>>>>> "Target Version/s" = 3.0.0
>>>>>> >
>>>>>> > Committers should look at those and triage. Extremely important bug
>>>>>> > fixes, documentation, and API tweaks that impact compatibility
>>>>>> should
>>>>>> > be worked on immediately.
>>>>>> >
>>>>>> > ==================
>>>>>> > But my bug isn't fixed?
>>>>>> > ==================
>>>>>> >
>>>>>> > In order to make timely releases, we will typically not hold the
>>>>>> > release unless the bug in question is a regression from the previous
>>>>>> > release. That being said, if there is something which is a
>>>>>> regression
>>>>>> > that has not been correctly targeted please ping me or a committer
>>>>>> to
>>>>>> > help target the issue.
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> [image: Databricks Summit - Watch the talks]
>>>>> <https://databricks.com/sparkaisummit/north-america>
>>>>>
>>>>
>>
>> --
>> ---
>> Takeshi Yamamuro
>>
>

Re: [VOTE] SPARK 3.0.0-preview (RC2)

Posted by Dongjoon Hyun <do...@gmail.com>.
Hi, Xingbo.

Could you sent a vote result email to finalize this vote, please?

Bests,
Dongjoon.

On Fri, Nov 1, 2019 at 2:55 PM Takeshi Yamamuro <li...@gmail.com>
wrote:

> +1, too.
>
> On Sat, Nov 2, 2019 at 3:36 AM Hyukjin Kwon <gu...@gmail.com> wrote:
>
>> +1
>>
>> On Fri, 1 Nov 2019, 15:36 Wenchen Fan, <cl...@gmail.com> wrote:
>>
>>> The PR builder uses Hadoop 2.7 profile, which makes me think that 2.7 is
>>> more stable and we should make releases using 2.7 by default.
>>>
>>> +1
>>>
>>> On Fri, Nov 1, 2019 at 7:16 AM Xiao Li <li...@databricks.com> wrote:
>>>
>>>> Spark 3.0 will still use the Hadoop 2.7 profile by default, I think.
>>>> Hadoop 2.7 profile is much more stable than Hadoop 3.2 profile.
>>>>
>>>> On Thu, Oct 31, 2019 at 3:54 PM Sean Owen <sr...@gmail.com> wrote:
>>>>
>>>>> This isn't a big thing, but I see that the pyspark build includes
>>>>> Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
>>>>> 3.2 by default.
>>>>>
>>>>> Otherwise, the tests all seems to pass with JDK 8 / 11 with all
>>>>> profiles enabled, so I'm +1 on it.
>>>>>
>>>>>
>>>>> On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <ji...@gmail.com>
>>>>> wrote:
>>>>> >
>>>>> > Please vote on releasing the following candidate as Apache Spark
>>>>> version 3.0.0-preview.
>>>>> >
>>>>> > The vote is open until November 3 PST and passes if a majority +1
>>>>> PMC votes are cast, with
>>>>> > a minimum of 3 +1 votes.
>>>>> >
>>>>> > [ ] +1 Release this package as Apache Spark 3.0.0-preview
>>>>> > [ ] -1 Do not release this package because ...
>>>>> >
>>>>> > To learn more about Apache Spark, please see
>>>>> http://spark.apache.org/
>>>>> >
>>>>> > The tag to be voted on is v3.0.0-preview-rc2 (commit
>>>>> 007c873ae34f58651481ccba30e8e2ba38a692c4):
>>>>> > https://github.com/apache/spark/tree/v3.0.0-preview-rc2
>>>>> >
>>>>> > The release files, including signatures, digests, etc. can be found
>>>>> at:
>>>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
>>>>> >
>>>>> > Signatures used for Spark RCs can be found in this file:
>>>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>> >
>>>>> > The staging repository for this release can be found at:
>>>>> >
>>>>> https://repository.apache.org/content/repositories/orgapachespark-1336/
>>>>> >
>>>>> > The documentation corresponding to this release can be found at:
>>>>> >
>>>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
>>>>> >
>>>>> > The list of bug fixes going into 3.0.0 can be found at the following
>>>>> URL:
>>>>> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>>> >
>>>>> > FAQ
>>>>> >
>>>>> > =========================
>>>>> > How can I help test this release?
>>>>> > =========================
>>>>> >
>>>>> > If you are a Spark user, you can help us test this release by taking
>>>>> > an existing Spark workload and running on this release candidate,
>>>>> then
>>>>> > reporting any regressions.
>>>>> >
>>>>> > If you're working in PySpark you can set up a virtual env and install
>>>>> > the current RC and see if anything important breaks, in the
>>>>> Java/Scala
>>>>> > you can add the staging repository to your projects resolvers and
>>>>> test
>>>>> > with the RC (make sure to clean up the artifact cache before/after so
>>>>> > you don't end up building with an out of date RC going forward).
>>>>> >
>>>>> > ===========================================
>>>>> > What should happen to JIRA tickets still targeting 3.0.0?
>>>>> > ===========================================
>>>>> >
>>>>> > The current list of open tickets targeted at 3.0.0 can be found at:
>>>>> > https://issues.apache.org/jira/projects/SPARK and search for
>>>>> "Target Version/s" = 3.0.0
>>>>> >
>>>>> > Committers should look at those and triage. Extremely important bug
>>>>> > fixes, documentation, and API tweaks that impact compatibility should
>>>>> > be worked on immediately.
>>>>> >
>>>>> > ==================
>>>>> > But my bug isn't fixed?
>>>>> > ==================
>>>>> >
>>>>> > In order to make timely releases, we will typically not hold the
>>>>> > release unless the bug in question is a regression from the previous
>>>>> > release. That being said, if there is something which is a regression
>>>>> > that has not been correctly targeted please ping me or a committer to
>>>>> > help target the issue.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>>>
>>>>>
>>>>
>>>> --
>>>> [image: Databricks Summit - Watch the talks]
>>>> <https://databricks.com/sparkaisummit/north-america>
>>>>
>>>
>
> --
> ---
> Takeshi Yamamuro
>

Re: [VOTE] SPARK 3.0.0-preview (RC2)

Posted by Takeshi Yamamuro <li...@gmail.com>.
+1, too.

On Sat, Nov 2, 2019 at 3:36 AM Hyukjin Kwon <gu...@gmail.com> wrote:

> +1
>
> On Fri, 1 Nov 2019, 15:36 Wenchen Fan, <cl...@gmail.com> wrote:
>
>> The PR builder uses Hadoop 2.7 profile, which makes me think that 2.7 is
>> more stable and we should make releases using 2.7 by default.
>>
>> +1
>>
>> On Fri, Nov 1, 2019 at 7:16 AM Xiao Li <li...@databricks.com> wrote:
>>
>>> Spark 3.0 will still use the Hadoop 2.7 profile by default, I think.
>>> Hadoop 2.7 profile is much more stable than Hadoop 3.2 profile.
>>>
>>> On Thu, Oct 31, 2019 at 3:54 PM Sean Owen <sr...@gmail.com> wrote:
>>>
>>>> This isn't a big thing, but I see that the pyspark build includes
>>>> Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
>>>> 3.2 by default.
>>>>
>>>> Otherwise, the tests all seems to pass with JDK 8 / 11 with all
>>>> profiles enabled, so I'm +1 on it.
>>>>
>>>>
>>>> On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <ji...@gmail.com>
>>>> wrote:
>>>> >
>>>> > Please vote on releasing the following candidate as Apache Spark
>>>> version 3.0.0-preview.
>>>> >
>>>> > The vote is open until November 3 PST and passes if a majority +1 PMC
>>>> votes are cast, with
>>>> > a minimum of 3 +1 votes.
>>>> >
>>>> > [ ] +1 Release this package as Apache Spark 3.0.0-preview
>>>> > [ ] -1 Do not release this package because ...
>>>> >
>>>> > To learn more about Apache Spark, please see http://spark.apache.org/
>>>> >
>>>> > The tag to be voted on is v3.0.0-preview-rc2 (commit
>>>> 007c873ae34f58651481ccba30e8e2ba38a692c4):
>>>> > https://github.com/apache/spark/tree/v3.0.0-preview-rc2
>>>> >
>>>> > The release files, including signatures, digests, etc. can be found
>>>> at:
>>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
>>>> >
>>>> > Signatures used for Spark RCs can be found in this file:
>>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>> >
>>>> > The staging repository for this release can be found at:
>>>> >
>>>> https://repository.apache.org/content/repositories/orgapachespark-1336/
>>>> >
>>>> > The documentation corresponding to this release can be found at:
>>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
>>>> >
>>>> > The list of bug fixes going into 3.0.0 can be found at the following
>>>> URL:
>>>> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>> >
>>>> > FAQ
>>>> >
>>>> > =========================
>>>> > How can I help test this release?
>>>> > =========================
>>>> >
>>>> > If you are a Spark user, you can help us test this release by taking
>>>> > an existing Spark workload and running on this release candidate, then
>>>> > reporting any regressions.
>>>> >
>>>> > If you're working in PySpark you can set up a virtual env and install
>>>> > the current RC and see if anything important breaks, in the Java/Scala
>>>> > you can add the staging repository to your projects resolvers and test
>>>> > with the RC (make sure to clean up the artifact cache before/after so
>>>> > you don't end up building with an out of date RC going forward).
>>>> >
>>>> > ===========================================
>>>> > What should happen to JIRA tickets still targeting 3.0.0?
>>>> > ===========================================
>>>> >
>>>> > The current list of open tickets targeted at 3.0.0 can be found at:
>>>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>>>> Version/s" = 3.0.0
>>>> >
>>>> > Committers should look at those and triage. Extremely important bug
>>>> > fixes, documentation, and API tweaks that impact compatibility should
>>>> > be worked on immediately.
>>>> >
>>>> > ==================
>>>> > But my bug isn't fixed?
>>>> > ==================
>>>> >
>>>> > In order to make timely releases, we will typically not hold the
>>>> > release unless the bug in question is a regression from the previous
>>>> > release. That being said, if there is something which is a regression
>>>> > that has not been correctly targeted please ping me or a committer to
>>>> > help target the issue.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>>
>>>>
>>>
>>> --
>>> [image: Databricks Summit - Watch the talks]
>>> <https://databricks.com/sparkaisummit/north-america>
>>>
>>

-- 
---
Takeshi Yamamuro

Re: [VOTE] SPARK 3.0.0-preview (RC2)

Posted by Hyukjin Kwon <gu...@gmail.com>.
+1

On Fri, 1 Nov 2019, 15:36 Wenchen Fan, <cl...@gmail.com> wrote:

> The PR builder uses Hadoop 2.7 profile, which makes me think that 2.7 is
> more stable and we should make releases using 2.7 by default.
>
> +1
>
> On Fri, Nov 1, 2019 at 7:16 AM Xiao Li <li...@databricks.com> wrote:
>
>> Spark 3.0 will still use the Hadoop 2.7 profile by default, I think.
>> Hadoop 2.7 profile is much more stable than Hadoop 3.2 profile.
>>
>> On Thu, Oct 31, 2019 at 3:54 PM Sean Owen <sr...@gmail.com> wrote:
>>
>>> This isn't a big thing, but I see that the pyspark build includes
>>> Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
>>> 3.2 by default.
>>>
>>> Otherwise, the tests all seems to pass with JDK 8 / 11 with all
>>> profiles enabled, so I'm +1 on it.
>>>
>>>
>>> On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <ji...@gmail.com>
>>> wrote:
>>> >
>>> > Please vote on releasing the following candidate as Apache Spark
>>> version 3.0.0-preview.
>>> >
>>> > The vote is open until November 3 PST and passes if a majority +1 PMC
>>> votes are cast, with
>>> > a minimum of 3 +1 votes.
>>> >
>>> > [ ] +1 Release this package as Apache Spark 3.0.0-preview
>>> > [ ] -1 Do not release this package because ...
>>> >
>>> > To learn more about Apache Spark, please see http://spark.apache.org/
>>> >
>>> > The tag to be voted on is v3.0.0-preview-rc2 (commit
>>> 007c873ae34f58651481ccba30e8e2ba38a692c4):
>>> > https://github.com/apache/spark/tree/v3.0.0-preview-rc2
>>> >
>>> > The release files, including signatures, digests, etc. can be found at:
>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
>>> >
>>> > Signatures used for Spark RCs can be found in this file:
>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>> >
>>> > The staging repository for this release can be found at:
>>> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1336/
>>> >
>>> > The documentation corresponding to this release can be found at:
>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
>>> >
>>> > The list of bug fixes going into 3.0.0 can be found at the following
>>> URL:
>>> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>> >
>>> > FAQ
>>> >
>>> > =========================
>>> > How can I help test this release?
>>> > =========================
>>> >
>>> > If you are a Spark user, you can help us test this release by taking
>>> > an existing Spark workload and running on this release candidate, then
>>> > reporting any regressions.
>>> >
>>> > If you're working in PySpark you can set up a virtual env and install
>>> > the current RC and see if anything important breaks, in the Java/Scala
>>> > you can add the staging repository to your projects resolvers and test
>>> > with the RC (make sure to clean up the artifact cache before/after so
>>> > you don't end up building with an out of date RC going forward).
>>> >
>>> > ===========================================
>>> > What should happen to JIRA tickets still targeting 3.0.0?
>>> > ===========================================
>>> >
>>> > The current list of open tickets targeted at 3.0.0 can be found at:
>>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>>> Version/s" = 3.0.0
>>> >
>>> > Committers should look at those and triage. Extremely important bug
>>> > fixes, documentation, and API tweaks that impact compatibility should
>>> > be worked on immediately.
>>> >
>>> > ==================
>>> > But my bug isn't fixed?
>>> > ==================
>>> >
>>> > In order to make timely releases, we will typically not hold the
>>> > release unless the bug in question is a regression from the previous
>>> > release. That being said, if there is something which is a regression
>>> > that has not been correctly targeted please ping me or a committer to
>>> > help target the issue.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>
>>>
>>
>> --
>> [image: Databricks Summit - Watch the talks]
>> <https://databricks.com/sparkaisummit/north-america>
>>
>

Re: [VOTE] SPARK 3.0.0-preview (RC2)

Posted by Dongjoon Hyun <do...@gmail.com>.
+1 for Apache Spark 3.0.0-preview (RC2).

Bests,
Dongjoon.

On Thu, Oct 31, 2019 at 11:36 PM Wenchen Fan <cl...@gmail.com> wrote:

> The PR builder uses Hadoop 2.7 profile, which makes me think that 2.7 is
> more stable and we should make releases using 2.7 by default.
>
> +1
>
> On Fri, Nov 1, 2019 at 7:16 AM Xiao Li <li...@databricks.com> wrote:
>
>> Spark 3.0 will still use the Hadoop 2.7 profile by default, I think.
>> Hadoop 2.7 profile is much more stable than Hadoop 3.2 profile.
>>
>> On Thu, Oct 31, 2019 at 3:54 PM Sean Owen <sr...@gmail.com> wrote:
>>
>>> This isn't a big thing, but I see that the pyspark build includes
>>> Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
>>> 3.2 by default.
>>>
>>> Otherwise, the tests all seems to pass with JDK 8 / 11 with all
>>> profiles enabled, so I'm +1 on it.
>>>
>>>
>>> On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <ji...@gmail.com>
>>> wrote:
>>> >
>>> > Please vote on releasing the following candidate as Apache Spark
>>> version 3.0.0-preview.
>>> >
>>> > The vote is open until November 3 PST and passes if a majority +1 PMC
>>> votes are cast, with
>>> > a minimum of 3 +1 votes.
>>> >
>>> > [ ] +1 Release this package as Apache Spark 3.0.0-preview
>>> > [ ] -1 Do not release this package because ...
>>> >
>>> > To learn more about Apache Spark, please see http://spark.apache.org/
>>> >
>>> > The tag to be voted on is v3.0.0-preview-rc2 (commit
>>> 007c873ae34f58651481ccba30e8e2ba38a692c4):
>>> > https://github.com/apache/spark/tree/v3.0.0-preview-rc2
>>> >
>>> > The release files, including signatures, digests, etc. can be found at:
>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
>>> >
>>> > Signatures used for Spark RCs can be found in this file:
>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>> >
>>> > The staging repository for this release can be found at:
>>> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1336/
>>> >
>>> > The documentation corresponding to this release can be found at:
>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
>>> >
>>> > The list of bug fixes going into 3.0.0 can be found at the following
>>> URL:
>>> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>> >
>>> > FAQ
>>> >
>>> > =========================
>>> > How can I help test this release?
>>> > =========================
>>> >
>>> > If you are a Spark user, you can help us test this release by taking
>>> > an existing Spark workload and running on this release candidate, then
>>> > reporting any regressions.
>>> >
>>> > If you're working in PySpark you can set up a virtual env and install
>>> > the current RC and see if anything important breaks, in the Java/Scala
>>> > you can add the staging repository to your projects resolvers and test
>>> > with the RC (make sure to clean up the artifact cache before/after so
>>> > you don't end up building with an out of date RC going forward).
>>> >
>>> > ===========================================
>>> > What should happen to JIRA tickets still targeting 3.0.0?
>>> > ===========================================
>>> >
>>> > The current list of open tickets targeted at 3.0.0 can be found at:
>>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>>> Version/s" = 3.0.0
>>> >
>>> > Committers should look at those and triage. Extremely important bug
>>> > fixes, documentation, and API tweaks that impact compatibility should
>>> > be worked on immediately.
>>> >
>>> > ==================
>>> > But my bug isn't fixed?
>>> > ==================
>>> >
>>> > In order to make timely releases, we will typically not hold the
>>> > release unless the bug in question is a regression from the previous
>>> > release. That being said, if there is something which is a regression
>>> > that has not been correctly targeted please ping me or a committer to
>>> > help target the issue.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>
>>>
>>
>> --
>> [image: Databricks Summit - Watch the talks]
>> <https://databricks.com/sparkaisummit/north-america>
>>
>

Re: [VOTE] SPARK 3.0.0-preview (RC2)

Posted by Wenchen Fan <cl...@gmail.com>.
The PR builder uses Hadoop 2.7 profile, which makes me think that 2.7 is
more stable and we should make releases using 2.7 by default.

+1

On Fri, Nov 1, 2019 at 7:16 AM Xiao Li <li...@databricks.com> wrote:

> Spark 3.0 will still use the Hadoop 2.7 profile by default, I think.
> Hadoop 2.7 profile is much more stable than Hadoop 3.2 profile.
>
> On Thu, Oct 31, 2019 at 3:54 PM Sean Owen <sr...@gmail.com> wrote:
>
>> This isn't a big thing, but I see that the pyspark build includes
>> Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
>> 3.2 by default.
>>
>> Otherwise, the tests all seems to pass with JDK 8 / 11 with all
>> profiles enabled, so I'm +1 on it.
>>
>>
>> On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <ji...@gmail.com>
>> wrote:
>> >
>> > Please vote on releasing the following candidate as Apache Spark
>> version 3.0.0-preview.
>> >
>> > The vote is open until November 3 PST and passes if a majority +1 PMC
>> votes are cast, with
>> > a minimum of 3 +1 votes.
>> >
>> > [ ] +1 Release this package as Apache Spark 3.0.0-preview
>> > [ ] -1 Do not release this package because ...
>> >
>> > To learn more about Apache Spark, please see http://spark.apache.org/
>> >
>> > The tag to be voted on is v3.0.0-preview-rc2 (commit
>> 007c873ae34f58651481ccba30e8e2ba38a692c4):
>> > https://github.com/apache/spark/tree/v3.0.0-preview-rc2
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
>> >
>> > Signatures used for Spark RCs can be found in this file:
>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >
>> > The staging repository for this release can be found at:
>> > https://repository.apache.org/content/repositories/orgapachespark-1336/
>> >
>> > The documentation corresponding to this release can be found at:
>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
>> >
>> > The list of bug fixes going into 3.0.0 can be found at the following
>> URL:
>> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
>> >
>> > FAQ
>> >
>> > =========================
>> > How can I help test this release?
>> > =========================
>> >
>> > If you are a Spark user, you can help us test this release by taking
>> > an existing Spark workload and running on this release candidate, then
>> > reporting any regressions.
>> >
>> > If you're working in PySpark you can set up a virtual env and install
>> > the current RC and see if anything important breaks, in the Java/Scala
>> > you can add the staging repository to your projects resolvers and test
>> > with the RC (make sure to clean up the artifact cache before/after so
>> > you don't end up building with an out of date RC going forward).
>> >
>> > ===========================================
>> > What should happen to JIRA tickets still targeting 3.0.0?
>> > ===========================================
>> >
>> > The current list of open tickets targeted at 3.0.0 can be found at:
>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.0.0
>> >
>> > Committers should look at those and triage. Extremely important bug
>> > fixes, documentation, and API tweaks that impact compatibility should
>> > be worked on immediately.
>> >
>> > ==================
>> > But my bug isn't fixed?
>> > ==================
>> >
>> > In order to make timely releases, we will typically not hold the
>> > release unless the bug in question is a regression from the previous
>> > release. That being said, if there is something which is a regression
>> > that has not been correctly targeted please ping me or a committer to
>> > help target the issue.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>
>
> --
> [image: Databricks Summit - Watch the talks]
> <https://databricks.com/sparkaisummit/north-america>
>

Re: [VOTE] SPARK 3.0.0-preview (RC2)

Posted by Xiao Li <li...@databricks.com>.
Spark 3.0 will still use the Hadoop 2.7 profile by default, I think. Hadoop
2.7 profile is much more stable than Hadoop 3.2 profile.

On Thu, Oct 31, 2019 at 3:54 PM Sean Owen <sr...@gmail.com> wrote:

> This isn't a big thing, but I see that the pyspark build includes
> Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
> 3.2 by default.
>
> Otherwise, the tests all seems to pass with JDK 8 / 11 with all
> profiles enabled, so I'm +1 on it.
>
>
> On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <ji...@gmail.com>
> wrote:
> >
> > Please vote on releasing the following candidate as Apache Spark version
> 3.0.0-preview.
> >
> > The vote is open until November 3 PST and passes if a majority +1 PMC
> votes are cast, with
> > a minimum of 3 +1 votes.
> >
> > [ ] +1 Release this package as Apache Spark 3.0.0-preview
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > The tag to be voted on is v3.0.0-preview-rc2 (commit
> 007c873ae34f58651481ccba30e8e2ba38a692c4):
> > https://github.com/apache/spark/tree/v3.0.0-preview-rc2
> >
> > The release files, including signatures, digests, etc. can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
> >
> > Signatures used for Spark RCs can be found in this file:
> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1336/
> >
> > The documentation corresponding to this release can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
> >
> > The list of bug fixes going into 3.0.0 can be found at the following URL:
> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
> >
> > FAQ
> >
> > =========================
> > How can I help test this release?
> > =========================
> >
> > If you are a Spark user, you can help us test this release by taking
> > an existing Spark workload and running on this release candidate, then
> > reporting any regressions.
> >
> > If you're working in PySpark you can set up a virtual env and install
> > the current RC and see if anything important breaks, in the Java/Scala
> > you can add the staging repository to your projects resolvers and test
> > with the RC (make sure to clean up the artifact cache before/after so
> > you don't end up building with an out of date RC going forward).
> >
> > ===========================================
> > What should happen to JIRA tickets still targeting 3.0.0?
> > ===========================================
> >
> > The current list of open tickets targeted at 3.0.0 can be found at:
> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.0.0
> >
> > Committers should look at those and triage. Extremely important bug
> > fixes, documentation, and API tweaks that impact compatibility should
> > be worked on immediately.
> >
> > ==================
> > But my bug isn't fixed?
> > ==================
> >
> > In order to make timely releases, we will typically not hold the
> > release unless the bug in question is a regression from the previous
> > release. That being said, if there is something which is a regression
> > that has not been correctly targeted please ping me or a committer to
> > help target the issue.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

-- 
[image: Databricks Summit - Watch the talks]
<https://databricks.com/sparkaisummit/north-america>

Re: [VOTE] SPARK 3.0.0-preview (RC2)

Posted by Sean Owen <sr...@gmail.com>.
This isn't a big thing, but I see that the pyspark build includes
Hadoop 2.7 rather than 3.2. Maybe later we change the build to put in
3.2 by default.

Otherwise, the tests all seems to pass with JDK 8 / 11 with all
profiles enabled, so I'm +1 on it.


On Thu, Oct 31, 2019 at 1:00 AM Xingbo Jiang <ji...@gmail.com> wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 3.0.0-preview.
>
> The vote is open until November 3 PST and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.0.0-preview
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.0.0-preview-rc2 (commit 007c873ae34f58651481ccba30e8e2ba38a692c4):
> https://github.com/apache/spark/tree/v3.0.0-preview-rc2
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1336/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc2-docs/
>
> The list of bug fixes going into 3.0.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12339177
>
> FAQ
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with an out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 3.0.0?
> ===========================================
>
> The current list of open tickets targeted at 3.0.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 3.0.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately.
>
> ==================
> But my bug isn't fixed?
> ==================
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org