You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by DB Tsai <d_...@apple.com.INVALID> on 2019/02/20 20:06:45 UTC

[VOTE] Release Apache Spark 2.4.1 (RC2)

Please vote on releasing the following candidate as Apache Spark version 2.4.1.

The vote is open until Feb 24 PST and passes if a majority +1 PMC votes are cast, with
a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.4.1
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.4.1-rc2 (commit 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
https://github.com/apache/spark/tree/v2.4.1-rc2

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/

Signatures used for Spark RCs can be found in this file:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1299/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/

The list of bug fixes going into 2.4.1 can be found at the following URL:
https://issues.apache.org/jira/projects/SPARK/versions/2.4.1

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.4.1?
===========================================

The current list of open tickets targeted at 2.4.1 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.1

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Sean Owen <sr...@gmail.com>.
That looks like a change to restore some behavior that was removed in
2.2. It's not directly relevant to a release vote on 2.4.1. See the
existing discussion at
https://github.com/apache/spark/pull/22144#issuecomment-432258536 It
may indeed be a good thing to change but just continue the discussion
as you work on your PR.

On Thu, Feb 21, 2019 at 9:09 AM Parth Gandhi <pa...@gmail.com> wrote:
>
> Hello,
>         In https://issues.apache.org/jira/browse/SPARK-24935, I am getting requests from people that they were hoping for the fix to be merged in Spark 2.4.1. The concerned PR is here: https://github.com/apache/spark/pull/23778. I do not mind if we do not merge it for 2.4.1 and I do not want to be considered this as a blocker for 2.4.1, but I would appreciate if somebody can have a look and decide on the same. I have not written unit tests for the PR yet, as I am relatively new to Catalyst and was hoping for someone to confirm whether the fix is progressing in the right direction before proceeding ahead with the unit tests. Thank you.
>
> Regards,
> Parth Kamlesh Gandhi
>
>
> On Wed, Feb 20, 2019 at 11:34 PM Felix Cheung <fe...@hotmail.com> wrote:
>>
>> Could you hold for a bit - I have one more fix to get in
>>
>>
>> ________________________________
>> From: d_tsai@apple.com on behalf of DB Tsai <d_...@apple.com.invalid>
>> Sent: Wednesday, February 20, 2019 12:25 PM
>> To: Spark dev list
>> Cc: Cesar Delgado
>> Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)
>>
>> Okay. Let's fail rc2, and I'll prepare rc3 with SPARK-26859.
>>
>> DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc
>>
>> > On Feb 20, 2019, at 12:11 PM, Marcelo Vanzin <va...@cloudera.com.INVALID> wrote:
>> >
>> > Just wanted to point out that
>> > https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
>> > and is marked as a correctness bug. (The fix is in the 2.4 branch,
>> > just not in rc2.)
>> >
>> > On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid> wrote:
>> >>
>> >> Please vote on releasing the following candidate as Apache Spark version 2.4.1.
>> >>
>> >> The vote is open until Feb 24 PST and passes if a majority +1 PMC votes are cast, with
>> >> a minimum of 3 +1 votes.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 2.4.1
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see http://spark.apache.org/
>> >>
>> >> The tag to be voted on is v2.4.1-rc2 (commit 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
>> >> https://github.com/apache/spark/tree/v2.4.1-rc2
>> >>
>> >> The release files, including signatures, digests, etc. can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
>> >>
>> >> Signatures used for Spark RCs can be found in this file:
>> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >>
>> >> The staging repository for this release can be found at:
>> >> https://repository.apache.org/content/repositories/orgapachespark-1299/
>> >>
>> >> The documentation corresponding to this release can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
>> >>
>> >> The list of bug fixes going into 2.4.1 can be found at the following URL:
>> >> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
>> >>
>> >> FAQ
>> >>
>> >> =========================
>> >> How can I help test this release?
>> >> =========================
>> >>
>> >> If you are a Spark user, you can help us test this release by taking
>> >> an existing Spark workload and running on this release candidate, then
>> >> reporting any regressions.
>> >>
>> >> If you're working in PySpark you can set up a virtual env and install
>> >> the current RC and see if anything important breaks, in the Java/Scala
>> >> you can add the staging repository to your projects resolvers and test
>> >> with the RC (make sure to clean up the artifact cache before/after so
>> >> you don't end up building with a out of date RC going forward).
>> >>
>> >> ===========================================
>> >> What should happen to JIRA tickets still targeting 2.4.1?
>> >> ===========================================
>> >>
>> >> The current list of open tickets targeted at 2.4.1 can be found at:
>> >> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.1
>> >>
>> >> Committers should look at those and triage. Extremely important bug
>> >> fixes, documentation, and API tweaks that impact compatibility should
>> >> be worked on immediately. Everything else please retarget to an
>> >> appropriate release.
>> >>
>> >> ==================
>> >> But my bug isn't fixed?
>> >> ==================
>> >>
>> >> In order to make timely releases, we will typically not hold the
>> >> release unless the bug in question is a regression from the previous
>> >> release. That being said, if there is something which is a regression
>> >> that has not been correctly targeted please ping me or a committer to
>> >> help target the issue.
>> >>
>> >>
>> >> DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc
>> >>
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> >>
>> >
>> >
>> > --
>> > Marcelo
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> >
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Parth Gandhi <pa...@gmail.com>.
Hello,
        In https://issues.apache.org/jira/browse/SPARK-24935, I am getting
requests from people that they were hoping for the fix to be merged in
Spark 2.4.1. The concerned PR is here:
https://github.com/apache/spark/pull/23778. I do not mind if we do not
merge it for 2.4.1 and I do not want to be considered this as a blocker for
2.4.1, but I would appreciate if somebody can have a look and decide on the
same. I have not written unit tests for the PR yet, as I am relatively new
to Catalyst and was hoping for someone to confirm whether the fix is
progressing in the right direction before proceeding ahead with the unit
tests. Thank you.

Regards,
Parth Kamlesh Gandhi


On Wed, Feb 20, 2019 at 11:34 PM Felix Cheung <fe...@hotmail.com>
wrote:

> Could you hold for a bit - I have one more fix to get in
>
>
> ------------------------------
> *From:* d_tsai@apple.com on behalf of DB Tsai <d_...@apple.com.invalid>
> *Sent:* Wednesday, February 20, 2019 12:25 PM
> *To:* Spark dev list
> *Cc:* Cesar Delgado
> *Subject:* Re: [VOTE] Release Apache Spark 2.4.1 (RC2)
>
> Okay. Let's fail rc2, and I'll prepare rc3 with SPARK-26859.
>
> DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple,
> Inc
>
> > On Feb 20, 2019, at 12:11 PM, Marcelo Vanzin <va...@cloudera.com.INVALID>
> wrote:
> >
> > Just wanted to point out that
> > https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
> > and is marked as a correctness bug. (The fix is in the 2.4 branch,
> > just not in rc2.)
> >
> > On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid>
> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 2.4.1.
> >>
> >> The vote is open until Feb 24 PST and passes if a majority +1 PMC votes
> are cast, with
> >> a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 2.4.1
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> >>
> >> The tag to be voted on is v2.4.1-rc2 (commit
> 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
> >> https://github.com/apache/spark/tree/v2.4.1-rc2
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1299/
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
> >>
> >> The list of bug fixes going into 2.4.1 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >>
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 2.4.1?
> >> ===========================================
> >>
> >> The current list of open tickets targeted at 2.4.1 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.1
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >>
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >>
> >> DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple,
> Inc
> >>
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
> >
> >
> > --
> > Marcelo
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by shane knapp <sk...@berkeley.edu>.
https://issues.apache.org/jira/browse/SPARK-26742



On Thu, Mar 7, 2019 at 10:52 AM shane knapp <sk...@berkeley.edu> wrote:

> i'm ready to update the ubuntu workers/minikube/k8s to support the 4.1.2
> client:
> https://issues.apache.org/jira/browse/SPARK-2674
>
> i am more than comfortable with this build system update, both on the ops
> and spark project side.  we were incredibly far behind the release cycle
> for k8s and minikube, which was beginning to impact the dep graph.
> updating to at least k8s v1.13 and the 4.1.2 client lib gives us a lot of
> breathing room w/little worry about backwards compatibility.
>
> if this is something we're comfortable with doing for the 2.4.1 release
> (+master), then i'll need to take down the pull request builder for ~30
> mins (which will be it's own email to dev@).
>
> shane
>
> On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <
> stavros.kontopoulos@lightbend.com> wrote:
>
>> Yes its a touch decision and as we discussed today (
>> https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA
>> )
>> "Kubernetes support window is 9 months, Spark is two years". So we may
>> end up with old client versions on branches still supported like 2.4.x in
>> the future.
>> That gives us no choice but to upgrade, if we want to be on the safe
>> side. We have tested 3.0.0 with 1.11 internally and it works but I dont
>> know what it means to run with old
>> clients.
>>
>>
>> On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com> wrote:
>>
>>> If the old client is basically unusable with the versions of K8S
>>> people mostly use now, and the new client still works with older
>>> versions, I could see including this in 2.4.1.
>>>
>>> Looking at
>>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix
>>> it seems like the 4.1.1 client is needed for 1.10 and above. However
>>> it no longer supports 1.7 and below.
>>> We have 3.0.x, and versions through 4.0.x of the client support the
>>> same K8S versions, so no real middle ground here.
>>>
>>> 1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
>>> branches are maintained for 9 months per
>>> https://kubernetes.io/docs/setup/version-skew-policy/
>>>
>>> Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
>>> used the newer client from the start as at that point (?) 1.7 and
>>> earlier were already at least 7 months past EOL.
>>> If we update the client in 2.4.1, versions of K8S as recently
>>> 'supported' as a year ago won't work anymore. I'm guessing there are
>>> still 1.7 users out there? That wasn't that long ago but if the
>>> project and users generally move fast, maybe not.
>>>
>>> Normally I'd say, that's what the next minor release of Spark is for;
>>> update if you want later infra. But there is no Spark 2.5.
>>> I presume downstream distros could modify the dependency easily (?) if
>>> needed and maybe already do. It wouldn't necessarily help end users.
>>>
>>> Does the 3.0.x client not work at all with 1.10+ or just unsupported.
>>> If it 'basically works but no guarantees' I'd favor not updating. If
>>> it doesn't work at all, hm. That's tough. I think I'd favor updating
>>> the client but think it's a tough call both ways.
>>>
>>>
>>>
>>> On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
>>> <st...@lightbend.com> wrote:
>>> >
>>> > Yes Shane Knapp has done the work for that already,  and also tests
>>> pass, I am working on a PR now, I could submit it for the 2.4 branch .
>>> > I understand that this is a major dependency update, but the problem I
>>> see is that the client version is so old that I dont think it makes
>>> > much sense for current users who are on k8s 1.10, 1.11 etc(
>>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix,
>>> 3.0.0 does not even exist in there).
>>> > I dont know what it means to use that old version with current k8s
>>> clusters in terms of bugs etc.
>>>
>>
>>
>>
>
> --
> Shane Knapp
> UC Berkeley EECS Research / RISELab Staff Technical Lead
> https://rise.cs.berkeley.edu
>


-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by shane knapp <sk...@berkeley.edu>.
i'm ready to update the ubuntu workers/minikube/k8s to support the 4.1.2
client:
https://issues.apache.org/jira/browse/SPARK-2674

i am more than comfortable with this build system update, both on the ops
and spark project side.  we were incredibly far behind the release cycle
for k8s and minikube, which was beginning to impact the dep graph.
updating to at least k8s v1.13 and the 4.1.2 client lib gives us a lot of
breathing room w/little worry about backwards compatibility.

if this is something we're comfortable with doing for the 2.4.1 release
(+master), then i'll need to take down the pull request builder for ~30
mins (which will be it's own email to dev@).

shane

On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <
stavros.kontopoulos@lightbend.com> wrote:

> Yes its a touch decision and as we discussed today (
> https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA
> )
> "Kubernetes support window is 9 months, Spark is two years". So we may
> end up with old client versions on branches still supported like 2.4.x in
> the future.
> That gives us no choice but to upgrade, if we want to be on the safe side.
> We have tested 3.0.0 with 1.11 internally and it works but I dont know what
> it means to run with old
> clients.
>
>
> On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com> wrote:
>
>> If the old client is basically unusable with the versions of K8S
>> people mostly use now, and the new client still works with older
>> versions, I could see including this in 2.4.1.
>>
>> Looking at
>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix
>> it seems like the 4.1.1 client is needed for 1.10 and above. However
>> it no longer supports 1.7 and below.
>> We have 3.0.x, and versions through 4.0.x of the client support the
>> same K8S versions, so no real middle ground here.
>>
>> 1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
>> branches are maintained for 9 months per
>> https://kubernetes.io/docs/setup/version-skew-policy/
>>
>> Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
>> used the newer client from the start as at that point (?) 1.7 and
>> earlier were already at least 7 months past EOL.
>> If we update the client in 2.4.1, versions of K8S as recently
>> 'supported' as a year ago won't work anymore. I'm guessing there are
>> still 1.7 users out there? That wasn't that long ago but if the
>> project and users generally move fast, maybe not.
>>
>> Normally I'd say, that's what the next minor release of Spark is for;
>> update if you want later infra. But there is no Spark 2.5.
>> I presume downstream distros could modify the dependency easily (?) if
>> needed and maybe already do. It wouldn't necessarily help end users.
>>
>> Does the 3.0.x client not work at all with 1.10+ or just unsupported.
>> If it 'basically works but no guarantees' I'd favor not updating. If
>> it doesn't work at all, hm. That's tough. I think I'd favor updating
>> the client but think it's a tough call both ways.
>>
>>
>>
>> On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
>> <st...@lightbend.com> wrote:
>> >
>> > Yes Shane Knapp has done the work for that already,  and also tests
>> pass, I am working on a PR now, I could submit it for the 2.4 branch .
>> > I understand that this is a major dependency update, but the problem I
>> see is that the client version is so old that I dont think it makes
>> > much sense for current users who are on k8s 1.10, 1.11 etc(
>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix,
>> 3.0.0 does not even exist in there).
>> > I dont know what it means to use that old version with current k8s
>> clusters in terms of bugs etc.
>>
>
>
>

-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Felix Cheung <fe...@hotmail.com>.
There is SPARK-26604 we are looking into

________________________________
From: Saisai Shao <sa...@gmail.com>
Sent: Wednesday, March 6, 2019 6:05 PM
To: shane knapp
Cc: Stavros Kontopoulos; Sean Owen; DB Tsai; Spark dev list; d_tsai@apple.com
Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Do we have other block/critical issues for Spark 2.4.1 or waiting something to be fixed? I roughly searched the JIRA, seems there's no block/critical issues marked for 2.4.1.

Thanks
Saisai

shane knapp <sk...@berkeley.edu>> 于2019年3月7日周四 上午4:57写道:
i'll be popping in to the sig-big-data meeting on the 20th to talk about stuff like this.

On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <st...@lightbend.com>> wrote:
Yes its a touch decision and as we discussed today (https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA)
"Kubernetes support window is 9 months, Spark is two years".So we may end up with old client versions on branches still supported like 2.4.x in the future.
That gives us no choice but to upgrade, if we want to be on the safe side. We have tested 3.0.0 with 1.11 internally and it works but I dont know what it means to run with old
clients.


On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com>> wrote:
If the old client is basically unusable with the versions of K8S
people mostly use now, and the new client still works with older
versions, I could see including this in 2.4.1.

Looking at https://github.com/fabric8io/kubernetes-client#compatibility-matrix
it seems like the 4.1.1 client is needed for 1.10 and above. However
it no longer supports 1.7 and below.
We have 3.0.x, and versions through 4.0.x of the client support the
same K8S versions, so no real middle ground here.

1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
branches are maintained for 9 months per
https://kubernetes.io/docs/setup/version-skew-policy/

Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
used the newer client from the start as at that point (?) 1.7 and
earlier were already at least 7 months past EOL.
If we update the client in 2.4.1, versions of K8S as recently
'supported' as a year ago won't work anymore. I'm guessing there are
still 1.7 users out there? That wasn't that long ago but if the
project and users generally move fast, maybe not.

Normally I'd say, that's what the next minor release of Spark is for;
update if you want later infra. But there is no Spark 2.5.
I presume downstream distros could modify the dependency easily (?) if
needed and maybe already do. It wouldn't necessarily help end users.

Does the 3.0.x client not work at all with 1.10+ or just unsupported.
If it 'basically works but no guarantees' I'd favor not updating. If
it doesn't work at all, hm. That's tough. I think I'd favor updating
the client but think it's a tough call both ways.



On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
<st...@lightbend.com>> wrote:
>
> Yes Shane Knapp has done the work for that already,  and also tests pass, I am working on a PR now, I could submit it for the 2.4 branch .
> I understand that this is a major dependency update, but the problem I see is that the client version is so old that I dont think it makes
> much sense for current users who are on k8s 1.10, 1.11 etc(https://github.com/fabric8io/kubernetes-client#compatibility-matrix, 3.0.0 does not even exist in there).
> I dont know what it means to use that old version with current k8s clusters in terms of bugs etc.




--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Sean Owen <sr...@gmail.com>.
I don't think we'd fail the current RC for this change, no.

On Tue, Mar 12, 2019 at 3:51 AM Jakub Wozniak <ja...@cern.ch> wrote:
>
> Hello,
>
> Any more thoughts on this one?
> Will that be let in 2.4.1 or rather not?
>
> Thanks in advance,
> Jakub
>
>
> On 8 Mar 2019, at 11:26, Jakub Wozniak <ja...@cern.ch> wrote:
>
> Hi,
>
> To me it is backwards compatible with older Hbase versions.
> The code actually only falls back to the newer api on exception.
>
> It would be great if this gets in.
> Otherwise a setup with Hbase 2 + Spark 2.4 gets a bit complicated as we are forced to use an older version of the Hbase client (1.4.9) when running on Yarn.
> In theory compatible but we see some performance degradations while doing reads from Hbase with the older client (we are investigating it now).
> We have had issues in the past when Hbase server & client versions were not aligned so this is not our favourite.
>
> Thanks,
> Jakub
>
>
> On 8 Mar 2019, at 11:15, Jakub Wozniak <ja...@cern.ch> wrote:
>
> I guess it is that one:
> https://github.com/apache/spark/commit/dfed439e33b7bf224dd412b0960402068d961c7b#diff-9ebb59b7b008c694a8f583b94bd24e1d
>
> Cheers,
> Jakub
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Jakub Wozniak <ja...@cern.ch>.
Hello,

Any more thoughts on this one?
Will that be let in 2.4.1 or rather not?

Thanks in advance,
Jakub


On 8 Mar 2019, at 11:26, Jakub Wozniak <ja...@cern.ch>> wrote:

Hi,

To me it is backwards compatible with older Hbase versions.
The code actually only falls back to the newer api on exception.

It would be great if this gets in.
Otherwise a setup with Hbase 2 + Spark 2.4 gets a bit complicated as we are forced to use an older version of the Hbase client (1.4.9) when running on Yarn.
In theory compatible but we see some performance degradations while doing reads from Hbase with the older client (we are investigating it now).
We have had issues in the past when Hbase server & client versions were not aligned so this is not our favourite.

Thanks,
Jakub


On 8 Mar 2019, at 11:15, Jakub Wozniak <ja...@cern.ch>> wrote:

I guess it is that one:
https://github.com/apache/spark/commit/dfed439e33b7bf224dd412b0960402068d961c7b#diff-9ebb59b7b008c694a8f583b94bd24e1d

Cheers,
Jakub


On 7 Mar 2019, at 17:25, Sean Owen <sr...@gmail.com>> wrote:

Do you know what change fixed it?
If it's not a regression from 2.4.0 it wouldn't necessarily go into a
maintenance release. If there were no downside, maybe; does it cause
any incompatibility with older HBase versions?
It may be that this support is targeted for Spark 3 on purpose, which
is probably due in the middle of the year.

On Thu, Mar 7, 2019 at 8:57 AM Jakub Wozniak <ja...@cern.ch>> wrote:

Hello,

I have a question regarding the 2.4.1 release.

It looks like Spark 2.4 (and 2.4.1-rc) is not exactly compatible with Hbase 2.x+ for the Yarn mode.
The problem is in the org.apache.spark.deploy.security.HbaseDelegationTokenProvider class that expects a specific version of TokenUtil class from Hbase that was changed between Hbase 1.x & 2.x.
On top the HadoopDelegationTokenManager does not use the ServiceLoader class so I cannot attach my own provider (providers are hardcoded).

It seems that both problems are resolved on the Spark master branch.

Is there any reason not to include this fix in the 2.4.1 release?
If so when do you plan to release it (the fix for Hbase)?

Or maybe there is something I’ve overlooked, please correct me if I’m wrong.

Best regards,
Jakub


On 7 Mar 2019, at 03:04, Saisai Shao <sa...@gmail.com>> wrote:

Do we have other block/critical issues for Spark 2.4.1 or waiting something to be fixed? I roughly searched the JIRA, seems there's no block/critical issues marked for 2.4.1.

Thanks
Saisai

shane knapp <sk...@berkeley.edu>> 于2019年3月7日周四 上午4:57写道:

i'll be popping in to the sig-big-data meeting on the 20th to talk about stuff like this.

On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <st...@lightbend.com>> wrote:

Yes its a touch decision and as we discussed today (https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA)
"Kubernetes support window is 9 months, Spark is two years". So we may end up with old client versions on branches still supported like 2.4.x in the future.
That gives us no choice but to upgrade, if we want to be on the safe side. We have tested 3.0.0 with 1.11 internally and it works but I dont know what it means to run with old
clients.


On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com>> wrote:

If the old client is basically unusable with the versions of K8S
people mostly use now, and the new client still works with older
versions, I could see including this in 2.4.1.

Looking at https://github.com/fabric8io/kubernetes-client#compatibility-matrix
it seems like the 4.1.1 client is needed for 1.10 and above. However
it no longer supports 1.7 and below.
We have 3.0.x, and versions through 4.0.x of the client support the
same K8S versions, so no real middle ground here.

1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
branches are maintained for 9 months per
https://kubernetes.io/docs/setup/version-skew-policy/

Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
used the newer client from the start as at that point (?) 1.7 and
earlier were already at least 7 months past EOL.
If we update the client in 2.4.1, versions of K8S as recently
'supported' as a year ago won't work anymore. I'm guessing there are
still 1.7 users out there? That wasn't that long ago but if the
project and users generally move fast, maybe not.

Normally I'd say, that's what the next minor release of Spark is for;
update if you want later infra. But there is no Spark 2.5.
I presume downstream distros could modify the dependency easily (?) if
needed and maybe already do. It wouldn't necessarily help end users.

Does the 3.0.x client not work at all with 1.10+ or just unsupported.
If it 'basically works but no guarantees' I'd favor not updating. If
it doesn't work at all, hm. That's tough. I think I'd favor updating
the client but think it's a tough call both ways.



On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
<st...@lightbend.com>> wrote:

Yes Shane Knapp has done the work for that already,  and also tests pass, I am working on a PR now, I could submit it for the 2.4 branch .
I understand that this is a major dependency update, but the problem I see is that the client version is so old that I dont think it makes
much sense for current users who are on k8s 1.10, 1.11 etc(https://github.com/fabric8io/kubernetes-client#compatibility-matrix, 3.0.0 does not even exist in there).
I dont know what it means to use that old version with current k8s clusters in terms of bugs etc.





--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu<https://rise.cs.berkeley.edu/>



---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>





Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Jakub Wozniak <ja...@cern.ch>.
Hi,

To me it is backwards compatible with older Hbase versions.
The code actually only falls back to the newer api on exception.

It would be great if this gets in.
Otherwise a setup with Hbase 2 + Spark 2.4 gets a bit complicated as we are forced to use an older version of the Hbase client (1.4.9) when running on Yarn.
In theory compatible but we see some performance degradations while doing reads from Hbase with the older client (we are investigating it now).
We have had issues in the past when Hbase server & client versions were not aligned so this is not our favourite.

Thanks,
Jakub


On 8 Mar 2019, at 11:15, Jakub Wozniak <ja...@cern.ch>> wrote:

I guess it is that one:
https://github.com/apache/spark/commit/dfed439e33b7bf224dd412b0960402068d961c7b#diff-9ebb59b7b008c694a8f583b94bd24e1d

Cheers,
Jakub


On 7 Mar 2019, at 17:25, Sean Owen <sr...@gmail.com>> wrote:

Do you know what change fixed it?
If it's not a regression from 2.4.0 it wouldn't necessarily go into a
maintenance release. If there were no downside, maybe; does it cause
any incompatibility with older HBase versions?
It may be that this support is targeted for Spark 3 on purpose, which
is probably due in the middle of the year.

On Thu, Mar 7, 2019 at 8:57 AM Jakub Wozniak <ja...@cern.ch>> wrote:

Hello,

I have a question regarding the 2.4.1 release.

It looks like Spark 2.4 (and 2.4.1-rc) is not exactly compatible with Hbase 2.x+ for the Yarn mode.
The problem is in the org.apache.spark.deploy.security.HbaseDelegationTokenProvider class that expects a specific version of TokenUtil class from Hbase that was changed between Hbase 1.x & 2.x.
On top the HadoopDelegationTokenManager does not use the ServiceLoader class so I cannot attach my own provider (providers are hardcoded).

It seems that both problems are resolved on the Spark master branch.

Is there any reason not to include this fix in the 2.4.1 release?
If so when do you plan to release it (the fix for Hbase)?

Or maybe there is something I’ve overlooked, please correct me if I’m wrong.

Best regards,
Jakub


On 7 Mar 2019, at 03:04, Saisai Shao <sa...@gmail.com>> wrote:

Do we have other block/critical issues for Spark 2.4.1 or waiting something to be fixed? I roughly searched the JIRA, seems there's no block/critical issues marked for 2.4.1.

Thanks
Saisai

shane knapp <sk...@berkeley.edu>> 于2019年3月7日周四 上午4:57写道:

i'll be popping in to the sig-big-data meeting on the 20th to talk about stuff like this.

On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <st...@lightbend.com>> wrote:

Yes its a touch decision and as we discussed today (https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA)
"Kubernetes support window is 9 months, Spark is two years". So we may end up with old client versions on branches still supported like 2.4.x in the future.
That gives us no choice but to upgrade, if we want to be on the safe side. We have tested 3.0.0 with 1.11 internally and it works but I dont know what it means to run with old
clients.


On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com>> wrote:

If the old client is basically unusable with the versions of K8S
people mostly use now, and the new client still works with older
versions, I could see including this in 2.4.1.

Looking at https://github.com/fabric8io/kubernetes-client#compatibility-matrix
it seems like the 4.1.1 client is needed for 1.10 and above. However
it no longer supports 1.7 and below.
We have 3.0.x, and versions through 4.0.x of the client support the
same K8S versions, so no real middle ground here.

1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
branches are maintained for 9 months per
https://kubernetes.io/docs/setup/version-skew-policy/

Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
used the newer client from the start as at that point (?) 1.7 and
earlier were already at least 7 months past EOL.
If we update the client in 2.4.1, versions of K8S as recently
'supported' as a year ago won't work anymore. I'm guessing there are
still 1.7 users out there? That wasn't that long ago but if the
project and users generally move fast, maybe not.

Normally I'd say, that's what the next minor release of Spark is for;
update if you want later infra. But there is no Spark 2.5.
I presume downstream distros could modify the dependency easily (?) if
needed and maybe already do. It wouldn't necessarily help end users.

Does the 3.0.x client not work at all with 1.10+ or just unsupported.
If it 'basically works but no guarantees' I'd favor not updating. If
it doesn't work at all, hm. That's tough. I think I'd favor updating
the client but think it's a tough call both ways.



On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
<st...@lightbend.com>> wrote:

Yes Shane Knapp has done the work for that already,  and also tests pass, I am working on a PR now, I could submit it for the 2.4 branch .
I understand that this is a major dependency update, but the problem I see is that the client version is so old that I dont think it makes
much sense for current users who are on k8s 1.10, 1.11 etc(https://github.com/fabric8io/kubernetes-client#compatibility-matrix, 3.0.0 does not even exist in there).
I dont know what it means to use that old version with current k8s clusters in terms of bugs etc.





--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu<https://rise.cs.berkeley.edu/>



---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>




Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Jakub Wozniak <ja...@cern.ch>.
I guess it is that one:
https://github.com/apache/spark/commit/dfed439e33b7bf224dd412b0960402068d961c7b#diff-9ebb59b7b008c694a8f583b94bd24e1d

Cheers,
Jakub


On 7 Mar 2019, at 17:25, Sean Owen <sr...@gmail.com>> wrote:

Do you know what change fixed it?
If it's not a regression from 2.4.0 it wouldn't necessarily go into a
maintenance release. If there were no downside, maybe; does it cause
any incompatibility with older HBase versions?
It may be that this support is targeted for Spark 3 on purpose, which
is probably due in the middle of the year.

On Thu, Mar 7, 2019 at 8:57 AM Jakub Wozniak <ja...@cern.ch>> wrote:

Hello,

I have a question regarding the 2.4.1 release.

It looks like Spark 2.4 (and 2.4.1-rc) is not exactly compatible with Hbase 2.x+ for the Yarn mode.
The problem is in the org.apache.spark.deploy.security.HbaseDelegationTokenProvider class that expects a specific version of TokenUtil class from Hbase that was changed between Hbase 1.x & 2.x.
On top the HadoopDelegationTokenManager does not use the ServiceLoader class so I cannot attach my own provider (providers are hardcoded).

It seems that both problems are resolved on the Spark master branch.

Is there any reason not to include this fix in the 2.4.1 release?
If so when do you plan to release it (the fix for Hbase)?

Or maybe there is something I’ve overlooked, please correct me if I’m wrong.

Best regards,
Jakub


On 7 Mar 2019, at 03:04, Saisai Shao <sa...@gmail.com>> wrote:

Do we have other block/critical issues for Spark 2.4.1 or waiting something to be fixed? I roughly searched the JIRA, seems there's no block/critical issues marked for 2.4.1.

Thanks
Saisai

shane knapp <sk...@berkeley.edu>> 于2019年3月7日周四 上午4:57写道:

i'll be popping in to the sig-big-data meeting on the 20th to talk about stuff like this.

On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <st...@lightbend.com>> wrote:

Yes its a touch decision and as we discussed today (https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA)
"Kubernetes support window is 9 months, Spark is two years". So we may end up with old client versions on branches still supported like 2.4.x in the future.
That gives us no choice but to upgrade, if we want to be on the safe side. We have tested 3.0.0 with 1.11 internally and it works but I dont know what it means to run with old
clients.


On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com>> wrote:

If the old client is basically unusable with the versions of K8S
people mostly use now, and the new client still works with older
versions, I could see including this in 2.4.1.

Looking at https://github.com/fabric8io/kubernetes-client#compatibility-matrix
it seems like the 4.1.1 client is needed for 1.10 and above. However
it no longer supports 1.7 and below.
We have 3.0.x, and versions through 4.0.x of the client support the
same K8S versions, so no real middle ground here.

1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
branches are maintained for 9 months per
https://kubernetes.io/docs/setup/version-skew-policy/

Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
used the newer client from the start as at that point (?) 1.7 and
earlier were already at least 7 months past EOL.
If we update the client in 2.4.1, versions of K8S as recently
'supported' as a year ago won't work anymore. I'm guessing there are
still 1.7 users out there? That wasn't that long ago but if the
project and users generally move fast, maybe not.

Normally I'd say, that's what the next minor release of Spark is for;
update if you want later infra. But there is no Spark 2.5.
I presume downstream distros could modify the dependency easily (?) if
needed and maybe already do. It wouldn't necessarily help end users.

Does the 3.0.x client not work at all with 1.10+ or just unsupported.
If it 'basically works but no guarantees' I'd favor not updating. If
it doesn't work at all, hm. That's tough. I think I'd favor updating
the client but think it's a tough call both ways.



On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
<st...@lightbend.com> wrote:

Yes Shane Knapp has done the work for that already,  and also tests pass, I am working on a PR now, I could submit it for the 2.4 branch .
I understand that this is a major dependency update, but the problem I see is that the client version is so old that I dont think it makes
much sense for current users who are on k8s 1.10, 1.11 etc(https://github.com/fabric8io/kubernetes-client#compatibility-matrix, 3.0.0 does not even exist in there).
I dont know what it means to use that old version with current k8s clusters in terms of bugs etc.





--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu



---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>



Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Sean Owen <sr...@gmail.com>.
Do you know what change fixed it?
If it's not a regression from 2.4.0 it wouldn't necessarily go into a
maintenance release. If there were no downside, maybe; does it cause
any incompatibility with older HBase versions?
It may be that this support is targeted for Spark 3 on purpose, which
is probably due in the middle of the year.

On Thu, Mar 7, 2019 at 8:57 AM Jakub Wozniak <ja...@cern.ch> wrote:
>
> Hello,
>
> I have a question regarding the 2.4.1 release.
>
> It looks like Spark 2.4 (and 2.4.1-rc) is not exactly compatible with Hbase 2.x+ for the Yarn mode.
> The problem is in the org.apache.spark.deploy.security.HbaseDelegationTokenProvider class that expects a specific version of TokenUtil class from Hbase that was changed between Hbase 1.x & 2.x.
> On top the HadoopDelegationTokenManager does not use the ServiceLoader class so I cannot attach my own provider (providers are hardcoded).
>
> It seems that both problems are resolved on the Spark master branch.
>
> Is there any reason not to include this fix in the 2.4.1 release?
> If so when do you plan to release it (the fix for Hbase)?
>
> Or maybe there is something I’ve overlooked, please correct me if I’m wrong.
>
> Best regards,
> Jakub
>
>
> On 7 Mar 2019, at 03:04, Saisai Shao <sa...@gmail.com> wrote:
>
> Do we have other block/critical issues for Spark 2.4.1 or waiting something to be fixed? I roughly searched the JIRA, seems there's no block/critical issues marked for 2.4.1.
>
> Thanks
> Saisai
>
> shane knapp <sk...@berkeley.edu> 于2019年3月7日周四 上午4:57写道:
>>
>> i'll be popping in to the sig-big-data meeting on the 20th to talk about stuff like this.
>>
>> On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <st...@lightbend.com> wrote:
>>>
>>> Yes its a touch decision and as we discussed today (https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA)
>>> "Kubernetes support window is 9 months, Spark is two years". So we may end up with old client versions on branches still supported like 2.4.x in the future.
>>> That gives us no choice but to upgrade, if we want to be on the safe side. We have tested 3.0.0 with 1.11 internally and it works but I dont know what it means to run with old
>>> clients.
>>>
>>>
>>> On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com> wrote:
>>>>
>>>> If the old client is basically unusable with the versions of K8S
>>>> people mostly use now, and the new client still works with older
>>>> versions, I could see including this in 2.4.1.
>>>>
>>>> Looking at https://github.com/fabric8io/kubernetes-client#compatibility-matrix
>>>> it seems like the 4.1.1 client is needed for 1.10 and above. However
>>>> it no longer supports 1.7 and below.
>>>> We have 3.0.x, and versions through 4.0.x of the client support the
>>>> same K8S versions, so no real middle ground here.
>>>>
>>>> 1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
>>>> branches are maintained for 9 months per
>>>> https://kubernetes.io/docs/setup/version-skew-policy/
>>>>
>>>> Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
>>>> used the newer client from the start as at that point (?) 1.7 and
>>>> earlier were already at least 7 months past EOL.
>>>> If we update the client in 2.4.1, versions of K8S as recently
>>>> 'supported' as a year ago won't work anymore. I'm guessing there are
>>>> still 1.7 users out there? That wasn't that long ago but if the
>>>> project and users generally move fast, maybe not.
>>>>
>>>> Normally I'd say, that's what the next minor release of Spark is for;
>>>> update if you want later infra. But there is no Spark 2.5.
>>>> I presume downstream distros could modify the dependency easily (?) if
>>>> needed and maybe already do. It wouldn't necessarily help end users.
>>>>
>>>> Does the 3.0.x client not work at all with 1.10+ or just unsupported.
>>>> If it 'basically works but no guarantees' I'd favor not updating. If
>>>> it doesn't work at all, hm. That's tough. I think I'd favor updating
>>>> the client but think it's a tough call both ways.
>>>>
>>>>
>>>>
>>>> On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
>>>> <st...@lightbend.com> wrote:
>>>> >
>>>> > Yes Shane Knapp has done the work for that already,  and also tests pass, I am working on a PR now, I could submit it for the 2.4 branch .
>>>> > I understand that this is a major dependency update, but the problem I see is that the client version is so old that I dont think it makes
>>>> > much sense for current users who are on k8s 1.10, 1.11 etc(https://github.com/fabric8io/kubernetes-client#compatibility-matrix, 3.0.0 does not even exist in there).
>>>> > I dont know what it means to use that old version with current k8s clusters in terms of bugs etc.
>>>
>>>
>>>
>>
>>
>> --
>> Shane Knapp
>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>> https://rise.cs.berkeley.edu
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Jakub Wozniak <ja...@cern.ch>.
Hello,

I have a question regarding the 2.4.1 release.

It looks like Spark 2.4 (and 2.4.1-rc) is not exactly compatible with Hbase 2.x+ for the Yarn mode.
The problem is in the org.apache.spark.deploy.security.HbaseDelegationTokenProvider class that expects a specific version of TokenUtil class from Hbase that was changed between Hbase 1.x & 2.x.
On top the HadoopDelegationTokenManager does not use the ServiceLoader class so I cannot attach my own provider (providers are hardcoded).

It seems that both problems are resolved on the Spark master branch.

Is there any reason not to include this fix in the 2.4.1 release?
If so when do you plan to release it (the fix for Hbase)?

Or maybe there is something I’ve overlooked, please correct me if I’m wrong.

Best regards,
Jakub


On 7 Mar 2019, at 03:04, Saisai Shao <sa...@gmail.com>> wrote:

Do we have other block/critical issues for Spark 2.4.1 or waiting something to be fixed? I roughly searched the JIRA, seems there's no block/critical issues marked for 2.4.1.

Thanks
Saisai

shane knapp <sk...@berkeley.edu>> 于2019年3月7日周四 上午4:57写道:
i'll be popping in to the sig-big-data meeting on the 20th to talk about stuff like this.

On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <st...@lightbend.com>> wrote:
Yes its a touch decision and as we discussed today (https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA)
"Kubernetes support window is 9 months, Spark is two years". So we may end up with old client versions on branches still supported like 2.4.x in the future.
That gives us no choice but to upgrade, if we want to be on the safe side. We have tested 3.0.0 with 1.11 internally and it works but I dont know what it means to run with old
clients.


On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com>> wrote:
If the old client is basically unusable with the versions of K8S
people mostly use now, and the new client still works with older
versions, I could see including this in 2.4.1.

Looking at https://github.com/fabric8io/kubernetes-client#compatibility-matrix
it seems like the 4.1.1 client is needed for 1.10 and above. However
it no longer supports 1.7 and below.
We have 3.0.x, and versions through 4.0.x of the client support the
same K8S versions, so no real middle ground here.

1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
branches are maintained for 9 months per
https://kubernetes.io/docs/setup/version-skew-policy/

Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
used the newer client from the start as at that point (?) 1.7 and
earlier were already at least 7 months past EOL.
If we update the client in 2.4.1, versions of K8S as recently
'supported' as a year ago won't work anymore. I'm guessing there are
still 1.7 users out there? That wasn't that long ago but if the
project and users generally move fast, maybe not.

Normally I'd say, that's what the next minor release of Spark is for;
update if you want later infra. But there is no Spark 2.5.
I presume downstream distros could modify the dependency easily (?) if
needed and maybe already do. It wouldn't necessarily help end users.

Does the 3.0.x client not work at all with 1.10+ or just unsupported.
If it 'basically works but no guarantees' I'd favor not updating. If
it doesn't work at all, hm. That's tough. I think I'd favor updating
the client but think it's a tough call both ways.



On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
<st...@lightbend.com>> wrote:
>
> Yes Shane Knapp has done the work for that already,  and also tests pass, I am working on a PR now, I could submit it for the 2.4 branch .
> I understand that this is a major dependency update, but the problem I see is that the client version is so old that I dont think it makes
> much sense for current users who are on k8s 1.10, 1.11 etc(https://github.com/fabric8io/kubernetes-client#compatibility-matrix, 3.0.0 does not even exist in there).
> I dont know what it means to use that old version with current k8s clusters in terms of bugs etc.




--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu<https://rise.cs.berkeley.edu/>


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by DB Tsai <db...@dbtsai.com.INVALID>.
Saisai,

There is no blocker now. I ran into some difficulties in publishing the
jars into Nexus. The publish task was finished, but Nexus gave me the
following error.


*failureMessage Failed to validate the pgp signature of
'/org/apache/spark/spark-streaming-flume-assembly_2.11/2.4.1/spark-streaming-flume-assembly_2.11-2.4.1-tests.jar',
check the logs.*

I am sure my key is in the key server, and the weird thing is that it fails
on different jars each time I ran the publish script.

Sincerely,

DB Tsai
----------------------------------------------------------
Web: https://www.dbtsai.com
PGP Key ID: 42E5B25A8F7A82C1


On Wed, Mar 6, 2019 at 6:04 PM Saisai Shao <sa...@gmail.com> wrote:

> Do we have other block/critical issues for Spark 2.4.1 or waiting
> something to be fixed? I roughly searched the JIRA, seems there's no
> block/critical issues marked for 2.4.1.
>
> Thanks
> Saisai
>
> shane knapp <sk...@berkeley.edu> 于2019年3月7日周四 上午4:57写道:
>
>> i'll be popping in to the sig-big-data meeting on the 20th to talk about
>> stuff like this.
>>
>> On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <
>> stavros.kontopoulos@lightbend.com> wrote:
>>
>>> Yes its a touch decision and as we discussed today (
>>> https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA
>>> )
>>> "Kubernetes support window is 9 months, Spark is two years". So we may
>>> end up with old client versions on branches still supported like 2.4.x in
>>> the future.
>>> That gives us no choice but to upgrade, if we want to be on the safe
>>> side. We have tested 3.0.0 with 1.11 internally and it works but I dont
>>> know what it means to run with old
>>> clients.
>>>
>>>
>>> On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com> wrote:
>>>
>>>> If the old client is basically unusable with the versions of K8S
>>>> people mostly use now, and the new client still works with older
>>>> versions, I could see including this in 2.4.1.
>>>>
>>>> Looking at
>>>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix
>>>> it seems like the 4.1.1 client is needed for 1.10 and above. However
>>>> it no longer supports 1.7 and below.
>>>> We have 3.0.x, and versions through 4.0.x of the client support the
>>>> same K8S versions, so no real middle ground here.
>>>>
>>>> 1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
>>>> branches are maintained for 9 months per
>>>> https://kubernetes.io/docs/setup/version-skew-policy/
>>>>
>>>> Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
>>>> used the newer client from the start as at that point (?) 1.7 and
>>>> earlier were already at least 7 months past EOL.
>>>> If we update the client in 2.4.1, versions of K8S as recently
>>>> 'supported' as a year ago won't work anymore. I'm guessing there are
>>>> still 1.7 users out there? That wasn't that long ago but if the
>>>> project and users generally move fast, maybe not.
>>>>
>>>> Normally I'd say, that's what the next minor release of Spark is for;
>>>> update if you want later infra. But there is no Spark 2.5.
>>>> I presume downstream distros could modify the dependency easily (?) if
>>>> needed and maybe already do. It wouldn't necessarily help end users.
>>>>
>>>> Does the 3.0.x client not work at all with 1.10+ or just unsupported.
>>>> If it 'basically works but no guarantees' I'd favor not updating. If
>>>> it doesn't work at all, hm. That's tough. I think I'd favor updating
>>>> the client but think it's a tough call both ways.
>>>>
>>>>
>>>>
>>>> On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
>>>> <st...@lightbend.com> wrote:
>>>> >
>>>> > Yes Shane Knapp has done the work for that already,  and also tests
>>>> pass, I am working on a PR now, I could submit it for the 2.4 branch .
>>>> > I understand that this is a major dependency update, but the problem
>>>> I see is that the client version is so old that I dont think it makes
>>>> > much sense for current users who are on k8s 1.10, 1.11 etc(
>>>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix,
>>>> 3.0.0 does not even exist in there).
>>>> > I dont know what it means to use that old version with current k8s
>>>> clusters in terms of bugs etc.
>>>>
>>>
>>>
>>>
>>
>> --
>> Shane Knapp
>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>> https://rise.cs.berkeley.edu
>>
>

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Saisai Shao <sa...@gmail.com>.
Do we have other block/critical issues for Spark 2.4.1 or waiting something
to be fixed? I roughly searched the JIRA, seems there's no block/critical
issues marked for 2.4.1.

Thanks
Saisai

shane knapp <sk...@berkeley.edu> 于2019年3月7日周四 上午4:57写道:

> i'll be popping in to the sig-big-data meeting on the 20th to talk about
> stuff like this.
>
> On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <
> stavros.kontopoulos@lightbend.com> wrote:
>
>> Yes its a touch decision and as we discussed today (
>> https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA
>> )
>> "Kubernetes support window is 9 months, Spark is two years". So we may
>> end up with old client versions on branches still supported like 2.4.x in
>> the future.
>> That gives us no choice but to upgrade, if we want to be on the safe
>> side. We have tested 3.0.0 with 1.11 internally and it works but I dont
>> know what it means to run with old
>> clients.
>>
>>
>> On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com> wrote:
>>
>>> If the old client is basically unusable with the versions of K8S
>>> people mostly use now, and the new client still works with older
>>> versions, I could see including this in 2.4.1.
>>>
>>> Looking at
>>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix
>>> it seems like the 4.1.1 client is needed for 1.10 and above. However
>>> it no longer supports 1.7 and below.
>>> We have 3.0.x, and versions through 4.0.x of the client support the
>>> same K8S versions, so no real middle ground here.
>>>
>>> 1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
>>> branches are maintained for 9 months per
>>> https://kubernetes.io/docs/setup/version-skew-policy/
>>>
>>> Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
>>> used the newer client from the start as at that point (?) 1.7 and
>>> earlier were already at least 7 months past EOL.
>>> If we update the client in 2.4.1, versions of K8S as recently
>>> 'supported' as a year ago won't work anymore. I'm guessing there are
>>> still 1.7 users out there? That wasn't that long ago but if the
>>> project and users generally move fast, maybe not.
>>>
>>> Normally I'd say, that's what the next minor release of Spark is for;
>>> update if you want later infra. But there is no Spark 2.5.
>>> I presume downstream distros could modify the dependency easily (?) if
>>> needed and maybe already do. It wouldn't necessarily help end users.
>>>
>>> Does the 3.0.x client not work at all with 1.10+ or just unsupported.
>>> If it 'basically works but no guarantees' I'd favor not updating. If
>>> it doesn't work at all, hm. That's tough. I think I'd favor updating
>>> the client but think it's a tough call both ways.
>>>
>>>
>>>
>>> On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
>>> <st...@lightbend.com> wrote:
>>> >
>>> > Yes Shane Knapp has done the work for that already,  and also tests
>>> pass, I am working on a PR now, I could submit it for the 2.4 branch .
>>> > I understand that this is a major dependency update, but the problem I
>>> see is that the client version is so old that I dont think it makes
>>> > much sense for current users who are on k8s 1.10, 1.11 etc(
>>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix,
>>> 3.0.0 does not even exist in there).
>>> > I dont know what it means to use that old version with current k8s
>>> clusters in terms of bugs etc.
>>>
>>
>>
>>
>
> --
> Shane Knapp
> UC Berkeley EECS Research / RISELab Staff Technical Lead
> https://rise.cs.berkeley.edu
>

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by shane knapp <sk...@berkeley.edu>.
i'll be popping in to the sig-big-data meeting on the 20th to talk about
stuff like this.

On Wed, Mar 6, 2019 at 12:40 PM Stavros Kontopoulos <
stavros.kontopoulos@lightbend.com> wrote:

> Yes its a touch decision and as we discussed today (
> https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA
> )
> "Kubernetes support window is 9 months, Spark is two years". So we may
> end up with old client versions on branches still supported like 2.4.x in
> the future.
> That gives us no choice but to upgrade, if we want to be on the safe side.
> We have tested 3.0.0 with 1.11 internally and it works but I dont know what
> it means to run with old
> clients.
>
>
> On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com> wrote:
>
>> If the old client is basically unusable with the versions of K8S
>> people mostly use now, and the new client still works with older
>> versions, I could see including this in 2.4.1.
>>
>> Looking at
>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix
>> it seems like the 4.1.1 client is needed for 1.10 and above. However
>> it no longer supports 1.7 and below.
>> We have 3.0.x, and versions through 4.0.x of the client support the
>> same K8S versions, so no real middle ground here.
>>
>> 1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
>> branches are maintained for 9 months per
>> https://kubernetes.io/docs/setup/version-skew-policy/
>>
>> Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
>> used the newer client from the start as at that point (?) 1.7 and
>> earlier were already at least 7 months past EOL.
>> If we update the client in 2.4.1, versions of K8S as recently
>> 'supported' as a year ago won't work anymore. I'm guessing there are
>> still 1.7 users out there? That wasn't that long ago but if the
>> project and users generally move fast, maybe not.
>>
>> Normally I'd say, that's what the next minor release of Spark is for;
>> update if you want later infra. But there is no Spark 2.5.
>> I presume downstream distros could modify the dependency easily (?) if
>> needed and maybe already do. It wouldn't necessarily help end users.
>>
>> Does the 3.0.x client not work at all with 1.10+ or just unsupported.
>> If it 'basically works but no guarantees' I'd favor not updating. If
>> it doesn't work at all, hm. That's tough. I think I'd favor updating
>> the client but think it's a tough call both ways.
>>
>>
>>
>> On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
>> <st...@lightbend.com> wrote:
>> >
>> > Yes Shane Knapp has done the work for that already,  and also tests
>> pass, I am working on a PR now, I could submit it for the 2.4 branch .
>> > I understand that this is a major dependency update, but the problem I
>> see is that the client version is so old that I dont think it makes
>> > much sense for current users who are on k8s 1.10, 1.11 etc(
>> https://github.com/fabric8io/kubernetes-client#compatibility-matrix,
>> 3.0.0 does not even exist in there).
>> > I dont know what it means to use that old version with current k8s
>> clusters in terms of bugs etc.
>>
>
>
>

-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Stavros Kontopoulos <st...@lightbend.com>.
Yes its a touch decision and as we discussed today (
https://docs.google.com/document/d/1pnF38NF6N5eM8DlK088XUW85Vms4V2uTsGZvSp8MNIA
)
"Kubernetes support window is 9 months, Spark is two years". So we may end
up with old client versions on branches still supported like 2.4.x in the
future.
That gives us no choice but to upgrade, if we want to be on the safe side.
We have tested 3.0.0 with 1.11 internally and it works but I dont know what
it means to run with old
clients.


On Wed, Mar 6, 2019 at 7:54 PM Sean Owen <sr...@gmail.com> wrote:

> If the old client is basically unusable with the versions of K8S
> people mostly use now, and the new client still works with older
> versions, I could see including this in 2.4.1.
>
> Looking at
> https://github.com/fabric8io/kubernetes-client#compatibility-matrix
> it seems like the 4.1.1 client is needed for 1.10 and above. However
> it no longer supports 1.7 and below.
> We have 3.0.x, and versions through 4.0.x of the client support the
> same K8S versions, so no real middle ground here.
>
> 1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
> branches are maintained for 9 months per
> https://kubernetes.io/docs/setup/version-skew-policy/
>
> Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
> used the newer client from the start as at that point (?) 1.7 and
> earlier were already at least 7 months past EOL.
> If we update the client in 2.4.1, versions of K8S as recently
> 'supported' as a year ago won't work anymore. I'm guessing there are
> still 1.7 users out there? That wasn't that long ago but if the
> project and users generally move fast, maybe not.
>
> Normally I'd say, that's what the next minor release of Spark is for;
> update if you want later infra. But there is no Spark 2.5.
> I presume downstream distros could modify the dependency easily (?) if
> needed and maybe already do. It wouldn't necessarily help end users.
>
> Does the 3.0.x client not work at all with 1.10+ or just unsupported.
> If it 'basically works but no guarantees' I'd favor not updating. If
> it doesn't work at all, hm. That's tough. I think I'd favor updating
> the client but think it's a tough call both ways.
>
>
>
> On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
> <st...@lightbend.com> wrote:
> >
> > Yes Shane Knapp has done the work for that already,  and also tests
> pass, I am working on a PR now, I could submit it for the 2.4 branch .
> > I understand that this is a major dependency update, but the problem I
> see is that the client version is so old that I dont think it makes
> > much sense for current users who are on k8s 1.10, 1.11 etc(
> https://github.com/fabric8io/kubernetes-client#compatibility-matrix,
> 3.0.0 does not even exist in there).
> > I dont know what it means to use that old version with current k8s
> clusters in terms of bugs etc.
>

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Sean Owen <sr...@gmail.com>.
If the old client is basically unusable with the versions of K8S
people mostly use now, and the new client still works with older
versions, I could see including this in 2.4.1.

Looking at https://github.com/fabric8io/kubernetes-client#compatibility-matrix
it seems like the 4.1.1 client is needed for 1.10 and above. However
it no longer supports 1.7 and below.
We have 3.0.x, and versions through 4.0.x of the client support the
same K8S versions, so no real middle ground here.

1.7.0 came out June 2017, it seems. 1.10 was March 2018. Minor release
branches are maintained for 9 months per
https://kubernetes.io/docs/setup/version-skew-policy/

Spark 2.4.0 came in Nov 2018. I suppose we could say it should have
used the newer client from the start as at that point (?) 1.7 and
earlier were already at least 7 months past EOL.
If we update the client in 2.4.1, versions of K8S as recently
'supported' as a year ago won't work anymore. I'm guessing there are
still 1.7 users out there? That wasn't that long ago but if the
project and users generally move fast, maybe not.

Normally I'd say, that's what the next minor release of Spark is for;
update if you want later infra. But there is no Spark 2.5.
I presume downstream distros could modify the dependency easily (?) if
needed and maybe already do. It wouldn't necessarily help end users.

Does the 3.0.x client not work at all with 1.10+ or just unsupported.
If it 'basically works but no guarantees' I'd favor not updating. If
it doesn't work at all, hm. That's tough. I think I'd favor updating
the client but think it's a tough call both ways.



On Wed, Mar 6, 2019 at 11:14 AM Stavros Kontopoulos
<st...@lightbend.com> wrote:
>
> Yes Shane Knapp has done the work for that already,  and also tests pass, I am working on a PR now, I could submit it for the 2.4 branch .
> I understand that this is a major dependency update, but the problem I see is that the client version is so old that I dont think it makes
> much sense for current users who are on k8s 1.10, 1.11 etc(https://github.com/fabric8io/kubernetes-client#compatibility-matrix, 3.0.0 does not even exist in there).
> I dont know what it means to use that old version with current k8s clusters in terms of bugs etc.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Stavros Kontopoulos <st...@lightbend.com>.
Yes Shane Knapp has done the work for that already,  and also tests pass, I
am working on a PR now, I could submit it for the 2.4 branch .
I understand that this is a major dependency update, but the problem I see
is that the client version is so old that I dont think it makes
much sense for current users who are on k8s 1.10, 1.11 etc(
https://github.com/fabric8io/kubernetes-client#compatibility-matrix, 3.0.0
does not even exist in there).
I dont know what it means to use that old version with current k8s clusters
in terms of bugs etc.

On Wed, Mar 6, 2019 at 6:32 PM shane knapp <sk...@berkeley.edu> wrote:

> On Wed, Mar 6, 2019 at 7:17 AM Sean Owen <sr...@gmail.com> wrote:
>
>> The problem is that that's a major dependency upgrade in a maintenance
>> release. It didn't seem to work when we applied it to master. I don't
>> think it would block a release.
>>
>> i tested the k8s client 4.1.2 against master a couple of weeks back and
> it worked fine.  i will doubly confirm when i get in to the office today.
>
> --
> Shane Knapp
> UC Berkeley EECS Research / RISELab Staff Technical Lead
> https://rise.cs.berkeley.edu
>

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by shane knapp <sk...@berkeley.edu>.
On Wed, Mar 6, 2019 at 7:17 AM Sean Owen <sr...@gmail.com> wrote:

> The problem is that that's a major dependency upgrade in a maintenance
> release. It didn't seem to work when we applied it to master. I don't
> think it would block a release.
>
> i tested the k8s client 4.1.2 against master a couple of weeks back and it
worked fine.  i will doubly confirm when i get in to the office today.

-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Sean Owen <sr...@gmail.com>.
The problem is that that's a major dependency upgrade in a maintenance
release. It didn't seem to work when we applied it to master. I don't
think it would block a release.

On Wed, Mar 6, 2019 at 6:32 AM Stavros Kontopoulos
<st...@lightbend.com> wrote:
>
> We need to resolve this https://issues.apache.org/jira/browse/SPARK-26742 as well for 2.4.1, to make k8s support meaningful as many people are now on 1.11+
>
> Stavros
>
> On Tue, Mar 5, 2019 at 3:12 PM Saisai Shao <sa...@gmail.com> wrote:
>>
>> Hi DB,
>>
>> I saw that we already have 6 RCs, but the vote I can search by now was RC2, were they all canceled?
>>
>> Thanks
>> Saisai

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Stavros Kontopoulos <st...@lightbend.com>.
We need to resolve this https://issues.apache.org/jira/browse/SPARK-26742
as well for 2.4.1, to make k8s support meaningful as many people are now on
1.11+

Stavros

On Tue, Mar 5, 2019 at 3:12 PM Saisai Shao <sa...@gmail.com> wrote:

> Hi DB,
>
> I saw that we already have 6 RCs, but the vote I can search by now was
> RC2, were they all canceled?
>
> Thanks
> Saisai
>
> DB Tsai <db...@dbtsai.com.invalid> 于2019年2月22日周五 上午4:51写道:
>
>> I am cutting a new rc4 with fix from Felix. Thanks.
>>
>> Sincerely,
>>
>> DB Tsai
>> ----------------------------------------------------------
>> Web: https://www.dbtsai.com
>> PGP Key ID: 0359BC9965359766
>>
>> On Thu, Feb 21, 2019 at 8:57 AM Felix Cheung <fe...@hotmail.com>
>> wrote:
>> >
>> > I merged the fix to 2.4.
>> >
>> >
>> > ________________________________
>> > From: Felix Cheung <fe...@hotmail.com>
>> > Sent: Wednesday, February 20, 2019 9:34 PM
>> > To: DB Tsai; Spark dev list
>> > Cc: Cesar Delgado
>> > Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)
>> >
>> > Could you hold for a bit - I have one more fix to get in
>> >
>> >
>> > ________________________________
>> > From: d_tsai@apple.com on behalf of DB Tsai <d_...@apple.com.invalid>
>> > Sent: Wednesday, February 20, 2019 12:25 PM
>> > To: Spark dev list
>> > Cc: Cesar Delgado
>> > Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)
>> >
>> > Okay. Let's fail rc2, and I'll prepare rc3 with SPARK-26859.
>> >
>> > DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple,
>> Inc
>> >
>> > > On Feb 20, 2019, at 12:11 PM, Marcelo Vanzin
>> <va...@cloudera.com.INVALID> wrote:
>> > >
>> > > Just wanted to point out that
>> > > https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
>> > > and is marked as a correctness bug. (The fix is in the 2.4 branch,
>> > > just not in rc2.)
>> > >
>> > > On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid>
>> wrote:
>> > >>
>> > >> Please vote on releasing the following candidate as Apache Spark
>> version 2.4.1.
>> > >>
>> > >> The vote is open until Feb 24 PST and passes if a majority +1 PMC
>> votes are cast, with
>> > >> a minimum of 3 +1 votes.
>> > >>
>> > >> [ ] +1 Release this package as Apache Spark 2.4.1
>> > >> [ ] -1 Do not release this package because ...
>> > >>
>> > >> To learn more about Apache Spark, please see
>> http://spark.apache.org/
>> > >>
>> > >> The tag to be voted on is v2.4.1-rc2 (commit
>> 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
>> > >> https://github.com/apache/spark/tree/v2.4.1-rc2
>> > >>
>> > >> The release files, including signatures, digests, etc. can be found
>> at:
>> > >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
>> > >>
>> > >> Signatures used for Spark RCs can be found in this file:
>> > >> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> > >>
>> > >> The staging repository for this release can be found at:
>> > >>
>> https://repository.apache.org/content/repositories/orgapachespark-1299/
>> > >>
>> > >> The documentation corresponding to this release can be found at:
>> > >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
>> > >>
>> > >> The list of bug fixes going into 2.4.1 can be found at the following
>> URL:
>> > >> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
>> > >>
>> > >> FAQ
>> > >>
>> > >> =========================
>> > >> How can I help test this release?
>> > >> =========================
>> > >>
>> > >> If you are a Spark user, you can help us test this release by taking
>> > >> an existing Spark workload and running on this release candidate,
>> then
>> > >> reporting any regressions.
>> > >>
>> > >> If you're working in PySpark you can set up a virtual env and install
>> > >> the current RC and see if anything important breaks, in the
>> Java/Scala
>> > >> you can add the staging repository to your projects resolvers and
>> test
>> > >> with the RC (make sure to clean up the artifact cache before/after so
>> > >> you don't end up building with a out of date RC going forward).
>> > >>
>> > >> ===========================================
>> > >> What should happen to JIRA tickets still targeting 2.4.1?
>> > >> ===========================================
>> > >>
>> > >> The current list of open tickets targeted at 2.4.1 can be found at:
>> > >> https://issues.apache.org/jira/projects/SPARK and search for
>> "Target Version/s" = 2.4.1
>> > >>
>> > >> Committers should look at those and triage. Extremely important bug
>> > >> fixes, documentation, and API tweaks that impact compatibility should
>> > >> be worked on immediately. Everything else please retarget to an
>> > >> appropriate release.
>> > >>
>> > >> ==================
>> > >> But my bug isn't fixed?
>> > >> ==================
>> > >>
>> > >> In order to make timely releases, we will typically not hold the
>> > >> release unless the bug in question is a regression from the previous
>> > >> release. That being said, if there is something which is a regression
>> > >> that has not been correctly targeted please ping me or a committer to
>> > >> help target the issue.
>> > >>
>> > >>
>> > >> DB Tsai | Siri Open Source Technologies [not a contribution] | 
>> Apple, Inc
>> > >>
>> > >>
>> > >> ---------------------------------------------------------------------
>> > >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> > >>
>> > >
>> > >
>> > > --
>> > > Marcelo
>> > >
>> > > ---------------------------------------------------------------------
>> > > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> > >
>> >
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Saisai Shao <sa...@gmail.com>.
Hi DB,

I saw that we already have 6 RCs, but the vote I can search by now was RC2,
were they all canceled?

Thanks
Saisai

DB Tsai <db...@dbtsai.com.invalid> 于2019年2月22日周五 上午4:51写道:

> I am cutting a new rc4 with fix from Felix. Thanks.
>
> Sincerely,
>
> DB Tsai
> ----------------------------------------------------------
> Web: https://www.dbtsai.com
> PGP Key ID: 0359BC9965359766
>
> On Thu, Feb 21, 2019 at 8:57 AM Felix Cheung <fe...@hotmail.com>
> wrote:
> >
> > I merged the fix to 2.4.
> >
> >
> > ________________________________
> > From: Felix Cheung <fe...@hotmail.com>
> > Sent: Wednesday, February 20, 2019 9:34 PM
> > To: DB Tsai; Spark dev list
> > Cc: Cesar Delgado
> > Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)
> >
> > Could you hold for a bit - I have one more fix to get in
> >
> >
> > ________________________________
> > From: d_tsai@apple.com on behalf of DB Tsai <d_...@apple.com.invalid>
> > Sent: Wednesday, February 20, 2019 12:25 PM
> > To: Spark dev list
> > Cc: Cesar Delgado
> > Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)
> >
> > Okay. Let's fail rc2, and I'll prepare rc3 with SPARK-26859.
> >
> > DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple,
> Inc
> >
> > > On Feb 20, 2019, at 12:11 PM, Marcelo Vanzin
> <va...@cloudera.com.INVALID> wrote:
> > >
> > > Just wanted to point out that
> > > https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
> > > and is marked as a correctness bug. (The fix is in the 2.4 branch,
> > > just not in rc2.)
> > >
> > > On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid>
> wrote:
> > >>
> > >> Please vote on releasing the following candidate as Apache Spark
> version 2.4.1.
> > >>
> > >> The vote is open until Feb 24 PST and passes if a majority +1 PMC
> votes are cast, with
> > >> a minimum of 3 +1 votes.
> > >>
> > >> [ ] +1 Release this package as Apache Spark 2.4.1
> > >> [ ] -1 Do not release this package because ...
> > >>
> > >> To learn more about Apache Spark, please see http://spark.apache.org/
> > >>
> > >> The tag to be voted on is v2.4.1-rc2 (commit
> 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
> > >> https://github.com/apache/spark/tree/v2.4.1-rc2
> > >>
> > >> The release files, including signatures, digests, etc. can be found
> at:
> > >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
> > >>
> > >> Signatures used for Spark RCs can be found in this file:
> > >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> > >>
> > >> The staging repository for this release can be found at:
> > >>
> https://repository.apache.org/content/repositories/orgapachespark-1299/
> > >>
> > >> The documentation corresponding to this release can be found at:
> > >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
> > >>
> > >> The list of bug fixes going into 2.4.1 can be found at the following
> URL:
> > >> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
> > >>
> > >> FAQ
> > >>
> > >> =========================
> > >> How can I help test this release?
> > >> =========================
> > >>
> > >> If you are a Spark user, you can help us test this release by taking
> > >> an existing Spark workload and running on this release candidate, then
> > >> reporting any regressions.
> > >>
> > >> If you're working in PySpark you can set up a virtual env and install
> > >> the current RC and see if anything important breaks, in the Java/Scala
> > >> you can add the staging repository to your projects resolvers and test
> > >> with the RC (make sure to clean up the artifact cache before/after so
> > >> you don't end up building with a out of date RC going forward).
> > >>
> > >> ===========================================
> > >> What should happen to JIRA tickets still targeting 2.4.1?
> > >> ===========================================
> > >>
> > >> The current list of open tickets targeted at 2.4.1 can be found at:
> > >> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.1
> > >>
> > >> Committers should look at those and triage. Extremely important bug
> > >> fixes, documentation, and API tweaks that impact compatibility should
> > >> be worked on immediately. Everything else please retarget to an
> > >> appropriate release.
> > >>
> > >> ==================
> > >> But my bug isn't fixed?
> > >> ==================
> > >>
> > >> In order to make timely releases, we will typically not hold the
> > >> release unless the bug in question is a regression from the previous
> > >> release. That being said, if there is something which is a regression
> > >> that has not been correctly targeted please ping me or a committer to
> > >> help target the issue.
> > >>
> > >>
> > >> DB Tsai | Siri Open Source Technologies [not a contribution] | 
> Apple, Inc
> > >>
> > >>
> > >> ---------------------------------------------------------------------
> > >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> > >>
> > >
> > >
> > > --
> > > Marcelo
> > >
> > > ---------------------------------------------------------------------
> > > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> > >
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by DB Tsai <db...@dbtsai.com.INVALID>.
I am cutting a new rc4 with fix from Felix. Thanks.

Sincerely,

DB Tsai
----------------------------------------------------------
Web: https://www.dbtsai.com
PGP Key ID: 0359BC9965359766

On Thu, Feb 21, 2019 at 8:57 AM Felix Cheung <fe...@hotmail.com> wrote:
>
> I merged the fix to 2.4.
>
>
> ________________________________
> From: Felix Cheung <fe...@hotmail.com>
> Sent: Wednesday, February 20, 2019 9:34 PM
> To: DB Tsai; Spark dev list
> Cc: Cesar Delgado
> Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)
>
> Could you hold for a bit - I have one more fix to get in
>
>
> ________________________________
> From: d_tsai@apple.com on behalf of DB Tsai <d_...@apple.com.invalid>
> Sent: Wednesday, February 20, 2019 12:25 PM
> To: Spark dev list
> Cc: Cesar Delgado
> Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)
>
> Okay. Let's fail rc2, and I'll prepare rc3 with SPARK-26859.
>
> DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc
>
> > On Feb 20, 2019, at 12:11 PM, Marcelo Vanzin <va...@cloudera.com.INVALID> wrote:
> >
> > Just wanted to point out that
> > https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
> > and is marked as a correctness bug. (The fix is in the 2.4 branch,
> > just not in rc2.)
> >
> > On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark version 2.4.1.
> >>
> >> The vote is open until Feb 24 PST and passes if a majority +1 PMC votes are cast, with
> >> a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 2.4.1
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> >>
> >> The tag to be voted on is v2.4.1-rc2 (commit 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
> >> https://github.com/apache/spark/tree/v2.4.1-rc2
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1299/
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
> >>
> >> The list of bug fixes going into 2.4.1 can be found at the following URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >>
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 2.4.1?
> >> ===========================================
> >>
> >> The current list of open tickets targeted at 2.4.1 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.1
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >>
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >>
> >> DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc
> >>
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
> >
> >
> > --
> > Marcelo
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Felix Cheung <fe...@hotmail.com>.
I merged the fix to 2.4.


________________________________
From: Felix Cheung <fe...@hotmail.com>
Sent: Wednesday, February 20, 2019 9:34 PM
To: DB Tsai; Spark dev list
Cc: Cesar Delgado
Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Could you hold for a bit - I have one more fix to get in


________________________________
From: d_tsai@apple.com on behalf of DB Tsai <d_...@apple.com.invalid>
Sent: Wednesday, February 20, 2019 12:25 PM
To: Spark dev list
Cc: Cesar Delgado
Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Okay. Let's fail rc2, and I'll prepare rc3 with SPARK-26859.

DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc

> On Feb 20, 2019, at 12:11 PM, Marcelo Vanzin <va...@cloudera.com.INVALID> wrote:
>
> Just wanted to point out that
> https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
> and is marked as a correctness bug. (The fix is in the 2.4 branch,
> just not in rc2.)
>
> On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid> wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark version 2.4.1.
>>
>> The vote is open until Feb 24 PST and passes if a majority +1 PMC votes are cast, with
>> a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 2.4.1
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v2.4.1-rc2 (commit 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
>> https://github.com/apache/spark/tree/v2.4.1-rc2
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1299/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
>>
>> The list of bug fixes going into 2.4.1 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 2.4.1?
>> ===========================================
>>
>> The current list of open tickets targeted at 2.4.1 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.1
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>> DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Felix Cheung <fe...@hotmail.com>.
Could you hold for a bit - I have one more fix to get in


________________________________
From: d_tsai@apple.com on behalf of DB Tsai <d_...@apple.com.invalid>
Sent: Wednesday, February 20, 2019 12:25 PM
To: Spark dev list
Cc: Cesar Delgado
Subject: Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Okay. Let's fail rc2, and I'll prepare rc3 with SPARK-26859.

DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc

> On Feb 20, 2019, at 12:11 PM, Marcelo Vanzin <va...@cloudera.com.INVALID> wrote:
>
> Just wanted to point out that
> https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
> and is marked as a correctness bug. (The fix is in the 2.4 branch,
> just not in rc2.)
>
> On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid> wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark version 2.4.1.
>>
>> The vote is open until Feb 24 PST and passes if a majority +1 PMC votes are cast, with
>> a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 2.4.1
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v2.4.1-rc2 (commit 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
>> https://github.com/apache/spark/tree/v2.4.1-rc2
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1299/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
>>
>> The list of bug fixes going into 2.4.1 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>>
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 2.4.1?
>> ===========================================
>>
>> The current list of open tickets targeted at 2.4.1 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.1
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>>
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>> DB Tsai | Siri Open Source Technologies [not a contribution] |  Apple, Inc
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by DB Tsai <d_...@apple.com.INVALID>.
Okay. Let's fail rc2, and I'll prepare rc3 with SPARK-26859.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Feb 20, 2019, at 12:11 PM, Marcelo Vanzin <va...@cloudera.com.INVALID> wrote:
> 
> Just wanted to point out that
> https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
> and is marked as a correctness bug. (The fix is in the 2.4 branch,
> just not in rc2.)
> 
> On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid> wrote:
>> 
>> Please vote on releasing the following candidate as Apache Spark version 2.4.1.
>> 
>> The vote is open until Feb 24 PST and passes if a majority +1 PMC votes are cast, with
>> a minimum of 3 +1 votes.
>> 
>> [ ] +1 Release this package as Apache Spark 2.4.1
>> [ ] -1 Do not release this package because ...
>> 
>> To learn more about Apache Spark, please see http://spark.apache.org/
>> 
>> The tag to be voted on is v2.4.1-rc2 (commit 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
>> https://github.com/apache/spark/tree/v2.4.1-rc2
>> 
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
>> 
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> 
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1299/
>> 
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
>> 
>> The list of bug fixes going into 2.4.1 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
>> 
>> FAQ
>> 
>> =========================
>> How can I help test this release?
>> =========================
>> 
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>> 
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>> 
>> ===========================================
>> What should happen to JIRA tickets still targeting 2.4.1?
>> ===========================================
>> 
>> The current list of open tickets targeted at 2.4.1 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.1
>> 
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>> 
>> ==================
>> But my bug isn't fixed?
>> ==================
>> 
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>> 
>> 
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> 
> 
> 
> -- 
> Marcelo
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.1 (RC2)

Posted by Marcelo Vanzin <va...@cloudera.com.INVALID>.
Just wanted to point out that
https://issues.apache.org/jira/browse/SPARK-26859 is not in this RC,
and is marked as a correctness bug. (The fix is in the 2.4 branch,
just not in rc2.)

On Wed, Feb 20, 2019 at 12:07 PM DB Tsai <d_...@apple.com.invalid> wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 2.4.1.
>
> The vote is open until Feb 24 PST and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.1
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.1-rc2 (commit 229ad524cfd3f74dd7aa5fc9ba841ae223caa960):
> https://github.com/apache/spark/tree/v2.4.1-rc2
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1299/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.1-rc2-docs/
>
> The list of bug fixes going into 2.4.1 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/2.4.1
>
> FAQ
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.4.1?
> ===========================================
>
> The current list of open tickets targeted at 2.4.1 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.1
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==================
> But my bug isn't fixed?
> ==================
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org