You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Xiao Li <ga...@gmail.com> on 2019/05/01 14:39:04 UTC

[VOTE] Release Apache Spark 2.4.3

Please vote on releasing the following candidate as Apache Spark version
2.4.3.

The vote is open until May 5th PST and passes if a majority +1 PMC votes
are cast, with
a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.4.3
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.4.3-rc1 (commit
c3e32bf06c35ba2580d46150923abfa795b4446a):
https://github.com/apache/spark/tree/v2.4.3-rc1

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/

Signatures used for Spark RCs can be found in this file:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1324/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/

The list of bug fixes going into 2.4.2 can be found at the following URL:
https://issues.apache.org/jira/projects/SPARK/versions/12345410

The release is using the release script of the branch 2.4.3-rc1 with the
following commit
https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.4.3?
===========================================

The current list of open tickets targeted at 2.4.3 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target
Version/s" = 2.4.3

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Re: [VOTE] Release Apache Spark 2.4.3

Posted by Xiao Li <ga...@gmail.com>.
This vote passes! I'll follow up with a formal release announcement soon.

+1:
Michael Heuer (non-binding)
Gengliang Wang (non-binding)
Sean Owen (binding)
Felix Cheung (binding)
Wenchen Fan (binding)
Herman van Hovell (binding)
Xiao Li (binding)

Cheers,

Xiao

antonkulaga <an...@gmail.com> 于2019年5月6日周一 下午2:36写道:

> >Hadoop 3 has not been supported in 2.4.x. 2.12 has been since 2.4.0,
>
> I see. I thought it was as I saw many posts about configuring Spark for
> Hadoop 3 as well as hadoop 3 based spark docker containers
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 2.4.3

Posted by antonkulaga <an...@gmail.com>.
>Hadoop 3 has not been supported in 2.4.x. 2.12 has been since 2.4.0,

I see. I thought it was as I saw many posts about configuring Spark for
Hadoop 3 as well as hadoop 3 based spark docker containers



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.3

Posted by Sean Owen <sr...@gmail.com>.
Hadoop 3 has not been supported in 2.4.x. 2.12 has been since 2.4.0,
and 2.12 artifacts have always been released where available. What are
you referring to?

On Fri, May 3, 2019 at 9:28 AM antonkulaga <an...@gmail.com> wrote:
>
> Can you prove release version for Hadoop 3 and Scala 2.12 this time?
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.3

Posted by antonkulaga <an...@gmail.com>.
Can you prove release version for Hadoop 3 and Scala 2.12 this time?



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.3

Posted by Gengliang Wang <lt...@gmail.com>.
+1 (non-binding)

> 在 2019年5月1日,上午10:16,Michael Heuer <he...@gmail.com> 写道:
> 
> +1 (non-binding)


Re: [VOTE] Release Apache Spark 2.4.3

Posted by Michael Heuer <he...@gmail.com>.
+1 (non-binding)

The binary release files are correctly built with Scala 2.11.12.

Thank you,

   michael


> On May 1, 2019, at 9:39 AM, Xiao Li <ga...@gmail.com> wrote:
> 
> Please vote on releasing the following candidate as Apache Spark version 2.4.3.
> 
> The vote is open until May 5th PST and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
> 
> [ ] +1 Release this package as Apache Spark 2.4.3
> [ ] -1 Do not release this package because ...
> 
> To learn more about Apache Spark, please see http://spark.apache.org/ <http://spark.apache.org/>
> 
> The tag to be voted on is v2.4.3-rc1 (commit c3e32bf06c35ba2580d46150923abfa795b4446a):
> https://github.com/apache/spark/tree/v2.4.3-rc1 <https://github.com/apache/spark/tree/v2.4.3-rc1>
> 
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/ <https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/>
> 
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS <https://dist.apache.org/repos/dist/dev/spark/KEYS>
> 
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1324/ <https://repository.apache.org/content/repositories/orgapachespark-1324/>
> 
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/ <https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/>
> 
> The list of bug fixes going into 2.4.2 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12345410 <https://issues.apache.org/jira/projects/SPARK/versions/12345410>
> 
> The release is using the release script of the branch 2.4.3-rc1 with the following commit https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01 <https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01>
> 
> FAQ
> 
> =========================
> How can I help test this release?
> =========================
> 
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
> 
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
> 
> ===========================================
> What should happen to JIRA tickets still targeting 2.4.3?
> ===========================================
> 
> The current list of open tickets targeted at 2.4.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK <https://issues.apache.org/jira/projects/SPARK> and search for "Target Version/s" = 2.4.3
> 
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
> 
> ==================
> But my bug isn't fixed?
> ==================
> 
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.


Re: [VOTE] Release Apache Spark 2.4.3

Posted by Wenchen Fan <cl...@gmail.com>.
+1.

The Scala version problem has been resolved, which is the main motivation
of 2.4.3.

On Mon, May 6, 2019 at 12:38 AM Felix Cheung <fe...@hotmail.com>
wrote:

> I ran basic tests on R, r-hub etc. LGTM.
>
> +1 (limited - I didn’t get to run other usual tests)
>
> ------------------------------
> *From:* Sean Owen <sr...@apache.org>
> *Sent:* Wednesday, May 1, 2019 2:21 PM
> *To:* Xiao Li
> *Cc:* dev@spark.apache.org
> *Subject:* Re: [VOTE] Release Apache Spark 2.4.3
>
> +1 from me. There is little change from 2.4.2 anyway, except for the
> important change to the build script that should build pyspark with
> Scala 2.11 jars. I verified that the package contains the _2.11 Spark
> jars, but have a look!
>
> I'm still getting this weird error from the Kafka module when testing,
> but it's a long-standing weird known issue:
>
> [error]
> /home/ubuntu/spark-2.4.3/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumerSuite.scala:85:
> Symbol 'term org.eclipse' is missing from the classpath.
> [error] This symbol is required by 'method
> org.apache.spark.metrics.MetricsSystem.getServletHandlers'.
> [error] Make sure that term eclipse is in your classpath and check for
> conflicting dependencies with `-Ylog-classpath`.
> [error] A full rebuild may help if 'MetricsSystem.class' was compiled
> against an incompatible version of org.
> [error]     testUtils.sendMessages(topic, data.toArray)
>
> Killing zinc and rebuilding didn't help.
> But this isn't happening in Jenkins for example, so it should be
> env-specific.
>
> On Wed, May 1, 2019 at 9:39 AM Xiao Li <ga...@gmail.com> wrote:
> >
> > Please vote on releasing the following candidate as Apache Spark version
> 2.4.3.
> >
> > The vote is open until May 5th PST and passes if a majority +1 PMC votes
> are cast, with
> > a minimum of 3 +1 votes.
> >
> > [ ] +1 Release this package as Apache Spark 2.4.3
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > The tag to be voted on is v2.4.3-rc1 (commit
> c3e32bf06c35ba2580d46150923abfa795b4446a):
> > https://github.com/apache/spark/tree/v2.4.3-rc1
> >
> > The release files, including signatures, digests, etc. can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/
> >
> > Signatures used for Spark RCs can be found in this file:
> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1324/
> >
> > The documentation corresponding to this release can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/
> >
> > The list of bug fixes going into 2.4.2 can be found at the following URL:
> > https://issues.apache.org/jira/projects/SPARK/versions/12345410
> >
> > The release is using the release script of the branch 2.4.3-rc1 with the
> following commit
> https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01
> >
> > FAQ
> >
> > =========================
> > How can I help test this release?
> > =========================
> >
> > If you are a Spark user, you can help us test this release by taking
> > an existing Spark workload and running on this release candidate, then
> > reporting any regressions.
> >
> > If you're working in PySpark you can set up a virtual env and install
> > the current RC and see if anything important breaks, in the Java/Scala
> > you can add the staging repository to your projects resolvers and test
> > with the RC (make sure to clean up the artifact cache before/after so
> > you don't end up building with a out of date RC going forward).
> >
> > ===========================================
> > What should happen to JIRA tickets still targeting 2.4.3?
> > ===========================================
> >
> > The current list of open tickets targeted at 2.4.3 can be found at:
> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.3
> >
> > Committers should look at those and triage. Extremely important bug
> > fixes, documentation, and API tweaks that impact compatibility should
> > be worked on immediately. Everything else please retarget to an
> > appropriate release.
> >
> > ==================
> > But my bug isn't fixed?
> > ==================
> >
> > In order to make timely releases, we will typically not hold the
> > release unless the bug in question is a regression from the previous
> > release. That being said, if there is something which is a regression
> > that has not been correctly targeted please ping me or a committer to
> > help target the issue.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 2.4.3

Posted by Felix Cheung <fe...@hotmail.com>.
I ran basic tests on R, r-hub etc. LGTM.

+1 (limited - I didn’t get to run other usual tests)

________________________________
From: Sean Owen <sr...@apache.org>
Sent: Wednesday, May 1, 2019 2:21 PM
To: Xiao Li
Cc: dev@spark.apache.org
Subject: Re: [VOTE] Release Apache Spark 2.4.3

+1 from me. There is little change from 2.4.2 anyway, except for the
important change to the build script that should build pyspark with
Scala 2.11 jars. I verified that the package contains the _2.11 Spark
jars, but have a look!

I'm still getting this weird error from the Kafka module when testing,
but it's a long-standing weird known issue:

[error] /home/ubuntu/spark-2.4.3/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumerSuite.scala:85:
Symbol 'term org.eclipse' is missing from the classpath.
[error] This symbol is required by 'method
org.apache.spark.metrics.MetricsSystem.getServletHandlers'.
[error] Make sure that term eclipse is in your classpath and check for
conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'MetricsSystem.class' was compiled
against an incompatible version of org.
[error]     testUtils.sendMessages(topic, data.toArray)

Killing zinc and rebuilding didn't help.
But this isn't happening in Jenkins for example, so it should be env-specific.

On Wed, May 1, 2019 at 9:39 AM Xiao Li <ga...@gmail.com> wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 2.4.3.
>
> The vote is open until May 5th PST and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.3
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.3-rc1 (commit c3e32bf06c35ba2580d46150923abfa795b4446a):
> https://github.com/apache/spark/tree/v2.4.3-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1324/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/
>
> The list of bug fixes going into 2.4.2 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12345410
>
> The release is using the release script of the branch 2.4.3-rc1 with the following commit https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01
>
> FAQ
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.4.3?
> ===========================================
>
> The current list of open tickets targeted at 2.4.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.3
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==================
> But my bug isn't fixed?
> ==================
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.4.3

Posted by Sean Owen <sr...@apache.org>.
+1 from me. There is little change from 2.4.2 anyway, except for the
important change to the build script that should build pyspark with
Scala 2.11 jars. I verified that the package contains the _2.11 Spark
jars, but have a look!

I'm still getting this weird error from the Kafka module when testing,
but it's a long-standing weird known issue:

[error] /home/ubuntu/spark-2.4.3/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumerSuite.scala:85:
Symbol 'term org.eclipse' is missing from the classpath.
[error] This symbol is required by 'method
org.apache.spark.metrics.MetricsSystem.getServletHandlers'.
[error] Make sure that term eclipse is in your classpath and check for
conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'MetricsSystem.class' was compiled
against an incompatible version of org.
[error]     testUtils.sendMessages(topic, data.toArray)

Killing zinc and rebuilding didn't help.
But this isn't happening in Jenkins for example, so it should be env-specific.

On Wed, May 1, 2019 at 9:39 AM Xiao Li <ga...@gmail.com> wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 2.4.3.
>
> The vote is open until May 5th PST and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.3
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.3-rc1 (commit c3e32bf06c35ba2580d46150923abfa795b4446a):
> https://github.com/apache/spark/tree/v2.4.3-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1324/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/
>
> The list of bug fixes going into 2.4.2 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12345410
>
> The release is using the release script of the branch 2.4.3-rc1 with the following commit https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01
>
> FAQ
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.4.3?
> ===========================================
>
> The current list of open tickets targeted at 2.4.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.3
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==================
> But my bug isn't fixed?
> ==================
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org