You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Chao Sun <su...@apache.org> on 2022/11/15 00:11:35 UTC

[VOTE] Release Spark 3.2.3 (RC1)

Please vote on releasing the following candidate as Apache Spark version 3.2.3.

The vote is open until 11:59pm Pacific time Nov 17th and passes if a
majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.2.3
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.2.3-rc1 (commit
b53c341e0fefbb33d115ab630369a18765b7763d):
https://github.com/apache/spark/tree/v3.2.3-rc1

The release files, including signatures, digests, etc. can be found at:
https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/

Signatures used for Spark RCs can be found in this file:
https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1431/

The documentation corresponding to this release can be found at:
https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/

The list of bug fixes going into 3.2.3 can be found at the following URL:
https://issues.apache.org/jira/projects/SPARK/versions/12352105

This release is using the release script of the tag v3.2.3-rc1.


FAQ

=========================
How can I help test this release?
=========================
If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.2.3?
===========================================
The current list of open tickets targeted at 3.2.3 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target
Version/s" = 3.2.3

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================
In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by Chao Sun <su...@apache.org>.
+1 (non-binding) myself. Thanks everyone for voting!

On Wed, Nov 16, 2022 at 9:22 PM 416161436@qq.com <ru...@foxmail.com>
wrote:

> +1
>
> ------------------------------
> Ruifeng Zheng
> ruifengz@foxmail.com
>
> <https://wx.mail.qq.com/home/index?t=readmail_businesscard_midpage&nocheck=true&name=Ruifeng+Zheng&icon=https%3A%2F%2Fres.mail.qq.com%2Fzh_CN%2Fhtmledition%2Fimages%2Frss%2Fmale.gif%3Frand%3D1617349242&mail=ruifengz%40foxmail.com&code=>
>
>
>
> ------------------ Original ------------------
> *From:* "Wenchen Fan" <cl...@gmail.com>;
> *Date:* Thu, Nov 17, 2022 10:26 AM
> *To:* "Yang,Jie(INF)"<ya...@baidu.com>;
> *Cc:* "Chris Nauroth"<cn...@apache.org>;"Yuming Wang"<wg...@gmail.com>;"Dongjoon
> Hyun"<do...@gmail.com>;"huaxin gao"<hu...@gmail.com>;"L.
> C. Hsieh"<vi...@gmail.com>;"Chao Sun"<su...@apache.org>;"dev"<
> dev@spark.apache.org>;
> *Subject:* Re: [VOTE] Release Spark 3.2.3 (RC1)
>
> +1
>
> On Thu, Nov 17, 2022 at 10:20 AM Yang,Jie(INF) <ya...@baidu.com>
> wrote:
>
>> +1,non-binding
>>
>>
>>
>> The test combination of Java 11 + Scala 2.12 and Java 11 + Scala 2.13 has
>> passed.
>>
>>
>>
>> Yang Jie
>>
>>
>>
>> *发件人**: *Chris Nauroth <cn...@apache.org>
>> *日期**: *2022年11月17日 星期四 04:27
>> *收件人**: *Yuming Wang <wg...@gmail.com>
>> *抄送**: *"Yang,Jie(INF)" <ya...@baidu.com>, Dongjoon Hyun <
>> dongjoon.hyun@gmail.com>, huaxin gao <hu...@gmail.com>, "L. C.
>> Hsieh" <vi...@gmail.com>, Chao Sun <su...@apache.org>, dev <
>> dev@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> +1 (non-binding)
>>
>> * Verified all checksums.
>> * Verified all signatures.
>> * Built from source, with multiple profiles, to full success, for Java 11
>> and Scala 2.12:
>>     * build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3
>> -Phive-thriftserver -Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests
>> clean package
>> * Tests passed.
>> * Ran several examples successfully:
>>     * bin/spark-submit --class org.apache.spark.examples.SparkPi
>> examples/jars/spark-examples_2.12-3.2.3.jar
>>     * bin/spark-submit --class
>> org.apache.spark.examples.sql.hive.SparkHiveExample
>> examples/jars/spark-examples_2.12-3.2.3.jar
>>     * bin/spark-submit
>> examples/src/main/python/streaming/network_wordcount.py localhost 9999
>>
>>
>>
>> Chao, thank you for preparing the release.
>>
>>
>>
>> Chris Nauroth
>>
>>
>>
>>
>>
>> On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang <wg...@gmail.com> wrote:
>>
>> +1
>>
>>
>>
>> On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) <ya...@baidu.com>
>> wrote:
>>
>> I switched Scala 2.13 to Scala 2.12 today. The test is still in progress
>> and it has not been hung.
>>
>>
>>
>> Yang Jie
>>
>>
>>
>> *发件人**: *Dongjoon Hyun <do...@gmail.com>
>> *日期**: *2022年11月16日 星期三 01:17
>> *收件人**: *"Yang,Jie(INF)" <ya...@baidu.com>
>> *抄送**: *huaxin gao <hu...@gmail.com>, "L. C. Hsieh" <
>> viirya@gmail.com>, Chao Sun <su...@apache.org>, dev <
>> dev@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> Did you hit that in Scala 2.12, too?
>>
>>
>>
>> Dongjoon.
>>
>>
>>
>> On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <ya...@baidu.com>
>> wrote:
>>
>> Hi, all
>>
>>
>>
>> I test v3.2.3 with following command:
>>
>>
>>
>> ```
>>
>> dev/change-scala-version.sh 2.13
>>
>> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
>> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
>> -Pscala-2.13 -fn
>>
>> ```
>>
>>
>>
>> The testing environment is:
>>
>>
>>
>> OS: CentOS 6u3 Final
>>
>> Java: zulu 11.0.17
>>
>> Python: 3.9.7
>>
>> Scala: 2.13
>>
>>
>>
>> The above test command has been executed twice, and all times hang in the
>> following stack:
>>
>>
>>
>> ```
>>
>> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
>> elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition
>> [0x00007f2de3929000]
>>
>>    java.lang.Thread.State: WAITING (parking)
>>
>>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>>
>>        - parking to wait for  <0x0000000790d00050> (a
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>
>>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
>> /LockSupport.java:194)
>>
>>        at
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
>> /AbstractQueuedSynchronizer.java:2081)
>>
>>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
>> /LinkedBlockingQueue.java:433)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>>
>>        - locked <0x0000000790d00208> (a java.lang.Object)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>>
>>        at
>> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>>
>>        - locked <0x0000000790d00218> (a
>> org.apache.spark.sql.execution.QueryExecution)
>>
>>        at
>> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>>
>>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>>
>>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
>>
>>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>>
>>        at
>> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>>
>>        at
>> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
>> Source)
>>
>>        at
>> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>>
>>        at
>> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>>
>>        at
>> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>>
>>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
>>
>>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
>>
>>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
>>
>>        at
>> org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
>>
>>        at
>> org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
>> Source)
>>
>>        at
>> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
>>
>>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
>>
>>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
>>
>>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>>
>>        at org.scalatest.Transformer.apply(Transformer.scala:22)
>>
>>        at org.scalatest.Transformer.apply(Transformer.scala:20)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
>>
>>        at
>> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
>> Source)
>>
>>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
>>
>>        at org.apache.spark.SparkFunSuite.org
>> <https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>
>> $scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
>>
>>        at
>> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
>>
>>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
>> Source)
>>
>>        at
>> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
>>
>>        at
>> org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown
>> Source)
>>
>>        at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>>
>>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
>>
>>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
>>
>>        at org.scalatest.Suite.run(Suite.scala:1112)
>>
>>        at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>        at org.scalatest.funsuite.AnyFunSuite.org
>> <https://mailshield.baidu.com/check?q=CyXfi1lNK3XEW4vrZ3%2bg3gQrijHc2pVnzFcFMt1SPMOwpGuNU2lkrpNxUMa%2br1jUF7fqtA%3d%3d>
>> $scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
>> Source)
>>
>>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>>
>>        at org.apache.spark.SparkFunSuite.org
>> <https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>
>> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>>
>>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>>
>>        at
>> org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>>
>>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>>
>>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>>
>>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>>
>>        at
>> org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
>>
>>        at
>> scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>>
>>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>>
>>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>>
>>        at
>> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>>
>>        at org.scalatest.Suite.run(Suite.scala:1109)
>>
>>        at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>>
>>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>>
>>        at
>> org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown
>> Source)
>>
>>        at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>        at
>> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>>
>>        at
>> org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown
>> Source)
>>
>>        at
>> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>>
>>        at
>> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>>
>>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
>>
>>        at org.scalatest.tools.Runner.main(Runner.scala)
>>
>> ```
>>
>> I think the test case being executed is `SPARK-28323: PythonUDF should be
>> able to use in join condition`, does anyone have the same problem?
>>
>>
>>
>> Yang Jie
>>
>>
>>
>>
>>
>> *发件人**: *huaxin gao <hu...@gmail.com>
>> *日期**: *2022年11月15日 星期二 13:59
>> *收件人**: *"L. C. Hsieh" <vi...@gmail.com>
>> *抄送**: *Dongjoon Hyun <do...@gmail.com>, Chao Sun <
>> sunchao@apache.org>, dev <de...@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> +1
>>
>>
>>
>> Thanks Chao!
>>
>>
>>
>> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com> wrote:
>>
>> +1
>>
>> Thanks Chao.
>>
>> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>
>> wrote:
>> >
>> > +1
>> >
>> > Thank you, Chao.
>> >
>> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
>> >>
>> >> Please vote on releasing the following candidate as Apache Spark
>> version 3.2.3.
>> >>
>> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 3.2.3
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see http://spark.apache.org/
>> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
>> >>
>> >> The tag to be voted on is v3.2.3-rc1 (commit
>> >> b53c341e0fefbb33d115ab630369a18765b7763d):
>> >> https://github.com/apache/spark/tree/v3.2.3-rc1
>> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
>> >>
>> >> The release files, including signatures, digests, etc. can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
>> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
>> >>
>> >> Signatures used for Spark RCs can be found in this file:
>> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
>> >>
>> >> The staging repository for this release can be found at:
>> >>
>> https://repository.apache.org/content/repositories/orgapachespark-1431/
>> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
>> >>
>> >> The documentation corresponding to this release can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
>> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
>> >>
>> >> The list of bug fixes going into 3.2.3 can be found at the following
>> URL:
>> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
>> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
>> >>
>> >> This release is using the release script of the tag v3.2.3-rc1.
>> >>
>> >>
>> >> FAQ
>> >>
>> >> =========================
>> >> How can I help test this release?
>> >> =========================
>> >> If you are a Spark user, you can help us test this release by taking
>> >> an existing Spark workload and running on this release candidate, then
>> >> reporting any regressions.
>> >>
>> >> If you're working in PySpark you can set up a virtual env and install
>> >> the current RC and see if anything important breaks, in the Java/Scala
>> >> you can add the staging repository to your projects resolvers and test
>> >> with the RC (make sure to clean up the artifact cache before/after so
>> >> you don't end up building with a out of date RC going forward).
>> >>
>> >> ===========================================
>> >> What should happen to JIRA tickets still targeting 3.2.3?
>> >> ===========================================
>> >> The current list of open tickets targeted at 3.2.3 can be found at:
>> >> https://issues.apache.org/jira/projects/SPARK
>> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
>> and search for "Target
>> >> Version/s" = 3.2.3
>> >>
>> >> Committers should look at those and triage. Extremely important bug
>> >> fixes, documentation, and API tweaks that impact compatibility should
>> >> be worked on immediately. Everything else please retarget to an
>> >> appropriate release.
>> >>
>> >> ==================
>> >> But my bug isn't fixed?
>> >> ==================
>> >> In order to make timely releases, we will typically not hold the
>> >> release unless the bug in question is a regression from the previous
>> >> release. That being said, if there is something which is a regression
>> >> that has not been correctly targeted please ping me or a committer to
>> >> help target the issue.
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> >>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by "416161436@qq.com" <ru...@foxmail.com>.
+1&nbsp;




Ruifeng&nbsp;Zheng
ruifengz@foxmail.com



&nbsp;




------------------&nbsp;Original&nbsp;------------------
From:                                                                                                                        "Wenchen Fan"                                                                                    <cloud0fan@gmail.com&gt;;
Date:&nbsp;Thu, Nov 17, 2022 10:26 AM
To:&nbsp;"Yang,Jie(INF)"<yangjie01@baidu.com&gt;;
Cc:&nbsp;"Chris Nauroth"<cnauroth@apache.org&gt;;"Yuming Wang"<wgyumg@gmail.com&gt;;"Dongjoon Hyun"<dongjoon.hyun@gmail.com&gt;;"huaxin gao"<huaxin.gao11@gmail.com&gt;;"L. C. Hsieh"<viirya@gmail.com&gt;;"Chao Sun"<sunchao@apache.org&gt;;"dev"<dev@spark.apache.org&gt;;
Subject:&nbsp;Re: [VOTE] Release Spark 3.2.3 (RC1)



+1


On Thu, Nov 17, 2022 at 10:20 AM Yang,Jie(INF) <yangjie01@baidu.com&gt; wrote:

   
+1,non-binding
 
&nbsp;
 
The test combination of Java 11 + Scala 2.12 and Java 11 + Scala 2.13 has passed.
 
&nbsp;
 
Yang Jie
 
 
 
&nbsp;
  
发件人: Chris Nauroth <cnauroth@apache.org&gt;
 日期: 2022年11月17日 星期四 04:27
 收件人: Yuming Wang <wgyumg@gmail.com&gt;
 抄送: "Yang,Jie(INF)" <yangjie01@baidu.com&gt;, Dongjoon Hyun <dongjoon.hyun@gmail.com&gt;, huaxin gao <huaxin.gao11@gmail.com&gt;, "L. C. Hsieh" <viirya@gmail.com&gt;, Chao Sun <sunchao@apache.org&gt;,  dev <dev@spark.apache.org&gt;
 主题: Re: [VOTE] Release Spark 3.2.3 (RC1)
 
  
&nbsp;
 
  
+1 (non-binding)
 
 * Verified all checksums.
 * Verified all signatures.
 * Built from source, with multiple profiles, to full success, for Java 11 and Scala 2.12:
 &nbsp; &nbsp; * build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3 -Phive-thriftserver -Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests clean package
 * Tests passed.
 * Ran several examples successfully:
 &nbsp; &nbsp; * bin/spark-submit --class org.apache.spark.examples.SparkPi examples/jars/spark-examples_2.12-3.2.3.jar
 &nbsp; &nbsp; * bin/spark-submit --class org.apache.spark.examples.sql.hive.SparkHiveExample examples/jars/spark-examples_2.12-3.2.3.jar
 &nbsp; &nbsp; * bin/spark-submit examples/src/main/python/streaming/network_wordcount.py localhost 9999
      
&nbsp;
 
  
Chao, thank you for preparing the release.
 
  
&nbsp;
 
  
Chris Nauroth
 
 
 
 
 
 
&nbsp;
 
 
&nbsp;
   
On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang <wgyumg@gmail.com&gt; wrote:
 
   
+1
 
 
&nbsp;
   
On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) <yangjie01@baidu.com&gt; wrote:
 
     
I switched Scala 2.13 to Scala 2.12 today. The test is still in progress and it has not been hung.
 
&nbsp;
 
Yang Jie
 
&nbsp;
  
发件人: Dongjoon Hyun <dongjoon.hyun@gmail.com&gt;
 日期: 2022年11月16日 星期三 01:17
 收件人: "Yang,Jie(INF)" <yangjie01@baidu.com&gt;
 抄送: huaxin gao <huaxin.gao11@gmail.com&gt;, "L. C. Hsieh" <viirya@gmail.com&gt;,  Chao Sun <sunchao@apache.org&gt;, dev <dev@spark.apache.org&gt;
 主题: Re: [VOTE] Release Spark 3.2.3 (RC1)
 
  
&nbsp;
 
  
Did you hit that in Scala 2.12, too? 
  
&nbsp;
 
  
Dongjoon.
 
 
 
&nbsp;
   
On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <yangjie01@baidu.com&gt; wrote:
 
     
Hi, all
 
&nbsp;
 
I test v3.2.3 with following command:
 
&nbsp;
 
```
 
dev/change-scala-version.sh 2.13
 
build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl  -Pkubernetes -Phive&nbsp; -Pscala-2.13 -fn
 
```
 
&nbsp;
 
The testing environment is:
 
&nbsp;
 
OS: CentOS 6u3 Final
 
Java: zulu 11.0.17
 
Python: 3.9.7
 
Scala: 2.13
 
&nbsp;
 
The above test command has been executed twice, and all times hang in the following stack:
 
&nbsp;
 
```
 
"ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132  waiting on condition&nbsp; [0x00007f2de3929000]
 
&nbsp;&nbsp; java.lang.Thread.State: WAITING (parking)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - parking to wait for&nbsp; <0x0000000790d00050&gt; (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17/LockSupport.java:194)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17/AbstractQueuedSynchronizer.java:2081)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17/LinkedBlockingQueue.java:433)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown  Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - locked <0x0000000790d00208&gt; (a java.lang.Object)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - locked <0x0000000790d00218&gt; (a org.apache.spark.sql.execution.QueryExecution)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - locked <0x0000000790d002d8&gt; (a org.apache.spark.sql.Dataset)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Transformer.apply(Transformer.scala:22)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Transformer.apply(Transformer.scala:20)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at  org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at scala.collection.immutable.List.foreach(List.scala:333)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite.run(Suite.scala:1112)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite.run$(Suite.scala:1094)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at  org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at  org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite.run(Suite.scala:1109)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.Suite.run$(Suite.scala:1094)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at scala.collection.immutable.List.foreach(List.scala:333)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner$.main(Runner.scala:775)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at org.scalatest.tools.Runner.main(Runner.scala)
 
```
 
I think the test case being executed is `SPARK-28323: PythonUDF should be able to use in join condition`, does anyone  have the same problem?
 
&nbsp;
 
Yang Jie
 
&nbsp;
 
&nbsp;
  
发件人: huaxin gao <huaxin.gao11@gmail.com&gt;
 日期: 2022年11月15日 星期二 13:59
 收件人: "L. C. Hsieh" <viirya@gmail.com&gt;
 抄送: Dongjoon Hyun <dongjoon.hyun@gmail.com&gt;, Chao Sun <sunchao@apache.org&gt;,  dev <dev@spark.apache.org&gt;
 主题: Re: [VOTE] Release Spark 3.2.3 (RC1)
 
  
&nbsp;
 
  
+1&nbsp;
  
&nbsp;
 
  
Thanks Chao!
 
 
 
&nbsp;
   
On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <viirya@gmail.com&gt; wrote:
 
  
+1
 
 Thanks Chao.
 
 On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <dongjoon.hyun@gmail.com&gt; wrote:
 &gt;
 &gt; +1
 &gt;
 &gt; Thank you, Chao.
 &gt;
 &gt; On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <sunchao@apache.org&gt; wrote:
 &gt;&gt;
 &gt;&gt; Please vote on releasing the following candidate as Apache Spark version 3.2.3.
 &gt;&gt;
 &gt;&gt; The vote is open until 11:59pm Pacific time Nov 17th and passes if a
 &gt;&gt; majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
 &gt;&gt;
 &gt;&gt; [ ] +1 Release this package as Apache Spark 3.2.3
 &gt;&gt; [ ] -1 Do not release this package because ...
 &gt;&gt;
 &gt;&gt; To learn more about Apache Spark, please see  http://spark.apache.org/
 &gt;&gt;
 &gt;&gt; The tag to be voted on is v3.2.3-rc1 (commit
 &gt;&gt; b53c341e0fefbb33d115ab630369a18765b7763d):
 &gt;&gt;  https://github.com/apache/spark/tree/v3.2.3-rc1
 &gt;&gt;
 &gt;&gt; The release files, including signatures, digests, etc. can be found at:
 &gt;&gt;  https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
 &gt;&gt;
 &gt;&gt; Signatures used for Spark RCs can be found in this file:
 &gt;&gt;  https://dist.apache.org/repos/dist/dev/spark/KEYS
 &gt;&gt;
 &gt;&gt; The staging repository for this release can be found at:
 &gt;&gt;  https://repository.apache.org/content/repositories/orgapachespark-1431/
 &gt;&gt;
 &gt;&gt; The documentation corresponding to this release can be found at:
 &gt;&gt;  https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
 &gt;&gt;
 &gt;&gt; The list of bug fixes going into 3.2.3 can be found at the following URL:
 &gt;&gt;  https://issues.apache.org/jira/projects/SPARK/versions/12352105
 &gt;&gt;
 &gt;&gt; This release is using the release script of the tag v3.2.3-rc1.
 &gt;&gt;
 &gt;&gt;
 &gt;&gt; FAQ
 &gt;&gt;
 &gt;&gt; =========================
 &gt;&gt; How can I help test this release?
 &gt;&gt; =========================
 &gt;&gt; If you are a Spark user, you can help us test this release by taking
 &gt;&gt; an existing Spark workload and running on this release candidate, then
 &gt;&gt; reporting any regressions.
 &gt;&gt;
 &gt;&gt; If you're working in PySpark you can set up a virtual env and install
 &gt;&gt; the current RC and see if anything important breaks, in the Java/Scala
 &gt;&gt; you can add the staging repository to your projects resolvers and test
 &gt;&gt; with the RC (make sure to clean up the artifact cache before/after so
 &gt;&gt; you don't end up building with a out of date RC going forward).
 &gt;&gt;
 &gt;&gt; ===========================================
 &gt;&gt; What should happen to JIRA tickets still targeting 3.2.3?
 &gt;&gt; ===========================================
 &gt;&gt; The current list of open tickets targeted at 3.2.3 can be found at:
 &gt;&gt;  https://issues.apache.org/jira/projects/SPARK and search for "Target
 &gt;&gt; Version/s" = 3.2.3
 &gt;&gt;
 &gt;&gt; Committers should look at those and triage. Extremely important bug
 &gt;&gt; fixes, documentation, and API tweaks that impact compatibility should
 &gt;&gt; be worked on immediately. Everything else please retarget to an
 &gt;&gt; appropriate release.
 &gt;&gt;
 &gt;&gt; ==================
 &gt;&gt; But my bug isn't fixed?
 &gt;&gt; ==================
 &gt;&gt; In order to make timely releases, we will typically not hold the
 &gt;&gt; release unless the bug in question is a regression from the previous
 &gt;&gt; release. That being said, if there is something which is a regression
 &gt;&gt; that has not been correctly targeted please ping me or a committer to
 &gt;&gt; help target the issue.
 &gt;&gt;
 &gt;&gt; ---------------------------------------------------------------------
 &gt;&gt; To unsubscribe e-mail:  dev-unsubscribe@spark.apache.org
 &gt;&gt;
 
 ---------------------------------------------------------------------
 To unsubscribe e-mail:  dev-unsubscribe@spark.apache.org

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by Wenchen Fan <cl...@gmail.com>.
+1

On Thu, Nov 17, 2022 at 10:20 AM Yang,Jie(INF) <ya...@baidu.com> wrote:

> +1,non-binding
>
>
>
> The test combination of Java 11 + Scala 2.12 and Java 11 + Scala 2.13 has
> passed.
>
>
>
> Yang Jie
>
>
>
> *发件人**: *Chris Nauroth <cn...@apache.org>
> *日期**: *2022年11月17日 星期四 04:27
> *收件人**: *Yuming Wang <wg...@gmail.com>
> *抄送**: *"Yang,Jie(INF)" <ya...@baidu.com>, Dongjoon Hyun <
> dongjoon.hyun@gmail.com>, huaxin gao <hu...@gmail.com>, "L. C.
> Hsieh" <vi...@gmail.com>, Chao Sun <su...@apache.org>, dev <
> dev@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> +1 (non-binding)
>
> * Verified all checksums.
> * Verified all signatures.
> * Built from source, with multiple profiles, to full success, for Java 11
> and Scala 2.12:
>     * build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3 -Phive-thriftserver
> -Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests clean package
> * Tests passed.
> * Ran several examples successfully:
>     * bin/spark-submit --class org.apache.spark.examples.SparkPi
> examples/jars/spark-examples_2.12-3.2.3.jar
>     * bin/spark-submit --class
> org.apache.spark.examples.sql.hive.SparkHiveExample
> examples/jars/spark-examples_2.12-3.2.3.jar
>     * bin/spark-submit
> examples/src/main/python/streaming/network_wordcount.py localhost 9999
>
>
>
> Chao, thank you for preparing the release.
>
>
>
> Chris Nauroth
>
>
>
>
>
> On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang <wg...@gmail.com> wrote:
>
> +1
>
>
>
> On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) <ya...@baidu.com> wrote:
>
> I switched Scala 2.13 to Scala 2.12 today. The test is still in progress
> and it has not been hung.
>
>
>
> Yang Jie
>
>
>
> *发件人**: *Dongjoon Hyun <do...@gmail.com>
> *日期**: *2022年11月16日 星期三 01:17
> *收件人**: *"Yang,Jie(INF)" <ya...@baidu.com>
> *抄送**: *huaxin gao <hu...@gmail.com>, "L. C. Hsieh" <
> viirya@gmail.com>, Chao Sun <su...@apache.org>, dev <
> dev@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> Did you hit that in Scala 2.12, too?
>
>
>
> Dongjoon.
>
>
>
> On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <ya...@baidu.com> wrote:
>
> Hi, all
>
>
>
> I test v3.2.3 with following command:
>
>
>
> ```
>
> dev/change-scala-version.sh 2.13
>
> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
> -Pscala-2.13 -fn
>
> ```
>
>
>
> The testing environment is:
>
>
>
> OS: CentOS 6u3 Final
>
> Java: zulu 11.0.17
>
> Python: 3.9.7
>
> Scala: 2.13
>
>
>
> The above test command has been executed twice, and all times hang in the
> following stack:
>
>
>
> ```
>
> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
> elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition
> [0x00007f2de3929000]
>
>    java.lang.Thread.State: WAITING (parking)
>
>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>
>        - parking to wait for  <0x0000000790d00050> (a
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>
>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
> /LockSupport.java:194)
>
>        at
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
> /AbstractQueuedSynchronizer.java:2081)
>
>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
> /LinkedBlockingQueue.java:433)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
> Source)
>
>        at
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>
>        - locked <0x0000000790d00208> (a java.lang.Object)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>
>        at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
> Source)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>
>        at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
> Source)
>
>        at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>
>        at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>
>        - locked <0x0000000790d00218> (a
> org.apache.spark.sql.execution.QueryExecution)
>
>        at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>
>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>
>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
>
>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>
>        at
> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>
>        at
> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
> Source)
>
>        at
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>
>        at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>
>        at
> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>
>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
>
>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
>
>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
>
>        at
> org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
>
>        at
> org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
> Source)
>
>        at
> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
>
>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
>
>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
>
>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>
>        at org.scalatest.Transformer.apply(Transformer.scala:22)
>
>        at org.scalatest.Transformer.apply(Transformer.scala:20)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
>
>        at
> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
> Source)
>
>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
>
>        at org.apache.spark.SparkFunSuite.org
> <https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>
> $scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
>
>        at
> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
>
>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
> Source)
>
>        at
> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
>
>        at
> org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown
> Source)
>
>        at scala.collection.immutable.List.foreach(List.scala:333)
>
>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>
>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
>
>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
>
>        at
> org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
>
>        at org.scalatest.Suite.run(Suite.scala:1112)
>
>        at org.scalatest.Suite.run$(Suite.scala:1094)
>
>        at org.scalatest.funsuite.AnyFunSuite.org
> <https://mailshield.baidu.com/check?q=CyXfi1lNK3XEW4vrZ3%2bg3gQrijHc2pVnzFcFMt1SPMOwpGuNU2lkrpNxUMa%2br1jUF7fqtA%3d%3d>
> $scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
> Source)
>
>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>
>        at org.apache.spark.SparkFunSuite.org
> <https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>
> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>
>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>
>        at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>
>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>
>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>
>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>
>        at
> org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
>
>        at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>
>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>
>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>
>        at
> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>
>        at org.scalatest.Suite.run(Suite.scala:1109)
>
>        at org.scalatest.Suite.run$(Suite.scala:1094)
>
>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>
>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>
>        at
> org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown
> Source)
>
>        at scala.collection.immutable.List.foreach(List.scala:333)
>
>        at
> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>
>        at
> org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown
> Source)
>
>        at
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>
>        at
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>
>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
>
>        at org.scalatest.tools.Runner.main(Runner.scala)
>
> ```
>
> I think the test case being executed is `SPARK-28323: PythonUDF should be
> able to use in join condition`, does anyone have the same problem?
>
>
>
> Yang Jie
>
>
>
>
>
> *发件人**: *huaxin gao <hu...@gmail.com>
> *日期**: *2022年11月15日 星期二 13:59
> *收件人**: *"L. C. Hsieh" <vi...@gmail.com>
> *抄送**: *Dongjoon Hyun <do...@gmail.com>, Chao Sun <
> sunchao@apache.org>, dev <de...@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> +1
>
>
>
> Thanks Chao!
>
>
>
> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com> wrote:
>
> +1
>
> Thanks Chao.
>
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>
> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1
> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/
> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 3.2.3?
> >> ===========================================
> >> The current list of open tickets targeted at 3.2.3 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK
> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
> and search for "Target
> >> Version/s" = 3.2.3
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by "Yang,Jie(INF)" <ya...@baidu.com>.
+1,non-binding

The test combination of Java 11 + Scala 2.12 and Java 11 + Scala 2.13 has passed.

Yang Jie


发件人: Chris Nauroth <cn...@apache.org>
日期: 2022年11月17日 星期四 04:27
收件人: Yuming Wang <wg...@gmail.com>
抄送: "Yang,Jie(INF)" <ya...@baidu.com>, Dongjoon Hyun <do...@gmail.com>, huaxin gao <hu...@gmail.com>, "L. C. Hsieh" <vi...@gmail.com>, Chao Sun <su...@apache.org>, dev <de...@spark.apache.org>
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

+1 (non-binding)

* Verified all checksums.
* Verified all signatures.
* Built from source, with multiple profiles, to full success, for Java 11 and Scala 2.12:
    * build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3 -Phive-thriftserver -Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests clean package
* Tests passed.
* Ran several examples successfully:
    * bin/spark-submit --class org.apache.spark.examples.SparkPi examples/jars/spark-examples_2.12-3.2.3.jar
    * bin/spark-submit --class org.apache.spark.examples.sql.hive.SparkHiveExample examples/jars/spark-examples_2.12-3.2.3.jar
    * bin/spark-submit examples/src/main/python/streaming/network_wordcount.py localhost 9999

Chao, thank you for preparing the release.

Chris Nauroth


On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang <wg...@gmail.com>> wrote:
+1

On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) <ya...@baidu.com>> wrote:
I switched Scala 2.13 to Scala 2.12 today. The test is still in progress and it has not been hung.

Yang Jie

发件人: Dongjoon Hyun <do...@gmail.com>>
日期: 2022年11月16日 星期三 01:17
收件人: "Yang,Jie(INF)" <ya...@baidu.com>>
抄送: huaxin gao <hu...@gmail.com>>, "L. C. Hsieh" <vi...@gmail.com>>, Chao Sun <su...@apache.org>>, dev <de...@spark.apache.org>>
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

Did you hit that in Scala 2.12, too?

Dongjoon.

On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <ya...@baidu.com>> wrote:
Hi, all

I test v3.2.3 with following command:

```

dev/change-scala-version.sh 2.13
build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive  -Pscala-2.13 -fn
```

The testing environment is:

OS: CentOS 6u3 Final
Java: zulu 11.0.17
Python: 3.9.7
Scala: 2.13

The above test command has been executed twice, and all times hang in the following stack:

```
"ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition  [0x00007f2de3929000]
   java.lang.Thread.State: WAITING (parking)
       at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
       - parking to wait for  <0x0000000790d00050> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
       at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17/LockSupport.java:194)
       at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17/AbstractQueuedSynchronizer.java:2081)
       at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17/LinkedBlockingQueue.java:433)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown Source)
       at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
       - locked <0x0000000790d00208> (a java.lang.Object)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
       at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
       at org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown Source)
       at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
       at org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown Source)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
       at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
       at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
       at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
       - locked <0x0000000790d00218> (a org.apache.spark.sql.execution.QueryExecution)
       at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
       at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
       - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
       at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
       at org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
       at org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown Source)
       at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
       at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
       at org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
       at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
       at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
       at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
       at org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
       at org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown Source)
       at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
       at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
       at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
       at org.scalatest.Transformer.apply(Transformer.scala:22)
       at org.scalatest.Transformer.apply(Transformer.scala:20)
       at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
       at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
       at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown Source)
       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
       at org.apache.spark.SparkFunSuite.org<https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
       at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
       at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
       at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown Source)
       at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
       at org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown Source)
       at scala.collection.immutable.List.foreach(List.scala:333)
       at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
       at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
       at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
       at org.scalatest.Suite.run(Suite.scala:1112)
       at org.scalatest.Suite.run$(Suite.scala:1094)
       at org.scalatest.funsuite.AnyFunSuite.org<https://mailshield.baidu.com/check?q=CyXfi1lNK3XEW4vrZ3%2bg3gQrijHc2pVnzFcFMt1SPMOwpGuNU2lkrpNxUMa%2br1jUF7fqtA%3d%3d>$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown Source)
       at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
       at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
       at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
       at org.apache.spark.SparkFunSuite.org<https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
       at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
       at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
       at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
       at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
       at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
       at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
       at org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
       at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
       at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
       at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
       at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
       at org.scalatest.Suite.run(Suite.scala:1109)
       at org.scalatest.Suite.run$(Suite.scala:1094)
       at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
       at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
       at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
       at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
       at org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown Source)
       at scala.collection.immutable.List.foreach(List.scala:333)
       at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
       at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
       at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
       at org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown Source)
       at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
       at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
       at org.scalatest.tools.Runner$.main(Runner.scala:775)
       at org.scalatest.tools.Runner.main(Runner.scala)
```
I think the test case being executed is `SPARK-28323: PythonUDF should be able to use in join condition`, does anyone have the same problem?

Yang Jie


发件人: huaxin gao <hu...@gmail.com>>
日期: 2022年11月15日 星期二 13:59
收件人: "L. C. Hsieh" <vi...@gmail.com>>
抄送: Dongjoon Hyun <do...@gmail.com>>, Chao Sun <su...@apache.org>>, dev <de...@spark.apache.org>>
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

+1

Thanks Chao!

On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com>> wrote:
+1

Thanks Chao.

On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>> wrote:
>
> +1
>
> Thank you, Chao.
>
> On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org>> wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark version 3.2.3.
>>
>> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.3
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/<https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
>>
>> The tag to be voted on is v3.2.3-rc1 (commit
>> b53c341e0fefbb33d115ab630369a18765b7763d):
>> https://github.com/apache/spark/tree/v3.2.3-rc1<https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/<https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS<https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1431/<https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/<https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
>>
>> The list of bug fixes going into 3.2.3 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12352105<https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
>>
>> This release is using the release script of the tag v3.2.3-rc1.
>>
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 3.2.3?
>> ===========================================
>> The current list of open tickets targeted at 3.2.3 can be found at:
>> https://issues.apache.org/jira/projects/SPARK<https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d> and search for "Target
>> Version/s" = 3.2.3
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by Chris Nauroth <cn...@apache.org>.
+1 (non-binding)

* Verified all checksums.
* Verified all signatures.
* Built from source, with multiple profiles, to full success, for Java 11
and Scala 2.12:
    * build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3 -Phive-thriftserver
-Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests clean package
* Tests passed.
* Ran several examples successfully:
    * bin/spark-submit --class org.apache.spark.examples.SparkPi
examples/jars/spark-examples_2.12-3.2.3.jar
    * bin/spark-submit --class
org.apache.spark.examples.sql.hive.SparkHiveExample
examples/jars/spark-examples_2.12-3.2.3.jar
    * bin/spark-submit
examples/src/main/python/streaming/network_wordcount.py localhost 9999

Chao, thank you for preparing the release.

Chris Nauroth


On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang <wg...@gmail.com> wrote:

> +1
>
> On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) <ya...@baidu.com> wrote:
>
>> I switched Scala 2.13 to Scala 2.12 today. The test is still in progress
>> and it has not been hung.
>>
>>
>>
>> Yang Jie
>>
>>
>>
>> *发件人**: *Dongjoon Hyun <do...@gmail.com>
>> *日期**: *2022年11月16日 星期三 01:17
>> *收件人**: *"Yang,Jie(INF)" <ya...@baidu.com>
>> *抄送**: *huaxin gao <hu...@gmail.com>, "L. C. Hsieh" <
>> viirya@gmail.com>, Chao Sun <su...@apache.org>, dev <
>> dev@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> Did you hit that in Scala 2.12, too?
>>
>>
>>
>> Dongjoon.
>>
>>
>>
>> On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <ya...@baidu.com>
>> wrote:
>>
>> Hi, all
>>
>>
>>
>> I test v3.2.3 with following command:
>>
>>
>>
>> ```
>>
>> dev/change-scala-version.sh 2.13
>>
>> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
>> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
>> -Pscala-2.13 -fn
>>
>> ```
>>
>>
>>
>> The testing environment is:
>>
>>
>>
>> OS: CentOS 6u3 Final
>>
>> Java: zulu 11.0.17
>>
>> Python: 3.9.7
>>
>> Scala: 2.13
>>
>>
>>
>> The above test command has been executed twice, and all times hang in the
>> following stack:
>>
>>
>>
>> ```
>>
>> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
>> elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition
>> [0x00007f2de3929000]
>>
>>    java.lang.Thread.State: WAITING (parking)
>>
>>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>>
>>        - parking to wait for  <0x0000000790d00050> (a
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>
>>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
>> /LockSupport.java:194)
>>
>>        at
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
>> /AbstractQueuedSynchronizer.java:2081)
>>
>>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
>> /LinkedBlockingQueue.java:433)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>>
>>        - locked <0x0000000790d00208> (a java.lang.Object)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>>
>>        at
>> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>>
>>        - locked <0x0000000790d00218> (a
>> org.apache.spark.sql.execution.QueryExecution)
>>
>>        at
>> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>>
>>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>>
>>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
>>
>>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>>
>>        at
>> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>>
>>        at
>> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
>> Source)
>>
>>        at
>> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>>
>>        at
>> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>>
>>        at
>> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>>
>>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
>>
>>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
>>
>>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
>>
>>        at
>> org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
>>
>>        at
>> org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
>> Source)
>>
>>        at
>> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
>>
>>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
>>
>>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
>>
>>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>>
>>        at org.scalatest.Transformer.apply(Transformer.scala:22)
>>
>>        at org.scalatest.Transformer.apply(Transformer.scala:20)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
>>
>>        at
>> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
>> Source)
>>
>>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
>>
>>        at org.apache.spark.SparkFunSuite.org
>> <https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>
>> $scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
>>
>>        at
>> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
>>
>>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
>> Source)
>>
>>        at
>> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
>>
>>        at
>> org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown
>> Source)
>>
>>        at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>>
>>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
>>
>>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
>>
>>        at org.scalatest.Suite.run(Suite.scala:1112)
>>
>>        at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>        at org.scalatest.funsuite.AnyFunSuite.org
>> <https://mailshield.baidu.com/check?q=CyXfi1lNK3XEW4vrZ3%2bg3gQrijHc2pVnzFcFMt1SPMOwpGuNU2lkrpNxUMa%2br1jUF7fqtA%3d%3d>
>> $scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
>> Source)
>>
>>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>>
>>        at org.apache.spark.SparkFunSuite.org
>> <https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>
>> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>>
>>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>>
>>        at
>> org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>>
>>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>>
>>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>>
>>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>>
>>        at
>> org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
>>
>>        at
>> scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>>
>>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>>
>>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>>
>>        at
>> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>>
>>        at org.scalatest.Suite.run(Suite.scala:1109)
>>
>>        at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>>
>>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>>
>>        at
>> org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown
>> Source)
>>
>>        at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>        at
>> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>>
>>        at
>> org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown
>> Source)
>>
>>        at
>> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>>
>>        at
>> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>>
>>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
>>
>>        at org.scalatest.tools.Runner.main(Runner.scala)
>>
>> ```
>>
>> I think the test case being executed is `SPARK-28323: PythonUDF should be
>> able to use in join condition`, does anyone have the same problem?
>>
>>
>>
>> Yang Jie
>>
>>
>>
>>
>>
>> *发件人**: *huaxin gao <hu...@gmail.com>
>> *日期**: *2022年11月15日 星期二 13:59
>> *收件人**: *"L. C. Hsieh" <vi...@gmail.com>
>> *抄送**: *Dongjoon Hyun <do...@gmail.com>, Chao Sun <
>> sunchao@apache.org>, dev <de...@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> +1
>>
>>
>>
>> Thanks Chao!
>>
>>
>>
>> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com> wrote:
>>
>> +1
>>
>> Thanks Chao.
>>
>> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>
>> wrote:
>> >
>> > +1
>> >
>> > Thank you, Chao.
>> >
>> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
>> >>
>> >> Please vote on releasing the following candidate as Apache Spark
>> version 3.2.3.
>> >>
>> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 3.2.3
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see http://spark.apache.org/
>> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
>> >>
>> >> The tag to be voted on is v3.2.3-rc1 (commit
>> >> b53c341e0fefbb33d115ab630369a18765b7763d):
>> >> https://github.com/apache/spark/tree/v3.2.3-rc1
>> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
>> >>
>> >> The release files, including signatures, digests, etc. can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
>> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
>> >>
>> >> Signatures used for Spark RCs can be found in this file:
>> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
>> >>
>> >> The staging repository for this release can be found at:
>> >>
>> https://repository.apache.org/content/repositories/orgapachespark-1431/
>> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
>> >>
>> >> The documentation corresponding to this release can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
>> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
>> >>
>> >> The list of bug fixes going into 3.2.3 can be found at the following
>> URL:
>> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
>> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
>> >>
>> >> This release is using the release script of the tag v3.2.3-rc1.
>> >>
>> >>
>> >> FAQ
>> >>
>> >> =========================
>> >> How can I help test this release?
>> >> =========================
>> >> If you are a Spark user, you can help us test this release by taking
>> >> an existing Spark workload and running on this release candidate, then
>> >> reporting any regressions.
>> >>
>> >> If you're working in PySpark you can set up a virtual env and install
>> >> the current RC and see if anything important breaks, in the Java/Scala
>> >> you can add the staging repository to your projects resolvers and test
>> >> with the RC (make sure to clean up the artifact cache before/after so
>> >> you don't end up building with a out of date RC going forward).
>> >>
>> >> ===========================================
>> >> What should happen to JIRA tickets still targeting 3.2.3?
>> >> ===========================================
>> >> The current list of open tickets targeted at 3.2.3 can be found at:
>> >> https://issues.apache.org/jira/projects/SPARK
>> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
>> and search for "Target
>> >> Version/s" = 3.2.3
>> >>
>> >> Committers should look at those and triage. Extremely important bug
>> >> fixes, documentation, and API tweaks that impact compatibility should
>> >> be worked on immediately. Everything else please retarget to an
>> >> appropriate release.
>> >>
>> >> ==================
>> >> But my bug isn't fixed?
>> >> ==================
>> >> In order to make timely releases, we will typically not hold the
>> >> release unless the bug in question is a regression from the previous
>> >> release. That being said, if there is something which is a regression
>> >> that has not been correctly targeted please ping me or a committer to
>> >> help target the issue.
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> >>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by Yuming Wang <wg...@gmail.com>.
+1

On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) <ya...@baidu.com> wrote:

> I switched Scala 2.13 to Scala 2.12 today. The test is still in progress
> and it has not been hung.
>
>
>
> Yang Jie
>
>
>
> *发件人**: *Dongjoon Hyun <do...@gmail.com>
> *日期**: *2022年11月16日 星期三 01:17
> *收件人**: *"Yang,Jie(INF)" <ya...@baidu.com>
> *抄送**: *huaxin gao <hu...@gmail.com>, "L. C. Hsieh" <
> viirya@gmail.com>, Chao Sun <su...@apache.org>, dev <
> dev@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> Did you hit that in Scala 2.12, too?
>
>
>
> Dongjoon.
>
>
>
> On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <ya...@baidu.com> wrote:
>
> Hi, all
>
>
>
> I test v3.2.3 with following command:
>
>
>
> ```
>
> dev/change-scala-version.sh 2.13
>
> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
> -Pscala-2.13 -fn
>
> ```
>
>
>
> The testing environment is:
>
>
>
> OS: CentOS 6u3 Final
>
> Java: zulu 11.0.17
>
> Python: 3.9.7
>
> Scala: 2.13
>
>
>
> The above test command has been executed twice, and all times hang in the
> following stack:
>
>
>
> ```
>
> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
> elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition
> [0x00007f2de3929000]
>
>    java.lang.Thread.State: WAITING (parking)
>
>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>
>        - parking to wait for  <0x0000000790d00050> (a
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>
>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
> /LockSupport.java:194)
>
>        at
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
> /AbstractQueuedSynchronizer.java:2081)
>
>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
> /LinkedBlockingQueue.java:433)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
> Source)
>
>        at
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>
>        - locked <0x0000000790d00208> (a java.lang.Object)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>
>        at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
> Source)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>
>        at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
> Source)
>
>        at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>
>        at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>
>        - locked <0x0000000790d00218> (a
> org.apache.spark.sql.execution.QueryExecution)
>
>        at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>
>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>
>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
>
>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>
>        at
> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>
>        at
> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
> Source)
>
>        at
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>
>        at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>
>        at
> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>
>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
>
>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
>
>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
>
>        at
> org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
>
>        at
> org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
> Source)
>
>        at
> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
>
>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
>
>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
>
>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>
>        at org.scalatest.Transformer.apply(Transformer.scala:22)
>
>        at org.scalatest.Transformer.apply(Transformer.scala:20)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
>
>        at
> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
> Source)
>
>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
>
>        at org.apache.spark.SparkFunSuite.org
> <https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>
> $scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
>
>        at
> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
>
>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
> Source)
>
>        at
> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
>
>        at
> org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown
> Source)
>
>        at scala.collection.immutable.List.foreach(List.scala:333)
>
>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>
>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
>
>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
>
>        at
> org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
>
>        at org.scalatest.Suite.run(Suite.scala:1112)
>
>        at org.scalatest.Suite.run$(Suite.scala:1094)
>
>        at org.scalatest.funsuite.AnyFunSuite.org
> <https://mailshield.baidu.com/check?q=CyXfi1lNK3XEW4vrZ3%2bg3gQrijHc2pVnzFcFMt1SPMOwpGuNU2lkrpNxUMa%2br1jUF7fqtA%3d%3d>
> $scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
> Source)
>
>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>
>        at org.apache.spark.SparkFunSuite.org
> <https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>
> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>
>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>
>        at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>
>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>
>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>
>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>
>        at
> org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
>
>        at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>
>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>
>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>
>        at
> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>
>        at org.scalatest.Suite.run(Suite.scala:1109)
>
>        at org.scalatest.Suite.run$(Suite.scala:1094)
>
>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>
>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>
>        at
> org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown
> Source)
>
>        at scala.collection.immutable.List.foreach(List.scala:333)
>
>        at
> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>
>        at
> org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown
> Source)
>
>        at
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>
>        at
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>
>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
>
>        at org.scalatest.tools.Runner.main(Runner.scala)
>
> ```
>
> I think the test case being executed is `SPARK-28323: PythonUDF should be
> able to use in join condition`, does anyone have the same problem?
>
>
>
> Yang Jie
>
>
>
>
>
> *发件人**: *huaxin gao <hu...@gmail.com>
> *日期**: *2022年11月15日 星期二 13:59
> *收件人**: *"L. C. Hsieh" <vi...@gmail.com>
> *抄送**: *Dongjoon Hyun <do...@gmail.com>, Chao Sun <
> sunchao@apache.org>, dev <de...@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> +1
>
>
>
> Thanks Chao!
>
>
>
> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com> wrote:
>
> +1
>
> Thanks Chao.
>
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>
> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1
> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/
> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 3.2.3?
> >> ===========================================
> >> The current list of open tickets targeted at 3.2.3 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK
> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
> and search for "Target
> >> Version/s" = 3.2.3
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by "Yang,Jie(INF)" <ya...@baidu.com>.
I switched Scala 2.13 to Scala 2.12 today. The test is still in progress and it has not been hung.

Yang Jie

发件人: Dongjoon Hyun <do...@gmail.com>
日期: 2022年11月16日 星期三 01:17
收件人: "Yang,Jie(INF)" <ya...@baidu.com>
抄送: huaxin gao <hu...@gmail.com>, "L. C. Hsieh" <vi...@gmail.com>, Chao Sun <su...@apache.org>, dev <de...@spark.apache.org>
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

Did you hit that in Scala 2.12, too?

Dongjoon.

On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <ya...@baidu.com>> wrote:
Hi, all

I test v3.2.3 with following command:

```

dev/change-scala-version.sh 2.13
build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive  -Pscala-2.13 -fn
```

The testing environment is:

OS: CentOS 6u3 Final
Java: zulu 11.0.17
Python: 3.9.7
Scala: 2.13

The above test command has been executed twice, and all times hang in the following stack:

```
"ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition  [0x00007f2de3929000]
   java.lang.Thread.State: WAITING (parking)
       at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
       - parking to wait for  <0x0000000790d00050> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
       at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17/LockSupport.java:194)
       at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17/AbstractQueuedSynchronizer.java:2081)
       at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17/LinkedBlockingQueue.java:433)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown Source)
       at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
       - locked <0x0000000790d00208> (a java.lang.Object)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
       at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
       at org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown Source)
       at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
       at org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown Source)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
       at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
       at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
       at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
       - locked <0x0000000790d00218> (a org.apache.spark.sql.execution.QueryExecution)
       at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
       at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
       - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
       at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
       at org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
       at org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown Source)
       at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
       at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
       at org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
       at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
       at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
       at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
       at org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
       at org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown Source)
       at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
       at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
       at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
       at org.scalatest.Transformer.apply(Transformer.scala:22)
       at org.scalatest.Transformer.apply(Transformer.scala:20)
       at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
       at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
       at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown Source)
       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
       at org.apache.spark.SparkFunSuite.org<https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
       at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
       at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
       at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown Source)
       at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
       at org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown Source)
       at scala.collection.immutable.List.foreach(List.scala:333)
       at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
       at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
       at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
       at org.scalatest.Suite.run(Suite.scala:1112)
       at org.scalatest.Suite.run$(Suite.scala:1094)
       at org.scalatest.funsuite.AnyFunSuite.org<https://mailshield.baidu.com/check?q=CyXfi1lNK3XEW4vrZ3%2bg3gQrijHc2pVnzFcFMt1SPMOwpGuNU2lkrpNxUMa%2br1jUF7fqtA%3d%3d>$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown Source)
       at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
       at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
       at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
       at org.apache.spark.SparkFunSuite.org<https://mailshield.baidu.com/check?q=g7Y99OToCMn4N2%2fkSCzdHsZrBdM%2bveoiGIScN4zUn0Njh7UjGZk7toykI%2bX5TsXD>$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
       at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
       at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
       at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
       at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
       at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
       at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
       at org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
       at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
       at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
       at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
       at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
       at org.scalatest.Suite.run(Suite.scala:1109)
       at org.scalatest.Suite.run$(Suite.scala:1094)
       at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
       at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
       at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
       at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
       at org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown Source)
       at scala.collection.immutable.List.foreach(List.scala:333)
       at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
       at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
       at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
       at org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown Source)
       at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
       at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
       at org.scalatest.tools.Runner$.main(Runner.scala:775)
       at org.scalatest.tools.Runner.main(Runner.scala)
```
I think the test case being executed is `SPARK-28323: PythonUDF should be able to use in join condition`, does anyone have the same problem?

Yang Jie


发件人: huaxin gao <hu...@gmail.com>>
日期: 2022年11月15日 星期二 13:59
收件人: "L. C. Hsieh" <vi...@gmail.com>>
抄送: Dongjoon Hyun <do...@gmail.com>>, Chao Sun <su...@apache.org>>, dev <de...@spark.apache.org>>
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

+1

Thanks Chao!

On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com>> wrote:
+1

Thanks Chao.

On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>> wrote:
>
> +1
>
> Thank you, Chao.
>
> On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org>> wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark version 3.2.3.
>>
>> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.3
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/<https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
>>
>> The tag to be voted on is v3.2.3-rc1 (commit
>> b53c341e0fefbb33d115ab630369a18765b7763d):
>> https://github.com/apache/spark/tree/v3.2.3-rc1<https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/<https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS<https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1431/<https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/<https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
>>
>> The list of bug fixes going into 3.2.3 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12352105<https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
>>
>> This release is using the release script of the tag v3.2.3-rc1.
>>
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 3.2.3?
>> ===========================================
>> The current list of open tickets targeted at 3.2.3 can be found at:
>> https://issues.apache.org/jira/projects/SPARK<https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d> and search for "Target
>> Version/s" = 3.2.3
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by Dongjoon Hyun <do...@gmail.com>.
Did you hit that in Scala 2.12, too?

Dongjoon.

On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <ya...@baidu.com> wrote:

> Hi, all
>
>
>
> I test v3.2.3 with following command:
>
>
>
> ```
>
> dev/change-scala-version.sh 2.13
>
> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
> -Pscala-2.13 -fn
>
> ```
>
>
>
> The testing environment is:
>
>
>
> OS: CentOS 6u3 Final
>
> Java: zulu 11.0.17
>
> Python: 3.9.7
>
> Scala: 2.13
>
>
>
> The above test command has been executed twice, and all times hang in the
> following stack:
>
>
>
> ```
>
> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
> elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition
> [0x00007f2de3929000]
>
>    java.lang.Thread.State: WAITING (parking)
>
>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>
>        - parking to wait for  <0x0000000790d00050> (a
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>
>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
> /LockSupport.java:194)
>
>        at
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
> /AbstractQueuedSynchronizer.java:2081)
>
>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
> /LinkedBlockingQueue.java:433)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
> Source)
>
>        at
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>
>        - locked <0x0000000790d00208> (a java.lang.Object)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>
>        at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
> Source)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>
>        at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
> Source)
>
>        at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>
>        at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>
>        - locked <0x0000000790d00218> (a
> org.apache.spark.sql.execution.QueryExecution)
>
>        at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>
>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>
>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
>
>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>
>        at
> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>
>        at
> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
> Source)
>
>        at
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>
>        at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>
>        at
> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>
>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
>
>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
>
>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
>
>        at
> org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
>
>        at
> org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
> Source)
>
>        at
> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
>
>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
>
>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
>
>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>
>        at org.scalatest.Transformer.apply(Transformer.scala:22)
>
>        at org.scalatest.Transformer.apply(Transformer.scala:20)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
>
>        at
> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
> Source)
>
>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
>
>        at org.apache.spark.SparkFunSuite.org
> $scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
>
>        at
> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
>
>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
> Source)
>
>        at
> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
>
>        at
> org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown
> Source)
>
>        at scala.collection.immutable.List.foreach(List.scala:333)
>
>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>
>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
>
>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
>
>        at
> org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
>
>        at org.scalatest.Suite.run(Suite.scala:1112)
>
>        at org.scalatest.Suite.run$(Suite.scala:1094)
>
>        at org.scalatest.funsuite.AnyFunSuite.org
> $scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
> Source)
>
>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>
>        at org.apache.spark.SparkFunSuite.org
> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>
>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>
>        at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>
>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>
>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>
>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>
>        at
> org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
>
>        at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>
>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>
>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>
>        at
> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>
>        at org.scalatest.Suite.run(Suite.scala:1109)
>
>        at org.scalatest.Suite.run$(Suite.scala:1094)
>
>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>
>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>
>        at
> org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown
> Source)
>
>        at scala.collection.immutable.List.foreach(List.scala:333)
>
>        at
> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>
>        at
> org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown
> Source)
>
>        at
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>
>        at
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>
>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
>
>        at org.scalatest.tools.Runner.main(Runner.scala)
>
> ```
>
> I think the test case being executed is `SPARK-28323: PythonUDF should be
> able to use in join condition`, does anyone have the same problem?
>
>
>
> Yang Jie
>
>
>
>
>
> *发件人**: *huaxin gao <hu...@gmail.com>
> *日期**: *2022年11月15日 星期二 13:59
> *收件人**: *"L. C. Hsieh" <vi...@gmail.com>
> *抄送**: *Dongjoon Hyun <do...@gmail.com>, Chao Sun <
> sunchao@apache.org>, dev <de...@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> +1
>
>
>
> Thanks Chao!
>
>
>
> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com> wrote:
>
> +1
>
> Thanks Chao.
>
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>
> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1
> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/
> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 3.2.3?
> >> ===========================================
> >> The current list of open tickets targeted at 3.2.3 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK
> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
> and search for "Target
> >> Version/s" = 3.2.3
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by Mridul Muralidharan <mr...@gmail.com>.
+1

Signatures, digests, etc check out fine.
Checked out tag and build/tested with -Pyarn -Pmesos -Pkubernetes

Regards,
Mridul


On Tue, Nov 15, 2022 at 1:00 PM kazuyuki tanimura
<kt...@apple.com.invalid> wrote:

> +1 (non-binding)
>
> Thank you Chao
>
> Kazu
>
>
>  | Kazuyuki Tanimura | ktanimura@apple.com | +1-408-207-7176
>
> Apple Confidential and Proprietary Information
>
> This email and any attachments is privileged and contains confidential
> information intended only for the recipient(s) named above. Any
> other distribution, forwarding, copying or disclosure of this message is
> strictly prohibited. If you have received this email in error, please
> notify me immediately by telephone or return email, and delete this message
> from your system.
>
> On Nov 15, 2022, at 10:04 AM, Sean Owen <sr...@gmail.com> wrote:
>
> +1 from me, at least from my testing. Java 8 + Scala 2.12 and Java 8 +
> Scala 2.13 worked for me, and I didn't see a test hang. I am testing with
> Python 3.10 FWIW.
>
> On Tue, Nov 15, 2022 at 6:37 AM Yang,Jie(INF) <ya...@baidu.com> wrote:
>
>> Hi, all
>>
>>
>>
>> I test v3.2.3 with following command:
>>
>>
>>
>> ```
>>
>> dev/change-scala-version.sh 2.13
>>
>> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
>> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
>> -Pscala-2.13 -fn
>>
>> ```
>>
>>
>>
>> The testing environment is:
>>
>>
>>
>> OS: CentOS 6u3 Final
>>
>> Java: zulu 11.0.17
>>
>> Python: 3.9.7
>>
>> Scala: 2.13
>>
>>
>>
>> The above test command has been executed twice, and all times hang in the
>> following stack:
>>
>>
>>
>> ```
>>
>> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
>> elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition
>> [0x00007f2de3929000]
>>
>>    java.lang.Thread.State: WAITING (parking)
>>
>>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>>
>>        - parking to wait for  <0x0000000790d00050> (a
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>
>>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
>> /LockSupport.java:194)
>>
>>        at
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
>> /AbstractQueuedSynchronizer.java:2081)
>>
>>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
>> /LinkedBlockingQueue.java:433)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>>
>>        - locked <0x0000000790d00208> (a java.lang.Object)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>>
>>        at
>> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>>
>>        - locked <0x0000000790d00218> (a
>> org.apache.spark.sql.execution.QueryExecution)
>>
>>        at
>> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>>
>>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>>
>>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
>>
>>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>>
>>        at
>> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>>
>>        at
>> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
>> Source)
>>
>>        at
>> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>>
>>        at
>> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>>
>>        at
>> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>>
>>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
>>
>>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
>>
>>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
>>
>>        at
>> org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
>>
>>        at
>> org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
>> Source)
>>
>>        at
>> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
>>
>>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
>>
>>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
>>
>>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>>
>>        at org.scalatest.Transformer.apply(Transformer.scala:22)
>>
>>        at org.scalatest.Transformer.apply(Transformer.scala:20)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
>>
>>        at
>> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
>> Source)
>>
>>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
>>
>>        at org.apache.spark.SparkFunSuite.org
>> <http://org.apache.spark.sparkfunsuite.org/>
>> $scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
>>
>>        at
>> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
>>
>>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
>> Source)
>>
>>        at
>> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
>>
>>        at
>> org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown
>> Source)
>>
>>        at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>>
>>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
>>
>>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
>>
>>        at org.scalatest.Suite.run(Suite.scala:1112)
>>
>>        at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>        at org.scalatest.funsuite.AnyFunSuite.org
>> <http://org.scalatest.funsuite.anyfunsuite.org/>
>> $scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
>> Source)
>>
>>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>>
>>        at org.apache.spark.SparkFunSuite.org
>> <http://org.apache.spark.sparkfunsuite.org/>
>> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>>
>>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>>
>>        at
>> org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>>
>>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>>
>>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>>
>>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>>
>>        at
>> org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
>>
>>        at
>> scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>>
>>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>>
>>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>>
>>        at
>> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>>
>>        at org.scalatest.Suite.run(Suite.scala:1109)
>>
>>        at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>>
>>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>>
>>        at
>> org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown
>> Source)
>>
>>        at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>        at
>> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>>
>>        at
>> org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown
>> Source)
>>
>>        at
>> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>>
>>        at
>> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>>
>>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
>>
>>        at org.scalatest.tools.Runner.main(Runner.scala)
>>
>> ```
>>
>> I think the test case being executed is `SPARK-28323: PythonUDF should be
>> able to use in join condition`, does anyone have the same problem?
>>
>>
>>
>> Yang Jie
>>
>>
>>
>>
>>
>> *发件人**: *huaxin gao <hu...@gmail.com>
>> *日期**: *2022年11月15日 星期二 13:59
>> *收件人**: *"L. C. Hsieh" <vi...@gmail.com>
>> *抄送**: *Dongjoon Hyun <do...@gmail.com>, Chao Sun <
>> sunchao@apache.org>, dev <de...@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> +1
>>
>>
>>
>> Thanks Chao!
>>
>>
>>
>> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com> wrote:
>>
>> +1
>>
>> Thanks Chao.
>>
>> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>
>> wrote:
>> >
>> > +1
>> >
>> > Thank you, Chao.
>> >
>> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
>> >>
>> >> Please vote on releasing the following candidate as Apache Spark
>> version 3.2.3.
>> >>
>> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 3.2.3
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see http://spark.apache.org/
>> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
>> >>
>> >> The tag to be voted on is v3.2.3-rc1 (commit
>> >> b53c341e0fefbb33d115ab630369a18765b7763d):
>> >> https://github.com/apache/spark/tree/v3.2.3-rc1
>> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
>> >>
>> >> The release files, including signatures, digests, etc. can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
>> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
>> >>
>> >> Signatures used for Spark RCs can be found in this file:
>> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
>> >>
>> >> The staging repository for this release can be found at:
>> >>
>> https://repository.apache.org/content/repositories/orgapachespark-1431/
>> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
>> >>
>> >> The documentation corresponding to this release can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
>> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
>> >>
>> >> The list of bug fixes going into 3.2.3 can be found at the following
>> URL:
>> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
>> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
>> >>
>> >> This release is using the release script of the tag v3.2.3-rc1.
>> >>
>> >>
>> >> FAQ
>> >>
>> >> =========================
>> >> How can I help test this release?
>> >> =========================
>> >> If you are a Spark user, you can help us test this release by taking
>> >> an existing Spark workload and running on this release candidate, then
>> >> reporting any regressions.
>> >>
>> >> If you're working in PySpark you can set up a virtual env and install
>> >> the current RC and see if anything important breaks, in the Java/Scala
>> >> you can add the staging repository to your projects resolvers and test
>> >> with the RC (make sure to clean up the artifact cache before/after so
>> >> you don't end up building with a out of date RC going forward).
>> >>
>> >> ===========================================
>> >> What should happen to JIRA tickets still targeting 3.2.3?
>> >> ===========================================
>> >> The current list of open tickets targeted at 3.2.3 can be found at:
>> >> https://issues.apache.org/jira/projects/SPARK
>> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
>> and search for "Target
>> >> Version/s" = 3.2.3
>> >>
>> >> Committers should look at those and triage. Extremely important bug
>> >> fixes, documentation, and API tweaks that impact compatibility should
>> >> be worked on immediately. Everything else please retarget to an
>> >> appropriate release.
>> >>
>> >> ==================
>> >> But my bug isn't fixed?
>> >> ==================
>> >> In order to make timely releases, we will typically not hold the
>> >> release unless the bug in question is a regression from the previous
>> >> release. That being said, if there is something which is a regression
>> >> that has not been correctly targeted please ping me or a committer to
>> >> help target the issue.
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>> >>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>
>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by kazuyuki tanimura <kt...@apple.com.INVALID>.
+1 (non-binding)

Thank you Chao

Kazu


 | Kazuyuki Tanimura | ktanimura@apple.com | +1-408-207-7176

Apple Confidential and Proprietary Information

This email and any attachments is privileged and contains confidential information intended only for the recipient(s) named above. Any other distribution, forwarding, copying or disclosure of this message is strictly prohibited. If you have received this email in error, please notify me immediately by telephone or return email, and delete this message from your system.

> On Nov 15, 2022, at 10:04 AM, Sean Owen <sr...@gmail.com> wrote:
> 
> +1 from me, at least from my testing. Java 8 + Scala 2.12 and Java 8 + Scala 2.13 worked for me, and I didn't see a test hang. I am testing with Python 3.10 FWIW.
> 
> On Tue, Nov 15, 2022 at 6:37 AM Yang,Jie(INF) <yangjie01@baidu.com <ma...@baidu.com>> wrote:
> Hi, all
> 
>  
> 
> I test v3.2.3 with following command:
> 
>  
> 
> ```
> 
> dev/change-scala-version.sh 2.13
> 
> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive  -Pscala-2.13 -fn
> 
> ```
> 
>  
> 
> The testing environment is:
> 
>  
> 
> OS: CentOS 6u3 Final
> 
> Java: zulu 11.0.17
> 
> Python: 3.9.7
> 
> Scala: 2.13
> 
>  
> 
> The above test command has been executed twice, and all times hang in the following stack:
> 
>  
> 
> ```
> 
> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition  [0x00007f2de3929000]
> 
>    java.lang.Thread.State: WAITING (parking)
> 
>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
> 
>        - parking to wait for  <0x0000000790d00050> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
> 
>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17/LockSupport.java:194)
> 
>        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17/AbstractQueuedSynchronizer.java:2081)
> 
>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17/LinkedBlockingQueue.java:433)
> 
>        at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
> 
>        at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown Source)
> 
>        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
> 
>        at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
> 
>        - locked <0x0000000790d00208> (a java.lang.Object)
> 
>        at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
> 
>        at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
> 
>        at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
> 
>        at org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown Source)
> 
>        at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
> 
>        at org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown Source)
> 
>        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> 
>        at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
> 
>        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
> 
>        at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
> 
>        - locked <0x0000000790d00218> (a org.apache.spark.sql.execution.QueryExecution)
> 
>        at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
> 
>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
> 
>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
> 
>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
> 
>        at org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
> 
>        at org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown Source)
> 
>        at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
> 
>        at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
> 
>        at org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
> 
>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
> 
>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
> 
>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
> 
>        at org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
> 
>        at org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown Source)
> 
>        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
> 
>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
> 
>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
> 
>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> 
>        at org.scalatest.Transformer.apply(Transformer.scala:22)
> 
>        at org.scalatest.Transformer.apply(Transformer.scala:20)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
> 
>        at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown Source)
> 
>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
> 
>        at org.apache.spark.SparkFunSuite.org <http://org.apache.spark.sparkfunsuite.org/>$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
> 
>        at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
> 
>        at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
> 
>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown Source)
> 
>        at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
> 
>        at org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown Source)
> 
>        at scala.collection.immutable.List.foreach(List.scala:333)
> 
>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
> 
>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
> 
>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
> 
>        at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
> 
>        at org.scalatest.Suite.run(Suite.scala:1112)
> 
>        at org.scalatest.Suite.run$(Suite.scala:1094)
> 
>        at org.scalatest.funsuite.AnyFunSuite.org <http://org.scalatest.funsuite.anyfunsuite.org/>$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown Source)
> 
>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
> 
>        at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
> 
>        at org.apache.spark.SparkFunSuite.org <http://org.apache.spark.sparkfunsuite.org/>$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
> 
>        at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
> 
>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
> 
>        at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
> 
>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
> 
>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
> 
>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
> 
>        at org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
> 
>        at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
> 
>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
> 
>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
> 
>        at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
> 
>        at org.scalatest.Suite.run(Suite.scala:1109)
> 
>        at org.scalatest.Suite.run$(Suite.scala:1094)
> 
>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
> 
>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
> 
>        at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
> 
>        at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
> 
>        at org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown Source)
> 
>        at scala.collection.immutable.List.foreach(List.scala:333)
> 
>        at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
> 
>        at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
> 
>        at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
> 
>        at org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown Source)
> 
>        at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
> 
>        at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
> 
>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
> 
>        at org.scalatest.tools.Runner.main(Runner.scala)
> 
> ```
> 
> I think the test case being executed is `SPARK-28323: PythonUDF should be able to use in join condition`, does anyone have the same problem?
> 
>  
> 
> Yang Jie
> 
>  
> 
>  
> 
> 发件人: huaxin gao <huaxin.gao11@gmail.com <ma...@gmail.com>>
> 日期: 2022年11月15日 星期二 13:59
> 收件人: "L. C. Hsieh" <viirya@gmail.com <ma...@gmail.com>>
> 抄送: Dongjoon Hyun <dongjoon.hyun@gmail.com <ma...@gmail.com>>, Chao Sun <sunchao@apache.org <ma...@apache.org>>, dev <dev@spark.apache.org <ma...@spark.apache.org>>
> 主题: Re: [VOTE] Release Spark 3.2.3 (RC1)
> 
>  
> 
> +1 
> 
>  
> 
> Thanks Chao!
> 
>  
> 
> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <viirya@gmail.com <ma...@gmail.com>> wrote:
> 
> +1
> 
> Thanks Chao.
> 
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <dongjoon.hyun@gmail.com <ma...@gmail.com>> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <sunchao@apache.org <ma...@apache.org>> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/ <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1 <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/ <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/ <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/ <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105 <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 3.2.3?
> >> ===========================================
> >> The current list of open tickets targeted at 3.2.3 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d> and search for "Target
> >> Version/s" = 3.2.3
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org <ma...@spark.apache.org>
> >>
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org <ma...@spark.apache.org>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by Sean Owen <sr...@gmail.com>.
+1 from me, at least from my testing. Java 8 + Scala 2.12 and Java 8 +
Scala 2.13 worked for me, and I didn't see a test hang. I am testing with
Python 3.10 FWIW.

On Tue, Nov 15, 2022 at 6:37 AM Yang,Jie(INF) <ya...@baidu.com> wrote:

> Hi, all
>
>
>
> I test v3.2.3 with following command:
>
>
>
> ```
>
> dev/change-scala-version.sh 2.13
>
> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
> -Pscala-2.13 -fn
>
> ```
>
>
>
> The testing environment is:
>
>
>
> OS: CentOS 6u3 Final
>
> Java: zulu 11.0.17
>
> Python: 3.9.7
>
> Scala: 2.13
>
>
>
> The above test command has been executed twice, and all times hang in the
> following stack:
>
>
>
> ```
>
> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
> elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition
> [0x00007f2de3929000]
>
>    java.lang.Thread.State: WAITING (parking)
>
>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>
>        - parking to wait for  <0x0000000790d00050> (a
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>
>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
> /LockSupport.java:194)
>
>        at
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
> /AbstractQueuedSynchronizer.java:2081)
>
>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
> /LinkedBlockingQueue.java:433)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
> Source)
>
>        at
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>
>        - locked <0x0000000790d00208> (a java.lang.Object)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>
>        at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>
>        at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
> Source)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>
>        at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
> Source)
>
>        at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>
>        at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>
>        at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>
>        - locked <0x0000000790d00218> (a
> org.apache.spark.sql.execution.QueryExecution)
>
>        at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>
>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>
>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
>
>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>
>        at
> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>
>        at
> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
> Source)
>
>        at
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>
>        at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>
>        at
> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>
>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
>
>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
>
>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
>
>        at
> org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
>
>        at
> org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
> Source)
>
>        at
> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
>
>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
>
>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
>
>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>
>        at org.scalatest.Transformer.apply(Transformer.scala:22)
>
>        at org.scalatest.Transformer.apply(Transformer.scala:20)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
>
>        at
> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
> Source)
>
>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
>
>        at org.apache.spark.SparkFunSuite.org
> $scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
>
>        at
> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
>
>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
> Source)
>
>        at
> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
>
>        at
> org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown
> Source)
>
>        at scala.collection.immutable.List.foreach(List.scala:333)
>
>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>
>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
>
>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
>
>        at
> org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
>
>        at org.scalatest.Suite.run(Suite.scala:1112)
>
>        at org.scalatest.Suite.run$(Suite.scala:1094)
>
>        at org.scalatest.funsuite.AnyFunSuite.org
> $scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
> Source)
>
>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>
>        at
> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>
>        at org.apache.spark.SparkFunSuite.org
> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>
>        at
> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>
>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>
>        at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>
>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>
>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>
>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>
>        at
> org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
>
>        at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>
>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>
>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>
>        at
> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>
>        at org.scalatest.Suite.run(Suite.scala:1109)
>
>        at org.scalatest.Suite.run$(Suite.scala:1094)
>
>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>
>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>
>        at
> org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown
> Source)
>
>        at scala.collection.immutable.List.foreach(List.scala:333)
>
>        at
> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>
>        at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>
>        at
> org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown
> Source)
>
>        at
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>
>        at
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>
>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
>
>        at org.scalatest.tools.Runner.main(Runner.scala)
>
> ```
>
> I think the test case being executed is `SPARK-28323: PythonUDF should be
> able to use in join condition`, does anyone have the same problem?
>
>
>
> Yang Jie
>
>
>
>
>
> *发件人**: *huaxin gao <hu...@gmail.com>
> *日期**: *2022年11月15日 星期二 13:59
> *收件人**: *"L. C. Hsieh" <vi...@gmail.com>
> *抄送**: *Dongjoon Hyun <do...@gmail.com>, Chao Sun <
> sunchao@apache.org>, dev <de...@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> +1
>
>
>
> Thanks Chao!
>
>
>
> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com> wrote:
>
> +1
>
> Thanks Chao.
>
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>
> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1
> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/
> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 3.2.3?
> >> ===========================================
> >> The current list of open tickets targeted at 3.2.3 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK
> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
> and search for "Target
> >> Version/s" = 3.2.3
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by "Yang,Jie(INF)" <ya...@baidu.com>.
Hi, all

I test v3.2.3 with following command:

```

dev/change-scala-version.sh 2.13
build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive  -Pscala-2.13 -fn
```

The testing environment is:

OS: CentOS 6u3 Final
Java: zulu 11.0.17
Python: 3.9.7
Scala: 2.13

The above test command has been executed twice, and all times hang in the following stack:

```
"ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition  [0x00007f2de3929000]
   java.lang.Thread.State: WAITING (parking)
       at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
       - parking to wait for  <0x0000000790d00050> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
       at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17/LockSupport.java:194)
       at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17/AbstractQueuedSynchronizer.java:2081)
       at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17/LinkedBlockingQueue.java:433)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown Source)
       at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
       - locked <0x0000000790d00208> (a java.lang.Object)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
       at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
       at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
       at org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown Source)
       at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
       at org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown Source)
       at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
       at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
       at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
       at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
       - locked <0x0000000790d00218> (a org.apache.spark.sql.execution.QueryExecution)
       at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
       at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
       - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
       at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
       at org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
       at org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown Source)
       at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
       at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
       at org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
       at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
       at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
       at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
       at org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
       at org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown Source)
       at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
       at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
       at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
       at org.scalatest.Transformer.apply(Transformer.scala:22)
       at org.scalatest.Transformer.apply(Transformer.scala:20)
       at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
       at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
       at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown Source)
       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
       at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
       at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
       at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
       at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown Source)
       at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
       at org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown Source)
       at scala.collection.immutable.List.foreach(List.scala:333)
       at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
       at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
       at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
       at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
       at org.scalatest.Suite.run(Suite.scala:1112)
       at org.scalatest.Suite.run$(Suite.scala:1094)
       at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
       at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
       at org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown Source)
       at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
       at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
       at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
       at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
       at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
       at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
       at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
       at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
       at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
       at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
       at org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
       at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
       at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
       at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
       at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
       at org.scalatest.Suite.run(Suite.scala:1109)
       at org.scalatest.Suite.run$(Suite.scala:1094)
       at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
       at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
       at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
       at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
       at org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown Source)
       at scala.collection.immutable.List.foreach(List.scala:333)
       at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
       at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
       at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
       at org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown Source)
       at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
       at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
       at org.scalatest.tools.Runner$.main(Runner.scala:775)
       at org.scalatest.tools.Runner.main(Runner.scala)
```
I think the test case being executed is `SPARK-28323: PythonUDF should be able to use in join condition`, does anyone have the same problem?

Yang Jie


发件人: huaxin gao <hu...@gmail.com>
日期: 2022年11月15日 星期二 13:59
收件人: "L. C. Hsieh" <vi...@gmail.com>
抄送: Dongjoon Hyun <do...@gmail.com>, Chao Sun <su...@apache.org>, dev <de...@spark.apache.org>
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

+1

Thanks Chao!

On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com>> wrote:
+1

Thanks Chao.

On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>> wrote:
>
> +1
>
> Thank you, Chao.
>
> On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org>> wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark version 3.2.3.
>>
>> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.3
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/<https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
>>
>> The tag to be voted on is v3.2.3-rc1 (commit
>> b53c341e0fefbb33d115ab630369a18765b7763d):
>> https://github.com/apache/spark/tree/v3.2.3-rc1<https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/<https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS<https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1431/<https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/<https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
>>
>> The list of bug fixes going into 3.2.3 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12352105<https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
>>
>> This release is using the release script of the tag v3.2.3-rc1.
>>
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 3.2.3?
>> ===========================================
>> The current list of open tickets targeted at 3.2.3 can be found at:
>> https://issues.apache.org/jira/projects/SPARK<https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d> and search for "Target
>> Version/s" = 3.2.3
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org<ma...@spark.apache.org>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by huaxin gao <hu...@gmail.com>.
+1

Thanks Chao!

On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vi...@gmail.com> wrote:

> +1
>
> Thanks Chao.
>
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com>
> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =========================
> >> How can I help test this release?
> >> =========================
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===========================================
> >> What should happen to JIRA tickets still targeting 3.2.3?
> >> ===========================================
> >> The current list of open tickets targeted at 3.2.3 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK and search for "Target
> >> Version/s" = 3.2.3
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==================
> >> But my bug isn't fixed?
> >> ==================
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by "L. C. Hsieh" <vi...@gmail.com>.
+1

Thanks Chao.

On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <do...@gmail.com> wrote:
>
> +1
>
> Thank you, Chao.
>
> On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark version 3.2.3.
>>
>> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.3
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.2.3-rc1 (commit
>> b53c341e0fefbb33d115ab630369a18765b7763d):
>> https://github.com/apache/spark/tree/v3.2.3-rc1
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1431/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
>>
>> The list of bug fixes going into 3.2.3 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12352105
>>
>> This release is using the release script of the tag v3.2.3-rc1.
>>
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 3.2.3?
>> ===========================================
>> The current list of open tickets targeted at 3.2.3 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.2.3
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Spark 3.2.3 (RC1)

Posted by Dongjoon Hyun <do...@gmail.com>.
+1

Thank you, Chao.

On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <su...@apache.org> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 3.2.3.
>
> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.2.3
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.2.3-rc1 (commit
> b53c341e0fefbb33d115ab630369a18765b7763d):
> https://github.com/apache/spark/tree/v3.2.3-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1431/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
>
> The list of bug fixes going into 3.2.3 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12352105
>
> This release is using the release script of the tag v3.2.3-rc1.
>
>
> FAQ
>
> =========================
> How can I help test this release?
> =========================
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 3.2.3?
> ===========================================
> The current list of open tickets targeted at 3.2.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.2.3
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==================
> But my bug isn't fixed?
> ==================
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>