You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by ji...@apache.org on 2020/05/22 06:35:38 UTC
[spark] branch master updated: [SPARK-31784][CORE][TEST] Fix test
BarrierTaskContextSuite."share messages with allGather() call"
This is an automated email from the ASF dual-hosted git repository.
jiangxb1987 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 83d0967 [SPARK-31784][CORE][TEST] Fix test BarrierTaskContextSuite."share messages with allGather() call"
83d0967 is described below
commit 83d0967dcc6b205a3fd2003e051f49733f63cb30
Author: yi.wu <yi...@databricks.com>
AuthorDate: Thu May 21 23:34:11 2020 -0700
[SPARK-31784][CORE][TEST] Fix test BarrierTaskContextSuite."share messages with allGather() call"
### What changes were proposed in this pull request?
Change from `messages.toList.iterator` to `Iterator.single(messages.toList)`.
### Why are the changes needed?
In this test, the expected result of `rdd2.collect().head` should actually be `List("0", "1", "2", "3")` but is `"0"` now.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Updated test.
Thanks WeichenXu123 reported this problem.
Closes #28596 from Ngone51/fix_allgather_test.
Authored-by: yi.wu <yi...@databricks.com>
Signed-off-by: Xingbo Jiang <xi...@databricks.com>
---
.../org/apache/spark/scheduler/BarrierTaskContextSuite.scala | 10 +++++-----
1 file changed, 5 insertions(+), 5 deletions(-)
diff --git a/core/src/test/scala/org/apache/spark/scheduler/BarrierTaskContextSuite.scala b/core/src/test/scala/org/apache/spark/scheduler/BarrierTaskContextSuite.scala
index b5614b2..6191e41 100644
--- a/core/src/test/scala/org/apache/spark/scheduler/BarrierTaskContextSuite.scala
+++ b/core/src/test/scala/org/apache/spark/scheduler/BarrierTaskContextSuite.scala
@@ -69,12 +69,12 @@ class BarrierTaskContextSuite extends SparkFunSuite with LocalSparkContext with
// Pass partitionId message in
val message: String = context.partitionId().toString
val messages: Array[String] = context.allGather(message)
- messages.toList.iterator
+ Iterator.single(messages.toList)
}
- // Take a sorted list of all the partitionId messages
- val messages = rdd2.collect().head
- // All the task partitionIds are shared
- for((x, i) <- messages.view.zipWithIndex) assert(x.toString == i.toString)
+ val messages = rdd2.collect()
+ // All the task partitionIds are shared across all tasks
+ assert(messages.length === 4)
+ assert(messages.forall(_ == List("0", "1", "2", "3")))
}
test("throw exception if we attempt to synchronize with different blocking calls") {
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org