You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/09/22 04:41:27 UTC

[GitHub] [spark] Ngone51 commented on a change in pull request #29754: [SPARK-32875][CORE][TEST] TaskSchedulerImplSuite: For the pattern of submitTasks + resourceOffers + assert, extract the general method.

Ngone51 commented on a change in pull request #29754:
URL: https://github.com/apache/spark/pull/29754#discussion_r492469232



##########
File path: core/src/test/scala/org/apache/spark/scheduler/TaskSchedulerImplSuite.scala
##########
@@ -140,6 +140,16 @@ class TaskSchedulerImplSuite extends SparkFunSuite with LocalSparkContext with B
     taskScheduler
   }
 
+  private def submitTasksAndCheck(
+      offers: IndexedSeq[WorkerOffer],
+      taskSets: Seq[TaskSet] = Seq(FakeTask.createTaskSet(1)),
+      taskSchedulerOpt: Option[TaskSchedulerImpl] = None) (f: Seq[TaskDescription] => Any): Any = {

Review comment:
       We can use the global `taskScheduler` as the default value?

##########
File path: core/src/test/scala/org/apache/spark/scheduler/TaskSchedulerImplSuite.scala
##########
@@ -140,6 +140,16 @@ class TaskSchedulerImplSuite extends SparkFunSuite with LocalSparkContext with B
     taskScheduler
   }
 
+  private def submitTasksAndCheck(
+      offers: IndexedSeq[WorkerOffer],

Review comment:
       I found that many tests use the same WorkerOffer, e.g., `WorkerOffer("executor0", "host0", 1)`. Shall we make it as the default value as well?

##########
File path: core/src/test/scala/org/apache/spark/scheduler/TaskSchedulerImplSuite.scala
##########
@@ -1911,11 +1912,12 @@ class TaskSchedulerImplSuite extends SparkFunSuite with LocalSparkContext with B
     clock.advance(2000)
 
     // Now give it some resources and both tasks should be rerun
-    val taskDescriptions = taskScheduler.resourceOffers(IndexedSeq(
-      WorkerOffer("executor2", "host2", 1), WorkerOffer("executor3", "host3", 1))).flatten
-    assert(taskDescriptions.size === 2)
-    assert(taskDescriptions.map(_.index).sorted == Seq(0, 1))
-    assert(manager.copiesRunning.take(2) === Array(1, 1))
+    submitTasksAndCheck(IndexedSeq(WorkerOffer("executor2", "host2", 1),
+      WorkerOffer("executor3", "host3", 1)), Seq.empty) { taskDescriptions =>

Review comment:
       Personally, I feel it's weird that we're submitting empty task sets here... `submitTasksAndCheck` maybe not appropriate for this case? Or you could redesign the `submitTasksAndCheck`?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org