You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/12/24 16:10:43 UTC

[GitHub] [spark] Ngone51 commented on a change in pull request #27004: [SPARK-30348][CORE] Fix flaky test failure on "MasterSuite.SPARK-27510: Master should avoid ..."

Ngone51 commented on a change in pull request #27004: [SPARK-30348][CORE] Fix flaky test failure on "MasterSuite.SPARK-27510: Master should avoid ..."
URL: https://github.com/apache/spark/pull/27004#discussion_r361194044
 
 

 ##########
 File path: core/src/test/scala/org/apache/spark/deploy/master/MasterSuite.scala
 ##########
 @@ -682,7 +682,11 @@ class MasterSuite extends SparkFunSuite
         // an app would be registered with Master once Driver set up
         assert(worker.apps.nonEmpty)
         appId = worker.apps.head._1
-        assert(master.idToApp.contains(appId))
+
+        // we found the case where the test was too fast which all steps were done within
+        // an interval - in this case, we have to check either app is available in master
+        // or marked as completed. See SPARK-30348 for details.
+        assert(master.idToApp.contains(appId) || master.completedApps.exists(_.id == appId))
 
 Review comment:
   I think hit `assert(master.completedApps.exists(_.id == appId))` isn't what we want to see for this test. 
   
   Does a short delay above `assert(master.idToApp.contains(appId))` or increase `MAX_EXECUTOR_RETRIES` works? I'd also prefer not to expose internal data for such a fix.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org