You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/08/07 21:54:27 UTC

[GitHub] [spark] tgravescs commented on a change in pull request #25047: [SPARK-27371][CORE] Support GPU-aware resources scheduling in Standalone

tgravescs commented on a change in pull request #25047: [SPARK-27371][CORE] Support GPU-aware resources scheduling in Standalone
URL: https://github.com/apache/spark/pull/25047#discussion_r311779752
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/deploy/master/Master.scala
 ##########
 @@ -683,8 +702,7 @@ private[deploy] class Master(
       if (app.coresLeft >= coresPerExecutor) {
         // Filter out workers that don't have enough resources to launch an executor
         val usableWorkers = workers.toArray.filter(_.state == WorkerState.ALIVE)
-          .filter(worker => worker.memoryFree >= app.desc.memoryPerExecutorMB &&
-            worker.coresFree >= coresPerExecutor)
+          .filter(canLaunchExecutor(_, app.desc))
           .sortBy(_.coresFree).reverse
 
 Review comment:
   We have an issue here if no Workers have the resources required by the executor/task requirements, then it doesn't warn/error and it doesn't retry.  Basically I started a Worker without GPUs and then said I need gpus for my executor task and it end up hanging.  I suppose one could argue this is ok as someone could start another Worker that has the resources, but I think we at least need to Warn about it

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org