You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/07/04 03:23:10 UTC

[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25047: [WIP][SPARK-27371][CORE] Support GPU-aware resources scheduling in Standalone

dongjoon-hyun commented on a change in pull request #25047: [WIP][SPARK-27371][CORE] Support GPU-aware resources scheduling in Standalone
URL: https://github.com/apache/spark/pull/25047#discussion_r300216448
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/SparkContext.scala
 ##########
 @@ -380,6 +380,17 @@ class SparkContext(config: SparkConf) extends Logging {
 
     val resourcesFileOpt = conf.get(DRIVER_RESOURCES_FILE)
     _resources = getOrDiscoverAllResources(_conf, SPARK_DRIVER_PREFIX, resourcesFileOpt)
+    // driver submitted in client mode under Standalone may have conflict resources with
+    // workers on this host. We should sync driver's resources info into SPARK_RESOURCES
+    // to avoid collision.
+    if (deployMode == "client" && (master.startsWith("spark://")
+      || master.startsWith("local-cluster"))) {
+      val requests = parseAllResourceRequests(_conf, SPARK_DRIVER_PREFIX).map {req =>
+        req.id.resourceName -> req.amount
+      }.toMap
+      // TODO(wuyi) log driver's acquired resources separately ?
 
 Review comment:
   Hi, @Ngone51 . Please don't use user-id TODO in the patch. As you know, Apache Spark repository has already a few ancient user-id TODOs like this which is not fixed until now. :)
   Since we don't know the future, let's use JIRA-IDed TODO like `TODO(SPARK-XXX)`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org