You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/07 20:38:28 UTC

[GitHub] HeartSaVioR commented on a change in pull request #23743: [SPARK-26843][MESOS] Use ConfigEntry for hardcoded configs for "mesos" resource manager

HeartSaVioR commented on a change in pull request #23743: [SPARK-26843][MESOS] Use ConfigEntry for hardcoded configs for "mesos" resource manager
URL: https://github.com/apache/spark/pull/23743#discussion_r254859107
 
 

 ##########
 File path: resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendUtil.scala
 ##########
 @@ -34,11 +34,11 @@ import org.apache.spark.internal.Logging
  */
 private[mesos] object MesosSchedulerBackendUtil extends Logging {
   /**
-   * Parse a comma-delimited list of volume specs, each of which
+   * Parse a list of volume specs, each of which
    * takes the form [host-dir:]container-dir[:rw|:ro].
    */
-  def parseVolumesSpec(volumes: String): List[Volume] = {
-    volumes.split(",").map(_.split(":")).flatMap { spec =>
+  def parseVolumesSpec(volumes: Seq[String]): List[Volume] = {
 
 Review comment:
   This is only used for parsing config value, and we are changing the type of config to `Seq[String]` instead of comma-separated `String`. Same applies for below methods as well.
   
   Similar change was suggested while working on ConfigEntry for others (don't remember which one was...).
   
   I'm not sure changing them would hurt because the object is `mesos` package private.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org