You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/30 07:34:33 UTC

[GitHub] [spark] HyukjinKwon commented on a diff in pull request #37721: [SPARK-40272][CORE]Support service port custom with range

HyukjinKwon commented on code in PR #37721:
URL: https://github.com/apache/spark/pull/37721#discussion_r958110189


##########
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##########
@@ -2429,4 +2429,21 @@ package object config {
       .version("3.4.0")
       .timeConf(TimeUnit.MILLISECONDS)
       .createWithDefaultString("5s")
+
+  private[spark] val CUSTOM_SERVICE_PORT =
+    ConfigBuilder("spark.service.port.custom")
+      .doc("Customize the spark service port range, " +
+        "service like SparkUI.port, SparkDriver.port, NettyService.port etc." +
+        "The starting port number is spark.service.port.custom.origin " +
+        "and the port range is limited by spark.port.maxRetries. " +
+        "e.g. spark.service.port.custom.origin=49152 and spark.port.maxRetries=10, " +
+        "the actual range of the port is greater than or equal to 49152 and less than 49163")
+      .booleanConf
+      .createWithDefault(false)
+
+  private[spark] val CUSTOM_SERVICE_PORT_ORIGIN =
+    ConfigBuilder("spark.service.port.custom.origin")
+      .intConf
+      .checkValue(v => v >= 1024, "The threshold should be greater than or equal to 1024.")
+      .createWithDefault(49152)

Review Comment:
   Should probably have one option that's defined with `createOptional`.
   
   BTW, why do we need this though? can't we simply use 0 to automatically find the availalbe port?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org