You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/08 11:55:14 UTC
[GitHub] [spark] advancedxy commented on a change in pull request #32819: [WIP][SPARK-35677][Core][SQL] Support dynamic range of executor numbers for dynamic allocation
advancedxy commented on a change in pull request #32819:
URL: https://github.com/apache/spark/pull/32819#discussion_r704346156
##########
File path: core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala
##########
@@ -862,6 +854,15 @@ private[spark] class ExecutorAllocationManager(
}
}
+ override def onExecutorAllocatorRangeUpdate(
+ event: SparkListenerExecutorAllocatorRangeUpdate): Unit = {
+ val newLower = event.lower.getOrElse(_minNumExecutors)
+ val newUpper = event.upper.getOrElse(_maxNumExecutors)
+ validateSettings(newLower, newUpper)
Review comment:
I believe this should be like
```
try {
validateSettings(newLower, newUpper)
_minNumExecutors = newLower
_maxNumExecutors = newUpper
} catch {
....
}
```
Otherwise exception is thrown in this listener.
##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/command/SetCommand.scala
##########
@@ -80,6 +82,22 @@ case class SetCommand(kv: Option[(String, Option[String])])
}
(keyValueOutput, runFunc)
+ case Some((DYN_ALLOCATION_MIN_EXECUTORS.key, Some(value))) =>
Review comment:
This is an administration operation. I am not sure it's a good idea to control this by just run set command for long running service such as thrift server as users can easily run this command, and misuse this feature.
How about expose this via the Rest API?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org