You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/02/01 16:12:52 UTC

[GitHub] [spark] attilapiros commented on a change in pull request #35345: [SPARK-37145][K8S] Add KubernetesCustom[Driver/Executor]FeatureConfigStep developer api

attilapiros commented on a change in pull request #35345:
URL: https://github.com/apache/spark/pull/35345#discussion_r796574328



##########
File path: resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
##########
@@ -351,7 +352,8 @@ private[spark] object Config extends Logging {
     ConfigBuilder("spark.kubernetes.executor.pod.featureSteps")
       .doc("Class name of an extra executor pod feature step implementing " +
         "KubernetesFeatureConfigStep. This is a developer API. Comma separated. " +
-        "Runs after all of Spark internal feature steps.")
+        "Runs after all of Spark internal feature steps. Since 3.3.0, you can extend your " +
+        "executor feature step by implementing `KubernetesExecutorCustomFeatureConfigStep`.")

Review comment:
       likewise
   

##########
File path: docs/running-on-kubernetes.md
##########
@@ -1430,7 +1431,8 @@ See the [configuration page](configuration.html) for information on Spark config
   <td>
     Class names of an extra executor pod feature step implementing
     `KubernetesFeatureConfigStep`. This is a developer API. Comma separated.
-    Runs after all of Spark internal feature steps.
+    Runs after all of Spark internal feature steps. Since 3.3.0, you can extend your
+    executor feature step by implementing `KubernetesExecutorCustomFeatureConfigStep`.

Review comment:
       likewise
   

##########
File path: resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
##########
@@ -66,6 +66,69 @@ abstract class PodBuilderSuite extends SparkFunSuite {
     assert(pod.container.getVolumeMounts.asScala.exists(_.getName == "so_long_two"))
   }
 
+  test("SPARK-37145: configure a custom test step with base config") {
+    val client = mockKubernetesClient()
+    val sparkConf = baseConf.clone()
+      .set(userFeatureStepsConf.key,
+          "org.apache.spark.deploy.k8s.TestStepWithConf")
+      .set(templateFileConf.key, "template-file.yaml")
+      .set("test-features-key", "test-features-value")
+    val pod = buildPod(sparkConf, client)
+    verifyPod(pod)
+    val metadata = pod.pod.getMetadata
+    assert(metadata.getAnnotations.containsKey("test-user-feature-annotation"))
+  }
+
+  test("SPARK-37145: configure a custom test step with driver or executor config") {
+    val client = mockKubernetesClient()
+    val (featureSteps, annotationKey) = this.getClass.getSimpleName match {

Review comment:
       We already selecting `userFeatureStepsConf` based on the concrete implementation (although I would prefer helper functions in a base suite which would be used in the concrete implementation to create specific tests with the right parameterization, but at least let's follow the existing pattern).
   
   So let's have 
   ```scala 
   protected def userFeatureStepWithExpectedAnnotation: (class[_], String)
   ```
   
   The `TestStepWithDrvConf` and `TestStepWithExecConf ` should be moved into the specific suites (where the annotation content can be accessed by a `val`and reused in these two custom steps and in the `userFeatureStepWithExpectedAnnotation` function).

##########
File path: resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
##########
@@ -341,7 +341,8 @@ private[spark] object Config extends Logging {
     ConfigBuilder("spark.kubernetes.driver.pod.featureSteps")
       .doc("Class names of an extra driver pod feature step implementing " +
         "KubernetesFeatureConfigStep. This is a developer API. Comma separated. " +
-        "Runs after all of Spark internal feature steps.")
+        "Runs after all of Spark internal feature steps. Since 3.3.0, you can extend your " +
+        "driver feature step by implementing `KubernetesDriverCustomFeatureConfigStep`.")

Review comment:
       likewise

##########
File path: resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KubernetesFeatureConfigStep.scala
##########
@@ -19,7 +19,43 @@ package org.apache.spark.deploy.k8s.features
 import io.fabric8.kubernetes.api.model.HasMetadata
 
 import org.apache.spark.annotation.{DeveloperApi, Unstable}
-import org.apache.spark.deploy.k8s.SparkPod
+import org.apache.spark.deploy.k8s.{KubernetesDriverConf, KubernetesExecutorConf, SparkPod}
+
+/**
+ * :: DeveloperApi ::
+ *
+ * A base class to help user extend custom feature step in driver side.
+ * Note: If your custom feature step would be used only in driver or both in driver and executor,
+ * please use this.
+ */
+@Unstable
+@DeveloperApi
+trait KubernetesDriverCustomFeatureConfigStep extends KubernetesFeatureConfigStep {
+  /**
+   * Initialize the configuration for driver user feature step, this only applicable when user
+   * specified `spark.kubernetes.driver.pod.featureSteps`, the init would be called after feature

Review comment:
       Nit: would => will

##########
File path: resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KubernetesFeatureConfigStep.scala
##########
@@ -19,7 +19,43 @@ package org.apache.spark.deploy.k8s.features
 import io.fabric8.kubernetes.api.model.HasMetadata
 
 import org.apache.spark.annotation.{DeveloperApi, Unstable}
-import org.apache.spark.deploy.k8s.SparkPod
+import org.apache.spark.deploy.k8s.{KubernetesDriverConf, KubernetesExecutorConf, SparkPod}
+
+/**
+ * :: DeveloperApi ::
+ *
+ * A base class to help user extend custom feature step in driver side.

Review comment:
       Nit: not base class but base interface or trait

##########
File path: resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
##########
@@ -66,6 +66,69 @@ abstract class PodBuilderSuite extends SparkFunSuite {
     assert(pod.container.getVolumeMounts.asScala.exists(_.getName == "so_long_two"))
   }
 
+  test("SPARK-37145: configure a custom test step with base config") {
+    val client = mockKubernetesClient()
+    val sparkConf = baseConf.clone()
+      .set(userFeatureStepsConf.key,
+          "org.apache.spark.deploy.k8s.TestStepWithConf")
+      .set(templateFileConf.key, "template-file.yaml")
+      .set("test-features-key", "test-features-value")
+    val pod = buildPod(sparkConf, client)
+    verifyPod(pod)
+    val metadata = pod.pod.getMetadata
+    assert(metadata.getAnnotations.containsKey("test-user-feature-annotation"))
+  }
+
+  test("SPARK-37145: configure a custom test step with driver or executor config") {
+    val client = mockKubernetesClient()
+    val (featureSteps, annotationKey) = this.getClass.getSimpleName match {
+      case "KubernetesDriverBuilderSuite" =>
+        ("org.apache.spark.deploy.k8s.TestStepWithDrvConf",
+          "test-drv-user-feature-annotation")
+      case "KubernetesExecutorBuilderSuite" =>
+        ("org.apache.spark.deploy.k8s.TestStepWithExecConf",
+          "test-exec-user-feature-annotation")
+    }
+    val sparkConf = baseConf.clone()
+      .set(templateFileConf.key, "template-file.yaml")
+      .set(userFeatureStepsConf.key, featureSteps)
+      .set("test-features-key", "test-features-value")
+    val pod = buildPod(sparkConf, client)
+    verifyPod(pod)
+    val metadata = pod.pod.getMetadata
+    assert(metadata.getAnnotations.containsKey(annotationKey))
+    assert(metadata.getAnnotations.get(annotationKey) === "test-features-value")
+  }
+
+  test("SPARK-37145: configure a custom test step with wrong type config") {
+    val client = mockKubernetesClient()
+    val featureSteps = this.getClass.getSimpleName match {

Review comment:
       likewise
   

##########
File path: docs/running-on-kubernetes.md
##########
@@ -1420,7 +1420,8 @@ See the [configuration page](configuration.html) for information on Spark config
   <td>
     Class names of an extra driver pod feature step implementing
     `KubernetesFeatureConfigStep`. This is a developer API. Comma separated.
-    Runs after all of Spark internal feature steps.
+    Runs after all of Spark internal feature steps. Since 3.3.0, you can extend your
+    executor feature step by implementing `KubernetesDriverCustomFeatureConfigStep`.

Review comment:
       Something like:
   ```suggestion
       Runs after all of Spark internal feature steps. Since 3.3.0, your executor feature step
       can implement `KubernetesDriverCustomFeatureConfigStep` where the executor config 
       is also available.
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org