You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2023/06/23 20:49:57 UTC

[spark] branch branch-3.3 updated: [SPARK-44158][K8S] Remove unused `spark.kubernetes.executor.lostCheckmaxAttempts`

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.3 by this push:
     new d52aa4f1e7f [SPARK-44158][K8S] Remove unused `spark.kubernetes.executor.lostCheckmaxAttempts`
d52aa4f1e7f is described below

commit d52aa4f1e7f04c8593d1172ef9c40279d944a2f0
Author: Dongjoon Hyun <do...@apache.org>
AuthorDate: Fri Jun 23 13:49:24 2023 -0700

    [SPARK-44158][K8S] Remove unused `spark.kubernetes.executor.lostCheckmaxAttempts`
    
    ### What changes were proposed in this pull request?
    
    This PR aims to remove `spark.kubernetes.executor.lostCheckmaxAttempts` because it was not used after SPARK-24248 (Apache Spark 2.4.0)
    
    ### Why are the changes needed?
    
    To clean up this from documentation and code.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No because it was no-op already.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    Closes #41713 from dongjoon-hyun/SPARK-44158.
    
    Authored-by: Dongjoon Hyun <do...@apache.org>
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
    (cherry picked from commit 6590e7db5212bb0dc90f22133a96e3d5e385af65)
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
 docs/running-on-kubernetes.md                                  | 10 ----------
 .../src/main/scala/org/apache/spark/deploy/k8s/Config.scala    | 10 ----------
 2 files changed, 20 deletions(-)

diff --git a/docs/running-on-kubernetes.md b/docs/running-on-kubernetes.md
index 5a76e6155dc..163875f25c1 100644
--- a/docs/running-on-kubernetes.md
+++ b/docs/running-on-kubernetes.md
@@ -906,16 +906,6 @@ See the [configuration page](configuration.html) for information on Spark config
   </td>
   <td>2.3.0</td>
 </tr>
-<tr>
-  <td><code>spark.kubernetes.executor.lostCheck.maxAttempts</code></td>
-  <td><code>10</code></td>
-  <td>
-    Number of times that the driver will try to ascertain the loss reason for a specific executor.
-    The loss reason is used to ascertain whether the executor failure is due to a framework or an application error
-    which in turn decides whether the executor is removed and replaced, or placed into a failed state for debugging.
-  </td>
-  <td>2.3.0</td>
-</tr>
 <tr>
   <td><code>spark.kubernetes.submission.waitAppCompletion</code></td>
   <td><code>true</code></td>
diff --git a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
index 42a40d4e2c9..ed7f4a3c73b 100644
--- a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
+++ b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
@@ -438,16 +438,6 @@ private[spark] object Config extends Logging {
       .checkValue(value => value > 0, "Allocation executor timeout must be a positive time value.")
       .createWithDefaultString("600s")
 
-  val KUBERNETES_EXECUTOR_LOST_REASON_CHECK_MAX_ATTEMPTS =
-    ConfigBuilder("spark.kubernetes.executor.lostCheck.maxAttempts")
-      .doc("Maximum number of attempts allowed for checking the reason of an executor loss " +
-        "before it is assumed that the executor failed.")
-      .version("2.3.0")
-      .intConf
-      .checkValue(value => value > 0, "Maximum attempts of checks of executor lost reason " +
-        "must be a positive integer")
-      .createWithDefault(10)
-
   val WAIT_FOR_APP_COMPLETION =
     ConfigBuilder("spark.kubernetes.submission.waitAppCompletion")
       .doc("In cluster mode, whether to wait for the application to finish before exiting the " +


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org