You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2022/11/28 20:39:52 UTC

[spark] branch branch-3.3 updated: [SPARK-41185][K8S][DOCS] Remove ARM limitation for YuniKorn from docs

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.3 by this push:
     new 00185e3d8f9 [SPARK-41185][K8S][DOCS] Remove ARM limitation for YuniKorn from docs
00185e3d8f9 is described below

commit 00185e3d8f9a5bea7238f1387543cf01a8fe1fd4
Author: Wilfred Spiegelenburg <wi...@apache.org>
AuthorDate: Mon Nov 28 12:39:27 2022 -0800

    [SPARK-41185][K8S][DOCS] Remove ARM limitation for YuniKorn from docs
    
    ### What changes were proposed in this pull request?
    Remove the limitations section from the K8s documentation for YuniKorn.
    
    ### Why are the changes needed?
    The limitation section is outdated because YuniKorn is fully supported from release 1.1.0 onwards. YuniKorn 1.1.0 is the release that is referenced in the documentation.
    
    ### Does this PR introduce any user-facing change?
    No.
    
    ### How was this patch tested?
    Existing tests.
    
    Closes #38780 from wilfred-s/SPARK-41185.
    
    Authored-by: Wilfred Spiegelenburg <wi...@apache.org>
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
    (cherry picked from commit bfc9e4ef111e21eee99407309ca6be278617d319)
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
 docs/running-on-kubernetes.md | 4 ----
 1 file changed, 4 deletions(-)

diff --git a/docs/running-on-kubernetes.md b/docs/running-on-kubernetes.md
index f7f7ec539b8..5a76e6155dc 100644
--- a/docs/running-on-kubernetes.md
+++ b/docs/running-on-kubernetes.md
@@ -1842,10 +1842,6 @@ Submit Spark jobs with the following extra options:
 Note that `{{APP_ID}}` is the built-in variable that will be substituted with Spark job ID automatically.
 With the above configuration, the job will be scheduled by YuniKorn scheduler instead of the default Kubernetes scheduler.
 
-##### Limitations
-
-- Apache YuniKorn currently only supports x86 Linux, running Spark on ARM64 (or other platform) with Apache YuniKorn is not supported at present.
-
 ### Stage Level Scheduling Overview
 
 Stage level scheduling is supported on Kubernetes when dynamic allocation is enabled. This also requires <code>spark.dynamicAllocation.shuffleTracking.enabled</code> to be enabled since Kubernetes doesn't support an external shuffle service at this time. The order in which containers for different profiles is requested from Kubernetes is not guaranteed. Note that since dynamic allocation on Kubernetes requires the shuffle tracking feature, this means that executors from previous stages t [...]


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org