You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2022/01/25 02:17:26 UTC

[spark] branch branch-3.2 updated: [SPARK-37998][K8S][TESTS] Use `rbac.authorization.k8s.io/v1` instead of `v1beta1`

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.2 by this push:
     new e8a7032  [SPARK-37998][K8S][TESTS] Use `rbac.authorization.k8s.io/v1` instead of `v1beta1`
e8a7032 is described below

commit e8a70326e45d894808e9b1aa3ffa30aa50926c73
Author: Yikun Jiang <yi...@gmail.com>
AuthorDate: Mon Jan 24 18:15:27 2022 -0800

    [SPARK-37998][K8S][TESTS] Use `rbac.authorization.k8s.io/v1` instead of `v1beta1`
    
    ### What changes were proposed in this pull request?
    Before this patch:
    ```bash
    $ k apply -f  resource-managers/kubernetes/integration-tests/dev/spark-rbac.yaml
    namespace/spark created
    serviceaccount/spark-sa created
    unable to recognize "resource-managers/kubernetes/integration-tests/dev/spark-rbac.yaml": no matches for kind "ClusterRole" in version "rbac.authorization.k8s.io/v1beta1"
    unable to recognize "resource-managers/kubernetes/integration-tests/dev/spark-rbac.yaml": no matches for kind "ClusterRoleBinding" in version "rbac.authorization.k8s.io/v1beta1"
    ```
    
    This patch bumps rbac to v1 to fix api no matches error in latest minikube setup k8s.
    
    ### Why are the changes needed?
    
    Current spark-rbac.yaml would be failed to create rbac when setup k8s using minikube latest version.
    
    As note from kubernetes:
    - The rbac.authorization.k8s.io/v1beta1 API version of ClusterRole, ClusterRoleBinding, Role, and RoleBinding is no longer served as of v1.22.
    - Migrate manifests and API clients to use the rbac.authorization.k8s.io/v1 API version, available since v1.8.
    
    We'd better using rbac `v1` in here to aovid apply failed on kuberentes v1.22+.
    
    [1] https://kubernetes.io/docs/reference/using-api/deprecation-guide/#rbac-resources-v122
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    ```bash
    $ k apply -f spark-rbac.yaml
    namespace/spark unchanged
    serviceaccount/spark-sa unchanged
    clusterrole.rbac.authorization.k8s.io/spark-role created
    clusterrolebinding.rbac.authorization.k8s.io/spark-role-binding created
    ```
    
    Closes #35300 from Yikun/SPARK-37998.
    
    Authored-by: Yikun Jiang <yi...@gmail.com>
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
    (cherry picked from commit 32d14b5622e8930fe45a1a8d6c71aa1cc0415cfd)
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
 resource-managers/kubernetes/integration-tests/dev/spark-rbac.yaml | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/resource-managers/kubernetes/integration-tests/dev/spark-rbac.yaml b/resource-managers/kubernetes/integration-tests/dev/spark-rbac.yaml
index a4c242f..f6b8b10 100644
--- a/resource-managers/kubernetes/integration-tests/dev/spark-rbac.yaml
+++ b/resource-managers/kubernetes/integration-tests/dev/spark-rbac.yaml
@@ -26,7 +26,7 @@ metadata:
   name: spark-sa
   namespace: spark
 ---
-apiVersion: rbac.authorization.k8s.io/v1beta1
+apiVersion: rbac.authorization.k8s.io/v1
 kind: ClusterRole
 metadata:
   name: spark-role
@@ -38,7 +38,7 @@ rules:
   verbs:
   - "*"
 ---
-apiVersion: rbac.authorization.k8s.io/v1beta1
+apiVersion: rbac.authorization.k8s.io/v1
 kind: ClusterRoleBinding
 metadata:
   name: spark-role-binding
@@ -49,4 +49,4 @@ subjects:
 roleRef:
   kind: ClusterRole
   name: spark-role
-  apiGroup: rbac.authorization.k8s.io
\ No newline at end of file
+  apiGroup: rbac.authorization.k8s.io

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org