You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by ya...@apache.org on 2023/12/02 19:28:54 UTC
(spark) branch master updated: [SPARK-46210][K8S][DOCS] Update `YuniKorn` docs with v1.4
This is an automated email from the ASF dual-hosted git repository.
yangjie01 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 37debbbbb1c1 [SPARK-46210][K8S][DOCS] Update `YuniKorn` docs with v1.4
37debbbbb1c1 is described below
commit 37debbbbb1c1962ca742e5adf9fe902a4aa792a0
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Sun Dec 3 03:28:42 2023 +0800
[SPARK-46210][K8S][DOCS] Update `YuniKorn` docs with v1.4
### What changes were proposed in this pull request?
This PR aims to update `YuniKorn` docs with v1.4 for Apache Spark 4.0.0.
### Why are the changes needed?
Apache YuniKorn v1.4.0 was released on 2023-11-20 with 270 resolved JIRAs.
- https://yunikorn.apache.org/release-announce/1.4.0
- `PreEnqueue-Plugin` for `SchedulingGate` feature based on [KEP-3521: Pod Scheduling Readiness](https://github.com/kubernetes/enhancements/blob/master/keps/sig-scheduling/3521-pod-scheduling-readiness/README.md)
I installed YuniKorn v1.4.0 on K8s 1.28 and tested manually.
**K8s v1.28**
```
$ kubectl version
Client Version: v1.28.4
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: v1.28.2
```
**YuniKorn v1.4**
```
$ helm list -n yunikorn
NAME NAMESPACE REVISION UPDATED STATUS CHART APP VERSION
yunikorn yunikorn 1 2023-12-01 19:02:54.63097 -0800 PST deployed yunikorn-1.4.0
```
```
$ build/sbt -Pkubernetes -Pkubernetes-integration-tests -Dspark.kubernetes.test.deployMode=docker-desktop "kubernetes-integration-tests/testOnly *.YuniKornSuite" -Dtest.exclude.tags=minikube,local,decom,r -Dtest.default.exclude.tags=
...
[info] YuniKornSuite:
[info] - SPARK-42190: Run SparkPi with local[*] (11 seconds, 592 milliseconds)
[info] - Run SparkPi with no resources (15 seconds, 437 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (20 seconds, 243 milliseconds)
[info] - Run SparkPi with a very long application name. (15 seconds, 231 milliseconds
[info] - Use SparkLauncher.NO_RESOURCE (20 seconds, 240 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (15 seconds, 233 milliseconds)
[info] - Run SparkPi with an argument. (21 seconds, 378 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (15 seconds, 231 milliseconds)
[info] - All pods have the same service account by default (20 seconds, 295 milliseconds)
[info] - Run extraJVMOptions check on driver (11 seconds, 300 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - G1GC (12 seconds, 183 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - Other GC (11 seconds, 837 milliseconds)
[info] - SPARK-42769: All executor pods have SPARK_DRIVER_POD_IP env variable (15 seconds, 499 milliseconds)
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (19 seconds, 881 milliseconds)
[info] - Run SparkPi with env and mount secrets. (22 seconds, 842 milliseconds)
[info] - Run PySpark on simple pi.py example (16 seconds, 319 milliseconds)
[info] - Run PySpark to test a pyfiles example (21 seconds, 599 milliseconds)
[info] - Run PySpark with memory customization (15 seconds, 355 milliseconds)
[info] - Run in client mode. (5 seconds, 120 milliseconds)
[info] - Start pod creation from template (15 seconds, 484 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (15 seconds, 427 milliseconds)
[info] Run completed in 7 minutes, 48 seconds.
[info] Total number of tests run: 21
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 21, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 485 s (08:05), completed Dec 1, 2023, 7:36:56 PM
```
```
$ k describe pod -l spark-role=driver -n spark-ce267a8f046f4785a4f42763c5b3851c
...
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Normal Scheduling 7s yunikorn spark-ce267a8f046f4785a4f42763c5b3851c/spark-test-app-c94236f72b6b438c80b00226cbc0e95c-driver is queued and waiting for allocation
Normal Scheduled 7s yunikorn Successfully assigned spark-ce267a8f046f4785a4f42763c5b3851c/spark-test-app-c94236f72b6b438c80b00226cbc0e95c-driver to node docker-desktop
Normal PodBindSuccessful 7s yunikorn Pod spark-ce267a8f046f4785a4f42763c5b3851c/spark-test-app-c94236f72b6b438c80b00226cbc0e95c-driver is successfully bound to node docker-desktop
Normal Pulled 7s kubelet Container image "docker.io/kubespark/spark:dev" already present on machine
Normal Created 7s kubelet Created container spark-kubernetes-driver
Normal Started 7s kubelet Started container spark-kubernetes-driver
```
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Manual review.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #44117 from dongjoon-hyun/SPARK-46210.
Authored-by: Dongjoon Hyun <dh...@apple.com>
Signed-off-by: yangjie01 <ya...@baidu.com>
---
docs/running-on-kubernetes.md | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/docs/running-on-kubernetes.md b/docs/running-on-kubernetes.md
index cc70c025792f..4b4dc9d304fb 100644
--- a/docs/running-on-kubernetes.md
+++ b/docs/running-on-kubernetes.md
@@ -1927,10 +1927,10 @@ Install Apache YuniKorn:
```bash
helm repo add yunikorn https://apache.github.io/yunikorn-release
helm repo update
-helm install yunikorn yunikorn/yunikorn --namespace yunikorn --version 1.3.0 --create-namespace --set embedAdmissionController=false
+helm install yunikorn yunikorn/yunikorn --namespace yunikorn --version 1.4.0 --create-namespace --set embedAdmissionController=false
```
-The above steps will install YuniKorn v1.3.0 on an existing Kubernetes cluster.
+The above steps will install YuniKorn v1.4.0 on an existing Kubernetes cluster.
##### Get started
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org