You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/23 17:57:40 UTC

[GitHub] [spark] yangwwei commented on a diff in pull request #37622: [SPARK-40187][DOCS] Add `Apache YuniKorn` scheduler docs

yangwwei commented on code in PR #37622:
URL: https://github.com/apache/spark/pull/37622#discussion_r952954104


##########
docs/running-on-kubernetes.md:
##########
@@ -1811,6 +1811,50 @@ spec:
   queue: default
 ```
 
+#### Using Apache YuniKorn as Customized Scheduler for Spark on Kubernetes
+
+[Apache YuniKorn](https://yunikorn.apache.org/) is a resource scheduler for Kubernetes that provides advanced batch scheduling
+capabilities, such as job queuing, resource fairness, min/max queue capacity and flexible job ordering policies.
+For available Apache YuniKorn features, please refer to [this doc](https://yunikorn.apache.org/docs/next/get_started/core_features).

Review Comment:
   hi @dongjoon-hyun This is how the doc site works, next -> is the current under-development version, we shouldn't use this, that's a good point; but I think we can use the latest stable version: this points to https://yunikorn.apache.org/docs/. Only the past versions are accessible via https://yunikorn.apache.org/docs/{VERSION_NUM}, that's why you did not see 1.0.0 there, 1.0.0 is the current stable version.
   
   If we use a hard-coded version, e.g 1.0.0 here, we will need to come back to update the doc quite often, I don't feel that is good. So my question is: is it better to use the latest stable version here or a hard-coded version that will need updates over time? Please let me know, thanks!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org