You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@yunikorn.apache.org by GitBox <gi...@apache.org> on 2022/10/07 22:27:41 UTC

[GitHub] [yunikorn-site] wilfred-s opened a new pull request, #192: [YUNIKORN-1345] Updating the Spark tutorial i18n

wilfred-s opened a new pull request, #192:
URL: https://github.com/apache/yunikorn-site/pull/192

   fix includes correct assets link for i18n pages from 1.1.0 release


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [yunikorn-site] wilfred-s commented on a diff in pull request #192: [YUNIKORN-1345] Updating the Spark tutorial i18n

Posted by GitBox <gi...@apache.org>.
wilfred-s commented on code in PR #192:
URL: https://github.com/apache/yunikorn-site/pull/192#discussion_r990706804


##########
i18n/zh-cn/docusaurus-plugin-content-docs/current/user_guide/workloads/run_spark.md:
##########
@@ -104,28 +108,39 @@ EOF
 kubectl proxy
 ```
 
-运行一个简单的 SparkPi 作业(这假设Spark二进制文件已安装到 `/usr/local` 目录)。
+[dockerhub](https://hub.docker.com/r/apache/spark/tags)中有不同spark版本的官方镜像
+运行一个简单的 SparkPi 作业,假设 Spark 二进制文件本地安装在 `/usr/local` 目录中。
 ```shell script
-export SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7/
+export SPARK_HOME=/usr/local/spark/
 ${SPARK_HOME}/bin/spark-submit --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.namespace=spark-test \
    --conf spark.kubernetes.executor.request.cores=1 \
-   --conf spark.kubernetes.container.image=apache/yunikorn:spark-2.4.4 \
+   --conf spark.kubernetes.container.image=docker.io/spark:v3.3.0 \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-test:spark \
-   local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
+   local:///opt/spark/examples/jars/spark-examples_2.12-3.3.0.jar
+```
+:::note
+在 [spark](https://spark.apache.org/docs/latest/running-on-kubernetes.html#configuration) 中有更多设置驱动程序和执行程序的选项。
+可以分配 applicationId 和队列路径。
 ```
+--conf spark.kubernetes.executor.label.applicationId=application-spark-0001
+--conf spark.kubernetes.driver.label.applicationId=application-spark-0001
+--conf spark.kubernetes.executor.label.queue=default.root.sandbox
+--conf spark.kubernetes.driver.label.queue=default.root.sandbox
+```
+:::
 
 您可以看见Spark的driver和executors在Kubernetes上创建:
 
-![spark-pods](./../../assets/spark-pods.png)
+![spark-pods](./../../assets/RunningSparkOnK8s.png)
 
-您还可以从 YuniKorn UI 查看作业信息。如果您不知道如何访问 YuniKorn UI,请阅读文档
+spark-pi 结果在驱动程序舱中。

Review Comment:
   updated



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [yunikorn-site] 0yukali0 commented on a diff in pull request #192: [YUNIKORN-1345] Updating the Spark tutorial i18n

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on code in PR #192:
URL: https://github.com/apache/yunikorn-site/pull/192#discussion_r990593100


##########
i18n/zh-cn/docusaurus-plugin-content-docs/current/user_guide/workloads/run_spark.md:
##########
@@ -104,28 +108,39 @@ EOF
 kubectl proxy
 ```
 
-运行一个简单的 SparkPi 作业(这假设Spark二进制文件已安装到 `/usr/local` 目录)。
+[dockerhub](https://hub.docker.com/r/apache/spark/tags)中有不同spark版本的官方镜像
+运行一个简单的 SparkPi 作业,假设 Spark 二进制文件本地安装在 `/usr/local` 目录中。
 ```shell script
-export SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7/
+export SPARK_HOME=/usr/local/spark/
 ${SPARK_HOME}/bin/spark-submit --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.namespace=spark-test \
    --conf spark.kubernetes.executor.request.cores=1 \
-   --conf spark.kubernetes.container.image=apache/yunikorn:spark-2.4.4 \
+   --conf spark.kubernetes.container.image=docker.io/spark:v3.3.0 \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-test:spark \
-   local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
+   local:///opt/spark/examples/jars/spark-examples_2.12-3.3.0.jar
+```
+:::note
+在 [spark](https://spark.apache.org/docs/latest/running-on-kubernetes.html#configuration) 中有更多设置驱动程序和执行程序的选项。
+可以分配 applicationId 和队列路径。
 ```
+--conf spark.kubernetes.executor.label.applicationId=application-spark-0001
+--conf spark.kubernetes.driver.label.applicationId=application-spark-0001
+--conf spark.kubernetes.executor.label.queue=default.root.sandbox
+--conf spark.kubernetes.driver.label.queue=default.root.sandbox
+```
+:::
 
 您可以看见Spark的driver和executors在Kubernetes上创建:
 
-![spark-pods](./../../assets/spark-pods.png)
+![spark-pods](./../../assets/RunningSparkOnK8s.png)
 
-您还可以从 YuniKorn UI 查看作业信息。如果您不知道如何访问 YuniKorn UI,请阅读文档
+spark-pi 结果在驱动程序舱中。
 [链接](../../get_started/get_started.md#访问-web-ui).

Review Comment:
   ../../get_started/get_started.md#访问-web-ui
   This png should be removed.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [yunikorn-site] 0yukali0 commented on a diff in pull request #192: [YUNIKORN-1345] Updating the Spark tutorial i18n

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on code in PR #192:
URL: https://github.com/apache/yunikorn-site/pull/192#discussion_r990592946


##########
i18n/zh-cn/docusaurus-plugin-content-docs/current/user_guide/workloads/run_spark.md:
##########
@@ -104,28 +108,39 @@ EOF
 kubectl proxy
 ```
 
-运行一个简单的 SparkPi 作业(这假设Spark二进制文件已安装到 `/usr/local` 目录)。
+[dockerhub](https://hub.docker.com/r/apache/spark/tags)中有不同spark版本的官方镜像
+运行一个简单的 SparkPi 作业,假设 Spark 二进制文件本地安装在 `/usr/local` 目录中。
 ```shell script
-export SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7/
+export SPARK_HOME=/usr/local/spark/
 ${SPARK_HOME}/bin/spark-submit --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.namespace=spark-test \
    --conf spark.kubernetes.executor.request.cores=1 \
-   --conf spark.kubernetes.container.image=apache/yunikorn:spark-2.4.4 \
+   --conf spark.kubernetes.container.image=docker.io/spark:v3.3.0 \

Review Comment:
   docker.io/spark:v3.3.0 should be docker.io/apache/spark:v3.3.0



##########
i18n/zh-cn/docusaurus-plugin-content-docs/current/user_guide/workloads/run_spark.md:
##########
@@ -104,28 +108,39 @@ EOF
 kubectl proxy
 ```
 
-运行一个简单的 SparkPi 作业(这假设Spark二进制文件已安装到 `/usr/local` 目录)。
+[dockerhub](https://hub.docker.com/r/apache/spark/tags)中有不同spark版本的官方镜像
+运行一个简单的 SparkPi 作业,假设 Spark 二进制文件本地安装在 `/usr/local` 目录中。
 ```shell script
-export SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7/
+export SPARK_HOME=/usr/local/spark/
 ${SPARK_HOME}/bin/spark-submit --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.namespace=spark-test \
    --conf spark.kubernetes.executor.request.cores=1 \
-   --conf spark.kubernetes.container.image=apache/yunikorn:spark-2.4.4 \
+   --conf spark.kubernetes.container.image=docker.io/spark:v3.3.0 \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-test:spark \
-   local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
+   local:///opt/spark/examples/jars/spark-examples_2.12-3.3.0.jar
+```
+:::note
+在 [spark](https://spark.apache.org/docs/latest/running-on-kubernetes.html#configuration) 中有更多设置驱动程序和执行程序的选项。
+可以分配 applicationId 和队列路径。
 ```
+--conf spark.kubernetes.executor.label.applicationId=application-spark-0001
+--conf spark.kubernetes.driver.label.applicationId=application-spark-0001
+--conf spark.kubernetes.executor.label.queue=default.root.sandbox
+--conf spark.kubernetes.driver.label.queue=default.root.sandbox
+```
+:::
 
 您可以看见Spark的driver和executors在Kubernetes上创建:
 
-![spark-pods](./../../assets/spark-pods.png)
+![spark-pods](./../../assets/RunningSparkOnK8s.png)
 
-您还可以从 YuniKorn UI 查看作业信息。如果您不知道如何访问 YuniKorn UI,请阅读文档
+spark-pi 结果在驱动程序舱中。
 [链接](../../get_started/get_started.md#访问-web-ui).

Review Comment:
   This png should be removed.



##########
i18n/zh-cn/docusaurus-plugin-content-docs/current/user_guide/workloads/run_spark.md:
##########
@@ -104,28 +108,39 @@ EOF
 kubectl proxy
 ```
 
-运行一个简单的 SparkPi 作业(这假设Spark二进制文件已安装到 `/usr/local` 目录)。
+[dockerhub](https://hub.docker.com/r/apache/spark/tags)中有不同spark版本的官方镜像
+运行一个简单的 SparkPi 作业,假设 Spark 二进制文件本地安装在 `/usr/local` 目录中。
 ```shell script
-export SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7/
+export SPARK_HOME=/usr/local/spark/
 ${SPARK_HOME}/bin/spark-submit --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.namespace=spark-test \
    --conf spark.kubernetes.executor.request.cores=1 \
-   --conf spark.kubernetes.container.image=apache/yunikorn:spark-2.4.4 \
+   --conf spark.kubernetes.container.image=docker.io/spark:v3.3.0 \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-test:spark \
-   local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
+   local:///opt/spark/examples/jars/spark-examples_2.12-3.3.0.jar
+```
+:::note
+在 [spark](https://spark.apache.org/docs/latest/running-on-kubernetes.html#configuration) 中有更多设置驱动程序和执行程序的选项。
+可以分配 applicationId 和队列路径。
 ```
+--conf spark.kubernetes.executor.label.applicationId=application-spark-0001
+--conf spark.kubernetes.driver.label.applicationId=application-spark-0001
+--conf spark.kubernetes.executor.label.queue=default.root.sandbox
+--conf spark.kubernetes.driver.label.queue=default.root.sandbox
+```
+:::
 
 您可以看见Spark的driver和executors在Kubernetes上创建:
 
-![spark-pods](./../../assets/spark-pods.png)
+![spark-pods](./../../assets/RunningSparkOnK8s.png)
 
-您还可以从 YuniKorn UI 查看作业信息。如果您不知道如何访问 YuniKorn UI,请阅读文档
+spark-pi 结果在驱动程序舱中。

Review Comment:
   spark-pi结果在 driver pod中。



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [yunikorn-site] 0yukali0 merged pull request #192: [YUNIKORN-1345] Updating the Spark tutorial i18n

Posted by GitBox <gi...@apache.org>.
0yukali0 merged PR #192:
URL: https://github.com/apache/yunikorn-site/pull/192


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [yunikorn-site] wilfred-s commented on a diff in pull request #192: [YUNIKORN-1345] Updating the Spark tutorial i18n

Posted by GitBox <gi...@apache.org>.
wilfred-s commented on code in PR #192:
URL: https://github.com/apache/yunikorn-site/pull/192#discussion_r990706830


##########
i18n/zh-cn/docusaurus-plugin-content-docs/current/user_guide/workloads/run_spark.md:
##########
@@ -104,28 +108,39 @@ EOF
 kubectl proxy
 ```
 
-运行一个简单的 SparkPi 作业(这假设Spark二进制文件已安装到 `/usr/local` 目录)。
+[dockerhub](https://hub.docker.com/r/apache/spark/tags)中有不同spark版本的官方镜像
+运行一个简单的 SparkPi 作业,假设 Spark 二进制文件本地安装在 `/usr/local` 目录中。
 ```shell script
-export SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7/
+export SPARK_HOME=/usr/local/spark/
 ${SPARK_HOME}/bin/spark-submit --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.namespace=spark-test \
    --conf spark.kubernetes.executor.request.cores=1 \
-   --conf spark.kubernetes.container.image=apache/yunikorn:spark-2.4.4 \
+   --conf spark.kubernetes.container.image=docker.io/spark:v3.3.0 \

Review Comment:
   updated



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [yunikorn-site] 0yukali0 commented on a diff in pull request #192: [YUNIKORN-1345] Updating the Spark tutorial i18n

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on code in PR #192:
URL: https://github.com/apache/yunikorn-site/pull/192#discussion_r990593100


##########
i18n/zh-cn/docusaurus-plugin-content-docs/current/user_guide/workloads/run_spark.md:
##########
@@ -104,28 +108,39 @@ EOF
 kubectl proxy
 ```
 
-运行一个简单的 SparkPi 作业(这假设Spark二进制文件已安装到 `/usr/local` 目录)。
+[dockerhub](https://hub.docker.com/r/apache/spark/tags)中有不同spark版本的官方镜像
+运行一个简单的 SparkPi 作业,假设 Spark 二进制文件本地安装在 `/usr/local` 目录中。
 ```shell script
-export SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7/
+export SPARK_HOME=/usr/local/spark/
 ${SPARK_HOME}/bin/spark-submit --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.namespace=spark-test \
    --conf spark.kubernetes.executor.request.cores=1 \
-   --conf spark.kubernetes.container.image=apache/yunikorn:spark-2.4.4 \
+   --conf spark.kubernetes.container.image=docker.io/spark:v3.3.0 \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-test:spark \
-   local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
+   local:///opt/spark/examples/jars/spark-examples_2.12-3.3.0.jar
+```
+:::note
+在 [spark](https://spark.apache.org/docs/latest/running-on-kubernetes.html#configuration) 中有更多设置驱动程序和执行程序的选项。
+可以分配 applicationId 和队列路径。
 ```
+--conf spark.kubernetes.executor.label.applicationId=application-spark-0001
+--conf spark.kubernetes.driver.label.applicationId=application-spark-0001
+--conf spark.kubernetes.executor.label.queue=default.root.sandbox
+--conf spark.kubernetes.driver.label.queue=default.root.sandbox
+```
+:::
 
 您可以看见Spark的driver和executors在Kubernetes上创建:
 
-![spark-pods](./../../assets/spark-pods.png)
+![spark-pods](./../../assets/RunningSparkOnK8s.png)
 
-您还可以从 YuniKorn UI 查看作业信息。如果您不知道如何访问 YuniKorn UI,请阅读文档
+spark-pi 结果在驱动程序舱中。
 [链接](../../get_started/get_started.md#访问-web-ui).

Review Comment:
   [链接](../../get_started/get_started.md#访问-web-ui).
   This png should be removed.



##########
i18n/zh-cn/docusaurus-plugin-content-docs/current/user_guide/workloads/run_spark.md:
##########
@@ -104,28 +108,39 @@ EOF
 kubectl proxy
 ```
 
-运行一个简单的 SparkPi 作业(这假设Spark二进制文件已安装到 `/usr/local` 目录)。
+[dockerhub](https://hub.docker.com/r/apache/spark/tags)中有不同spark版本的官方镜像
+运行一个简单的 SparkPi 作业,假设 Spark 二进制文件本地安装在 `/usr/local` 目录中。
 ```shell script
-export SPARK_HOME=/usr/local/spark-2.4.4-bin-hadoop2.7/
+export SPARK_HOME=/usr/local/spark/
 ${SPARK_HOME}/bin/spark-submit --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --master k8s://http://localhost:8001 --deploy-mode cluster --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.namespace=spark-test \
    --conf spark.kubernetes.executor.request.cores=1 \
-   --conf spark.kubernetes.container.image=apache/yunikorn:spark-2.4.4 \
+   --conf spark.kubernetes.container.image=docker.io/spark:v3.3.0 \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-test:spark \
-   local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
+   local:///opt/spark/examples/jars/spark-examples_2.12-3.3.0.jar
+```
+:::note
+在 [spark](https://spark.apache.org/docs/latest/running-on-kubernetes.html#configuration) 中有更多设置驱动程序和执行程序的选项。
+可以分配 applicationId 和队列路径。
 ```
+--conf spark.kubernetes.executor.label.applicationId=application-spark-0001
+--conf spark.kubernetes.driver.label.applicationId=application-spark-0001
+--conf spark.kubernetes.executor.label.queue=default.root.sandbox
+--conf spark.kubernetes.driver.label.queue=default.root.sandbox
+```
+:::
 
 您可以看见Spark的driver和executors在Kubernetes上创建:
 
-![spark-pods](./../../assets/spark-pods.png)
+![spark-pods](./../../assets/RunningSparkOnK8s.png)
 
-您还可以从 YuniKorn UI 查看作业信息。如果您不知道如何访问 YuniKorn UI,请阅读文档
+spark-pi 结果在驱动程序舱中。
 [链接](../../get_started/get_started.md#访问-web-ui).

Review Comment:
   ../../get_started/get_started.md#访问-web-ui
   This png should be removed.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org