You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@yunikorn.apache.org by GitBox <gi...@apache.org> on 2022/03/08 08:35:02 UTC

[GitHub] [incubator-yunikorn-site] 0yukali0 opened a new pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

0yukali0 opened a new pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134


   Why we need this issue?
   Describe how to use perf-tool in yunikorn release.
   
   Issue link:
   https://issues.apache.org/jira/browse/YUNIKORN-951


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] 0yukali0 commented on a change in pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on a change in pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#discussion_r825486906



##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|

Review comment:
       Hi @yangwwei , Queue fairness isn't supported in this perf-tool now.
   Maybe we can create a issue to trace it?
   I think we can implement it based on node-fairness in perf-tools.
   1.node capacity -> queue resource
   2.change (allocated resource of pods*10/node allocatable resource) to (resource of pods according to queue/queue )
   
   For now, i am going to add some screenshots and description to show how to set step by step.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] yangwwei commented on a change in pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
yangwwei commented on a change in pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#discussion_r825493263



##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|

Review comment:
       Oh, I see, maybe that wasn't included in this package. I remember we have some results here in the past: https://yunikorn.apache.org/docs/0.8.0/performance/evaluate_perf_function_with_kubemark#resource-fairness-between-queues. It's fine, we don't need to address this now. Thanks for checking this.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] 0yukali0 commented on a change in pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on a change in pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#discussion_r824318619



##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|
+|	node fairness	|	Monitor node resource usage(allocated/capicity) with lots of pods requests			| 	exist	|	exist			|
+|	thourghput	|	Allocate `pod.spec.starttime` to calculate throughput(pods/sec) with lots of pods requests	|	exist	|	none			|
+
+### 2. Build tool
+Performance tool is in [yunikorn release](https://github.com/apache/incubator-yunikorn-release.git), so clone it to your host. 
+```
+git clone https://github.com/apache/incubator-yunikorn-release.git
+```
+Go to performance tool directory and build it
+```
+cd incubator-yunikorn-release/perf-tools/
+go mod tidy
+go build
+```
+It will look like this.
+![Build-perf-tools](./../assets/perf-tutorial-build.png)
+
+### 3. Set test configuration
+Before start tests, check configuration whether meet your except.
+Default output path is `\tmp`, you can modify `common.outputrootpath` to change it.
+In each scenarios, it contains followings and we can set
+
+|	field			|			description					|
+| ----------------------------- | --------------------------------------------------------------------- |
+|	schedulerNames		|	List of scheduler will run these cases 				|
+|	showNumOfLastTasks	|	Show the last tasks in scheduling				|

Review comment:
       Ok.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] 0yukali0 commented on a change in pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on a change in pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#discussion_r824318257



##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|
+|	node fairness	|	Monitor node resource usage(allocated/capicity) with lots of pods requests			| 	exist	|	exist			|
+|	thourghput	|	Allocate `pod.spec.starttime` to calculate throughput(pods/sec) with lots of pods requests	|	exist	|	none			|
+
+### 2. Build tool
+Performance tool is in [yunikorn release](https://github.com/apache/incubator-yunikorn-release.git), so clone it to your host. 
+```
+git clone https://github.com/apache/incubator-yunikorn-release.git
+```
+Go to performance tool directory and build it
+```
+cd incubator-yunikorn-release/perf-tools/
+go mod tidy
+go build
+```
+It will look like this.
+![Build-perf-tools](./../assets/perf-tutorial-build.png)
+
+### 3. Set test configuration
+Before start tests, check configuration whether meet your except.
+Default output path is `\tmp`, you can modify `common.outputrootpath` to change it.
+In each scenarios, it contains followings and we can set
+
+|	field			|			description					|
+| ----------------------------- | --------------------------------------------------------------------- |
+|	schedulerNames		|	List of scheduler will run these cases 				|

Review comment:
       Yes.
   In throughput and fairness cases, this tool can run certain scheduler we require,
   It will change scheduler name in pod spec field.
   I will explain fields in each case separately in next commit.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] yangwwei commented on a change in pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
yangwwei commented on a change in pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#discussion_r825342509



##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|

Review comment:
       thanks, any updates?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] yangwwei commented on pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
yangwwei commented on pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#issuecomment-1066166715


   hi @0yukali0 have you done the updates? I can take a look again if all my previous comments were addressed. Thanks!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] 0yukali0 commented on a change in pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on a change in pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#discussion_r824318351



##########
File path: package.json
##########
@@ -12,7 +12,7 @@
   "dependencies": {
     "@docusaurus/core": "^2.0.0-beta.15",
     "@docusaurus/preset-classic": "^2.0.0-beta.15",
-    "@docusaurus/theme-search-algolia": "^2.0.0-beta.15",
+    "@docusaurus/theme-search-algolia": "^2.0.0-beta.17",

Review comment:
       Ok




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] yangwwei commented on a change in pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
yangwwei commented on a change in pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#discussion_r822296060



##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|
+|	node fairness	|	Monitor node resource usage(allocated/capicity) with lots of pods requests			| 	exist	|	exist			|
+|	thourghput	|	Allocate `pod.spec.starttime` to calculate throughput(pods/sec) with lots of pods requests	|	exist	|	none			|
+
+### 2. Build tool
+Performance tool is in [yunikorn release](https://github.com/apache/incubator-yunikorn-release.git), so clone it to your host. 

Review comment:
       The performance tool is available in [yunikorn release repo](https://github.com/apache/incubator-yunikorn-release.git), clone the repo to your local workspace.

##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|

Review comment:
       I would rather not call this e2e test, we usually say e2e test while doing some functional testing on integrated envs. I think we can remove this row, just focus on the rest. BTW, I think the tool supports Queue fairness if I remember correctly. could you please double-check?

##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|
+|	node fairness	|	Monitor node resource usage(allocated/capicity) with lots of pods requests			| 	exist	|	exist			|
+|	thourghput	|	Allocate `pod.spec.starttime` to calculate throughput(pods/sec) with lots of pods requests	|	exist	|	none			|
+
+### 2. Build tool
+Performance tool is in [yunikorn release](https://github.com/apache/incubator-yunikorn-release.git), so clone it to your host. 
+```
+git clone https://github.com/apache/incubator-yunikorn-release.git
+```
+Go to performance tool directory and build it
+```
+cd incubator-yunikorn-release/perf-tools/
+go mod tidy
+go build
+```
+It will look like this.
+![Build-perf-tools](./../assets/perf-tutorial-build.png)
+
+### 3. Set test configuration
+Before start tests, check configuration whether meet your except.
+Default output path is `\tmp`, you can modify `common.outputrootpath` to change it.
+In each scenarios, it contains followings and we can set
+
+|	field			|			description					|
+| ----------------------------- | --------------------------------------------------------------------- |
+|	schedulerNames		|	List of scheduler will run these cases 				|

Review comment:
       you can actually set 2 scheduler names? I wasn't aware of that
   I think you need to add some more description to explain this a bit more

##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|
+|	node fairness	|	Monitor node resource usage(allocated/capicity) with lots of pods requests			| 	exist	|	exist			|
+|	thourghput	|	Allocate `pod.spec.starttime` to calculate throughput(pods/sec) with lots of pods requests	|	exist	|	none			|

Review comment:
       Measure schedulers' throughput by calculating how many pods are allocated per second based on the pod start time

##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|
+|	node fairness	|	Monitor node resource usage(allocated/capicity) with lots of pods requests			| 	exist	|	exist			|
+|	thourghput	|	Allocate `pod.spec.starttime` to calculate throughput(pods/sec) with lots of pods requests	|	exist	|	none			|
+
+### 2. Build tool
+Performance tool is in [yunikorn release](https://github.com/apache/incubator-yunikorn-release.git), so clone it to your host. 
+```
+git clone https://github.com/apache/incubator-yunikorn-release.git
+```
+Go to performance tool directory and build it

Review comment:
       NITS, this can be simply put as : 
   Build the tool:
   

##########
File path: package.json
##########
@@ -12,7 +12,7 @@
   "dependencies": {
     "@docusaurus/core": "^2.0.0-beta.15",
     "@docusaurus/preset-classic": "^2.0.0-beta.15",
-    "@docusaurus/theme-search-algolia": "^2.0.0-beta.15",
+    "@docusaurus/theme-search-algolia": "^2.0.0-beta.17",

Review comment:
       this change seems unrelated to this PR, can we skip this?

##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|
+|	node fairness	|	Monitor node resource usage(allocated/capicity) with lots of pods requests			| 	exist	|	exist			|
+|	thourghput	|	Allocate `pod.spec.starttime` to calculate throughput(pods/sec) with lots of pods requests	|	exist	|	none			|
+
+### 2. Build tool
+Performance tool is in [yunikorn release](https://github.com/apache/incubator-yunikorn-release.git), so clone it to your host. 
+```
+git clone https://github.com/apache/incubator-yunikorn-release.git
+```
+Go to performance tool directory and build it
+```
+cd incubator-yunikorn-release/perf-tools/
+go mod tidy
+go build
+```
+It will look like this.
+![Build-perf-tools](./../assets/perf-tutorial-build.png)
+
+### 3. Set test configuration
+Before start tests, check configuration whether meet your except.
+Default output path is `\tmp`, you can modify `common.outputrootpath` to change it.

Review comment:
       \tmp or /tmp?

##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|
+|	node fairness	|	Monitor node resource usage(allocated/capicity) with lots of pods requests			| 	exist	|	exist			|
+|	thourghput	|	Allocate `pod.spec.starttime` to calculate throughput(pods/sec) with lots of pods requests	|	exist	|	none			|
+
+### 2. Build tool
+Performance tool is in [yunikorn release](https://github.com/apache/incubator-yunikorn-release.git), so clone it to your host. 
+```
+git clone https://github.com/apache/incubator-yunikorn-release.git
+```
+Go to performance tool directory and build it
+```
+cd incubator-yunikorn-release/perf-tools/
+go mod tidy
+go build
+```
+It will look like this.
+![Build-perf-tools](./../assets/perf-tutorial-build.png)
+
+### 3. Set test configuration
+Before start tests, check configuration whether meet your except.
+Default output path is `\tmp`, you can modify `common.outputrootpath` to change it.
+In each scenarios, it contains followings and we can set
+
+|	field			|			description					|
+| ----------------------------- | --------------------------------------------------------------------- |
+|	schedulerNames		|	List of scheduler will run these cases 				|
+|	showNumOfLastTasks	|	Show the last tasks in scheduling				|

Review comment:
       I feel this description isn't accurate. can you elaborate more?

##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|

Review comment:
       NITS: The first letter should be the capital letters in each column, just for consistency formatting




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] 0yukali0 commented on a change in pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on a change in pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#discussion_r824315881



##########
File path: docs/performance/performance_tutorial.md
##########
@@ -355,6 +355,52 @@ scrape_configs:
 
 Once the environment is setup, you are good to run workloads and collect results. YuniKorn community has some useful tools to run workloads and collect metrics, more details will be published here.
 
+### 1. Scenarios 
+In performance tools, there are three types of tests and feedbacks.
+
+|	test type	|						description						|	diagram	|  		log		|
+| ---------------------	| -----------------------------------------------------------------------------------------------------	| ------------- | ----------------------------- |
+|	e2e test	|	Simulate and record the time in each steps							|	none	|	exist(QPS, timecost)	|

Review comment:
       Ok, i will do it.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] 0yukali0 commented on pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
0yukali0 commented on pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134#issuecomment-1067144281


   Hi @yangwwei , i update new one.
   I also update some screenshots and configuration parts in this commit.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-yunikorn-site] yangwwei merged pull request #134: [YUNIKORN-951] Add perf-tool description into benchmarking tutorial page

Posted by GitBox <gi...@apache.org>.
yangwwei merged pull request #134:
URL: https://github.com/apache/incubator-yunikorn-site/pull/134


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@yunikorn.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org