You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@dolphinscheduler.apache.org by zh...@apache.org on 2022/03/18 08:19:57 UTC

[dolphinscheduler-website] branch master updated: [Feature-8025][Document] Add example and notice about task type HTTP (#741)

This is an automated email from the ASF dual-hosted git repository.

zhongjiajie pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/dolphinscheduler-website.git


The following commit(s) were added to refs/heads/master by this push:
     new 46bc340  [Feature-8025][Document] Add example and notice about task type HTTP (#741)
46bc340 is described below

commit 46bc3400da0cbadf95e7b337e591b61958ea3597
Author: QuakeWang <45...@users.noreply.github.com>
AuthorDate: Fri Mar 18 16:19:46 2022 +0800

    [Feature-8025][Document] Add example and notice about task type HTTP (#741)
---
 docs/en-us/dev/user_doc/guide/task/datax.md      |   2 +-
 docs/en-us/dev/user_doc/guide/task/flink.md      |   2 +-
 docs/en-us/dev/user_doc/guide/task/http.md       |  65 ++++++++++++++++-------
 docs/en-us/dev/user_doc/guide/task/map-reduce.md |   2 +-
 docs/en-us/dev/user_doc/guide/task/shell.md      |   2 +-
 docs/en-us/dev/user_doc/guide/task/spark.md      |   2 +-
 docs/en-us/dev/user_doc/guide/task/sql.md        |   4 +-
 docs/zh-cn/dev/user_doc/guide/task/http.md       |  54 ++++++++++++++-----
 img/tasks/demo/http_task01.png                   | Bin 0 -> 208489 bytes
 img/tasks/icons/http.png                         | Bin 0 -> 707 bytes
 10 files changed, 92 insertions(+), 41 deletions(-)

diff --git a/docs/en-us/dev/user_doc/guide/task/datax.md b/docs/en-us/dev/user_doc/guide/task/datax.md
index 8585aee..8f43510 100644
--- a/docs/en-us/dev/user_doc/guide/task/datax.md
+++ b/docs/en-us/dev/user_doc/guide/task/datax.md
@@ -6,7 +6,7 @@ DataX task type for executing DataX programs. For DataX nodes, the worker will e
 
 ## Create Task
 
-- Click Project Management -> Project Name -> Workflow Definition, and click the "Create Workflow" button to enter the DAG editing page.
+- Click `Project -> Management-Project -> Name-Workflow Definition`, and click the `Create Workflow` button to enter the DAG editing page.
 - Drag from the toolbar <img src="/img/tasks/icons/datax.png" width="15"/> task node to canvas.
 
 ## Task Parameter
diff --git a/docs/en-us/dev/user_doc/guide/task/flink.md b/docs/en-us/dev/user_doc/guide/task/flink.md
index 4bb0479..ab4f76b 100644
--- a/docs/en-us/dev/user_doc/guide/task/flink.md
+++ b/docs/en-us/dev/user_doc/guide/task/flink.md
@@ -6,7 +6,7 @@ Flink task type for executing Flink programs. For Flink nodes, the worker submit
 
 ## Create Task
 
-- Click Project Management -> Project Name -> Workflow Definition, and click the "Create Workflow" button to enter the DAG editing page.
+- Click `Project -> Management-Project -> Name-Workflow Definition`, and click the "Create Workflow" button to enter the DAG editing page.
 - Drag from the toolbar <img src="/img/tasks/icons/flink.png" width="15"/>task node to canvas.
 
 ## Task Parameter
diff --git a/docs/en-us/dev/user_doc/guide/task/http.md b/docs/en-us/dev/user_doc/guide/task/http.md
index 0303714..3034913 100644
--- a/docs/en-us/dev/user_doc/guide/task/http.md
+++ b/docs/en-us/dev/user_doc/guide/task/http.md
@@ -1,22 +1,47 @@
 # HTTP Node
 
-- Drag from the toolbar <img src="/img/http.png" width="35"/> task node to the canvas, as shown in the following figure:
-
-<p align="center">
-   <img src="/img/http-en.png" width="80%" />
- </p>
-
-- Node name: The node name in a workflow definition is unique.
-- Run flag: Identifies whether this node schedules normally, if it does not need to execute, select the `prohibition execution`.
-- Descriptive information: Describe the function of the node.
-- Task priority: When the number of worker threads is insufficient, execute in the order of priority from high to low, and tasks with the same priority will execute in a first-in first-out order.
-- Worker grouping: Assign tasks to the machines of the worker group to execute. If `Default` is selected, randomly select a worker machine for execution.
-- Times of failed retry attempts: The number of times the task failed to resubmit. You can select from drop-down or fill-in a number.
-- Failed retry interval: The time interval for resubmitting the task after a failed task. You can select from drop-down or fill-in a number.
-- Timeout alarm: Check the timeout alarm and timeout failure. When the task runs exceed the "timeout", an alarm email will send and the task execution will fail.
-- Request address: HTTP request URL.
-- Request type: Support GET, POST, HEAD, PUT and DELETE.
-- Request parameters: Support Parameter, Body and Headers.
-- Verification conditions: Support default response code, custom response code, content include and content not included.
-- Verification content: When the verification condition selects the custom response code, the content include or the content not included, the verification content is required.
-- Custom parameter: It is a user-defined local parameter of HTTP, and will replace the content with `${variable}` in the script.
+## Overview
+
+This node is used to perform http type tasks such as the common POST and GET request types, and also supports http request validation and other functions.
+
+## Create Task
+
+- Click `Project -> Management-Project -> Name-Workflow Definition`, and click the "Create Workflow" button to enter the DAG editing page.
+- Drag the <img src="/img/tasks/icons/http.png" width="15"/> from the toolbar to the drawing board.
+
+## Task Parameter
+
+- **Node name**: The node name in a workflow definition is unique.
+- **Run flag**: Identifies whether this node can be scheduled normally, if it does not need to be executed, you can turn on the prohibition switch.
+- **Descriptive information**: describe the function of the node.
+- **Task priority**: When the number of worker threads is insufficient, they are executed in order from high to low, and when the priority is the same, they are executed according to the first-in first-out principle.
+- **Worker grouping**: Tasks are assigned to the machines of the worker group to execute. If Default is selected, a worker machine will be randomly selected for execution.
+- **Environment Name**: Configure the environment name in which to run the script.
+- **Number of failed retry attempts**: The number of times the task failed to be resubmitted.
+- **Failed retry interval**: The time, in cents, interval for resubmitting the task after a failed task.
+- **Delayed execution time**: the time, in cents, that a task is delayed in execution.
+- **Timeout alarm**: Check the timeout alarm and timeout failure. When the task exceeds the "timeout period", an alarm email will be sent and the task execution will fail.
+- **Request address**: HTTP request URL.
+- **Request type**: Support GET, POSt, HEAD, PUT, DELETE.
+- **Request parameters**: Support Parameter, Body, Headers.
+- **Verification conditions**: support default response code, custom response code, content included, content not included.
+- **Verification content**: When the verification condition selects a custom response code, the content contains, and the content does not contain, the verification content is required.
+- **Custom parameter**: It is a user-defined parameter of http part, which will replace the content with `${variable}` in the script.
+- **Predecessor task**: Selecting a predecessor task for the current task will set the selected predecessor task as upstream of the current task.
+
+## Example
+
+HTTP defines the different methods of interacting with the server, the most basic methods are GET, POST, PUT and DELETE. Here we use the http task node to demonstrate the use of POST to send a request to the system's login page to submit data.
+
+The main configuration parameters are as follows:
+
+- URL: Address to access the target resource. Here is the system's login page.
+- HTTP Parameters:
+     - userName: Username
+     - userPassword: User login password
+
+![http_task](/img/tasks/demo/http_task01.png)
+
+## Notice
+
+None
diff --git a/docs/en-us/dev/user_doc/guide/task/map-reduce.md b/docs/en-us/dev/user_doc/guide/task/map-reduce.md
index f7a7a43..5ea69ba 100644
--- a/docs/en-us/dev/user_doc/guide/task/map-reduce.md
+++ b/docs/en-us/dev/user_doc/guide/task/map-reduce.md
@@ -6,7 +6,7 @@ MapReduce(MR) task type used for executing MapReduce programs. For MapReduce nod
 
 ## Create Task
 
-- Click Project Management-Project Name-Workflow Definition, and click the "Create Workflow" button to enter the DAG editing page.
+- Click `Project -> Management-Project -> Name-Workflow Definition`, and click the `Create Workflow` button to enter the DAG editing page.
 - Drag from the toolbar <img src="/img/tasks/icons/mr.png" width="15"/> to the canvas.
 
 ## Task Parameter
diff --git a/docs/en-us/dev/user_doc/guide/task/shell.md b/docs/en-us/dev/user_doc/guide/task/shell.md
index ef110f1..a026d2f 100644
--- a/docs/en-us/dev/user_doc/guide/task/shell.md
+++ b/docs/en-us/dev/user_doc/guide/task/shell.md
@@ -6,7 +6,7 @@ Shell task used to create a shell task type and execute a series of shell script
 
 ## Create Task
 
-- Click Project Management-Project->Name-Workflow Definition, and click the "Create Workflow" button to enter the DAG editing page.
+- Click `Project -> Management-Project -> Name-Workflow Definition`, and click the `Create Workflow` button to enter the DAG editing page.
 - Drag  from the toolbar <img src="/img/tasks/icons/shell.png" width="15"/> to the canvas.
 
 ## Task Parameter
diff --git a/docs/en-us/dev/user_doc/guide/task/spark.md b/docs/en-us/dev/user_doc/guide/task/spark.md
index 1ceb21d..7e02544 100644
--- a/docs/en-us/dev/user_doc/guide/task/spark.md
+++ b/docs/en-us/dev/user_doc/guide/task/spark.md
@@ -6,7 +6,7 @@ Spark task type used to execute Spark program. For Spark nodes, the worker submi
 
 ## Create Task
 
-- Click Project Management -> Project Name -> Workflow Definition, and click the "Create Workflow" button to enter the DAG editing page.
+- Click `Project -> Management-Project -> Name-Workflow Definition`, and click the `Create Workflow` button to enter the DAG editing page.
 - Drag from the toolbar <img src="/img/tasks/icons/spark.png" width="15"/> to the canvas.
 
 ## Task Parameter
diff --git a/docs/en-us/dev/user_doc/guide/task/sql.md b/docs/en-us/dev/user_doc/guide/task/sql.md
index 87971cd..9c62245 100644
--- a/docs/en-us/dev/user_doc/guide/task/sql.md
+++ b/docs/en-us/dev/user_doc/guide/task/sql.md
@@ -10,7 +10,7 @@ Refer to [DataSource](../datasource/introduction.md)
 
 ## Create Task
 
-- Click Project Management-Project Name-Workflow Definition, and click the "Create Workflow" button to enter the DAG editing page.
+- Click `Project -> Management-Project -> Name-Workflow Definition`, and click the "Create Workflow" button to enter the DAG editing page.
 - Drag from the toolbar <img src="/img/tasks/icons/sql.png" width="25"/> to the canvas.
 
 ## Task Parameter
@@ -40,4 +40,4 @@ Log in to the bigdata cluster and use 'hive' command or 'beeline' or 'JDBC' and
 
 ## Notice
 
-Pay attention to the selection of SQL type. If it is an insert operation, need to change to "Non-Query" type.
\ No newline at end of file
+Pay attention to the selection of SQL type. If it is an insert operation, need to change to "Non-Query" type.
diff --git a/docs/zh-cn/dev/user_doc/guide/task/http.md b/docs/zh-cn/dev/user_doc/guide/task/http.md
index 2e6f9a8..18cf371 100644
--- a/docs/zh-cn/dev/user_doc/guide/task/http.md
+++ b/docs/zh-cn/dev/user_doc/guide/task/http.md
@@ -1,22 +1,48 @@
-# http节点
+# HTTP 节点
 
-- 拖动工具栏中的<img src="/img/http.png" width="35"/>任务节点到画板中,如下图所示:
+## 综述
 
-<p align="center">
-   <img src="/img/http_edit.png" width="80%" />
- </p>
+该节点用于执行 http 类型的任务,例如常见的 POST、GET 等请求类型,此外还支持 http 请求校验等功能。
 
-- 节点名称:一个工作流定义中的节点名称是唯一的。
+## 创建任务
+
+- 点击项目管理 -> 项目名称 -> 工作流定义,点击”创建工作流”按钮,进入 DAG 编辑页面:
+
+- 拖动工具栏的 <img src="/img/tasks/icons/http.png" width="15"/> 任务节点到画板中。
+
+## 任务参数
+
+- 节点名称:设置任务的名称。一个工作流定义中的节点名称是唯一的。
 - 运行标志:标识这个节点是否能正常调度,如果不需要执行,可以打开禁止执行开关。
-- 描述信息:描述该节点的功能。
-- 任务优先级:worker线程数不足时,根据优先级从高到低依次执行,优先级一样时根据先进先出原则执行。
-- Worker分组:任务分配给worker组的机器机执行,选择Default,会随机选择一台worker机执行。
+- 描述:描述该节点的功能。
+- 任务优先级:worker 线程数不足时,根据优先级从高到低依次执行,优先级一样时根据先进先出原则执行。
+- Worker 分组:任务分配给 worker 组的机器机执行,选择 Default,会随机选择一台 worker 机执行。
+- 环境名称:配置运行任务的环境。
 - 失败重试次数:任务失败重新提交的次数,支持下拉和手填。
 - 失败重试间隔:任务失败重新提交任务的时间间隔,支持下拉和手填。
-- 超时告警:勾选超时告警、超时失败,当任务超过"超时时长"后,会发送告警邮件并且任务执行失败.
-- 请求地址:http请求URL。
-- 请求类型:支持GET、POSt、HEAD、PUT、DELETE。
-- 请求参数:支持Parameter、Body、Headers。
+- 延迟执行时间:任务延迟执行的时间,以分为单位。
+- 超时告警:勾选超时告警、超时失败,当任务超过"超时时长"后,会发送告警邮件并且任务执行失败。
+- 请求地址:http 请求 URL。
+- 请求类型:支持 GET、POST、HEAD、PUT、DELETE。
+- 请求参数:支持 Parameter、Body、Headers。
 - 校验条件:支持默认响应码、自定义响应码、内容包含、内容不包含。
 - 校验内容:当校验条件选择自定义响应码、内容包含、内容不包含时,需填写校验内容。
-- 自定义参数:是http局部的用户自定义参数,会替换脚本中以${变量}的内容。
+- 自定义参数:是 http 局部的用户自定义参数,会替换脚本中以 ${变量} 的内容。
+- 前置任务:选择当前任务的前置任务,会将被选择的前置任务设置为当前任务的上游。
+
+## 任务样例
+
+HTTP 定义了与服务器交互的不同方法,最基本的方法有4种,分别是GET,POST,PUT,DELETE。这里我们使用 http 任务节点,演示使用 POST 向系统的登录页面发送请求,提交数据。
+
+主要配置参数如下:
+
+- URL:访问目标资源的地址,这里为系统的登录页面。
+- HTTP Parameters
+     - userName:用户名;
+     - userPassword:用户登录密码。
+
+![http_task](/img/tasks/demo/http_task01.png)
+
+## 注意事项
+
+无。
diff --git a/img/tasks/demo/http_task01.png b/img/tasks/demo/http_task01.png
new file mode 100644
index 0000000..43102f7
Binary files /dev/null and b/img/tasks/demo/http_task01.png differ
diff --git a/img/tasks/icons/http.png b/img/tasks/icons/http.png
new file mode 100644
index 0000000..1d80cd0
Binary files /dev/null and b/img/tasks/icons/http.png differ