You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@dolphinscheduler.apache.org by zh...@apache.org on 2022/05/30 03:44:16 UTC

[dolphinscheduler] branch dev updated: [doc] Refactor context parameter docment (#10227)

This is an automated email from the ASF dual-hosted git repository.

zhongjiajie pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/dolphinscheduler.git


The following commit(s) were added to refs/heads/dev by this push:
     new cd36401a6a [doc] Refactor context parameter docment (#10227)
cd36401a6a is described below

commit cd36401a6a57fbbec9bc6da088fa6dbd0aa672ab
Author: QuakeWang <45...@users.noreply.github.com>
AuthorDate: Mon May 30 11:44:10 2022 +0800

    [doc] Refactor context parameter docment (#10227)
---
 docs/docs/en/guide/parameter/context.md            |  66 ++++++++++++---------
 docs/docs/zh/guide/parameter/context.md            |  65 +++++++++++---------
 docs/img/globalParam/image-20210723101242216.png   | Bin 44625 -> 0 bytes
 docs/img/globalParam/image-20210723102522383.png   | Bin 42919 -> 0 bytes
 docs/img/globalParam/image-20210723104957031.png   | Bin 36588 -> 0 bytes
 docs/img/globalParam/image-20210723105026924.png   | Bin 37906 -> 0 bytes
 docs/img/globalParam/image-20210723105131381.png   | Bin 34311 -> 0 bytes
 docs/img/globalParam/image-20210723105255850.png   | Bin 38332 -> 0 bytes
 docs/img/global_parameter.png                      | Bin 94677 -> 0 bytes
 docs/img/new_ui/dev/parameter/context_log01.png    | Bin 0 -> 311889 bytes
 docs/img/new_ui/dev/parameter/context_log02.png    | Bin 0 -> 311298 bytes
 docs/img/new_ui/dev/parameter/context_log03.png    | Bin 0 -> 297036 bytes
 .../new_ui/dev/parameter/context_parameter01.png   | Bin 0 -> 165499 bytes
 .../new_ui/dev/parameter/context_parameter02.png   | Bin 0 -> 160262 bytes
 .../new_ui/dev/parameter/context_parameter03.png   | Bin 0 -> 157262 bytes
 .../new_ui/dev/parameter/context_parameter04.png   | Bin 0 -> 150018 bytes
 16 files changed, 76 insertions(+), 55 deletions(-)

diff --git a/docs/docs/en/guide/parameter/context.md b/docs/docs/en/guide/parameter/context.md
index e211e4bdd2..170477c078 100644
--- a/docs/docs/en/guide/parameter/context.md
+++ b/docs/docs/en/guide/parameter/context.md
@@ -6,10 +6,6 @@ DolphinScheduler provides the ability to refer to each other between parameters,
 
 The premise of local tasks referring global parameters is that you have already defined [Global Parameter](global.md). The usage is similar to the usage in [local parameters](local.md), but the value of the parameter needs to be configured as the key of the global parameter.
 
-![parameter-call-global-in-local](/img/global_parameter.png)
-
-As the figure above shows, `${biz_date}` and `${curdate}` are examples of local parameters that refer to global parameters. Observe the last line of the above figure, `local_param_bizdate` uses `${global_bizdate}` to refer to the global parameter. In the shell script, you can use `${local_param_bizdate}` to refer to the value of the global variable `global_bizdate`, or set the value of `local_param_bizdate` directly through JDBC. Similarly, `local_param` refers to the global parameters d [...]
-
 ## Pass Parameter From Upstream Task to Downstream
 
 DolphinScheduler allows parameter transfer between tasks. Currently, transfer direction only supports one-way transfer from upstream to downstream. The task types that support this feature are:
@@ -20,47 +16,63 @@ DolphinScheduler allows parameter transfer between tasks. Currently, transfer di
 
 When defining an upstream node, if there is a need to transmit the result of that node to a dependency related downstream node. You need to set an `OUT` direction parameter to [Custom Parameters] of the [Current Node Settings]. At present, we mainly focus on the SQL and shell nodes to pass parameters downstream.
 
-### SQL
+> Note: If there are no dependencies between nodes, local parameters cannot be passed upstream.
+
+### Example
+
+This sample shows how to use the parameter passing function. Create local parameters and assign them to downstream through the SHELL task. The SQL task completes the query operation by obtaining the parameters of the upstream task.
+
+#### Create a SHELL task and set parameters
+
+The user needs to pass the parameter when creating the shell script, the output statement format is `'${setValue(key=value)}'`, the key is the `prop` of the corresponding parameter, and value is the value of the parameter.
+
+Create a Node_A task, add output and value parameters to the custom parameters, and write the following script:
+
+![context-parameter01](/img/new_ui/dev/parameter/context_parameter01.png)
+
+Parameter Description:
 
-`prop` is user-specified; the direction selects `OUT`, and will define as an export parameter only when the direction is `OUT`. Choose data structures for data type according to the scenario, and the leave the value part blank.
+- value: The direction selection is IN, and the value is 66
+- output: The direction is selected as OUT, assigned through the script`'${setValue(output=1)}'`, and passed to the downstream parameters
 
-If the result of the SQL node has only one row, one or multiple fields, the name of the `prop` needs to be the same as the field name. The data type can choose structure except `LIST`. The parameter assigns the value according to the same column name in the SQL query result.
+When the SHELL node is defined, the log detects the format of `${setValue(output=1)}`, it will assign 1 to output, and the downstream node can directly use the value of the variable output. Similarly, you can find the corresponding node instance on the [Workflow Instance] page, and then you can view the value of this variable.
 
-If the result of the SQL node has multiple rows, one or more fields, the name of the `prop` needs to be the same as the field name. Choose the data type structure as `LIST`, and the SQL query result will be converted to `LIST<VARCHAR>`, and forward to convert to JSON as the parameter value.
+Create the Node_B task, which is mainly used to test and output the parameters passed by the upstream task Node_A.
 
-Let's make an example of the SQL node process in the above picture:
+![context-parameter02](/img/new_ui/dev/parameter/context_parameter02.png)
 
-The following defines the [createParam1] node in the above figure:
+#### Create SQL tasks and use parameters
 
-![png05](/img/globalParam/image-20210723104957031.png)
+When the SHELL task is completed, we can use the output passed upstream as the query object for the SQL. The id of the query is renamed to ID and is output as a parameter.
 
-The following defines the [createParam2] node:
+![context-parameter03](/img/new_ui/dev/parameter/context_parameter03.png)
 
-![png06](/img/globalParam/image-20210723105026924.png)
+> Note: If the result of the SQL node has only one row, one or multiple fields, the name of the `prop` needs to be the same as the field name. The data type can choose structure except `LIST`. The parameter assigns the value according to the same column name in the SQL query result.
+>
+>If the result of the SQL node has multiple rows, one or more fields, the name of the `prop` needs to be the same as the field name. Choose the data type structure as `LIST`, and the SQL query result will be converted to `LIST<VARCHAR>`, and forward to convert to JSON as the parameter value.
 
-Find the value of the variable in the [Workflow Instance] page corresponding to the node instance.
+#### Save the workflow and set the global parameters
 
-The following shows the Node instance [createparam1]:
+Click on the Save workflow icon and set the global parameters output and value.
 
-![png07](/img/globalParam/image-20210723105131381.png)
+![context-parameter03](/img/new_ui/dev/parameter/context_parameter04.png)
 
-Here, the value of "id" is 12.
+#### View results
 
-Let's see the case of the node instance [createparam2].
+After the workflow is created, run the workflow online and view its running results.
 
-![png08](/img/globalParam/image-20210723105255850.png)
+The result of Node_A is as follows:
 
-There is only the "id" value. Although the user-defined SQL query both "id" and "database_name" field, only set the `OUT` parameter `id` due to only one parameter "id" is defined for output. The length of the result list is 10 due to display reasons.
+![context-log01](/img/new_ui/dev/parameter/context_log01.png)
 
-### SHELL
+The result of Node_B is as follows:
 
-`prop` is user-specified and the direction is `OUT`. The output is defined as an export parameter only when the direction is `OUT`. Choose data structures for data type according to the scenario, and leave the value part blank.
-The user needs to pass the parameter when creating the shell script, the output statement format is `${setValue(key=value)}`, the key is the `prop` of the corresponding parameter, and value is the value of the parameter.
+![context-log02](/img/new_ui/dev/parameter/context_log02.png)
 
-For example, in the figure below:
+The result of Node_mysql is as follows:
 
-![png09](/img/globalParam/image-20210723101242216.png)
+![context-log03](/img/new_ui/dev/parameter/context_log03.png)
 
-When the log detects the `${setValue(key=value1)}` format in the shell node definition, it will assign value1 to the key, and downstream nodes can use the variable key directly. Similarly, you can find the corresponding node instance on the [Workflow Instance] page to see the value of the variable.
+Even though output is assigned a value of 1 in Node_A's script, the log still shows a value of 100. But according to the principle from [parameter priority](priority.md): `Local Parameter > Parameter Context > Global Parameter`, the output value in Node_B is 1. It proves that the output parameter is passed in the workflow with reference to the expected value, and the query operation is completed using this value in Node_mysql.
 
-![png10](/img/globalParam/image-20210723102522383.png)
+But the output value 66 only shows in the Node_A, the reason is that the direction of value is selected as IN, and only when the direction is OUT will it be defined as a variable output.
diff --git a/docs/docs/zh/guide/parameter/context.md b/docs/docs/zh/guide/parameter/context.md
index d014b9454e..2e55ac7414 100644
--- a/docs/docs/zh/guide/parameter/context.md
+++ b/docs/docs/zh/guide/parameter/context.md
@@ -4,11 +4,7 @@ DolphinScheduler 提供参数间相互引用的能力,包括:本地参数引
 
 ## 本地任务引用全局参数
 
-本地任务引用全局参数的前提是,你已经定义了[全局参数](global.md),使用方式和[本地参数](local.md)中的使用方式类似,但是参数的值需要配置成全局参数中的key
-
-![parameter-call-global-in-local](/img/global_parameter.png)
-
-如上图中的`${biz_date}`以及`${curdate}`,就是本地参数引用全局参数的例子。观察上图的最后一行,local_param_bizdate通过\${global_bizdate}来引用全局参数,在shell脚本中可以通过\${local_param_bizdate}来引全局变量 global_bizdate的值,或通过JDBC直接将local_param_bizdate的值set进去。同理,local_param通过${local_param}引用上一节中定义的全局参数。​biz_date、biz_curdate、system.datetime都是用户自定义的参数,通过${全局参数}进行赋值。
+本地任务引用全局参数的前提是,你已经定义了[全局参数](global.md),使用方式和[本地参数](local.md)中的使用方式类似,但是参数的值需要配置成全局参数中的 key。
 
 ## 上游任务传递给下游任务
 
@@ -20,50 +16,63 @@ DolphinScheduler 允许在任务间进行参数传递,目前传递方向仅支
 
 当定义上游节点时,如果有需要将该节点的结果传递给有依赖关系的下游节点,需要在【当前节点设置】的【自定义参数】设置一个方向是 OUT 的变量。目前我们主要针对 SQL 和 SHELL 节点做了可以向下传递参数的功能。
 
-### SQL
+> 注:若节点之间没有依赖关系,则局部参数无法通过上游传递。
+
+### 任务样例
+
+本样例展示了如何使用参数传递的功能,通过 SHELL 任务来创建本地参数并赋值传递给下游,SQL 任务通过获得上游任务的参数完成查询操作。
+
+#### 创建 SHELL 任务,设置参数
+
+> 用户需要传递参数,在定义 SHELL 脚本时,需要输出格式为 ${setValue(key=value)} 的语句,key 为对应参数的 prop,value 为该参数的值。
 
-prop 为用户指定;方向选择为 OUT,只有当方向为 OUT 时才会被定义为变量输出;数据类型可以根据需要选择不同数据结构;value 部分不需要填写。
+创建 Node_A 任务,在自定义参数中添加 output 和 value 参数,并编写如下脚本:
 
-如果 SQL 节点的结果只有一行,一个或多个字段,prop 的名字需要和字段名称一致。数据类型可选择为除 LIST 以外的其他类型。变量会选择 SQL 查询结果中的列名中与该变量名称相同的列对应的值。
+![context-parameter01](/img/new_ui/dev/parameter/context_parameter01.png)
 
-如果 SQL 节点的结果为多行,一个或多个字段,prop 的名字需要和字段名称一致。数据类型选择为LIST。获取到 SQL 查询结果后会将对应列转化为 LIST<VARCHAR>,并将该结果转化为 JSON 后作为对应变量的值。
+参数说明:
 
-我们再以上图中包含 SQL 节点的流程举例说明:
+- value:方向选择为 IN,并赋值为 66
+- output:方向选择为 OUT,通过脚本 `'${setValue(output=1)}'` 赋值,并传递给下游参数
 
-上图中节点【createParam1】的定义如下:
+SHELL 节点定义时当日志检测到 ${setValue(output=1)} 的格式时,会将 1 赋值给 output,下游节点便可以直接使用变量 output 的值。同样,您可以在【工作流实例】页面,找到对应的节点实例,便可以查看该变量的值。
 
-<img src="/img/globalParam/image-20210723104957031.png" alt="image-20210723104957031" style="zoom:50%;" />
+创建 Node_B 任务,主要用于测试输出上游任务 Node_A 传递的参数。
 
-节点【createParam2】的定义如下:
+![context-parameter02](/img/new_ui/dev/parameter/context_parameter02.png)
 
-<img src="/img/globalParam/image-20210723105026924.png" alt="image-20210723105026924" style="zoom:50%;" />
+#### 创建 SQL 任务,使用参数
 
-您可以在【工作流实例】页面,找到对应的节点实例,便可以查看该变量的值。
+完成上述的 SHELL 任务之后,我们可以使用上游所传递的 output 作为 SQL 的查询对象。其中将所查询的 id 重命名为 ID,作为参数输出。
 
-节点实例【createParam1】如下:
+![context-parameter03](/img/new_ui/dev/parameter/context_parameter03.png)
 
-<img src="/img/globalParam/image-20210723105131381.png" alt="image-20210723105131381" style="zoom:50%;" />
+> 注:如果 SQL 节点的结果只有一行,一个或多个字段,参数的名字需要和字段名称一致。数据类型可选择为除 LIST 以外的其他类型。变量会选择 SQL 查询结果中的列名中与该变量名称相同的列对应的值。
+>
+> 如果 SQL 节点的结果为多行,一个或多个字段,参数的名字需要和字段名称一致。数据类型选择为 LIST。获取到 SQL 查询结果后会将对应列转化为 LIST,并将该结果转化为 JSON 后作为对应变量的值。
 
-这里当然 "id" 的值会等于 12.
+#### 保存工作流,设置全局参数
 
-我们再来看节点实例【createParam2】的情况。
+点击保存工作流图标,并设置全局参数 output 和 value。
 
-<img src="/img/globalParam/image-20210723105255850.png" alt="image-20210723105255850" style="zoom:50%;" />
+![context-parameter03](/img/new_ui/dev/parameter/context_parameter04.png)
 
-这里只有 "id" 的值。尽管用户定义的 sql 查到的是 "id" 和 "database_name" 两个字段,但是由于只定义了一个为 out 的变量 "id",所以只会设置一个变量。由于显示的原因,这里已经替您查好了该 list 的长度为 10。
+#### 查看运行结果
 
-### SHELL
+创建完成工作流之后,上线运行该工作流,查看其运行结果。
 
-prop 为用户指定;方向选择为 OUT,只有当方向为 OUT 时才会被定义为变量输出;数据类型可以根据需要选择不同数据结构;value 部分不需要填写。
+Node_A 运行结果如下:
 
+![context-log01](/img/new_ui/dev/parameter/context_log01.png)
 
-用户需要传递参数,在定义 shell 脚本时,需要输出格式为 ${setValue(key=value)} 的语句,key 为对应参数的 prop,value 为该参数的值。
+Node_B 运行结果如下:
 
+![context-log02](/img/new_ui/dev/parameter/context_log02.png)
 
-例如下图中:
+Node_mysql 运行结果如下:
 
-<img src="/img/globalParam/image-20210723101242216.png" alt="image-20210723101242216" style="zoom:50%;" />
+![context-log03](/img/new_ui/dev/parameter/context_log03.png)
 
-shell 节点定义时当日志检测到 ${setValue(key=value1)} 的格式时,会将 value1 赋值给 key,下游节点便可以直接使用变量 key 的值。同样,您可以在【工作流实例】页面,找到对应的节点实例,便可以查看该变量的值。
+虽然在 Node_A 的脚本中为 output 赋值为 1,但日志中显示的值仍然为 100。但根据[参数优先级](priority.md)的原则:`本地参数 > 上游任务传递的参数 > 全局参数`,在 Node_B 中输出的值为 1。则证明 output 参数参照预期的值在该工作流中传递,并在 Node_mysql 中使用该值完成查询操作。
 
-<img src="/img/globalParam/image-20210723102522383.png" alt="image-20210723102522383" style="zoom:50%;" />
+但是 value 的值却只有在 Node_A 中输出为 66,其原因为 value 的方向选择为 IN,只有当方向为 OUT 时才会被定义为变量输出。
diff --git a/docs/img/globalParam/image-20210723101242216.png b/docs/img/globalParam/image-20210723101242216.png
deleted file mode 100644
index 649b333c95..0000000000
Binary files a/docs/img/globalParam/image-20210723101242216.png and /dev/null differ
diff --git a/docs/img/globalParam/image-20210723102522383.png b/docs/img/globalParam/image-20210723102522383.png
deleted file mode 100644
index ca04547b07..0000000000
Binary files a/docs/img/globalParam/image-20210723102522383.png and /dev/null differ
diff --git a/docs/img/globalParam/image-20210723104957031.png b/docs/img/globalParam/image-20210723104957031.png
deleted file mode 100644
index 13db035aeb..0000000000
Binary files a/docs/img/globalParam/image-20210723104957031.png and /dev/null differ
diff --git a/docs/img/globalParam/image-20210723105026924.png b/docs/img/globalParam/image-20210723105026924.png
deleted file mode 100644
index 3191d7847d..0000000000
Binary files a/docs/img/globalParam/image-20210723105026924.png and /dev/null differ
diff --git a/docs/img/globalParam/image-20210723105131381.png b/docs/img/globalParam/image-20210723105131381.png
deleted file mode 100644
index 3a8262962a..0000000000
Binary files a/docs/img/globalParam/image-20210723105131381.png and /dev/null differ
diff --git a/docs/img/globalParam/image-20210723105255850.png b/docs/img/globalParam/image-20210723105255850.png
deleted file mode 100644
index 6e3c34dae7..0000000000
Binary files a/docs/img/globalParam/image-20210723105255850.png and /dev/null differ
diff --git a/docs/img/global_parameter.png b/docs/img/global_parameter.png
deleted file mode 100644
index 4a88487752..0000000000
Binary files a/docs/img/global_parameter.png and /dev/null differ
diff --git a/docs/img/new_ui/dev/parameter/context_log01.png b/docs/img/new_ui/dev/parameter/context_log01.png
new file mode 100644
index 0000000000..d34094c9da
Binary files /dev/null and b/docs/img/new_ui/dev/parameter/context_log01.png differ
diff --git a/docs/img/new_ui/dev/parameter/context_log02.png b/docs/img/new_ui/dev/parameter/context_log02.png
new file mode 100644
index 0000000000..0519f95d5f
Binary files /dev/null and b/docs/img/new_ui/dev/parameter/context_log02.png differ
diff --git a/docs/img/new_ui/dev/parameter/context_log03.png b/docs/img/new_ui/dev/parameter/context_log03.png
new file mode 100644
index 0000000000..1732712b08
Binary files /dev/null and b/docs/img/new_ui/dev/parameter/context_log03.png differ
diff --git a/docs/img/new_ui/dev/parameter/context_parameter01.png b/docs/img/new_ui/dev/parameter/context_parameter01.png
new file mode 100644
index 0000000000..3cf9e1b62d
Binary files /dev/null and b/docs/img/new_ui/dev/parameter/context_parameter01.png differ
diff --git a/docs/img/new_ui/dev/parameter/context_parameter02.png b/docs/img/new_ui/dev/parameter/context_parameter02.png
new file mode 100644
index 0000000000..e8f7167a7a
Binary files /dev/null and b/docs/img/new_ui/dev/parameter/context_parameter02.png differ
diff --git a/docs/img/new_ui/dev/parameter/context_parameter03.png b/docs/img/new_ui/dev/parameter/context_parameter03.png
new file mode 100644
index 0000000000..d75f359c31
Binary files /dev/null and b/docs/img/new_ui/dev/parameter/context_parameter03.png differ
diff --git a/docs/img/new_ui/dev/parameter/context_parameter04.png b/docs/img/new_ui/dev/parameter/context_parameter04.png
new file mode 100644
index 0000000000..d472dbf9b3
Binary files /dev/null and b/docs/img/new_ui/dev/parameter/context_parameter04.png differ