You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@doris.apache.org by zh...@apache.org on 2019/09/27 12:42:58 UTC
[incubator-doris] branch master updated: Add more routine load
example (#1902)
This is an automated email from the ASF dual-hosted git repository.
zhaoc pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-doris.git
The following commit(s) were added to refs/heads/master by this push:
new ec3aa03 Add more routine load example (#1902)
ec3aa03 is described below
commit ec3aa03c453c78f069100a0323b4bf22cca89f7a
Author: HangyuanLiu <46...@qq.com>
AuthorDate: Fri Sep 27 20:42:52 2019 +0800
Add more routine load example (#1902)
---
.../Data Manipulation/ROUTINE LOAD.md | 23 ++++++++++++++++++--
.../Data Manipulation/ROUTINE LOAD_EN.md | 25 ++++++++++++++++++++--
2 files changed, 44 insertions(+), 4 deletions(-)
diff --git a/docs/documentation/cn/sql-reference/sql-statements/Data Manipulation/ROUTINE LOAD.md b/docs/documentation/cn/sql-reference/sql-statements/Data Manipulation/ROUTINE LOAD.md
index 9b847e4..3c38b9f 100644
--- a/docs/documentation/cn/sql-reference/sql-statements/Data Manipulation/ROUTINE LOAD.md
+++ b/docs/documentation/cn/sql-reference/sql-statements/Data Manipulation/ROUTINE LOAD.md
@@ -215,8 +215,27 @@
NULL值:\N
## example
+ 1. 为 example_db 的 example_tbl 创建一个名为 test1 的 Kafka 例行导入任务。指定group.id和client.id,并且自动默认消费所有分区,且从末尾(OFFSET_END)开始订阅
- 1. 为 example_db 的 example_tbl 创建一个名为 test1 的 Kafka 例行导入任务。导入任务为严格模式。
+ CREATE ROUTINE LOAD example_db.test1 ON example_tbl
+ COLUMNS(k1, k2, k3, v1, v2, v3 = k1 * 100)
+ PROPERTIES
+ (
+ "desired_concurrent_number"="3",
+ "max_batch_interval" = "20",
+ "max_batch_rows" = "300000",
+ "max_batch_size" = "209715200",
+ "strict_mode" = "false"
+ )
+ FROM KAFKA
+ (
+ "kafka_broker_list" = "broker1:9092,broker2:9092,broker3:9092",
+ "kafka_topic" = "my_topic",
+ "property.group.id" = "xxx",
+ "property.client.id" = "xxx"
+ );
+
+ 2. 为 example_db 的 example_tbl 创建一个名为 test1 的 Kafka 例行导入任务。导入任务为严格模式。
CREATE ROUTINE LOAD example_db.test1 ON example_tbl
COLUMNS(k1, k2, k3, v1, v2, v3 = k1 * 100),
@@ -237,7 +256,7 @@
"kafka_offsets" = "101,0,0,200"
);
- 2. 通过 SSL 认证方式,从 Kafka 集群导入数据。同时设置 client.id 参数。导入任务为非严格模式,时区为 Africa/Abidjan
+ 3. 通过 SSL 认证方式,从 Kafka 集群导入数据。同时设置 client.id 参数。导入任务为非严格模式,时区为 Africa/Abidjan
CREATE ROUTINE LOAD example_db.test1 ON example_tbl
COLUMNS(k1, k2, k3, v1, v2, v3 = k1 * 100),
diff --git a/docs/documentation/en/sql-reference/sql-statements/Data Manipulation/ROUTINE LOAD_EN.md b/docs/documentation/en/sql-reference/sql-statements/Data Manipulation/ROUTINE LOAD_EN.md
index 81c5167..b2291f2 100644
--- a/docs/documentation/en/sql-reference/sql-statements/Data Manipulation/ROUTINE LOAD_EN.md
+++ b/docs/documentation/en/sql-reference/sql-statements/Data Manipulation/ROUTINE LOAD_EN.md
@@ -268,7 +268,28 @@ FROM data_source
## example
-1. Create a Kafka routine load task named test1 for the example_tbl of example_db. The load task is in strict mode.
+1. Create a Kafka routine load task named test1 for the example_tbl of example_db. Specify group.id and client.id, and automatically consume all partitions by default, with subscriptions starting at the end (OFFSET_END)
+ ```
+ CREATE ROUTINE LOAD example_db.test1 ON example_tbl
+ COLUMNS(k1, k2, k3, v1, v2, v3 = k1 * 100)
+ PROPERTIES
+ (
+ "desired_concurrent_number"="3",
+ "max_batch_interval" = "20",
+ "max_batch_rows" = "300000",
+ "max_batch_size" = "209715200",
+ "strict_mode" = "false"
+ )
+ FROM KAFKA
+ (
+ "kafka_broker_list" = "broker1:9092,broker2:9092,broker3:9092",
+ "kafka_topic" = "my_topic",
+ "property.group.id" = "xxx",
+ "property.client.id" = "xxx"
+ );
+ ```
+
+2. Create a Kafka routine load task named test1 for the example_tbl of example_db. The load task is in strict mode.
```
CREATE ROUTINE LOAD example_db.test1 ON example_tbl
@@ -291,7 +312,7 @@ FROM data_source
);
```
-2. load data from Kafka clusters via SSL authentication. Also set the client.id parameter. The load task is in non-strict mode and the time zone is Africa/Abidjan
+3. load data from Kafka clusters via SSL authentication. Also set the client.id parameter. The load task is in non-strict mode and the time zone is Africa/Abidjan
```
CREATE ROUTINE LOAD example_db.test1 ON example_tbl
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@doris.apache.org
For additional commands, e-mail: commits-help@doris.apache.org