You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@iotdb.apache.org by ha...@apache.org on 2022/08/31 07:35:40 UTC

[iotdb] branch master updated: [IOTDB-4190] update documents about nifi-iotdb-bundle (#7140)

This is an automated email from the ASF dual-hosted git repository.

haonan pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/iotdb.git


The following commit(s) were added to refs/heads/master by this push:
     new ff3b4a64e1 [IOTDB-4190] update documents about nifi-iotdb-bundle (#7140)
ff3b4a64e1 is described below

commit ff3b4a64e11d58db90a015be4428988cdb4fbe8a
Author: Xuan Ronaldo <xu...@qq.com>
AuthorDate: Wed Aug 31 15:35:33 2022 +0800

    [IOTDB-4190] update documents about nifi-iotdb-bundle (#7140)
---
 .../DBeaver.md                                     |   0
 .../Flink-IoTDB.md}                                |   0
 .../Flink-TsFile.md}                               |   0
 .../Grafana-Connector.md}                          |   0
 .../Grafana-Plugin.md}                             |   0
 .../Hive-TsFile.md}                                |   0
 .../MapReduce-TsFile.md}                           |   0
 docs/UserGuide/Ecosystem-Integration/NiFi-IoTDB.md | 115 +++++++++++++++++++++
 .../Spark-IoTDB.md}                                |   0
 .../Spark-TsFile.md}                               |   0
 .../Writing-Data-on-HDFS.md}                       |   0
 .../Zeppelin-IoTDB.md                              |   0
 .../DBeaver.md                                     |   0
 .../Flink TsFile.md                                |   0
 .../Flink-IoTDB.md}                                |   0
 .../Grafana Plugin.md                              |   0
 .../Grafana-Connector.md}                          |   0
 .../Hive-TsFile.md}                                |   0
 .../MapReduce-TsFile.md}                           |   0
 .../UserGuide/Ecosystem-Integration/NiFi-IoTDB.md  | 115 +++++++++++++++++++++
 .../Spark TsFile.md                                |   0
 .../Spark-IoTDB.md}                                |   0
 .../Writing-Data-on-HDFS.md}                       |   0
 .../Zeppelin-IoTDB.md                              |   0
 site/src/main/.vuepress/config.js                  |  50 ++++-----
 25 files changed, 257 insertions(+), 23 deletions(-)

diff --git a/docs/UserGuide/Ecosystem Integration/DBeaver.md b/docs/UserGuide/Ecosystem-Integration/DBeaver.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/DBeaver.md
rename to docs/UserGuide/Ecosystem-Integration/DBeaver.md
diff --git a/docs/UserGuide/Ecosystem Integration/Flink IoTDB.md b/docs/UserGuide/Ecosystem-Integration/Flink-IoTDB.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Flink IoTDB.md
rename to docs/UserGuide/Ecosystem-Integration/Flink-IoTDB.md
diff --git a/docs/UserGuide/Ecosystem Integration/Flink TsFile.md b/docs/UserGuide/Ecosystem-Integration/Flink-TsFile.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Flink TsFile.md
rename to docs/UserGuide/Ecosystem-Integration/Flink-TsFile.md
diff --git a/docs/UserGuide/Ecosystem Integration/Grafana Connector.md b/docs/UserGuide/Ecosystem-Integration/Grafana-Connector.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Grafana Connector.md
rename to docs/UserGuide/Ecosystem-Integration/Grafana-Connector.md
diff --git a/docs/UserGuide/Ecosystem Integration/Grafana Plugin.md b/docs/UserGuide/Ecosystem-Integration/Grafana-Plugin.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Grafana Plugin.md
rename to docs/UserGuide/Ecosystem-Integration/Grafana-Plugin.md
diff --git a/docs/UserGuide/Ecosystem Integration/Hive TsFile.md b/docs/UserGuide/Ecosystem-Integration/Hive-TsFile.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Hive TsFile.md
rename to docs/UserGuide/Ecosystem-Integration/Hive-TsFile.md
diff --git a/docs/UserGuide/Ecosystem Integration/MapReduce TsFile.md b/docs/UserGuide/Ecosystem-Integration/MapReduce-TsFile.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/MapReduce TsFile.md
rename to docs/UserGuide/Ecosystem-Integration/MapReduce-TsFile.md
diff --git a/docs/UserGuide/Ecosystem-Integration/NiFi-IoTDB.md b/docs/UserGuide/Ecosystem-Integration/NiFi-IoTDB.md
new file mode 100644
index 0000000000..5e0c92bdc6
--- /dev/null
+++ b/docs/UserGuide/Ecosystem-Integration/NiFi-IoTDB.md
@@ -0,0 +1,115 @@
+<!--
+
+    Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+        http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+-->
+# nifi-iotdb-bundle
+
+## Apache NiFi Introduction
+
+Apache NiFi is an easy to use, powerful, and reliable system to process and distribute data.
+
+Apache NiFi supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic.
+
+Apache NiFi includes the following capabilities:
+
+* Browser-based user interface
+    * Seamless experience for design, control, feedback, and monitoring
+* Data provenance tracking
+    * Complete lineage of information from beginning to end
+* Extensive configuration
+    * Loss-tolerant and guaranteed delivery
+    * Low latency and high throughput
+    * Dynamic prioritization
+    * Runtime modification of flow configuration
+    * Back pressure control
+* Extensible design
+    * Component architecture for custom Processors and Services
+    * Rapid development and iterative testing
+* Secure communication
+    * HTTPS with configurable authentication strategies
+    * Multi-tenant authorization and policy management
+    * Standard protocols for encrypted communication including TLS and SSH
+
+## PutIoTDB
+
+This is a processor that reads the content of the incoming FlowFile as individual records using the configured 'Record Reader' and writes them to Apache IoTDB using native interface.
+
+### Properties of PutIoTDB
+
+| property      | description                                                  | default value | necessary |
+| ------------- | ------------------------------------------------------------ | ------------- | --------- |
+| Host          | The host of IoTDB.                                           | null          | true      |
+| Port          | The port of IoTDB.                                           | 6667          | true      |
+| Username      | Username to access the IoTDB.                                | null          | true      |
+| Password      | Password to access the IoTDB.                                | null          | true      |
+| Record Reader | Specifies the type of Record Reader controller service to use <br />for parsing the incoming data and determining the schema. | null          | true      |
+| Schema        | The schema that IoTDB needs doesn't support good by NiFi.<br/>Therefore, you can define the schema here. <br />Besides, you can set encoding type and compression type by this method.<br />If you don't set this property, the inferred schema will be used.<br /> It can be updated by expression language. | null          | false     |
+| Aligned       | Whether using aligned interface?  It can be updated by expression language. | false         | false     |
+| MaxRowNumber  | Specifies the max row number of each tablet.  It can be updated by expression language. | 1024          | false     |
+
+### Inferred Schema of Flowfile
+
+There are a couple of rules about flowfile:
+
+1. The flowfile can be read by `Record Reader`.
+2. The schema of flowfile must contains a field `Time`, and it must be the first.
+3. The data type of time must be `STRING` or `LONG`.
+4. Fields excepted time must start with `root.`.
+5. The supported data types are `INT`, `LONG`, `FLOAT`, `DOUBLE`, `BOOLEAN`, `TEXT`.
+
+### Convert Schema by property
+
+As mentioned above, converting schema by property which is more flexible and stronger than inferred schema. 
+
+The structure of property `Schema`:
+
+```json
+{
+	"timeType": "LONG",
+	"fields": [{
+		"tsName": "root.sg.d1.s1",
+		"dataType": "INT32",
+		"encoding": "RLE",
+		"compressionType": "GZIP"
+	}, {
+		"tsName": "root.sg.d1.s2",
+		"dataType": "INT64",
+		"encoding": "RLE",
+		"compressionType": "GZIP"
+	}]
+}
+```
+
+**Note**
+
+1. The first column must be `Time`. The rest must be arranged in the same order as in `field` of JSON.
+1. The JSON of schema must contain `timeType` and `fields`.
+2. There are only two options `LONG` and `STRING` for `timeType`.
+3. The columns `tsName` and `dataType` must be set.
+4. The tsName must start with `root.`.
+5. The supported `dataTypes` are `INT32`, `INT64`, `FLOAT`, `DOUBLE`, `BOOLEAN`, `TEXT`.
+6. The supported `encoding` are `PLAIN`, `DICTIONARY`, `RLE`, `DIFF`, `TS_2DIFF`, `BITMAP`, `GORILLA_V1`, `REGULAR`, `GORILLA`.
+7. The supported `compressionType` are `UNCOMPRESSED`, `SNAPPY`, `GZIP`, `LZO`, `SDT`, `PAA`, `PLA`, `LZ4`.
+
+## Relationships
+
+| relationship | description                                          |
+| ------------ | ---------------------------------------------------- |
+| success      | Data can be written correctly or flow file is empty. |
+| failure      | The shema or flow file is abnormal.                  |
diff --git a/docs/UserGuide/Ecosystem Integration/Spark IoTDB.md b/docs/UserGuide/Ecosystem-Integration/Spark-IoTDB.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Spark IoTDB.md
rename to docs/UserGuide/Ecosystem-Integration/Spark-IoTDB.md
diff --git a/docs/UserGuide/Ecosystem Integration/Spark TsFile.md b/docs/UserGuide/Ecosystem-Integration/Spark-TsFile.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Spark TsFile.md
rename to docs/UserGuide/Ecosystem-Integration/Spark-TsFile.md
diff --git a/docs/UserGuide/Ecosystem Integration/Writing Data on HDFS.md b/docs/UserGuide/Ecosystem-Integration/Writing-Data-on-HDFS.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Writing Data on HDFS.md
rename to docs/UserGuide/Ecosystem-Integration/Writing-Data-on-HDFS.md
diff --git a/docs/UserGuide/Ecosystem Integration/Zeppelin-IoTDB.md b/docs/UserGuide/Ecosystem-Integration/Zeppelin-IoTDB.md
similarity index 100%
rename from docs/UserGuide/Ecosystem Integration/Zeppelin-IoTDB.md
rename to docs/UserGuide/Ecosystem-Integration/Zeppelin-IoTDB.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/DBeaver.md b/docs/zh/UserGuide/Ecosystem-Integration/DBeaver.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/DBeaver.md
rename to docs/zh/UserGuide/Ecosystem-Integration/DBeaver.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Flink TsFile.md b/docs/zh/UserGuide/Ecosystem-Integration/Flink TsFile.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Flink TsFile.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Flink TsFile.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Flink IoTDB.md b/docs/zh/UserGuide/Ecosystem-Integration/Flink-IoTDB.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Flink IoTDB.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Flink-IoTDB.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Grafana Plugin.md b/docs/zh/UserGuide/Ecosystem-Integration/Grafana Plugin.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Grafana Plugin.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Grafana Plugin.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Grafana Connector.md b/docs/zh/UserGuide/Ecosystem-Integration/Grafana-Connector.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Grafana Connector.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Grafana-Connector.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Hive TsFile.md b/docs/zh/UserGuide/Ecosystem-Integration/Hive-TsFile.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Hive TsFile.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Hive-TsFile.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/MapReduce TsFile.md b/docs/zh/UserGuide/Ecosystem-Integration/MapReduce-TsFile.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/MapReduce TsFile.md
rename to docs/zh/UserGuide/Ecosystem-Integration/MapReduce-TsFile.md
diff --git a/docs/zh/UserGuide/Ecosystem-Integration/NiFi-IoTDB.md b/docs/zh/UserGuide/Ecosystem-Integration/NiFi-IoTDB.md
new file mode 100644
index 0000000000..372642e65a
--- /dev/null
+++ b/docs/zh/UserGuide/Ecosystem-Integration/NiFi-IoTDB.md
@@ -0,0 +1,115 @@
+<!--
+
+    Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+        http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+-->
+# nifi-iotdb-bundle
+
+## Apache NiFi简介
+
+Apache NiFi 是一个易用的、功能强大的、可靠的数据处理和分发系统。
+
+Apache NiFi 支持强大的、可伸缩的数据路由、转换和系统中介逻辑的有向图。
+
+Apache NiFi 包含以下功能:
+
+* 基于浏览器的用户接口:
+    * 设计、控制、反馈和监控的无缝体验
+* 数据起源跟踪
+    * 从头到尾完整的信息族谱
+* 丰富的配置
+    * 丢失容忍和保证交付
+    * 低延迟和高吞吐
+    * 动态优先级策略
+    * 运行时可以修改流配置
+    * 反向压力控制
+* 扩展设计
+    * 用于定制 processors 和 services 的组件体系结构
+    * 快速开发和迭代测试
+* 安全会话
+    * 带有可配置认证策略的 HTTPS 协议
+    * 多租户授权和策略管理
+    * 包括TLS和SSH的加密通信的标准协议
+
+## PutIoTDB
+
+这是一个用于数据写入的处理器。它使用配置的 Record Reader 将传入 FlowFile 的内容读取为单独的记录,并使用本机接口将它们写入 Apache IoTDB。
+
+### PutIoTDB的配置项
+
+| 配置项        | 描述                                                         | 默认值 | 是否必填 |
+| ------------- | ------------------------------------------------------------ | ------ | -------- |
+| Host          | IoTDB 的主机名                                               | null   | true     |
+| Port          | IoTDB 的端口                                                 | 6667   | true     |
+| Username      | IoTDB 的用户名                                               | null   | true     |
+| Password      | IoTDB 的密码                                                 | null   | true     |
+| Record Reader | 指定一个 Record Reader controller service 来解析数据,并且推断数据格式。 | null   | true     |
+| Schema        | IoTDB 需要的 schema 不能很好的被 NiFi 支持,因此你可以在这里自定义 schema。<br />除此之外,你可以通过这个方式设置编码和压缩类型。如果你没有设置这个配置,就会使用 Record Reader 推断的 schema。<br />这个配置可以通过 Attributes 的表达式来更新。 | null   | false    |
+| Aligned       | 是否使用 aligned 接口?<br />这个配置可以通过 Attributes 的表达式来更新。 | false  | false    |
+| MaxRowNumber  | 指定 tablet 的最大行数。<br />这个配置可以通过 Attributes 的表达式来更新。 | 1024   | false    |
+
+### Flowfile 的推断数据类型
+
+如果要使用推断类型,需要注意以下几点:
+
+1. 输入的 flowfile 需要能被 `Record Reader` 读取。
+2. flowfile的 schema 中必须包含 `Time` 列,而且 `Time` 列必须是第一列。
+3. `Time`的数据类型只能是 `STRING`  或者  `LONG `。
+4. 除`Time` 以外的列必须以 `root.` 开头。
+5. 支持的数据类型有: `INT`,`LONG`, `FLOAT`, `DOUBLE`, `BOOLEAN`, `TEXT`。
+
+### 通过配置项自定义 schema
+
+如上所述,通过配置项来自定义 schema 比起推断的 schema来说,是一种更加灵活和强大的方式。
+
+ `Schema` 配置项的解构如下:
+
+```json
+{
+   "timeType": "LONG",
+   "fields": [{
+      "tsName": "root.sg.d1.s1",
+      "dataType": "INT32",
+      "encoding": "RLE",
+      "compressionType": "GZIP"
+   }, {
+      "tsName": "root.sg.d1.s2",
+      "dataType": "INT64",
+      "encoding": "RLE",
+      "compressionType": "GZIP"
+   }]
+}
+```
+
+**注意**
+
+1. flowfile 的第一列数据必须为 `Time`。剩下的必须与 `fields` 配置中保持一样的顺序。
+1. 定义 shema 的 JSON 中必须包含 `timeType` and `fields` 这两项。
+2. `timeType` 只支持 `LONG` 和 `STRING` 这两个选项。
+3. `tsName` 和 `dataType` 这两项必须被设置。
+4. `tsName` 必须以 `root.` 开头。 
+5. 支持的 `dataTypes` 有:`INT32`, `INT64`, `FLOAT`, `DOUBLE`, `BOOLEAN`, `TEXT`。
+6. 支持的 `encoding` 有: `PLAIN`, `DICTIONARY`, `RLE`, `DIFF`, `TS_2DIFF`, `BITMAP`, `GORILLA_V1`, `REGULAR`, `GORILLA`。
+7. 支持的 `compressionType` 有: `UNCOMPRESSED`, `SNAPPY`, `GZIP`, `LZO`, `SDT`, `PAA`, `PLA`, `LZ4`。
+
+## Relationships
+
+| relationship | 描述                    |
+| ------------ | ----------------------- |
+| success      | 数据能被正确的写入。    |
+| failure      | schema 或者数据有异常。 |
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Spark TsFile.md b/docs/zh/UserGuide/Ecosystem-Integration/Spark TsFile.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Spark TsFile.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Spark TsFile.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Spark IoTDB.md b/docs/zh/UserGuide/Ecosystem-Integration/Spark-IoTDB.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Spark IoTDB.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Spark-IoTDB.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Writing Data on HDFS.md b/docs/zh/UserGuide/Ecosystem-Integration/Writing-Data-on-HDFS.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Writing Data on HDFS.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Writing-Data-on-HDFS.md
diff --git a/docs/zh/UserGuide/Ecosystem Integration/Zeppelin-IoTDB.md b/docs/zh/UserGuide/Ecosystem-Integration/Zeppelin-IoTDB.md
similarity index 100%
rename from docs/zh/UserGuide/Ecosystem Integration/Zeppelin-IoTDB.md
rename to docs/zh/UserGuide/Ecosystem-Integration/Zeppelin-IoTDB.md
diff --git a/site/src/main/.vuepress/config.js b/site/src/main/.vuepress/config.js
index b38ee1064f..91a3838001 100644
--- a/site/src/main/.vuepress/config.js
+++ b/site/src/main/.vuepress/config.js
@@ -789,7 +789,8 @@ var config = {
 							['Ecosystem Integration/Hive TsFile','Hive TsFile'],
 							['Ecosystem Integration/Flink IoTDB','Flink IoTDB'],
 							['Ecosystem Integration/Flink TsFile','Flink TsFile'],
-							['Ecosystem Integration/Writing Data on HDFS','Writing Data on HDFS']
+							['Ecosystem Integration/Writing Data on HDFS','Writing Data on HDFS'],
+							['Ecosystem Integration/NiFi-IoTDB','NiFi IoTDB'],
 						]
 					},
 					{
@@ -957,17 +958,18 @@ var config = {
 					{
 						title: 'Ecosystem Integration',
 						children: [
-							['Ecosystem Integration/Grafana Plugin','Grafana Plugin'],
-							['Ecosystem Integration/Grafana Connector','Grafana Connector (Not Recommended)'],
-							['Ecosystem Integration/Zeppelin-IoTDB','Zeppelin-IoTDB'],
-							['Ecosystem Integration/DBeaver','DBeaver-IoTDB'],
-							['Ecosystem Integration/MapReduce TsFile','MapReduce TsFile'],
-							['Ecosystem Integration/Spark TsFile','Spark TsFile'],
-							['Ecosystem Integration/Spark IoTDB','Spark IoTDB'],
-							['Ecosystem Integration/Hive TsFile','Hive TsFile'],
-							['Ecosystem Integration/Flink IoTDB','Flink IoTDB'],
-							['Ecosystem Integration/Flink TsFile','Flink TsFile'],
-							['Ecosystem Integration/Writing Data on HDFS','Writing Data on HDFS']
+							['Ecosystem-Integration/Grafana-Plugin','Grafana-Plugin'],
+							['Ecosystem-Integration/Grafana-Connector','Grafana-Connector (Not Recommended)'],
+							['Ecosystem-Integration/Zeppelin-IoTDB','Zeppelin-IoTDB'],
+							['Ecosystem-Integration/DBeaver','DBeaver-IoTDB'],
+							['Ecosystem-Integration/MapReduce TsFile','MapReduce-TsFile'],
+							['Ecosystem-Integration/Spark-TsFile','Spark-TsFile'],
+							['Ecosystem-Integration/Spark-IoTDB','Spark-IoTDB'],
+							['Ecosystem-Integration/Hive-TsFile','Hive-TsFile'],
+							['Ecosystem-Integration/Flink-IoTDB','Flink-IoTDB'],
+							['Ecosystem-Integration/Flink-TsFile','Flink-TsFile'],
+							['Ecosystem-Integration/Writing-Data-on-HDFS','Writing-Data-on-HDFS'],
+							['Ecosystem-Integration/NiFi-IoTDB','NiFi-IoTDB'],
 						]
 					},
 					{
@@ -1715,6 +1717,7 @@ var config = {
 							['Ecosystem Integration/Flink TsFile','Flink-TsFile'],
 							['Ecosystem Integration/Flink IoTDB','Flink-IoTDB'],
 							['Ecosystem Integration/Writing Data on HDFS','HDFS集成'],
+							['Ecosystem Integration/NiFi-IoTDB','NiFi IoTDB'],
 						]
 					},
 					{
@@ -1882,17 +1885,18 @@ var config = {
 					{
 						title: '系统集成',
 						children: [
-							['Ecosystem Integration/Grafana Plugin','Grafana Plugin'],
-							['Ecosystem Integration/Grafana Connector','Grafana Connector(不推荐)'],
-							['Ecosystem Integration/Zeppelin-IoTDB','Zeppelin-IoTDB'],
-							['Ecosystem Integration/DBeaver','DBeaver-IoTDB'],
-							['Ecosystem Integration/Spark TsFile','Spark TsFile'],
-							['Ecosystem Integration/MapReduce TsFile','Hadoop-TsFile'],
-							['Ecosystem Integration/Spark IoTDB','Spark-IoTDB'],
-							['Ecosystem Integration/Hive TsFile','Hive-TsFile'],
-							['Ecosystem Integration/Flink TsFile','Flink-TsFile'],
-							['Ecosystem Integration/Flink IoTDB','Flink-IoTDB'],
-							['Ecosystem Integration/Writing Data on HDFS','HDFS集成'],
+							['Ecosystem-Integration/Grafana-Plugin','Grafana-Plugin'],
+							['Ecosystem-Integration/Grafana-Connector','Grafana-Connector(不推荐)'],
+							['Ecosystem-Integration/Zeppelin-IoTDB','Zeppelin-IoTDB'],
+							['Ecosystem-Integration/DBeaver','DBeaver-IoTDB'],
+							['Ecosystem-Integration/Spark-TsFile','Spark-TsFile'],
+							['Ecosystem-Integration/MapReduce-TsFile','Hadoop-TsFile'],
+							['Ecosystem-Integration/Spark-IoTDB','Spark-IoTDB'],
+							['Ecosystem-Integration/Hive-TsFile','Hive-TsFile'],
+							['Ecosystem-Integration/Flink-TsFile','Flink-TsFile'],
+							['Ecosystem-Integration/Flink-IoTDB','Flink-IoTDB'],
+							['Ecosystem-Integration/Writing-Data-on-HDFS','HDFS集成'],
+							['Ecosystem-Integration/NiFi-IoTDB','NiFi-IoTDB'],
 						]
 					},
 					{