You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@linkis.apache.org by ca...@apache.org on 2022/09/07 15:42:43 UTC

[incubator-linkis-website] branch dev updated: update some docs and engine download (#505)

This is an automated email from the ASF dual-hosted git repository.

casion pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-linkis-website.git


The following commit(s) were added to refs/heads/dev by this push:
     new e81f1eda2e update some docs  and engine download (#505)
e81f1eda2e is described below

commit e81f1eda2e354010861017fded3067a1151b7981
Author: Casion <ca...@gmail.com>
AuthorDate: Wed Sep 7 23:42:38 2022 +0800

    update some docs  and engine download (#505)
---
 ...2022-04-15-how-to-download-engineconn-plugin.md | 85 +++++++---------------
 ...2022-04-15-how-to-download-engineconn-plugin.md | 78 ++++++--------------
 .../docusaurus-plugin-content-docs/current.json    |  2 +-
 .../current/architecture/commons/rpc.md            | 18 ++++-
 .../version-0.11.0.json                            |  2 +-
 .../version-1.0.2.json                             |  2 +-
 .../version-1.0.3.json                             |  2 +-
 .../version-1.1.0.json                             |  2 +-
 .../version-1.1.1.json                             |  2 +-
 .../version-1.1.1/architecture/commons/rpc.md      | 18 ++++-
 .../version-1.1.2.json                             |  2 +-
 .../version-1.1.2/architecture/commons/rpc.md      | 18 ++++-
 .../version-1.1.3.json                             |  2 +-
 .../version-1.1.3/architecture/commons/rpc.md      | 18 ++++-
 .../version-1.2.0.json                             |  2 +-
 .../version-1.2.0/architecture/commons/rpc.md      | 18 ++++-
 16 files changed, 127 insertions(+), 144 deletions(-)

diff --git a/blog/2022-04-15-how-to-download-engineconn-plugin.md b/blog/2022-04-15-how-to-download-engineconn-plugin.md
index 5de14d8913..b5a90c04f4 100644
--- a/blog/2022-04-15-how-to-download-engineconn-plugin.md
+++ b/blog/2022-04-15-how-to-download-engineconn-plugin.md
@@ -6,71 +6,37 @@ tags: [engine,guide]
 > _This article mainly guides you how to download the non-default engine installation plug-in package corresponding to each version. _
 
 Considering the size of the release package and the use of plug-ins, the binary installation package released by linkis only contains some common engines /hive/spark/python/shell.
-Very useful engine, there are flink/io_file/pipeline/sqoop in the project code (there may be differences between different versions),
+Very useful engine, there are corresponding modules `flink/io_file/pipeline/sqoop` in the project code (there may be differences between different versions),
 In order to facilitate everyone's use, based on the release branch code of each version of linkis: https://github.com/apache/incubator-linkis, this part of the engine is compiled for everyone to choose and use.
 
-## 1.1.2 版本
+ ## Download link
+| **linkis version** | **engines included** |**engine material package download link** |
+|:---- |:---- |:---- |
+|1.2.0|jdbc<br/>pipeline<br/>flink<br/>openlookeng<br/>sqoop<br/>presto<br/>elasticsearch<br/>|[1.2.0-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.2.0-engineconn-plugin.tar)|
+|1.1.3|jdbc<br/>pipeline<br/>flink<br/>openlookeng<br/>sqoop|[1.1.3-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.3-engineconn-plugin.tar)|
+|1.1.2|jdbc<br/>pipeline<br/>flink<br/>openlookeng<br/>sqoop|[1.1.2-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.2-engineconn-plugin.tar)|
+|1.1.1|jdbc<br/>pipeline<br/>flink<br/>openlookeng<br/>|[1.1.1-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.1-engineconn-plugin.tar)|
+|1.1.0|jdbc<br/>pipeline<br/>flink<br/>|[1.1.0-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.0-engineconn-plugin.tar)|
+|1.0.3|jdbc<br/>pipeline<br/>flink<br/>|[1.0.3-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.0.3-engineconn-plugin.tar)|
 
-| **Engine** | **Corresponding component version** | Is there default in the official installation package | **Description** |
-|:---- |:---- |:---- |:---- |:---- |
-|Spark|2.4.3|Yes|Spark EngineConn. Support SQL, Scala, Pyspark and R code|
-|Hive|2.3.3|Yes|Hive EngineConn. Support HiveQL code.|
-|Shell||Yes|Shell EngineConn. Support Bash shell code.|
-|Python||Yes|Python EngineConn. Support Python code.|
-|JDBC| |No|JDBC EngineConn. MySQL and hiveql have been supported, and other engines with jdbc driver package can be quickly extended, such as Oracle.|
-|Flink |1.12.2|No| Flink EngineConn. It supports FlinkSQL code and launching a new Yarn application in the form of Flink jar.|
-|openLooKeng |1.5.0|No| openLooKeng EngineConn. Support SQL query data virtualization engine openlookeng.|
-|sqoop |1.4.6|No|Sqoop EngineConn. Support data migration tool sqoop engine.|
-
-
-[Non-default engine download link](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.2-engineconn-plugin.tar)
-
-
-## 1.1.1版本
-
-| **Engine** | **Corresponding component version** | Is there default in the official installation package | **Description** |
-|:---- |:---- |:---- |:---- |:---- |
-|Spark|2.4.3|Yes|Spark EngineConn. Support SQL, Scala, Pyspark and R code|
-|Hive|2.3.3|Yes|Hive EngineConn. Support HiveQL code.|
-|Shell||Yes|Shell EngineConn. Support Bash shell code.|
-|Python||Yes|Python EngineConn. Support Python code.|
-|JDBC| |No|JDBC EngineConn. MySQL and hiveql have been supported, and other engines with jdbc driver package can be quickly extended, such as Oracle.|
-|Flink |1.12.2|No| Flink EngineConn. It supports FlinkSQL code and launching a new Yarn application in the form of Flink jar.|
-|openLooKeng |1.5.0|No| openLooKeng EngineConn. Support SQL query data virtualization engine openlookeng.|
-
-[Non-default engine download link](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.1-engineconn-plugin.tar)
-
-## Version 1.1.0
+## engine type
 
-| **Engine** | **Corresponding component version** | Is there default in the official installation package | **Description** |
+| **Engine name** | **Support underlying component version<br/>(default dependency version)** | **Linkis 1.X version requirements** | **Whether it is included in the release package by default** | **Description** |
 |:---- |:---- |:---- |:---- |:---- |
-|Spark|2.4.3|Yes|Spark EngineConn. Supports SQL, Scala, Pyspark and R code.|
-|Hive|2.3.3|is 1|Hive EngineConn. Supports HiveQL code.|
-|Shell||Yes |Shell EngineConn. Supports Bash shell code.|
-|Python||Yes |Python EngineConn. Supports python code.|
-|JDBC| |No|JDBC EngineConn. Already supports MySQL and HiveQL, and can be quickly extended to support other engines with JDBC Driver packages, such as Oracle.|
-|Flink |1.12.2|No | Flink EngineConn. Supports FlinkSQL code, and also supports launching a new Yarn application in the form of Flink Jar. |
+|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(default Apache Spark 2.4.3)|\>=1.0.3|Yes|Spark EngineConn, supports SQL , Scala, Pyspark and R code|
+|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Hive 2.3.3)|\>=1.0.3|Yes|Hive EngineConn, supports HiveQL code|
+|Python|Python >= 2.6, <br/>(default Python2*)|\>=1.0.3|Yes|Python EngineConn, supports python code|
+|Shell|Bash >= 2.0|\>=1.0.3|Yes|Shell EngineConn, supports Bash shell code|
+|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(default Hive-jdbc 2.3.4)|\>=1.0.3|No |JDBC EngineConn, already supports MySQL and HiveQL, can be extended quickly Support other engines with JDBC Driver package, such as Oracle|
+|Flink |Flink >= 1.12.2, <br/>(default Apache Flink 1.12.2)|\>=1.0.2|No |Flink EngineConn, supports FlinkSQL code, also supports starting a new Yarn in the form of Flink Jar Application|
+|Pipeline|-|\>=1.0.2|No|Pipeline EngineConn, supports file import and export|
+|openLooKeng|openLooKeng >= 1.5.0, <br/>(default openLookEng 1.5.0)|\>=1.1.1|No|openLooKeng EngineConn, supports querying data virtualization engine with Sql openLooKeng|
+|Sqoop| Sqoop >= 1.4.6, <br/>(default Apache Sqoop 1.4.6)|\>=1.1.2|No|Sqoop EngineConn, support data migration tool Sqoop engine|
+|Presto|Presto >= 0.180|\>=1.2.0|No|Presto EngineConn, supports Presto SQL code|
+|ElasticSearch|ElasticSearch >=6.0|\>=1.2.0|No|ElasticSearch EngineConn, supports SQL and DSL code|
 
-[Non-default engine download link](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.0.1-engineconn-plugin.tar)
 
-
-
-## Version 1.0.3
-
-| **Engine** | **Corresponding component version** | Is there default in the official installation package | **Description** |
-|:---- |:---- |:---- |:---- |:---- |
-|Spark|2.4.3|Yes|Spark EngineConn. Supports SQL, Scala, Pyspark and R code.|
-|Hive|2.3.3|is 1|Hive EngineConn. Supports HiveQL code.|
-|Shell||Yes |Shell EngineConn. Supports Bash shell code.|
-|Python||Yes |Python EngineConn. Supports python code.|
-|JDBC| |No|JDBC EngineConn. Already supports MySQL and HiveQL, and can be quickly extended to support other engines with JDBC Driver packages, such as Oracle.|
-|Flink |1.12.2|No | Flink EngineConn. Supports FlinkSQL code, and also supports launching a new Yarn application in the form of Flink Jar. |
-
-[Non-default engine download link](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.0.3-engineconn-plugin.tar)
-
-
-
-## Install Engine Guide
+## Install engine guide
 
 After downloading the material package of the engine, unzip the package
 ```html
@@ -81,5 +47,4 @@ cd 1.0.3-engineconn-plugin
 
 Copy the engine material package to be used to the engine plug-in directory of linkis, and then refresh the engine material.
 
-
-Detailed process reference[Installing EngineConnPlugin engine](https://linkis.apache.org/zh-CN/docs/latest/deployment/engine-conn-plugin-installation).
\ No newline at end of file
+For the detailed process, refer to [Installing the EngineConnPlugin Engine](https://linkis.apache.org/zh-CN/docs/latest/deployment/engine-conn-plugin-installation).
\ No newline at end of file
diff --git a/i18n/zh-CN/docusaurus-plugin-content-blog/2022-04-15-how-to-download-engineconn-plugin.md b/i18n/zh-CN/docusaurus-plugin-content-blog/2022-04-15-how-to-download-engineconn-plugin.md
index a575d66265..bb4e36b460 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-blog/2022-04-15-how-to-download-engineconn-plugin.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-blog/2022-04-15-how-to-download-engineconn-plugin.md
@@ -6,66 +6,34 @@ tags: [engine,guide]
 > _本文主要指引大家如何下载每个版本对应的非默认引擎安装插件包。_
 
 考虑到发布包的大小和大家使用插件的情况,linkis发布的二进制安装包中只包含了部分常用引擎/hive/spark/python/shell,
-非常用引擎,项目代码中有flink/io_file/pipeline/sqoop(不同版本之间可能有区别),
+非常用引擎,项目代码中有对应的模块`flink/io_file/pipeline/sqoop`(不同版本之间可能有区别),
 为了方便大家使用,基于linkis每个版本的release分支代码: https://github.com/apache/incubator-linkis, 编译出这部分引擎,供大家选择使用。
 
-## 1.1.2 版本
-
-| **引擎** | **对应的组件版本** | 官方安装包中是否默认有| **说明** |
-|:---- |:---- |:---- |:---- |:---- |
-|Spark|2.4.3|是|Spark EngineConn。 支持SQL, Scala, Pyspark 和R 代码。|
-|Hive|2.3.3|是|Hive EngineConn。 支持HiveQL 代码。|
-|Shell||是|Shell EngineConn。 支持Bash shell 代码。|
-|Python||是|Python EngineConn。 支持python 代码。|
-|JDBC| |否|JDBC EngineConn。 已支持MySQL 和HiveQL,可快速扩展支持其他有JDBC Driver 包的引擎, 如Oracle。|
-|Flink |1.12.2|否| Flink EngineConn。支持FlinkSQL 代码,也支持以Flink Jar 形式启动一个新的Yarn 应用程序。|
-|openLooKeng |1.5.0|否| openLooKeng EngineConn。 支持用Sql查询数据虚拟化引擎openLooKeng。|
-|sqoop |1.4.6|否|Sqoop EngineConn。 支持 数据迁移工具 Sqoop 引擎。|
-
-
-[非默认引擎下载链接](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.2-engineconn-plugin.tar)
+ ## 下载链接  
+| **linkis版本** |  **包含的引擎** |**引擎物料包下载链接** |
+|:---- |:---- |:---- |
+|1.2.0|jdbc<br/>pipeline<br/>flink<br/>openlookeng<br/>sqoop<br/>presto<br/>elasticsearch<br/>|[1.2.0-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.2.0-engineconn-plugin.tar)|
+|1.1.3|jdbc<br/>pipeline<br/>flink<br/>openlookeng<br/>sqoop|[1.1.3-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.3-engineconn-plugin.tar)|
+|1.1.2|jdbc<br/>pipeline<br/>flink<br/>openlookeng<br/>sqoop|[1.1.2-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.2-engineconn-plugin.tar)|
+|1.1.1|jdbc<br/>pipeline<br/>flink<br/>openlookeng<br/>|[1.1.1-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.1-engineconn-plugin.tar)|
+|1.1.0|jdbc<br/>pipeline<br/>flink<br/>|[1.1.0-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.0-engineconn-plugin.tar)|
+|1.0.3|jdbc<br/>pipeline<br/>flink<br/>|[1.0.3-engineconn-plugin.tar](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.0.3-engineconn-plugin.tar)|
 
+## 引擎类型 
 
-## 1.1.1版本
-
-| **引擎** | **对应的组件版本** | 官方安装包中是否默认有| **说明** |
+| **引擎名** | **支持底层组件版本<br/>(默认依赖版本)** | **Linkis 1.X 版本要求** | **是否默认包含在发布包中** | **说明** |
 |:---- |:---- |:---- |:---- |:---- |
-|Spark|2.4.3|是|Spark EngineConn。 支持SQL, Scala, Pyspark 和R 代码。|
-|Hive|2.3.3|是|Hive EngineConn。 支持HiveQL 代码。|
-|Shell||是|Shell EngineConn。 支持Bash shell 代码。|
-|Python||是|Python EngineConn。 支持python 代码。|
-|JDBC| |否|JDBC EngineConn。 已支持MySQL 和HiveQL,可快速扩展支持其他有JDBC Driver 包的引擎, 如Oracle。|
-|Flink |1.12.2|否| Flink EngineConn。支持FlinkSQL 代码,也支持以Flink Jar 形式启动一个新的Yarn 应用程序。|
-|openLooKeng |1.5.0|否| openLooKeng EngineConn。 支持用Sql查询数据虚拟化引擎openLooKeng。|
-
-[非默认引擎下载链接](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.1.1-engineconn-plugin.tar)
-
-## 1.1.0 版本
-
-| **引擎** | **对应的组件版本** | 官方安装包中是否默认有| **说明** |
-|:---- |:---- |:---- |:---- |:---- |
-|Spark|2.4.3|是|Spark EngineConn。 支持SQL, Scala, Pyspark 和R 代码。|
-|Hive|2.3.3|是|Hive EngineConn。 支持HiveQL 代码。|
-|Shell||是|Shell EngineConn。 支持Bash shell 代码。|
-|Python||是|Python EngineConn。 支持python 代码。|
-|JDBC| |否|JDBC EngineConn。 已支持MySQL 和HiveQL,可快速扩展支持其他有JDBC Driver 包的引擎, 如Oracle。|
-|Flink |1.12.2|否|	Flink EngineConn。支持FlinkSQL 代码,也支持以Flink Jar 形式启动一个新的Yarn 应用程序。|
-
-[非默认引擎下载链接](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.0.1-engineconn-plugin.tar)
-
-
-## 1.0.3版本
-
-| **引擎** | **对应的组件版本** | 官方安装包中是否默认有| **说明** |
-|:---- |:---- |:---- |:---- |:---- |
-|Spark|2.4.3|是|Spark EngineConn。 支持SQL, Scala, Pyspark 和R 代码。|
-|Hive|2.3.3|是|Hive EngineConn。 支持HiveQL 代码。|
-|Shell||是|Shell EngineConn。 支持Bash shell 代码。|
-|Python||是|Python EngineConn。 支持python 代码。|
-|JDBC| |否|JDBC EngineConn。 已支持MySQL 和HiveQL,可快速扩展支持其他有JDBC Driver 包的引擎, 如Oracle。|
-|Flink |1.12.2|否|	Flink EngineConn。支持FlinkSQL 代码,也支持以Flink Jar 形式启动一个新的Yarn 应用程序。|
-
-[非默认引擎下载链接](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Linkis/engineconn-plugin/1.0.3-engineconn-plugin.tar)
+|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(默认Apache Spark 2.4.3)|\>=1.0.3|是|Spark EngineConn, 支持SQL, Scala, Pyspark 和R 代码|
+|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(默认Apache Hive 2.3.3)|\>=1.0.3|是|Hive EngineConn, 支持HiveQL 代码|
+|Python|Python >= 2.6, <br/>(默认Python2*)|\>=1.0.3|是|Python EngineConn, 支持python 代码|
+|Shell|Bash >= 2.0|\>=1.0.3|是|Shell EngineConn, 支持Bash shell 代码|
+|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(默认Hive-jdbc 2.3.4)|\>=1.0.3|否|JDBC EngineConn, 已支持MySQL 和HiveQL,可快速扩展支持其他有JDBC Driver 包的引擎, 如Oracle|
+|Flink |Flink >= 1.12.2, <br/>(默认Apache Flink 1.12.2)|\>=1.0.2|否|Flink EngineConn, 支持FlinkSQL 代码,也支持以Flink Jar 形式启动一个新的Yarn 应用程序|
+|Pipeline|-|\>=1.0.2|否|Pipeline EngineConn, 支持文件的导入和导出|
+|openLooKeng|openLooKeng >= 1.5.0, <br/>(默认openLookEng 1.5.0)|\>=1.1.1|否|openLooKeng EngineConn, 支持用Sql查询数据虚拟化引擎openLooKeng|
+|Sqoop| Sqoop >= 1.4.6, <br/>(默认Apache Sqoop 1.4.6)|\>=1.1.2|否|Sqoop EngineConn, 支持 数据迁移工具 Sqoop 引擎|
+|Presto|Presto >= 0.180|\>=1.2.0|否|Presto EngineConn, 支持Presto SQL 代码|
+|ElasticSearch|ElasticSearch >=6.0|\>=1.2.0|否|ElasticSearch EngineConn, 支持SQL 和DSL 代码|
 
 
 ## 安装引擎指引 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current.json b/i18n/zh-CN/docusaurus-plugin-content-docs/current.json
index ae5727e60b..21b1d771e8 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current.json
@@ -71,7 +71,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/commons/rpc.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/commons/rpc.md
index 2667efce1f..39f27da59d 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/commons/rpc.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/commons/rpc.md
@@ -64,27 +64,37 @@ abstract class Sender {
 - Sender发送器会将调用者的请求传递给拦截器,进行一系列发送前的逻辑处理。
 
 拦截器拦截请求,开始对请求做额外的功能性处理:
-**广播拦截器**
 
+**广播拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.BroadcastRPCInterceptor
-
+```
 广播拦截器只对需要进行广播的请求生效。
 广播拦截器会提供特殊的广播接口,如果本次请求实现了该广播接口,且该请求不是正在广播中,广播拦截器则认为本次请求需要进行广播,这时会触发广播操作。
+
 **重试拦截器**
 
+```
 org.apache.linkis.rpc.interceptor.common.RetryableRPCInterceptor
+```
 
 重试拦截器会对接下来的所有步骤提供重试功能。
 如果接收端要求重试,或者发送请求时出现了ConnectException(连接异常),或者调用者指定某些异常需要重试,这时重试拦截器会自动进行重试。
-**缓存拦截器**
 
+
+**缓存拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CacheableRPCInterceptor
+```
 
 缓存拦截器是针对一些响应内容不大可能经常变动的同步请求而设定的。
 缓存拦截器也会提供特殊的缓存接口,如果本次请求实现了缓存接口,会首先在缓存拦截器中寻找缓存,不存在缓存才会继续请求,并在拿到响应后,先将响应缓存起来,再将响应返回。
-**公共默认拦截器**
 
+**公共默认拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CommonRPCInterceptor
+```
+
 
 公共默认拦截器用于调用接下来的处理步骤(示例参考:org.apache.linkis.rpc.BaseRPCSender#ask) 
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-0.11.0.json b/i18n/zh-CN/docusaurus-plugin-content-docs/version-0.11.0.json
index af605ae311..365fe557d9 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-0.11.0.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-0.11.0.json
@@ -62,7 +62,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2.json b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2.json
index 765728340b..eef3e0ee6c 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.2.json
@@ -62,7 +62,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3.json b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3.json
index b0bf7b50b3..140a28ee87 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.0.3.json
@@ -66,7 +66,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0.json b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0.json
index 90b5c3fd94..52b2e4dc49 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0.json
@@ -66,7 +66,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1.json b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1.json
index 5f90d36d8f..484cf74de9 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1.json
@@ -71,7 +71,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/commons/rpc.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/commons/rpc.md
index 2667efce1f..39f27da59d 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/commons/rpc.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.1/architecture/commons/rpc.md
@@ -64,27 +64,37 @@ abstract class Sender {
 - Sender发送器会将调用者的请求传递给拦截器,进行一系列发送前的逻辑处理。
 
 拦截器拦截请求,开始对请求做额外的功能性处理:
-**广播拦截器**
 
+**广播拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.BroadcastRPCInterceptor
-
+```
 广播拦截器只对需要进行广播的请求生效。
 广播拦截器会提供特殊的广播接口,如果本次请求实现了该广播接口,且该请求不是正在广播中,广播拦截器则认为本次请求需要进行广播,这时会触发广播操作。
+
 **重试拦截器**
 
+```
 org.apache.linkis.rpc.interceptor.common.RetryableRPCInterceptor
+```
 
 重试拦截器会对接下来的所有步骤提供重试功能。
 如果接收端要求重试,或者发送请求时出现了ConnectException(连接异常),或者调用者指定某些异常需要重试,这时重试拦截器会自动进行重试。
-**缓存拦截器**
 
+
+**缓存拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CacheableRPCInterceptor
+```
 
 缓存拦截器是针对一些响应内容不大可能经常变动的同步请求而设定的。
 缓存拦截器也会提供特殊的缓存接口,如果本次请求实现了缓存接口,会首先在缓存拦截器中寻找缓存,不存在缓存才会继续请求,并在拿到响应后,先将响应缓存起来,再将响应返回。
-**公共默认拦截器**
 
+**公共默认拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CommonRPCInterceptor
+```
+
 
 公共默认拦截器用于调用接下来的处理步骤(示例参考:org.apache.linkis.rpc.BaseRPCSender#ask) 
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.2.json b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.2.json
index 1f2e24b0fe..7fb56ddd43 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.2.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.2.json
@@ -71,7 +71,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.2/architecture/commons/rpc.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.2/architecture/commons/rpc.md
index 2667efce1f..39f27da59d 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.2/architecture/commons/rpc.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.2/architecture/commons/rpc.md
@@ -64,27 +64,37 @@ abstract class Sender {
 - Sender发送器会将调用者的请求传递给拦截器,进行一系列发送前的逻辑处理。
 
 拦截器拦截请求,开始对请求做额外的功能性处理:
-**广播拦截器**
 
+**广播拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.BroadcastRPCInterceptor
-
+```
 广播拦截器只对需要进行广播的请求生效。
 广播拦截器会提供特殊的广播接口,如果本次请求实现了该广播接口,且该请求不是正在广播中,广播拦截器则认为本次请求需要进行广播,这时会触发广播操作。
+
 **重试拦截器**
 
+```
 org.apache.linkis.rpc.interceptor.common.RetryableRPCInterceptor
+```
 
 重试拦截器会对接下来的所有步骤提供重试功能。
 如果接收端要求重试,或者发送请求时出现了ConnectException(连接异常),或者调用者指定某些异常需要重试,这时重试拦截器会自动进行重试。
-**缓存拦截器**
 
+
+**缓存拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CacheableRPCInterceptor
+```
 
 缓存拦截器是针对一些响应内容不大可能经常变动的同步请求而设定的。
 缓存拦截器也会提供特殊的缓存接口,如果本次请求实现了缓存接口,会首先在缓存拦截器中寻找缓存,不存在缓存才会继续请求,并在拿到响应后,先将响应缓存起来,再将响应返回。
-**公共默认拦截器**
 
+**公共默认拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CommonRPCInterceptor
+```
+
 
 公共默认拦截器用于调用接下来的处理步骤(示例参考:org.apache.linkis.rpc.BaseRPCSender#ask) 
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3.json b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3.json
index 2db46bef8b..a02a396e5b 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3.json
@@ -71,7 +71,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/architecture/commons/rpc.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/architecture/commons/rpc.md
index 2667efce1f..39f27da59d 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/architecture/commons/rpc.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/architecture/commons/rpc.md
@@ -64,27 +64,37 @@ abstract class Sender {
 - Sender发送器会将调用者的请求传递给拦截器,进行一系列发送前的逻辑处理。
 
 拦截器拦截请求,开始对请求做额外的功能性处理:
-**广播拦截器**
 
+**广播拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.BroadcastRPCInterceptor
-
+```
 广播拦截器只对需要进行广播的请求生效。
 广播拦截器会提供特殊的广播接口,如果本次请求实现了该广播接口,且该请求不是正在广播中,广播拦截器则认为本次请求需要进行广播,这时会触发广播操作。
+
 **重试拦截器**
 
+```
 org.apache.linkis.rpc.interceptor.common.RetryableRPCInterceptor
+```
 
 重试拦截器会对接下来的所有步骤提供重试功能。
 如果接收端要求重试,或者发送请求时出现了ConnectException(连接异常),或者调用者指定某些异常需要重试,这时重试拦截器会自动进行重试。
-**缓存拦截器**
 
+
+**缓存拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CacheableRPCInterceptor
+```
 
 缓存拦截器是针对一些响应内容不大可能经常变动的同步请求而设定的。
 缓存拦截器也会提供特殊的缓存接口,如果本次请求实现了缓存接口,会首先在缓存拦截器中寻找缓存,不存在缓存才会继续请求,并在拿到响应后,先将响应缓存起来,再将响应返回。
-**公共默认拦截器**
 
+**公共默认拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CommonRPCInterceptor
+```
+
 
 公共默认拦截器用于调用接下来的处理步骤(示例参考:org.apache.linkis.rpc.BaseRPCSender#ask) 
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0.json b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0.json
index b42bd392d3..c22862de62 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0.json
@@ -71,7 +71,7 @@
     "description": "The label for category Upgrade Guide in sidebar tutorialSidebar"
   },
 
-  "sidebar.tutorialSidebar.category.Development Doc": {
+  "sidebar.tutorialSidebar.category.Development": {
     "message": "开发文档",
     "description": "The label for category Development Doc in sidebar tutorialSidebar"
   },
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/architecture/commons/rpc.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/architecture/commons/rpc.md
index 2667efce1f..39f27da59d 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/architecture/commons/rpc.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/architecture/commons/rpc.md
@@ -64,27 +64,37 @@ abstract class Sender {
 - Sender发送器会将调用者的请求传递给拦截器,进行一系列发送前的逻辑处理。
 
 拦截器拦截请求,开始对请求做额外的功能性处理:
-**广播拦截器**
 
+**广播拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.BroadcastRPCInterceptor
-
+```
 广播拦截器只对需要进行广播的请求生效。
 广播拦截器会提供特殊的广播接口,如果本次请求实现了该广播接口,且该请求不是正在广播中,广播拦截器则认为本次请求需要进行广播,这时会触发广播操作。
+
 **重试拦截器**
 
+```
 org.apache.linkis.rpc.interceptor.common.RetryableRPCInterceptor
+```
 
 重试拦截器会对接下来的所有步骤提供重试功能。
 如果接收端要求重试,或者发送请求时出现了ConnectException(连接异常),或者调用者指定某些异常需要重试,这时重试拦截器会自动进行重试。
-**缓存拦截器**
 
+
+**缓存拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CacheableRPCInterceptor
+```
 
 缓存拦截器是针对一些响应内容不大可能经常变动的同步请求而设定的。
 缓存拦截器也会提供特殊的缓存接口,如果本次请求实现了缓存接口,会首先在缓存拦截器中寻找缓存,不存在缓存才会继续请求,并在拿到响应后,先将响应缓存起来,再将响应返回。
-**公共默认拦截器**
 
+**公共默认拦截器**
+```
 org.apache.linkis.rpc.interceptor.common.CommonRPCInterceptor
+```
+
 
 公共默认拦截器用于调用接下来的处理步骤(示例参考:org.apache.linkis.rpc.BaseRPCSender#ask) 
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@linkis.apache.org
For additional commands, e-mail: commits-help@linkis.apache.org