You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@linkis.apache.org by pe...@apache.org on 2022/06/30 08:51:23 UTC

[incubator-linkis] branch dev-1.1.3 updated: Update README.md and CONTRIBUTING.md (#2383)

This is an automated email from the ASF dual-hosted git repository.

peacewong pushed a commit to branch dev-1.1.3
in repository https://gitbox.apache.org/repos/asf/incubator-linkis.git


The following commit(s) were added to refs/heads/dev-1.1.3 by this push:
     new 6a2379474 Update README.md and CONTRIBUTING.md (#2383)
6a2379474 is described below

commit 6a2379474cda7ae4bba6a70224241e30accee280
Author: Casion <ca...@gmail.com>
AuthorDate: Thu Jun 30 16:51:17 2022 +0800

    Update README.md and CONTRIBUTING.md (#2383)
    
    * update README.md and CONTRIBUTING.md
---
 CONTRIBUTING.md    | 126 ++++++++++++++++++++++++++++++-----------------------
 CONTRIBUTING_CN.md | 119 ++++++++++++++++++++++++++++----------------------
 README.md          |  97 +++++++++++++++++++++++------------------
 README_CN.md       |  94 ++++++++++++++++++++++-----------------
 4 files changed, 248 insertions(+), 188 deletions(-)

diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index fe847b177..40d013097 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -1,10 +1,6 @@
-# Contributing
+# How To Contribute
 
-| **Version Management Information Form** | |
-| ----------- | --------------------------------- |
-| Current version | Version 1.2|
-| Current version release date | December 17, 2021 |
-| Revision information | 1. Due to the transfer of the git repository to apache and the migration of Linkis-Doc documents to the linkis official website, corresponding links were modified |
+> For more information, see the official website [How to contribute to the project](https://linkis.apache.org/community/how-to-contribute)
 
 Thank you very much for contributing to the Linkis project! Before participating in the contribution, please read the following guidelines carefully.
 
@@ -29,88 +25,108 @@ You can find linkis documentations at [linkis-Website](https://linkis.apache.org
 ### 1.5 Other
 Including participating in and helping to organize community exchanges, community operation activities, etc., and other activities that can help the Linkis project and the community.
 
----
-
 ## 2. How to Contribution
 
 ### 2.1 Branch structure
 
-There are many branches,including temporary branches,in Linkis repository,but only three of them matter:
+The Linkis source code may have some temporary branches, but only the following three branches are really meaningful:
+- master: The source code of the latest stable release, and occasionally several hotfix submissions;
+- release-*: stable release version;
+- dev-*: main development branch;
 
-- master: The source code of the last stable release, and occasionally hotfix submissions;
-- release-*: stable release version;*
-- *dev-*: main development branch;
-- feature-*: Development branches for some larger new features that need to be jointly developed by the community
+#### 2.1.1 Concept
 
-Please note: The dev branch of major features will be named with corresponding naming instructions in addition to the version number, such as: dev-0.10.0-flink, which refers to the flink feature development branch of 0.10.0.
+- Upstream repository: https://github.com/apache/incubator-linkis The apache repository of linkis is called Upstream repository in the text
+- Fork repository: From https://github.com/apache/incubator-linkis fork to your own personal repository called Fork repository
 
-### 2.2 Development Guidelines
+#### 2.1.2   Synchronize Upstream Repository
+> Synchronize the latest code of the Upstream repository branch to your own Fork repository
 
-Linkis front-end and back-end code share the same code base, but they are separated in development. Before starting the development, please fork the Linkis project to your Github Repositories. When developing, please develop based on the Linkis code base in your Github Repositories.
+- Step1 Enter the user project page and select the branch to be updated
+- Step2 Click Fetch upstream under the code download button, select Fetch and merge (if the branch of your own Fork repository is accidentally polluted, you can delete the branch and synchronize the new branch of the Upstream repository to your own Fork repository, see the guide [Synchronize Upstream] The latest code of the warehouse branch to its own Fork warehouse] (#213-synchronize the new branch of the Upstream warehouse to its own Fork warehouse))
+![update-code](https://user-images.githubusercontent.com/7869972/176622158-52da5a80-6d6a-4f06-a099-ff65887d002c.png)
 
-We recommend cloning the dev-* branch for development, so that the possibility of merge conflicts when submitting a PR to the main Linkis project will be much smaller
+#### 2.1.3  Synchronize New Branch
+>Synchronize the new branch of the Upstream repository to your own Fork repository
 
-```bash
-git clone https://github.com/yourname/incubator-linkis.git --branch dev-*
-```
+Scenario: There is a new branch in the Upstream warehouse, but the forked library does not have this branch (you can choose to delete and re-fork, but the changes that have not been merged to the original warehouse will be lost)
 
-#### 2.2.1 Backend
+Operate in your own clone's local project
 
-The user configuration is in the project root directory /config/, the project startup script and the upgrade patch script are in the project root directory /bin/, the back-end code and core configuration are in the server/ directory, and the log is in the project root directory /log/. Note: The project root directory referred to here refers to the directory configured by the environment variable LINKIS_HOME, and environment variables need to be configured during IDE development. For exam [...]
+- Step1 Add the apacheUpstream repository image to the local
 
-##### 2.2.1.1 Directory structure
+```shell script
+git remote add apache git@github.com:apache/incubator-linkis.git
+```
+- Step2 Pull the apache image information to the local
 
-1. Script
+```shell script
+git fetch apache
+```
+- Step3 Create a local branch based on the new branch that needs to be synced
 
+```shell script
+git checkout -b dev-1.1.4 apache/dev-1.1.4
 ```
-├── bin              # script directory
-  ├── install.sh     # One-click deployment script
-  ├── start-all.sh   # One-click start script
-  └── stop-all.sh    # One-click stop script
+- Step4 Push the local branch to your own warehouse. If your own warehouse does not have the dev-1.1.4 branch, the dev-1.1.4 branch will be created
+```shell script
+git push origin dev-1.1.4:dev-1.1.4
 ```
-
-2. Configuration
-
+- Step5 Delete the upstream branch
+```shell script
+git remote remove apache
 ```
-├── config            # User configuration directory
-  ├── config.sh       # One-click deployment configuration file
-  ├── db.sh           # One-click deployment database configuration
+- Step6 Update the branch
+```shell script
+git pull
 ```
 
-3. Code directory structure
-   
-   For details, see [Linkis Code Directory Structure](https://linkis.apache.org/docs/latest/deployment/sourcecode_hierarchical_structure)
+#### 2.1.4 The process of a pr
+
+- Step1 Confirm the base branch of the current development (usually the current version in progress, such as the version 1.1.0 currently under development by the community, then the branch is dev-1.1.0, if you are not sure, you can ask in the community group or at @relevant classmates in the issue)
 
-4. Log directory
+- Step2 Synchronize the latest code of the Upstream warehouse branch to your own Fork warehouse branch, see the guide [2.1.2 Synchronize Upstream Repository] 
 
+- Step3 Based on the development branch, pull the new fix/feature branch (do not modify it directly on the original branch, if the subsequent PR is merged in the squash method, the submitted commit records will be merged into one)
+```shell script
+git checkout -b dev-1.1.4-fix dev-1.1.4
+git push origin dev-1.1.4-fix:dev-1.1.4-fix
 ```
-├── logs # log root directory
+- Step4 Develop
+- Step5 Submit pr (if it is in progress and the development has not been completely completed, please add the WIP logo to the pr title, such as `[WIP] Dev 1.1.1 Add junit test code for [linkis-common] `; associate the corresponding issue, etc.)
+- Step6 Waiting to be merged
+- Step7 Delete the fix/future branch (you can do this on the github page)
+```shell script
+git branch -d dev-1.1.4-fix
+git push
 ```
 
-##### 2.2.1.2 Environment Variables
 
-     Configure system environment variable or IDE environment variable LINKIS_HOME, it is recommended to use IDE environment variable first.
+Please note: For the dev branch of major features, in addition to the version number, the corresponding naming description will be added, such as: dev-0.10.0-flink, which refers to the flink feature development branch of 0.10.0.
 
-##### 2.2.1.3 Database
+### 2.2 Development Guidelines
 
-1. Create the Linkis system database by yourself;
-2. Modify the corresponding information of the database in conf/db.sh and execute bin/install.sh or directly import db/linkis_*.sql on the database client.
+Linkis front-end and back-end code share the same code base, but are separated in development. Before starting development, please fork a copy of the Linkis project to your Github Repositories, and develop based on the Linkis code base in your Github Repositories.
 
-##### 2.2.1.4 Configuration file
+We recommend to clone the dev branch and name it dev-fix for development. At the same time, create a new dev-fix branch in your own warehouse and modify it directly on the original branch. If the subsequent PR is merged in the squash method, the submitted commit records will be merged into one
 
-   Modify the `application.yml` file in the resources/ directory of each microservice to configure related properties.
+```shell script
+#pull the branch
+git clone https://github.com/{githubid}/incubator-linkis.git --branch dev
 
-##### 2.2.1.5 Packaging
+#Generate local dev-fix branch according to dev
+git checkout -b dev-fix dev
 
-1. To obtain a complete release package, you need to modify the relevant version information in /assembly/src/main/assembly/assembly.xml in the root directory, and then execute: `mvn clean package` in the root directory;
-2. To obtain the package of each module, simple execute `mvn clean package` in the module directory.
+#Push the local dev-fix branch to your own repository
+git push origin dev-fix dev-fix
+```
 
 ### 2.3 Issue submission guidelines
 - If you still don’t know how to initiate a PR to an open source project, please refer to [About issues](https://docs.github.com/en/github/managing-your-work-on-github/about-issues)
 - Issue name, which should briefly describe your problem or suggestion in one sentence; for the international promotion of the project, please write the issue in English or both Chinese and English.
 - For each Issue, please bring at least two labels, component and type, such as component=Computation Governance/EngineConn, type=Improvement. Reference: [issue #590](https://github.com/apache/incubator-linkis/issues/590)
 
-### 2.3 Pull Request(PR) Submission Guidelines
+### 2.4 Pull Request(PR) Submission Guidelines
 
 - If you still don’t know how to initiate a PR to an open source project, please refer to [About pull requests](https://docs.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests)
   Whether it is a bug fix or a new feature development, please submit a PR to the dev-* branch.
@@ -119,11 +135,11 @@ The user configuration is in the project root directory /config/, the project st
 - If this PR is not ready to merge, please add [WIP] prefix to the head of the name (WIP = work-in-progress).
 - All submissions to dev-* branches need to go through at least one review before they can be merged
 
-### 2.4 Review Standard
+### 2.5 Review Standard
 
 Before contributing code, you can find out what kind of submissions are popular in Review. Simply put, if a submission can bring as many gains as possible and as few side effects or risks as possible, the higher the probability of it being merged, the faster the review will be. Submissions with high risk and low value are almost impossible to merge, and may be rejected Review.
 
-#### 2.4.1 Gain
+#### 2.5.1 Gain
 
 - Fix the main cause of the bug
 - Add or fix a function or problem that a large number of users urgently need
@@ -132,7 +148,7 @@ Before contributing code, you can find out what kind of submissions are popular
 - Reduce complexity and amount of code
 - Issues that have been discussed by the community and identified for improvement
 
-#### 2.4.2 Side effects and risks
+#### 2.5.2 Side effects and risks
 
 - Only fix the surface phenomenon of the bug
 - Introduce new features with high complexity
@@ -143,7 +159,7 @@ Before contributing code, you can find out what kind of submissions are popular
 - Change the dependency version at will
 - Submit a large number of codes or changes at once
 
-#### 2.4.3 Reviewer notes
+#### 2.5.3 Reviewer notes
 
 - Please use a constructive tone to write comments
 - If you need to make changes by the submitter, please clearly state all the content that needs to be modified to complete the Pull Request
@@ -151,7 +167,7 @@ Before contributing code, you can find out what kind of submissions are popular
 
 ---
 
-##3, Outstanding Contributor
+## 3. Outstanding Contributor
 
 ### 3.1 About Committers (Collaborators)
 
diff --git a/CONTRIBUTING_CN.md b/CONTRIBUTING_CN.md
index 2aef758ea..f842f4050 100644
--- a/CONTRIBUTING_CN.md
+++ b/CONTRIBUTING_CN.md
@@ -1,13 +1,10 @@
-# Contributing
+# 如何参与项目贡献
 
-| **版本管理信息表** |                                   |
-| ----------- | --------------------------------- |
-| 现行版本        | 1.2 版                   |
-| 现行版本发布日期    | 2021 年 12 月 17 日                    |
-| 修订信息        | 1. 因仓库移交至apache,以及Linkis-Doc文档迁移至linkis官网,修改对应链接 |
+>更多信息可以见官网[如何参与项目贡献](https://linkis.apache.org/community/how-to-contribute)
 
 非常感谢贡献Linkis项目!在参与贡献之前,请仔细阅读以下指引。
 
+
 ## 一、贡献范畴
 
 ### 1.1 Bug 反馈与修复
@@ -29,7 +26,6 @@ Linkis 文档位于[Linkis官网](https://linkis.apache.org/zh-CN/docs/latest/in
 ### 1.5 其他
 包括参与和帮助组织社区交流、社区运营活动等,其他能够帮助Linkis 项目和社区的活动。
 
----
 
 ## 二、贡献流程
 
@@ -39,77 +35,96 @@ Linkis 源码可能会产生一些临时分支,但真正有明确意义的只
 - master: 最近一次稳定 release 的源码,偶尔会多几次 hotfix 提交;
 - release-*: 稳定的release 版本; 
 - dev-*: 主要开发分支;
-- feature-*: 针对某些较大、需要社区联合开发的新特性的开发分支
 
-请注意:大特性的dev分支,在命名时除了版本号,还会加上相应的命名说明,如:dev-0.10.0-flink,指0.10.0的flink特性开发分支。
-
-### 2.2 开发指引
+#### 2.1.1 概念 
 
-Linkis 前后端代码共用同一个代码库,但在开发上是分离的。在着手开发之前,请先将 Linkis 项目 fork 一份到自己的 Github Repositories 中, 开发时请基于自己 Github Repositories 中的 Linkis 代码库进行开发。
+- Upstream仓库:https://github.com/apache/incubator-linkis linkis的apache仓库文中称为Upstream仓库   
+- Fork仓库: 从https://github.com/apache/incubator-linkis fork到自己个人仓库 称为Fork仓库  
 
-我们建议克隆 dev-* 分支来开发,这样在向 Linkis 主项目提交 PR 时合并冲突的可能性会小很多
+#### 2.1.2 同步Upstream仓库分支最新代码到自己的Fork仓库   
 
-```bash
-git clone https://github.com/yourname/incubator-linkis.git --branch dev-*
-```
+- step1 进入用户项目页面,选中要更新的分支  
+- step2 点击code下载按钮下方的Fetch upstream,选择Fetch and merge (如自己的Fork仓库  该分支不小心污染了,可以删除该分支后,同步Upstream仓库新分支到自己的Fork仓库  ,参见指引[同步Upstream仓库分支最新代码到自己的Fork仓库  ](#213-同步Upstream仓库新分支到自己的Fork仓库  ))
+![update-code](https://user-images.githubusercontent.com/7869972/176622158-52da5a80-6d6a-4f06-a099-ff65887d002c.png)
 
-#### 2.2.1 后端
+#### 2.1.3 同步Upstream仓库新分支到自己的Fork仓库  
 
-用户配置在项目根目录 /config/ 下,项目启动脚本和升级补丁脚本在项目根目录 /bin/ 下, 后端代码及核心配置在 server/ 目录下, 日志在项目根目录 /log/ 下。注意:此处所指项目根目录都指环境变量 LINKIS_HOME 所配置的目录,在使用 IDE 开发过程中也需要配置环境变量,如 Idea 关于环境变量加载的优先级:`Run/Debug Configurations` 中配置的 `Environment variables` —>  IDE缓存的系统环境变量。
+场景:Upstream仓库有新增分支,但是fork的库没有该分支(可以选择删除后,重新fork,但是会丢失未merge到原始仓库的变更)
 
-##### 2.2.1.1 目录结构
+在自己clone的本地项目中操作
 
-1. 脚本
+- step1 添加apacheUpstream仓库镜像到本地  
 
+```shell script
+git remote add apache git@github.com:apache/incubator-linkis.git
 ```
-├── bin                   # 脚本目录
-  ├── install.sh              # 一键部署脚本
-  ├── start-all.sh   # 一键启动脚本
-  └── stop-all.sh    # 一键停止脚本
-```
+- step2 拉去apache镜像信息到本地  
 
-2. 配置
+```shell script
+git fetch apache
+```
+- step3 根据需要同步的新分支来创建本地分支
 
+```shell script
+git checkout -b dev-1.1.4 apache/dev-1.1.4
 ```
-├── config                          # 用户配置目录
-  ├── config.sh         # 一键部署的配置文件
-  ├── db.sh   # 一键部署的数据库配置
+- step4 把本地分支push到自己的仓库,如果自己的仓库没有dev-1.1.4分支,则会创建dev-1.1.4分支  
+```shell script
+git push origin dev-1.1.4:dev-1.1.4
 ```
-
-3. 代码目录结构
-   
-   详见 [Linkis代码目录结构](https://linkis.apache.org/zh-CN/docs/latest/deployment/sourcecode_hierarchical_structure)
-
-4. 日志目录
-
+- step5 删除upstream的分支
+```shell script
+git remote remove apache
 ```
-├── logs        # 日志根目录
+- step6 更新分支
+```shell script
+git pull
 ```
 
-##### 2.2.1.2 环境变量
+#### 2.1.4 一个pr的流程 
+
+- step1 确认当前开发的基础分支(一般是当前进行的中版本,如当前社区开发中的版本1.1.0,那么分支就是dev-1.1.0,不确定的话可以在社区群里问下或则在issue中@相关同学)
 
-     配置系统环境变量或 IDE 环境变量 LINKIS_HOME,推荐优先使用 IDE 环境变量。
+- step2 同步Upstream仓库分支最新代码到自己的Fork仓库 分支,参见指引 [2.1.2 同步Upstream仓库分支最新代码到自己的Fork仓库 ]
 
-##### 2.2.1.3 数据库
+- step3 基于开发分支,拉取新fix/feature分支(不要直接在原分支上修改,如果后续pr以squash方式merge后,提交的commit记录会被合并成一个)
+```shell script
+git checkout -b dev-1.1.4-fix  dev-1.1.4
+git push origin dev-1.1.4-fix:dev-1.1.4-fix
+```
+- step4  进行开发
+- step5  提交pr(如果是正在进行中,开发还未完全结束,请在pr标题上加上WIP标识 如 `[WIP] Dev 1.1.1 Add junit test code for [linkis-common] ` ;关联对应的issue等)
+- step6  等待被合并
+- step7  删除fix/future分支(可以在github页面上进行操作) 
+```shell script
+git branch -d dev-1.1.4-fix 
+git push 
+```
 
-1. 自行创建 Linkis 系统数据库;
-2. 修改 conf/db.sh 中的数据库相应信息并执行bin/install.sh 或 直接在数据库客户端导入 db/linkis_*.sql。
+请注意:大特性的dev分支,在命名时除了版本号,还会加上相应的命名说明,如:dev-0.10.0-flink,指0.10.0的flink特性开发分支。
 
-##### 2.2.1.4 配置文件
+### 2.2 开发指引
 
-   修改 每个微服务resources/ 目录下 `application.yml` 文件,配置相关属性。
+Linkis 前后端代码共用同一个代码库,但在开发上是分离的。在着手开发之前,请先将 Linkis 项目 fork 一份到自己的 Github Repositories 中, 开发时请基于自己 Github Repositories 中的 Linkis 代码库进行开发。
 
-##### 2.2.1.5 打包
+我们建议克隆dev分支命名为dev-fix来开发,同时在自己仓库新建dev-fix分支,直接在原分支上修改,如果后续pr以squash方式merge后,提交的commit记录会被合并成一个
 
-1. 打完整 release 包需要修改根目录下 /assembly/src/main/assembly/assembly.xml 中相关版本信息,然后在根目录下执行: `mvn clean package` 即可;
-2. 打 每个模块 的包可直接在 模块目录下执行 `mvn clean package`。
+```shell script
+#拉取分支
+git clone https://github.com/{githubid}/incubator-linkis.git --branch dev
+#根据dev生成本地dev-fix分支
+git checkout -b dev-fix dev
+#把本地dev-fix分支推到自己的仓库
+git push origin dev-fix dev-fix
+```
 
 ### 2.3 Issue 提交指引
+
 - 如果您还不知道怎样向开源项目发起 PR,请参考[About issues](https://docs.github.com/en/github/managing-your-work-on-github/about-issues)
 - Issue 名称,应一句话简单描述您的问题或建议;为了项目的国际化推广,请用英文,或中英文双语书写 issue.
 - 每个Issue,请至少带上component 和type 两个label,如component=Computation Governance/EngineConn,type=Improvement.参考:[issue #590](https://github.com/apache/incubator-linkis/issues/590)
 
-### 2.3 Pull Request(PR) 提交指引
+### 2.4 Pull Request(PR) 提交指引
 
 - 如果您还不知道怎样向开源项目发起 PR,请参考[About pull requests](https://docs.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests)
 - 无论是 Bug 修复,还是新功能开发,请将 PR 提交到 dev-* 分支。
@@ -118,11 +133,11 @@ git clone https://github.com/yourname/incubator-linkis.git --branch dev-*
 - 如果本次 PR 尚未准备好合并,请在名称头部加上 [WIP] 前缀(WIP = work-in-progress)。
 - 所有提交到 dev-* 分支的提交至少需要经过一次 Review 才可以被合并
 
-### 2.4 Review 标准
+### 2.5 Review 标准
 
 在贡献代码之前,可以了解一下什么样的提交在 Review 中是受欢迎的。简单来说,如果一项提交能带来尽可能多增益和尽可能少的副作用或风险,那它被合并的几率就越高,Review 的速度也会越快。风险大、价值低的提交是几乎不可能被合并的,并且有可能会被拒绝 Review。
 
-#### 2.4.1 增益
+#### 2.5.1 增益
 
 - 修复导致 Bug 的主要原因
 - 添加或修复一个大量用户亟需的功能或问题
@@ -131,7 +146,7 @@ git clone https://github.com/yourname/incubator-linkis.git --branch dev-*
 - 减少复杂度以及代码量
 - 经社区讨论过的、确定需要改进的问题
 
-#### 2.4.2 副作用和风险
+#### 2.5.2 副作用和风险
 
 - 仅仅修复 Bug 的表面现象
 - 引入复杂度高的新功能
@@ -142,7 +157,7 @@ git clone https://github.com/yourname/incubator-linkis.git --branch dev-*
 - 随意改变依赖版本
 - 一次性提交大量代码或改动
 
-#### 2.4.3 Reviewer 注意事项
+#### 2.5.3 Reviewer 注意事项
 
 - 请使用建设性语气撰写评论
 - 如果需要提交者进行修改,请明确说明完成此次 Pull Request 所需要修改的所有内容
diff --git a/README.md b/README.md
index a929eadf9..fa381856c 100644
--- a/README.md
+++ b/README.md
@@ -22,55 +22,63 @@ Since the first release of Linkis in 2019, it has accumulated more than **700**
 # Features
 
 - **Support for diverse underlying computation storage engines**.  
-    Currently supported computation/storage engines: Spark, Hive, Python, Presto, ElasticSearch, MLSQL, TiSpark, JDBC, Shell, etc;      
-    Computation/storage engines to be supported: Flink(Supported in version >=1.0.2), Impala, etc;      
-    Supported scripting languages: SparkSQL, HiveQL, Python, Shell, Pyspark, R, Scala and JDBC, etc.  
-  
+    Currently supported computation/storage engines: Spark, Hive, Flink, Python, Pipeline, Sqoop, openLooKeng, JDBC, Shell, etc.      
+    Computation/storage engines to be supported: Presto (planned 1.2.0), ElasticSearch (planned 1.2.0), etc.
+    Supported scripting languages: SparkSQL, HiveQL, Python, Shell, Pyspark, R, Scala and JDBC, etc. 
+
 - **Powerful task/request governance capabilities**. With services such as Orchestrator, Label Manager and customized Spring Cloud Gateway, Linkis is able to provide multi-level labels based, cross-cluster/cross-IDC fine-grained routing, load balance, multi-tenancy, traffic control, resource control, and orchestration strategies like dual-active, active-standby, etc.  
 
 - **Support full stack computation/storage engine**. As a computation middleware, it will receive, execute and manage tasks and requests for various computation storage engines, including batch tasks, interactive query tasks, real-time streaming tasks and storage tasks;
 
-- **Resource management capabilities**.  ResourceManager is not only capable of managing resources for Yarn and Linkis EngineManger as in Linkis 0.X, but also able to provide label-based multi-level resource allocation and recycling, allowing itself to have powerful resource management capabilities across mutiple Yarn clusters and mutiple computation resource types;
+- **Resource management capabilities**.  ResourceManager is not only capable of managing resources for Yarn and Linkis EngineManger, but also able to provide label-based multi-level resource allocation and recycling, allowing itself to have powerful resource management capabilities across mutiple Yarn clusters and mutiple computation resource types;
 
 - **Unified Context Service**. Generate Context ID for each task/request,  associate and manage user and system resource files (JAR, ZIP, Properties, etc.), result set, parameter variable, function, etc., across user, system, and computing engine. Set in one place, automatic reference everywhere;
 
 - **Unified materials**. System and user-level unified material management, which can be shared and transferred across users and systems.
 
-# Supported engine types
+# Supported Engine Types
 
-| **Engine** | **Supported Version** | **Linkis 0.X version requirement**| **Linkis 1.X version requirement** | **Description** |
+| **Engine Name** | **Suppor Component Version<br/>(Default Dependent Version)** | **Linkis Version Requirements** | **Included in Release Package<br/> By Default** | **Description** |
 |:---- |:---- |:---- |:---- |:---- |
-|Flink |1.12.2|\>=dev-0.12.0, PR #703 not merged yet.|>=1.0.2|	Flink EngineConn. Supports FlinkSQL code, and also supports Flink Jar to Linkis Manager to start a new Yarn application.|
-|Impala|\>=3.2.0, CDH >=6.3.0"|\>=dev-0.12.0, PR #703 not merged yet.|ongoing|Impala EngineConn. Supports Impala SQL.|
-|Presto|\>= 0.180|\>=0.11.0|ongoing|Presto EngineConn. Supports Presto SQL.|
-|ElasticSearch|\>=6.0|\>=0.11.0|ongoing|ElasticSearch EngineConn. Supports SQL and DSL code.|
-|Shell|Bash >=2.0|\>=0.9.3|\>=1.0.0_rc1|Shell EngineConn. Supports shell code.|
-|MLSQL|\>=1.1.0|\>=0.9.1|ongoing|MLSQL EngineConn. Supports MLSQL code.|
-|JDBC|MySQL >=5.0, Hive >=1.2.1|\>=0.9.0|\>=1.0.0_rc1|JDBC EngineConn. Supports MySQL and HiveQL code.|
-|Spark|Apache 2.0.0~2.4.7, CDH >=5.4.0|\>=0.5.0|\>=1.0.0_rc1|Spark EngineConn. Supports SQL, Scala, Pyspark and R code.|
-|Hive|Apache >=1.0.0, CDH >=5.4.0|\>=0.5.0|\>=1.0.0_rc1|Hive EngineConn. Supports HiveQL code.|
-|Hadoop|Apache >=2.6.0, CDH >=5.4.0|\>=0.5.0|ongoing|Hadoop EngineConn. Supports Hadoop MR/YARN application.|
-|Python|\>=2.6|\>=0.5.0|\>=1.0.0_rc1|Python EngineConn. Supports python code.|
-|TiSpark|1.1|\>=0.5.0|ongoing|TiSpark EngineConn. Support querying TiDB data by SparkSQL.|
+|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(default Apache Spark 2.4.3)|\>=1.0.3|Yes|Spark EngineConn, supports SQL , Scala, Pyspark and R code|
+|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Hive 2.3.3)|\>=1.0.3|Yes |Hive EngineConn, supports HiveQL code|
+|Python|Python >= 2.6, <br/>(default Python2*)|\>=1.0.3|Yes |Python EngineConn, supports python code|
+|Shell|Bash >= 2.0|\>=1.0.3|Yes|Shell EngineConn, supports Bash shell code|
+|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(default Hive-jdbc 2.3.4)|\>=1.0.3|No|JDBC EngineConn, already supports MySQL and HiveQL, can be extended quickly Support other engines with JDBC Driver package, such as Oracle|
+|Flink |Flink >= 1.12.2, <br/>(default Apache Flink 1.12.2)|\>=1.0.3|No |Flink EngineConn, supports FlinkSQL code, also supports starting a new Yarn in the form of Flink Jar Application |
+|Pipeline|-|\>=1.0.3|No|Pipeline EngineConn, supports file import and export|
+|openLooKeng|openLooKeng >= 1.5.0, <br/>(default openLookEng 1.5.0)|\>=1.1.1|No|openLooKeng EngineConn, supports querying data virtualization engine with Sql openLooKeng|
+|Sqoop| Sqoop >= 1.4.6, <br/>(default Apache Sqoop 1.4.6)|\>=1.1.2|No|Sqoop EngineConn, support data migration tool Sqoop engine|
+|Impala|Impala >= 3.2.0, CDH >=6.3.0|ongoing|-|Impala EngineConn, supports Impala SQL code|
+|Presto|Presto >= 0.180|ongoing|-|Presto EngineConn, supports Presto SQL code|
+|ElasticSearch|ElasticSearch >=6.0|ongoing|-|ElasticSearch EngineConn, supports SQL and DSL code|
+|MLSQL| MLSQL >=1.1.0|ongoing|-|MLSQL EngineConn, supports MLSQL code.|
+|Hadoop|Apache >=2.6.0, <br/>CDH >=5.4.0|ongoing|-|Hadoop EngineConn, supports Hadoop MR/YARN application|
+|TiSpark|1.1|ongoing|-|TiSpark EngineConn, supports querying TiDB with SparkSQL|
+
 
 # Ecosystem
 
-| Component | Description | Linkis 0.x(recommend 0.11.0) Compatible | Linkis 1.x(recommend 1.1.1) Compatible |
-| --------------- | -------------------------------------------------------------------- | --------- | --------- |
-| [**DataSphereStudio**](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/README.md) | DataSphere Studio (DSS for short) is WeDataSphere, a one-stop data application development management portal. | DSS 0.9.1[released] | **DSS 1.0.1[released][Linkis recommend 1.1.0]** |
-| [**Scriptis**](https://github.com/WeBankFinTech/Scriptis) | Support online script writing such as SQL, Pyspark, HiveQL, etc., submit to [Linkis](https://github.com/apache/incubator-linkis) to perform data analysis web tools. | Scriptis merged in DSS(DSS 0.9.1[released]) | **In DSS 1.0.1[released]** |
-| [**Schedulis**](https://github.com/WeBankFinTech/Schedulis) | Workflow task scheduling system based on Azkaban secondary development, with financial-grade features such as high performance, high availability and multi-tenant resource isolation. | Schedulis 0.6.1[released] |  **Schedulis0.6.2 [released]** |
-| [**Qualitis**](https://github.com/WeBankFinTech/Qualitis) | Data quality verification tool, providing data verification capabilities such as data integrity and correctness  | Qualitis 0.8.0[released] | **Qualitis 0.9.1 [released]** |
-| [**Streamis**](https://github.com/WeBankFinTech/Streamis) | Streaming application development management tool. It supports the release of Flink Jar and Flink SQL, and provides the development, debugging and production management capabilities of streaming applications, such as: start-stop, status monitoring, checkpoint, etc. | **No support** | **Streamis 0.1.0 [released][Linkis recommend 1.1.0]** |
-| [**Exchangis**](https://github.com/WeBankFinTech/Exchangis) | A data exchange platform that supports data transmission between structured and unstructured heterogeneous data sources, the upcoming Exchangis1. 0, will be connected with DSS workflow | **No support** | **Exchangis 1.0.0 [developing]**|
-| [**Visualis**](https://github.com/WeBankFinTech/Visualis) | A data visualization BI tool based on the second development of Davinci, an open source project of CreditEase, provides users with financial-level data visualization capabilities in terms of data security. | Visualis 0.5.0[released]| **Visualis 1.0.0[developing]**|
-| [**Prophecis**](https://github.com/WeBankFinTech/Prophecis) | A one-stop machine learning platform that integrates multiple open source machine learning frameworks. Prophecis' MLFlow can be connected to DSS workflow through AppConn. | Prophecis 0.2.2[released] | **Prophecis 0.3.0 [released]** |
+| Component | Description | Linkis 1.x(recommend 1.1.1) Compatible |
+| --------------- | -------------------------------------------------------------------- | --------- |
+| [**DataSphereStudio**](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/README.md) | DataSphere Studio (DSS for short) is WeDataSphere, a one-stop data application development management portal.  | **DSS 1.0.1[released][Linkis recommend 1.1.1]** |
+| [**Scriptis**](https://github.com/WeBankFinTech/Scriptis) | Support online script writing such as SQL, Pyspark, HiveQL, etc., submit to [Linkis](https://github.com/apache/incubator-linkis) to perform data analysis web tools.  | **In DSS 1.0.1[released]** |
+| [**Schedulis**](https://github.com/WeBankFinTech/Schedulis) | Workflow task scheduling system based on Azkaban secondary development, with financial-grade features such as high performance, high availability and multi-tenant resource isolation. | **Schedulis0.6.2 [released]** |
+| [**Qualitis**](https://github.com/WeBankFinTech/Qualitis) | Data quality verification tool, providing data verification capabilities such as data integrity and correctness  |**Qualitis 0.9.1 [released]** |
+| [**Streamis**](https://github.com/WeBankFinTech/Streamis) | Streaming application development management tool. It supports the release of Flink Jar and Flink SQL, and provides the development, debugging and production management capabilities of streaming applications, such as: start-stop, status monitoring, checkpoint, etc.| **Streamis 0.1.0 [released][Linkis recommend 1.1.0]** |
+| [**Exchangis**](https://github.com/WeBankFinTech/Exchangis) | A data exchange platform that supports data transmission between structured and unstructured heterogeneous data sources, the upcoming Exchangis1. 0, will be connected with DSS workflow | **Exchangis 1.0.0 [developing]**|
+| [**Visualis**](https://github.com/WeBankFinTech/Visualis) | A data visualization BI tool based on the second development of Davinci, an open source project of CreditEase, provides users with financial-level data visualization capabilities in terms of data security. |  **Visualis 1.0.0[developing]**|
+| [**Prophecis**](https://github.com/WeBankFinTech/Prophecis) | A one-stop machine learning platform that integrates multiple open source machine learning frameworks. Prophecis' MLFlow can be connected to DSS workflow through AppConn. | **Prophecis 0.3.0 [released]** |
 
 # Download
 
-Please go to the [Linkis Releases Page](https://github.com/apache/incubator-linkis/releases) to download a compiled distribution or a source code package of Linkis.
+Please go to the [Linkis Releases Page](https://linkis.apache.org/download/main) to download a compiled distribution or a source code package of Linkis.
 
-# Compile and deploy
+# Compile and Deploy
+
+> For more detailed guidance see:
+>[[Backend Compile]](https://linkis.apache.org/zh-CN/docs/latest/development/linkis_compile_and_package)
+>[[Management Console Build]](https://linkis.apache.org/zh-CN/docs/latest/development/web_build)
 
 ```shell
 
@@ -79,6 +87,7 @@ Please go to the [Linkis Releases Page](https://github.com/apache/incubator-link
 ./mvnw -N install
 ./mvnw  clean install -Dmaven.javadoc.skip=true -Dmaven.test.skip=true
 
+
 ### Windows
 mvnw.cmd -N install
 mvnw.cmd clean install -Dmaven.javadoc.skip=true -Dmaven.test.skip=true
@@ -88,16 +97,18 @@ cd incubator-linkis/web
 npm install
 npm run build
 ```
-Please follow [Compile Guide](https://linkis.apache.org/docs/latest/development/linkis_compile_and_package) to compile Linkis from source code.  
-Please refer to [Deployment Documents](https://linkis.apache.org/docs/latest/deployment/quick_deploy) to do the deployment.
-
+ 
+Please refer to [Quick Deployment](https://linkis.apache.org/docs/latest/deployment/quick_deploy) to do the deployment.
 
 # Examples and Guidance
-You can find examples and guidance for how to use and manage Linkis in [User Manual](https://linkis.apache.org/docs/latest/user_guide/overview), [Engine Usage Documents](https://linkis.apache.org/docs/latest/engine_usage/overview) and [API Documents](https://linkis.apache.org/docs/latest/api/overview).
+- [User Manual](https://linkis.apache.org/docs/latest/user_guide/overview)
+- [Engine Usage Documents](https://linkis.apache.org/docs/latest/engine_usage/overview) 
+- [API Documents](https://linkis.apache.org/docs/latest/api/overview)
 
-# Documentation
+# Documentation & Vedio
 
-The documentation of linkis is in [Linkis-Website Git Repository](https://github.com/apache/incubator-linkis-website).
+- The documentation of linkis is in [Linkis-Website Git Repository](https://github.com/apache/incubator-linkis-website).
+- Meetup videos on [Bilibili](https://space.bilibili.com/598542776?from=search&seid=14344213924133040656).
 
 # Architecture
 Linkis services could be divided into three categories: computation governance services, public enhancement services and microservice governance services.
@@ -119,14 +130,16 @@ For code and documentation contributions, please follow the [contribution guide]
 
 # Contact Us
 
-Any questions or suggestions please kindly submit an issue.  
-You can scan the QR code below to join our WeChat group to get more immediate response.
 
-![WeChat](https://user-images.githubusercontent.com/11496700/173569063-8615c259-59ef-477a-9cee-825d28b54e7b.png)
+- Any questions or suggestions please kindly submit an [issue](https://github.com/apache/incubator-linkis/issues).  
+- By mail [dev@linkis.apache.org](mailto:dev@linkis.apache.org)
+- You can scan the QR code below to join our WeChat group to get more immediate response.
+
+![wechatgroup](https://user-images.githubusercontent.com/7869972/176336986-d6b9be8f-d1d3-45f1-aa45-8e6adf5dd244.png)
+
 
-Meetup videos on [Bilibili](https://space.bilibili.com/598542776?from=search&seid=14344213924133040656).
 
 # Who is Using Linkis
 
-We opened [an issue](https://github.com/apache/incubator-linkis/issues/23) for users to feedback and record who is using Linkis.  
+We opened an issue [[Who is Using Linkis]](https://github.com/apache/incubator-linkis/issues/23) for users to feedback and record who is using Linkis.  
 Since the first release of Linkis in 2019, it has accumulated more than **700** trial companies and **1000+** sandbox trial users, which involving diverse industries, from finance, banking, tele-communication, to manufactory, internet companies and so on.
diff --git a/README_CN.md b/README_CN.md
index a4b1d2a2e..f86411ef9 100644
--- a/README_CN.md
+++ b/README_CN.md
@@ -25,70 +25,84 @@ Linkis 自2019年开源发布以来,已累计积累了700多家试验企业和
     **支持的脚本语言**:SparkSQL, HiveQL, Python, Shell, Pyspark, R, Scala 和JDBC 等。    
 - **强大的计算治理能力**。基于Orchestrator、Label Manager和定制的Spring Cloud Gateway等服务,Linkis能够提供基于多级标签的跨集群/跨IDC 细粒度路由、负载均衡、多租户、流量控制、资源控制和编排策略(如双活、主备等)支持能力。  
 - **全栈计算存储引擎架构支持**。能够接收、执行和管理针对各种计算存储引擎的任务和请求,包括离线批量任务、交互式查询任务、实时流式任务和存储型任务;
-- **资源管理能力**。 ResourceManager 不仅具备 Linkis0.X 对 Yarn 和 Linkis EngineManager 的资源管理能力,还将提供基于标签的多级资源分配和回收能力,让 ResourceManager 具备跨集群、跨计算资源类型的强大资源管理能力。
+- **资源管理能力**。 ResourceManager 具备对 Yarn 和 Linkis EngineManager 的资源管理能力,还将提供基于标签的多级资源分配和回收能力,让 ResourceManager 具备跨集群、跨计算资源类型的强大资源管理能力。
 - **统一上下文服务**。为每个计算任务生成context id,跨用户、系统、计算引擎的关联管理用户和系统资源文件(JAR、ZIP、Properties等),结果集,参数变量,函数等,一处设置,处处自动引用;
 - **统一物料**。系统和用户级物料管理,可分享和流转,跨用户、系统共享物料。
 
 # 支持的引擎类型
 
-| **引擎** | **引擎版本** | **Linkis 0.X 版本要求**| **Linkis 1.X 版本要求** | **说明** |
+| **引擎名** | **支持底层组件版本<br/>(默认依赖版本)** | **Linkis 版本要求** | **是否默认包含在发布包中** | **说明** |
 |:---- |:---- |:---- |:---- |:---- |
-|Flink |1.12.2|\>=dev-0.12.0, PR #703 尚未合并|>=1.0.2|	Flink EngineConn。支持FlinkSQL 代码,也支持以Flink Jar 形式启动一个新的Yarn 应用程序。|
-|Impala|\>=3.2.0, CDH >=6.3.0"|\>=dev-0.12.0, PR #703 尚未合并|ongoing|Impala EngineConn. 支持Impala SQL 代码.|
-|Presto|\>= 0.180|\>=0.11.0|ongoing|Presto EngineConn. 支持Presto SQL 代码.|
-|ElasticSearch|\>=6.0|\>=0.11.0|ongoing|ElasticSearch EngineConn. 支持SQL 和DSL 代码.|
-|Shell|Bash >=2.0|\>=0.9.3|\>=1.0.0_rc1|Shell EngineConn. 支持Bash shell 代码.|
-|MLSQL|\>=1.1.0|\>=0.9.1|ongoing|MLSQL EngineConn. 支持MLSQL 代码.|
-|JDBC|MySQL >=5.0, Hive >=1.2.1|\>=0.9.0|\>=1.0.0_rc1|JDBC EngineConn. 已支持MySQL 和HiveQL,可快速扩展支持其他有JDBC Driver 包的引擎, 如Oracle.
-|Spark|Apache 2.0.0~2.4.7, CDH >=5.4.0|\>=0.5.0|\>=1.0.0_rc1|Spark EngineConn. 支持SQL, Scala, Pyspark 和R 代码.|
-|Hive|Apache >=1.0.0, CDH >=5.4.0|\>=0.5.0|\>=1.0.0_rc1|Hive EngineConn. 支持HiveQL 代码.|
-|Hadoop|Apache >=2.6.0, CDH >=5.4.0|\>=0.5.0|ongoing|Hadoop EngineConn. 支持Hadoop MR/YARN application.|
-|Python|\>=2.6|\>=0.5.0|\>=1.0.0_rc1|Python EngineConn. 支持python 代码.|
-|TiSpark|1.1|\>=0.5.0|ongoing|TiSpark EngineConn. 支持用SparkSQL 查询TiDB.|
+|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(默认Apache Spark 2.4.3)|\>=1.0.3|是|Spark EngineConn, 支持SQL, Scala, Pyspark 和R 代码|
+|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(默认Apache Hive 2.3.3)|\>=1.0.3|是|Hive EngineConn, 支持HiveQL 代码|
+|Python|Python >= 2.6, <br/>(默认Python2*)|\>=1.0.3|是|Python EngineConn, 支持python 代码|
+|Shell|Bash >= 2.0|\>=1.0.3|是|Shell EngineConn, 支持Bash shell 代码|
+|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(默认Hive-jdbc 2.3.4)|\>=1.0.3|否|JDBC EngineConn, 已支持MySQL 和HiveQL,可快速扩展支持其他有JDBC Driver 包的引擎, 如Oracle|
+|Flink |Flink >= 1.12.2, <br/>(默认Apache Flink 1.12.2)|\>=1.0.3|否|Flink EngineConn, 支持FlinkSQL 代码,也支持以Flink Jar 形式启动一个新的Yarn 应用程序|
+|Pipeline|-|\>=1.0.3|否|Pipeline EngineConn, 支持文件的导入和导出|
+|openLooKeng|openLooKeng >= 1.5.0, <br/>(默认openLookEng 1.5.0)|\>=1.1.1|否|openLooKeng EngineConn, 支持用Sql查询数据虚拟化引擎openLooKeng|
+|Sqoop| Sqoop >= 1.4.6, <br/>(默认Apache Sqoop 1.4.6)|\>=1.1.2|否|Sqoop EngineConn, 支持 数据迁移工具 Sqoop 引擎|
+|Impala|Impala >= 3.2.0, CDH >=6.3.0|ongoing|-|Impala EngineConn,支持Impala SQL 代码|
+|Presto|Presto >= 0.180|ongoing|-|Presto EngineConn, 支持Presto SQL 代码|
+|ElasticSearch|ElasticSearch >=6.0|ongoing|-|ElasticSearch EngineConn, 支持SQL 和DSL 代码|
+|MLSQL| MLSQL >=1.1.0|ongoing|-|MLSQL EngineConn, 支持MLSQL 代码.|
+|Hadoop|Apache >=2.6.0, <br/>CDH >=5.4.0|ongoing|-|Hadoop EngineConn, 支持Hadoop MR/YARN application|
+|TiSpark|1.1|ongoing|-|TiSpark EngineConn, 支持用SparkSQL 查询TiDB|
 
 # 生态组件
 
-| 应用工具     | 描述                                                          | Linkis 0.X(推荐0.11.0) 兼容版本   | Linkis 1.X(推荐1.1.1) 兼容版本    | 
-| --------------- | -------------------------------------------------------------------- | --------- | ---------- | 
-| [**DataSphere Studio**](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/README-ZH.md)  | DataSphere Studio(简称 DSS)数据应用开发管理集成框架    | DSS 0.9.1[已发布] | **DSS 1.0.1[已发布][Linkis 推荐1.1.0]** |
-| [**Scriptis**](https://github.com/WeBankFinTech/Scriptis)   | 支持在线写 SQL、Pyspark、HiveQL 等脚本,提交给[Linkis](https://github.com/apache/incubator-linkis)执行的数据分析 Web 工具。 | Scriptis合并在DSS中(DSS 0.9.1[已发布]) | 在DSS 1.0.1中[已发布] |
-| [**Schedulis**](https://github.com/WeBankFinTech/Schedulis) | 基于 Azkaban 二次开发的工作流任务调度系统,具备高性能,高可用和多租户资源隔离等金融级特性。 | Schedulis 0.6.1[已发布] | **Schedulis0.6.2 [已发布]** |
-| [**Qualitis**](https://github.com/WeBankFinTech/Qualitis)   | 数据质量校验工具,提供数据完整性、正确性等数据校验能力 | Qualitis 0.8.0[已发布] | **Qualitis 0.9.0 [已发布]** |
-| [**Streamis**](https://github.com/WeBankFinTech/Streamis)  | 流式应用开发管理工具。支持发布 Flink Jar 和 Flink SQL ,提供流式应用的开发调试和生产管理能力,如:启停、状态监控、checkpoint 等。 | 不支持 | **Streamis 0.1.0 [已发布][Linkis 推荐1.1.0]** |
-| [**Exchangis**](https://github.com/WeBankFinTech/Exchangis) | 支持对结构化及无结构化的异构数据源之间的数据传输的数据交换平台,即将发布的 Exchangis1.0,将与 DSS 工作流打通 | 不支持 | **Exchangis 1.0.0 [开发中]** |
-| [**Visualis**](https://github.com/WeBankFinTech/Visualis)   | 基于宜信开源项目 Davinci 二次开发的数据可视化 BI 工具,为用户在数据安全方面提供金融级数据可视化能力。 | Visualis 0.5.0[已发布] | **Visualis 1.0.0[开发中]** |
-| [**Prophecis**](https://github.com/WeBankFinTech/Prophecis)     | 一站式机器学习平台,集成多种开源机器学习框架。Prophecis 的 MLFlow 通过 AppConn 可以接入到 DSS 工作流中。      | Prophecis 0.2.2[已发布] | **Prophecis 0.3.0 [已发布]** |
+| 应用工具     | 描述                                                          | Linkis 1.X(推荐1.1.1) 兼容版本    | 
+| --------------- | -------------------------------------------------------------------- | ---------- | 
+| [**DataSphere Studio**](https://github.com/WeBankFinTech/DataSphereStudio/blob/master/README-ZH.md)  | DataSphere Studio(简称 DSS)数据应用开发管理集成框架    | **DSS 1.0.1[已发布][Linkis 推荐1.1.1]** |
+| [**Scriptis**](https://github.com/WeBankFinTech/Scriptis)   | 支持在线写 SQL、Pyspark、HiveQL 等脚本,提交给[Linkis](https://github.com/apache/incubator-linkis)执行的数据分析 Web 工具。 | 在DSS 1.0.1中[已发布] |
+| [**Schedulis**](https://github.com/WeBankFinTech/Schedulis) | 基于 Azkaban 二次开发的工作流任务调度系统,具备高性能,高可用和多租户资源隔离等金融级特性。  | **Schedulis0.6.2 [已发布]** |
+| [**Qualitis**](https://github.com/WeBankFinTech/Qualitis)   | 数据质量校验工具,提供数据完整性、正确性等数据校验能力  | **Qualitis 0.9.0 [已发布]** |
+| [**Streamis**](https://github.com/WeBankFinTech/Streamis)  | 流式应用开发管理工具。支持发布 Flink Jar 和 Flink SQL ,提供流式应用的开发调试和生产管理能力,如:启停、状态监控、checkpoint 等。 | **Streamis 0.1.0 [已发布][Linkis 推荐1.1.0]** |
+| [**Exchangis**](https://github.com/WeBankFinTech/Exchangis) | 支持对结构化及无结构化的异构数据源之间的数据传输的数据交换平台,即将发布的 Exchangis1.0,将与 DSS 工作流打通 | **Exchangis 1.0.0 [开发中]** |
+| [**Visualis**](https://github.com/WeBankFinTech/Visualis)   | 基于宜信开源项目 Davinci 二次开发的数据可视化 BI 工具,为用户在数据安全方面提供金融级数据可视化能力。 | **Visualis 1.0.0[开发中]** |
+| [**Prophecis**](https://github.com/WeBankFinTech/Prophecis)     | 一站式机器学习平台,集成多种开源机器学习框架。Prophecis 的 MLFlow 通过 AppConn 可以接入到 DSS 工作流中。     | **Prophecis 0.3.0 [已发布]** |
 
 # 下载
 
-请前往[Linkis releases 页面](https://github.com/apache/incubator-linkis/releases) 下载Linkis 的已编译版本或源码包。
+请前往[Linkis Releases 页面](https://linkis.apache.org/download/main) 下载Linkis 的已编译版本或源码包。
 
 # 编译和安装部署
-```shell
 
+> 更详细的步骤参见:
+>[后端编译打包](https://linkis.apache.org/zh-CN/docs/latest/development/linkis_compile_and_package)
+>[管理台编译](https://linkis.apache.org/zh-CN/docs/latest/development/web_build)
+
+```shell script
 ## 后端编译
-### Mac OS/Linux
+
+### Mac OS/Linux 系统
 ./mvnw -N install
 ./mvnw  clean install -Dmaven.javadoc.skip=true -Dmaven.test.skip=true
 
-### Windows
+### Windows系统
 mvnw.cmd -N install
 mvnw.cmd clean install -Dmaven.javadoc.skip=true -Dmaven.test.skip=true
 
-## 前端编译
+## 管理台编译
 cd incubator-linkis/web
 npm install
 npm run build
 ```
-请参照[编译指引](https://linkis.apache.org/zh-CN/docs/latest/development/linkis_compile_and_package) 来编译Linkis 源码。  
-请参考[安装部署文档](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) 来部署Linkis。
+
+请参考[快速安装部署](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) 来部署Linkis
 
 # 示例和使用指引
-请到 [用户手册](https://linkis.apache.org/zh-CN/docs/latest/user_guide/overview), [各引擎使用指引](https://linkis.apache.org/zh-CN/docs/latest/engine_usage/overview) 和[API 文档](https://linkis.apache.org/zh-CN/docs/latest/api/overview) 中,查看如何使用和管理Linkis 的示例和指引。
 
-# 文档
+- [用户手册](https://linkis.apache.org/zh-CN/docs/latest/user_guide/overview),
+- [各引擎使用指引](https://linkis.apache.org/zh-CN/docs/latest/engine_usage/overview) 
+- [API 文档](https://linkis.apache.org/zh-CN/docs/latest/api/overview)
+
+# 文档&视频
+
+- 完整的Linkis文档代码存放在[linkis-website仓库中](https://github.com/apache/incubator-linkis-website)  
+
+- Meetup 视频 [Bilibili](https://space.bilibili.com/598542776?from=search&seid=14344213924133040656).
 
-完整的Linkis文档代码存放在[linkis-website仓库中](https://github.com/apache/incubator-linkis-website)  
 
 # 架构概要
 Linkis 基于微服务架构开发,其服务可以分为3类:计算治理服务、公共增强服务和微服务治理服务。  
@@ -124,13 +138,15 @@ Linkis 基于微服务架构开发,其服务可以分为3类:计算治理服
 
 # 联系我们
 
-对Linkis 的任何问题和建议,敬请提交issue,以便跟踪处理和经验沉淀共享。  
-您也可以扫描下面的二维码,加入我们的微信群,以获得更快速的响应。
-![WeChat](https://user-images.githubusercontent.com/11496700/173569063-8615c259-59ef-477a-9cee-825d28b54e7b.png)
+- 对Linkis 的任何问题和建议,可以提交issue,以便跟踪处理和经验沉淀共享
+- 通过邮件方式 [dev@linkis.apache.org](mailto:dev@linkis.apache.org) 
+- 可以扫描下面的二维码,加入我们的微信群,以获得更快速的响应
+
+![wechatgroup](https://user-images.githubusercontent.com/7869972/176336986-d6b9be8f-d1d3-45f1-aa45-8e6adf5dd244.png)
+
 
-Meetup 视频 [Bilibili](https://space.bilibili.com/598542776?from=search&seid=14344213924133040656).
 
 # 谁在使用Linkis
 
-我们创建了[一个 issue](https://github.com/apache/incubator-linkis/issues/23) 以便用户反馈和记录谁在使用Linkis.  
+我们创建了一个 issue [[Who is Using Linkis]](https://github.com/apache/incubator-linkis/issues/23) 以便用户反馈和记录谁在使用Linkis.  
 Linkis 自2019年开源发布以来,累计已有700多家试验企业和1000+沙盒试验用户,涉及金融、电信、制造、互联网等多个行业。


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@linkis.apache.org
For additional commands, e-mail: commits-help@linkis.apache.org