You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@linkis.apache.org by ca...@apache.org on 2022/08/11 06:49:19 UTC

[incubator-linkis-website] branch dev updated: update the path of linkis project after refactor (#479)

This is an automated email from the ASF dual-hosted git repository.

casion pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-linkis-website.git


The following commit(s) were added to refs/heads/dev by this push:
     new 150a061b04 update the path of linkis project after refactor (#479)
150a061b04 is described below

commit 150a061b041a591e6939247aaabadf8bb2eea2cc
Author: Casion <ca...@gmail.com>
AuthorDate: Thu Aug 11 14:49:14 2022 +0800

    update the path of linkis project after refactor (#479)
---
 community/development-specification/license.md     |  4 ++--
 community/how-to-release.md                        | 16 ++++++-------
 community/how-to-verify.md                         |  2 +-
 .../engine/add-an-engine-conn.md                   |  2 +-
 docs/deployment/engine-conn-plugin-installation.md |  2 +-
 .../sourcecode-hierarchical-structure.md           |  2 +-
 docs/development/linkis-compile-and-package.md     | 10 ++++-----
 docs/development/linkis-debug-in-mac.md            |  2 +-
 docs/development/linkis-debug.md                   | 16 ++++++-------
 docs/development/new-engine-conn.md                |  8 +++----
 docs/engine-usage/elasticsearch.md                 |  4 ++--
 docs/engine-usage/flink.md                         |  4 ++--
 docs/engine-usage/openlookeng.md                   |  4 ++--
 docs/engine-usage/pipeline.md                      |  2 +-
 docs/engine-usage/presto.md                        |  4 ++--
 docs/engine-usage/sqoop.md                         |  4 ++--
 .../current/development-specification/license.md   |  4 ++--
 .../current/how-to-release.md                      | 16 ++++++-------
 .../current/how-to-verify.md                       |  2 +-
 .../engine/add-an-engine-conn.md                   |  2 +-
 .../deployment/engine-conn-plugin-installation.md  |  4 ++--
 .../sourcecode-hierarchical-structure.md           |  2 +-
 .../development/linkis-compile-and-package.md      | 18 +++++++--------
 .../current/development/linkis-debug-in-mac.md     |  2 +-
 .../current/development/linkis-debug.md            | 16 ++++++-------
 .../current/development/new-engine-conn.md         | 26 +++++++++++-----------
 .../current/development/web-build.md               |  4 ++--
 .../current/engine-usage/elasticsearch.md          |  4 ++--
 .../current/engine-usage/flink.md                  |  4 ++--
 .../current/engine-usage/jdbc.md                   |  4 ++--
 .../current/engine-usage/openlookeng.md            |  2 +-
 .../current/engine-usage/overview.md               | 14 +++++++++---
 .../current/engine-usage/pipeline.md               |  2 +-
 .../current/engine-usage/presto.md                 |  4 ++--
 .../current/engine-usage/sqoop.md                  |  2 +-
 .../current/introduction.md                        | 26 +++++++++++-----------
 .../version-1.1.3/engine-usage/elasticsearch.md    |  2 +-
 .../version-1.1.3/engine-usage/presto.md           |  2 +-
 info.txt                                           | 22 +++++++++++++++++-
 39 files changed, 149 insertions(+), 121 deletions(-)

diff --git a/community/development-specification/license.md b/community/development-specification/license.md
index f596cbd4d5..bd52bfae83 100644
--- a/community/development-specification/license.md
+++ b/community/development-specification/license.md
@@ -40,7 +40,7 @@ We need to know the NOTICE/LICENSE of the files introduced by our project or jar
 copyright notice that is included in or attached to the work.
 
 ### Example Scenario 1
-For example, the third-party file `linkis-engineconn-plugins/engineconn-plugins/python/src/main/py4j/py4j-0.10.7-src.zip` is introduced into the source code
+For example, the third-party file `linkis-engineconn-plugins/python/src/main/py4j/py4j-0.10.7-src.zip` is introduced into the source code
 
 Find the source branch of the version corresponding to py4j-0.10.7-src.zip, if there is no `LICENSE/NOTICE` file in the corresponding version branch, select the main branch
 - The project source code is located at: https://github.com/bartdag/py4j/tree/0.10.7/py4j-python
@@ -48,7 +48,7 @@ Find the source branch of the version corresponding to py4j-0.10.7-src.zip, if t
 - NOTICE file: none
 
 The license information of `py4j-0.10.7-src.zip` needs to be specified in the `linkis/LICENSE` file.
-The detailed license.txt file corresponding to `py4j-0.10.7-src.zip` is placed in the same level directory `linkis-engineconn-plugins/engineconn-plugins/python/src/main/py4j/LICENSE-py4j-0.10 .7-src.txt`
+The detailed license.txt file corresponding to `py4j-0.10.7-src.zip` is placed in the same level directory `linkis-engineconn-plugins/python/src/main/py4j/LICENSE-py4j-0.10 .7-src.txt`
 Since https://github.com/bartdag/py4j/tree/0.10.7/py4j-python does not have a NOTICE file, there is no need to append to the `linkis/NOTICE` file.
 
 ### Example Scene 2
diff --git a/community/how-to-release.md b/community/how-to-release.md
index 634c9c4137..bc099335a6 100644
--- a/community/how-to-release.md
+++ b/community/how-to-release.md
@@ -269,7 +269,7 @@ Archives: 0
 0 Unknown Licenses
 ````
 <font color="red">
-If it is not 0, you need to confirm whether there is a license for the binary or compressed file in the source code. You can refer to `linkis-engineconn-plugins/engineconn-plugins/python/src/main/py4j/py4j-0.10.7- src.zip`
+If it is not 0, you need to confirm whether there is a license for the binary or compressed file in the source code. You can refer to `linkis-engineconn-plugins/python/src/main/py4j/py4j-0.10.7- src.zip`
 </font>
 
 ### 2.3 Publish jar package to Apache Nexus repository
@@ -284,7 +284,7 @@ $ mvn -DskipTests deploy -Prelease -Dmaven.javadoc.skip=true
 :::
 
 After the above command is executed successfully, the release package will be automatically uploaded to Apache's staging repository. All Artifacts deployed to the remote [maven repository](http://repository.apache.org/) will be in the staging state. Visit https://repository.apache.org/#stagingRepositories and log in using the Apache LDAP account. You will see the uploaded version, and the content in the `Repository` column is ${STAGING.REPOSITORY}. Click `Close` to tell Nexus that the bu [...]
-At the same time, the binary file assembly-combined-package/target/apache-linkis-1.0.3-incubating-bin.tar.gz is also generated
+At the same time, the binary file linkis-dist/target/apache-linkis-1.0.3-incubating-bin.tar.gz is also generated
 
 Step 2.4-3.3 execute the command, merge it in the release.sh script, or execute it through the release.sh script (See appendix at the end of this article)
 ### 2.4 Package source code
@@ -300,9 +300,9 @@ $ git archive --format=tar.gz --output="dist/apache-linkis/apache-linkis-1.0.3-i
 ```
 ### 2.5 Copy binary files
 
-After step 2.3 is executed, the binary file has been generated, located in assembly-combined-package/target/apache-linkis-1.0.3-incubating-bin.tar.gz
+After step 2.3 is executed, the binary file has been generated, located in linkis-dist/target/apache-linkis-1.0.3-incubating-bin.tar.gz
 ```shell
-$ cp assembly-combined-package/target/apache-linkis-1.0.3-incubating-bin.tar.gz dist/apache-linkis
+$ cp linkis-dist/target/apache-linkis-1.0.3-incubating-bin.tar.gz dist/apache-linkis
 ```
 
 ### 2.6 Package front-end management console
@@ -347,7 +347,7 @@ $ npm install
 
 #### 2.6.3 Package console project
 Execute the following instructions on the terminal command line to package the project and generate a compressed deployment installation package.
-Check web/package.json, web/.env files, and check whether the version number of the front-end management console is correct.
+Check linkis-web/package.json, linkis-web/.env files, and check whether the version number of the front-end management console is correct.
 ```shell
 $ npm run build
 ```
@@ -355,9 +355,9 @@ After the above command is successfully executed, the front-end management conso
 
 #### 2.6.4 Copy console installation package
 
-After step 2.6.3 is executed, the front-end management console installation package has been generated, located at web/apache-linkis-1.0.3-incubating-web-bin.tar.gz
+After step 2.6.3 is executed, the front-end management console installation package has been generated, located at linkis-web/apache-linkis-1.0.3-incubating-web-bin.tar.gz
 ```shell
-$ cp web/apache-linkis-1.0.3-incubating-web-bin.tar.gz dist/apache-linkis
+$ cp linkis-web/apache-linkis-1.0.3-incubating-web-bin.tar.gz dist/apache-linkis
 ```
 
 ### 2.7 Sign the source package/binary package/sha512
@@ -761,7 +761,7 @@ git archive --format=tar.gz --output="dist/apache-linkis/apache-linkis-$release_
 echo  "git archive --format=tar.gz --output='dist/apache-linkis/apache-linkis-$release_version-incubating-src.tar.gz' --prefix=apache-linkis-$release_version-incubating-src/   $git_branch"
 
 #step2 Copy the binary package
-cp assembly-combined-package/target/apache-linkis-$release_version-incubating-bin.tar.gz dist/apache-linkis
+cp linkis-dist/target/apache-linkis-$release_version-incubating-bin.tar.gz dist/apache-linkis
 
 #step3 Package the web (if you need to publish the front end)
 
diff --git a/community/how-to-verify.md b/community/how-to-verify.md
index c2bd808673..41d9f79cba 100644
--- a/community/how-to-verify.md
+++ b/community/how-to-verify.md
@@ -146,7 +146,7 @@ Archives: 0
 0 Unknown Licenses
 ````
 <font color="red">
-If it is not 0, you need to confirm whether the source code has the license for the binary or compressed file. You can refer to the `linkis-engineconn-plugins/engineconn-plugins/python/src/main/py4j/py4j- 0.10.7-src.zip`
+If it is not 0, you need to confirm whether the source code has the license for the binary or compressed file. You can refer to the `linkis-engineconn-plugins/python/src/main/py4j/py4j- 0.10.7-src.zip`
 </font>
 
 
diff --git a/docs/architecture/computation-governance-services/engine/add-an-engine-conn.md b/docs/architecture/computation-governance-services/engine/add-an-engine-conn.md
index 5bb2e3a23c..92d69f7bd0 100644
--- a/docs/architecture/computation-governance-services/engine/add-an-engine-conn.md
+++ b/docs/architecture/computation-governance-services/engine/add-an-engine-conn.md
@@ -41,7 +41,7 @@ ECM selection is mainly to complete the Label passed through the client to selec
 
    **Glossary:**
 
-- EgineConnPlugin: It is the interface that Linkis must implement when connecting a new computing storage engine. This interface mainly includes several capabilities that this EngineConn must provide during the startup process, including EngineConn resource generator, EngineConn startup command generator, EngineConn engine connection Device. Please refer to the Spark engine implementation class for the specific implementation: [SparkEngineConnPlugin](https://github.com/apache/incubator-l [...]
+- EgineConnPlugin: It is the interface that Linkis must implement when connecting a new computing storage engine. This interface mainly includes several capabilities that this EngineConn must provide during the startup process, including EngineConn resource generator, EngineConn startup command generator, EngineConn engine connection Device. Please refer to the Spark engine implementation class for the specific implementation: [SparkEngineConnPlugin](https://github.com/apache/incubator-l [...]
 - EngineConnPluginServer: It is a microservice that loads all the EngineConnPlugins and provides externally the required resource generation capabilities of EngineConn and EngineConn's startup command generation capabilities.
 - EngineConnResourceFactory: Calculate the total resources needed when EngineConn starts this time through the parameters passed in.
 - EngineConnLaunchBuilder: Through the incoming parameters, a startup command of the EngineConn is generated to provide the ECM to start the engine.
diff --git a/docs/deployment/engine-conn-plugin-installation.md b/docs/deployment/engine-conn-plugin-installation.md
index 21b221d936..94e6e71357 100644
--- a/docs/deployment/engine-conn-plugin-installation.md
+++ b/docs/deployment/engine-conn-plugin-installation.md
@@ -22,7 +22,7 @@ hive: engine home directory, must be the name of the engine
     └── 1.2.1 # Engine version
         └── linkis-engineplugin-hive-1.0.0-RC1.jar #Engine module package (only need to place a separate engine package)
 ```
-If you are adding a new engine, you can refer to hive's assembly configuration method, source code directory: linkis-engineconn-plugins/engineconn-plugins/hive/src/main/assembly/distribution.xml
+If you are adding a new engine, you can refer to hive's assembly configuration method, source code directory: linkis-engineconn-pluginshive/src/main/assembly/distribution.xml
 ## 2. Engine Installation
 ### 2.1 Plugin package installation
 1.First, confirm the dist directory of the engine: wds.linkis.engineconn.home (get the value of this parameter from ${LINKIS_HOME}/conf/linkis.properties), this parameter is used by EngineConnPluginServer to read the configuration file that the engine depends on And third-party Jar packages. If the parameter (wds.linkis.engineconn.dist.load.enable=true) is set, the engine in this directory will be automatically read and loaded into the Linkis BML (material library).
diff --git a/docs/deployment/sourcecode-hierarchical-structure.md b/docs/deployment/sourcecode-hierarchical-structure.md
index 9eef4a758d..ae0941d77c 100644
--- a/docs/deployment/sourcecode-hierarchical-structure.md
+++ b/docs/deployment/sourcecode-hierarchical-structure.md
@@ -8,7 +8,7 @@ sidebar_position: 5
 > Linkis source code hierarchical directory structure description, if you want to learn more about Linkis modules, please check [Linkis related architecture design](architecture/overview.md)
 
 ```html
-|-- assembly-combined-package //Compile the module of the entire project
+|-- linkis-dist //Compile the module of the entire project
 |        |-- assembly-combined
 |        |-- bin
 |        |-- deploy-config
diff --git a/docs/development/linkis-compile-and-package.md b/docs/development/linkis-compile-and-package.md
index 50c1a4a279..8a4e0ca786 100644
--- a/docs/development/linkis-compile-and-package.md
+++ b/docs/development/linkis-compile-and-package.md
@@ -52,11 +52,11 @@ Execute the following commands in the root directory of the Linkis source code p
 ```
 
 ### step3 Obtain the installation package
-The compiled complete installation package is in the assembly-combined-package->target directory of the project:
+The compiled complete installation package is in the linkis-dist->target directory of the project:
 
 ```bash
     #Detailed path is as follows
-    incubator-linkis-x.x.x/assembly-combined-package/target/apache-linkis-x.x.x-incubating-bin.tar.gz
+    incubator-linkis-x.x.x/linkis-dist/target/apache-linkis-x.x.x-incubating-bin.tar.gz
 ```
 
 ## 3. Compile a single module
@@ -98,14 +98,14 @@ Here's an example of the Spark engine that builds Linkis:
 Enter the directory where the Spark engine is located to compile and package, the command is as follows:
    
 ```bash
-    cd incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark
+    cd incubator-linkis-x.x.x/linkis-engineconn-pluginsspark
     mvn clean install
 ```
 ### step3 Obtain the installation package
 Get the installation package, there will be a compiled package in the ->target directory of the corresponding module:
    
 ```
-   incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/linkis-engineplugin-spark-x.x.x.jar
+   incubator-linkis-x.x.x/linkis-engineconn-pluginsspark/target/linkis-engineplugin-spark-x.x.x.jar
 ```
 
 How to install Spark engine separately? Please refer to [Linkis Engine Plugin Installation Document](../deployment/engine-conn-plugin-installation)
@@ -159,7 +159,7 @@ Modify the dependency hadoop-hdfs to hadoop-hdfs-client:
 Here's an example of changing the version of Spark. Go to the directory where the Spark engine is located and manually modify the Spark version information of the pom.xml file as follows:
 
 ```bash
-    cd incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark
+    cd incubator-linkis-x.x.x/linkis-engineconn-pluginsspark
     vim pom.xml
 ```
 
diff --git a/docs/development/linkis-debug-in-mac.md b/docs/development/linkis-debug-in-mac.md
index 8d15770b6d..09f04c07b7 100644
--- a/docs/development/linkis-debug-in-mac.md
+++ b/docs/development/linkis-debug-in-mac.md
@@ -247,7 +247,7 @@ linkis-cg-entrance
 linkis-entrance
 
 [VM Opitons]
--DserviceName=linkis-cg-entrance -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-cg-entrance -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [main Class]
 org.apache.linkis.entrance.LinkisEntranceApplication
diff --git a/docs/development/linkis-debug.md b/docs/development/linkis-debug.md
index badade25bf..026b61d640 100644
--- a/docs/development/linkis-debug.md
+++ b/docs/development/linkis-debug.md
@@ -22,7 +22,7 @@ mvn clean Install
 
 ## step2 Necessary parameter configuration
 
-For the configuration file under incubator-linkis/assembly-combined-package/assembly-combined/conf/, you need to configure the database and hive meta and other necessary startup parameters.
+For the configuration file under incubator-linkis/linkis-dist/package/conf/, you need to configure the database and hive meta and other necessary startup parameters.
 
 
 
@@ -31,7 +31,7 @@ For the configuration file under incubator-linkis/assembly-combined-package/asse
 
 In order to facilitate the printing of logs to the console during debugging, you need to modify the default log4j2.xml file and modify the appender to default to console. You need to remove the append of the default RollingFile and add the appender of the console, as shown below:
 ![](/Images/development/debug_log.png)
-log4j2.xml path incubator-linkis/assembly-combined-package/assembly-combined/conf/log4j2.xml
+log4j2.xml path incubator-linkis/linkis-dist/package/conf/log4j2.xml
 
 ```plain
  <?xml version="1.0" encoding="UTF-8"?>
@@ -76,7 +76,7 @@ You can use the "-Xbootclasspath/a: configuration file path" command. Append the
 org.apache.linkis.eureka.SpringCloudEurekaApplication
 
 [VM Opitons]
--DserviceName=linkis-mg-eureka -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-mg-eureka -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Program arguments]
 --spring.profiles.active=eureka --eureka.instance.preferIpAddress=true
@@ -108,7 +108,7 @@ After startup, you can view the list of eureka services through [http://localhos
 org.apache.linkis.gateway.springcloud.LinkisGatewayApplication
 
 [VM Opitons]
--DserviceName=linkis-mg-gateway -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-mg-gateway -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 linkis-gateway-server-support
@@ -125,7 +125,7 @@ Please exclude, the dependency on spring-boot-starter-logging
 org.apache.linkis.filesystem.LinkisPublicServiceApp
 
 [VM Opitons]
--DserviceName=linkis-ps-publicservice -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-ps-publicservice -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 #linkis < 1.1.0  by linkis-jobhistory  
@@ -140,7 +140,7 @@ linkis-storage-script-dev-server
 org.apache.linkis.cs.server.LinkisCSApplication
 
 [VM Opitons]
--DserviceName=linkis-ps-cs -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-ps-cs -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 linkis-cs-server
@@ -155,7 +155,7 @@ linkis-cs-server
 org.apache.linkis.manager.am.LinkisManagerApplication
 
 [VM Opitons]
--DserviceName=linkis-cg-linkismanager -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-cg-linkismanager -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 linkis-application-manager
@@ -166,7 +166,7 @@ linkis-application-manager
 org.apache.linkis.entrance.LinkisEntranceApplication
 
 [VM Opitons]
--DserviceName=linkis-cg-entrance -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-cg-entrance -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 linkis-entrance
diff --git a/docs/development/new-engine-conn.md b/docs/development/new-engine-conn.md
index 0c986f0217..5613606b83 100644
--- a/docs/development/new-engine-conn.md
+++ b/docs/development/new-engine-conn.md
@@ -172,14 +172,14 @@ The final effect presented to the user:
 An example command for JDBC engine module compilation is as follows:
 
 ```shell
-cd /linkis-project/linkis-engineconn-plugins/engineconn-plugins/jdbc
+cd /linkis-project/linkis-engineconn-pluginsjdbc
 
 mvn clean install -DskipTests
 ````
 
 When compiling a complete project, the new engine will not be added to the final tar.gz archive by default. If necessary, please modify the following files:
 
-assembly-combined-package/assembly-combined/src/main/assembly/assembly.xml
+linkis-dist/package/src/main/assembly/assembly.xml
 
 ```xml
 <!--jdbc-->
@@ -187,7 +187,7 @@ assembly-combined-package/assembly-combined/src/main/assembly/assembly.xml
   ......
   <fileSet>
       <directory>
-          ../../linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/
+          ../../linkis-engineconn-pluginsjdbc/target/out/
       </directory>
       <outputDirectory>lib/linkis-engineconn-plugins/</outputDirectory>
       <includes>
@@ -203,7 +203,7 @@ Then run the compile command in the project root directory:
 mvn clean install -DskipTests
 ````
 
-After successful compilation, find out.zip in the directories of assembly-combined-package/target/apache-linkis-1.x.x-incubating-bin.tar.gz and linkis-engineconn-plugins/engineconn-plugins/jdbc/target/.
+After successful compilation, find out.zip in the directories of linkis-dist/target/apache-linkis-1.x.x-incubating-bin.tar.gz and linkis-engineconn-pluginsjdbc/target/.
 
 Upload the out.zip file to the Linkis deployment node and extract it to the Linkis installation directory /lib/linkis-engineconn-plugins/:
 
diff --git a/docs/engine-usage/elasticsearch.md b/docs/engine-usage/elasticsearch.md
index 9fb01b4adb..3bfc469690 100644
--- a/docs/engine-usage/elasticsearch.md
+++ b/docs/engine-usage/elasticsearch.md
@@ -23,7 +23,7 @@ You can click [Linkis engine installation guide] (https://linkis.apache.org/zh-C
 Compile ElasticSearch engine separately
 
 ```
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/elasticsearch/
+${linkis_code_dir}linkis-engineconn-plugins/elasticsearch/
 mvn clean install
 ```
 
@@ -31,7 +31,7 @@ mvn clean install
 
 Upload the engine package compiled in Step 2.1
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/elasticsearch
+${linkis_code_dir}/linkis-engineconn-pluginsjdbc/target/out/elasticsearch
 ```
 to the engine directory on the server
 ```bash 
diff --git a/docs/engine-usage/flink.md b/docs/engine-usage/flink.md
index ec036c3857..b9ffd1b93d 100644
--- a/docs/engine-usage/flink.md
+++ b/docs/engine-usage/flink.md
@@ -39,12 +39,12 @@ The Linkis Flink engine will not be installed in Linkis 1.0.2+ by default, and y
 
 ```
 The way to compile flink separately
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/flink/
+${linkis_code_dir}linkis-engineconn-plugins/flink/
 mvn clean install
 ```
 The installation method is to compile the engine package, the location is
 ```bash
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/flink/target/flink-engineconn.zip
+${linkis_code_dir}linkis-engineconn-plugins/flink/target/flink-engineconn.zip
 ```
 Then deploy to
 ```bash
diff --git a/docs/engine-usage/openlookeng.md b/docs/engine-usage/openlookeng.md
index 8c52a72477..eef0cf5ed6 100644
--- a/docs/engine-usage/openlookeng.md
+++ b/docs/engine-usage/openlookeng.md
@@ -24,7 +24,7 @@ You can follow this guide to deploy and install https://linkis.apache.org/zh-CN/
 Compile openlookeng separately
 
 ````
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/openlookeng/
+${linkis_code_dir}linkis-engineconn-plugins/openlookeng/
 mvn clean install
 ````
 
@@ -32,7 +32,7 @@ mvn clean install
 
 The engine package compiled in step 2.1 is located in
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/openlookeng/target/out/openlookeng
+${linkis_code_dir}/linkis-engineconn-pluginsopenlookeng/target/out/openlookeng
 ````
 Upload to the engine directory of the server
 ```bash
diff --git a/docs/engine-usage/pipeline.md b/docs/engine-usage/pipeline.md
index 02cf04b193..8b7481b063 100644
--- a/docs/engine-usage/pipeline.md
+++ b/docs/engine-usage/pipeline.md
@@ -28,7 +28,7 @@ mvn clean install
 将 1.1 The engine package compiled in step, located in
 
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/pipeline/target/out/pipeline
+${linkis_code_dir}/linkis-engineconn-pluginspipeline/target/out/pipeline
 ```
 Upload to the engine directory of the server
 
diff --git a/docs/engine-usage/presto.md b/docs/engine-usage/presto.md
index d7d63f5add..c4414fd24b 100644
--- a/docs/engine-usage/presto.md
+++ b/docs/engine-usage/presto.md
@@ -21,7 +21,7 @@ You can follow this guide to deploy and install https://linkis.apache.org/zh-CN/
 Compile the Presto engine separately
 
 ````
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/presto/
+${linkis_code_dir}linkis-engineconn-plugins/presto/
 mvn clean install
 ````
 
@@ -29,7 +29,7 @@ mvn clean install
 
 The engine package compiled in step 2.1 is located in
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/presto
+${linkis_code_dir}/linkis-engineconn-pluginsjdbc/target/out/presto
 ````
 Upload to the engine directory of the server
 ```bash
diff --git a/docs/engine-usage/sqoop.md b/docs/engine-usage/sqoop.md
index f8b062d363..cda14b6004 100644
--- a/docs/engine-usage/sqoop.md
+++ b/docs/engine-usage/sqoop.md
@@ -43,12 +43,12 @@ Note: Before compiling the sqoop engine, the linkis project needs to be fully co
 
 ```
 Compile sqoop separately:
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/sqoop/
+${linkis_code_dir}linkis-engineconn-plugins/sqoop/
 mvn clean install
 ```
 The installation method is to compile the compiled engine package, located in 
 ```bash
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/sqoop/target/sqoop-engineconn.zip
+${linkis_code_dir}linkis-engineconn-plugins/sqoop/target/sqoop-engineconn.zip
 ```
 and then deploy to 
 ```bash 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/development-specification/license.md b/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/development-specification/license.md
index 454065d216..dc55b694f7 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/development-specification/license.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/development-specification/license.md
@@ -40,7 +40,7 @@ ASF(Apache基金会)下的开源项目,对于License有着极其严苛的要
 copyright notice that is included in or attached to the work.
 
 ### 示例 场景1
-比如源码中引入了`linkis-engineconn-plugins/engineconn-plugins/python/src/main/py4j/py4j-0.10.7-src.zip`第三方文件
+比如源码中引入了`linkis-engineconn-plugins/python/src/main/py4j/py4j-0.10.7-src.zip`第三方文件
 
 找到py4j-0.10.7-src.zip 对应的版本源码分支,如果对应版本分支无`LICENSE/NOTICE`文件,则选择主分支
 - 项目源码位于:https://github.com/bartdag/py4j/tree/0.10.7/py4j-python
@@ -48,7 +48,7 @@ copyright notice that is included in or attached to the work.
 - NOTICE文件:无
 
 需要在`linkis/LICENSE`文件中说明`py4j-0.10.7-src.zip`的license信息。
-`py4j-0.10.7-src.zip`对应的详细的license.txt文件放在同级的目录下`linkis-engineconn-plugins/engineconn-plugins/python/src/main/py4j/LICENSE-py4j-0.10.7-src.txt`
+`py4j-0.10.7-src.zip`对应的详细的license.txt文件放在同级的目录下`linkis-engineconn-plugins/python/src/main/py4j/LICENSE-py4j-0.10.7-src.txt`
 因为https://github.com/bartdag/py4j/tree/0.10.7/py4j-python 没有NOTICE文件,所以`linkis/NOTICE`文件中无需追加。
 
 ### 示例 场景 2
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/how-to-release.md b/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/how-to-release.md
index ccad789fd7..d18dc77dff 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/how-to-release.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/how-to-release.md
@@ -319,7 +319,7 @@ Archives: 0
 0 Unknown Licenses
 ```
 <font color="red">
-如果不为0,需要确认源码中是否有对该二进制或则压缩文件的license进行说明,可以参考源码中引用的`linkis-engineconn-plugins/engineconn-plugins/python/src/main/py4j/py4j-0.10.7-src.zip`
+如果不为0,需要确认源码中是否有对该二进制或则压缩文件的license进行说明,可以参考源码中引用的`linkis-engineconn-plugins/python/src/main/py4j/py4j-0.10.7-src.zip`
 </font>
 
 
@@ -338,7 +338,7 @@ $ mvn -DskipTests deploy -Prelease -Dmaven.javadoc.skip=true  -DretryFailedDeplo
 
 
 上述命令执行成功后,待发布版本包会自动上传到Apache的临时筹备仓库(staging repository)。所有被deploy到远程[maven仓库](http://repository.apache.org/)的Artifacts都会处于staging状态,访问https://repository.apache.org/#stagingRepositories, 使用Apache的LDAP账户登录后,就会看到上传的版本,`Repository`列的内容即为${STAGING.REPOSITORY}。 **点击`Close`来告诉Nexus这个构建已经完成,只有这样该版本才是可用的**。 如果电子签名等出现问题,`Close`会失败,可以通过`Activity`查看失败信息。
-同时也生成了二进制文件 `assembly-combined-package/target/apache-linkis-1.1.2-incubating-bin.tar.gz`
+同时也生成了二进制文件 `linkis-dist/target/apache-linkis-1.1.2-incubating-bin.tar.gz`
 
 
 步骤2.4-3.3执行命令,合并在release.sh脚本中,也可以通过release.sh脚本(见文末附录)来执行 
@@ -358,9 +358,9 @@ $ git archive --format=tar.gz --output="dist/apache-linkis/apache-linkis-1.1.2-i
 
 ### 2.5 拷贝二进制文件
 
-步骤2.3执行后,二进制文件已经生成,位于assembly-combined-package/target/apache-linkis-1.1.2-incubating-bin.tar.gz
+步骤2.3执行后,二进制文件已经生成,位于linkis-dist/target/apache-linkis-1.1.2-incubating-bin.tar.gz
 ```shell
-$ cp  assembly-combined-package/target/apache-linkis-1.1.2-incubating-bin.tar.gz   dist/apache-linkis
+$ cp  linkis-dist/target/apache-linkis-1.1.2-incubating-bin.tar.gz   dist/apache-linkis
 ```
 
 ### 2.6 打包前端管理台(如果需要发布前端)
@@ -384,7 +384,7 @@ $ npm install
 
 #### 2.6.3 打包前端管理台项目
 在终端命令行执行以下指令对项目进行打包,生成压缩后的部署安装包。
-检查web/package.json,web/.env文件,检查前端管理台版本号是否正确。
+检查linkis-web/package.json,linkis-web/.env文件,检查前端管理台版本号是否正确。
 ```
 $ npm run build
 ```
@@ -414,9 +414,9 @@ $ npm install
 
 #### 2.6.4 拷贝前端管理台安装包
 
-步骤2.6.3执行后,前端管理台安装包已经生成,位于 `web/apache-linkis-1.1.2-incubating-web-bin.tar.gz`
+步骤2.6.3执行后,前端管理台安装包已经生成,位于 `linkis-web/apache-linkis-1.1.2-incubating-web-bin.tar.gz`
 ```shell
-$ cp  web/apache-linkis-1.1.2-incubating-web-bin.tar.gz   dist/apache-linkis
+$ cp  linkis-web/apache-linkis-1.1.2-incubating-web-bin.tar.gz   dist/apache-linkis
 ```
 
 ### 2.7 对源码包/二进制包进行签名/sha512
@@ -943,7 +943,7 @@ git archive --format=tar.gz --output="dist/apache-linkis/apache-linkis-$release_
 echo  "git archive --format=tar.gz --output='dist/apache-linkis/apache-linkis-$release_version-incubating-src.tar.gz' --prefix=apache-linkis-$release_version-incubating-src/   $git_branch"
 
 #step2 拷贝二进制编译包
-cp  assembly-combined-package/target/apache-linkis-$release_version-incubating-bin.tar.gz   dist/apache-linkis
+cp  linkis-dist/target/apache-linkis-$release_version-incubating-bin.tar.gz   dist/apache-linkis
 
 #step3 打包web(如果需要发布前端)
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/how-to-verify.md b/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/how-to-verify.md
index 43aa9abfd5..624e53e7ea 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/how-to-verify.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs-community/current/how-to-verify.md
@@ -148,7 +148,7 @@ Archives: 0
 0 Unknown Licenses
 ```
 <font color="red">
-如果不为0,需要确认源码中是否有对该二进制或则压缩文件的license进行说明,可以参考源码中引用的`linkis-engineconn-plugins/engineconn-plugins/python/src/main/py4j/py4j-0.10.7-src.zip`
+如果不为0,需要确认源码中是否有对该二进制或则压缩文件的license进行说明,可以参考源码中引用的`linkis-engineconn-plugins/python/src/main/py4j/py4j-0.10.7-src.zip`
 </font>
 
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/computation-governance-services/engine/add-an-engine-conn.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/computation-governance-services/engine/add-an-engine-conn.md
index dd3851fd2f..f8d01e117c 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/computation-governance-services/engine/add-an-engine-conn.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/architecture/computation-governance-services/engine/add-an-engine-conn.md
@@ -39,7 +39,7 @@ EngineConn的新增,是Linkis计算治理的计算任务准备阶段的核心
 2. EngineConnPluginServer服务在接收到资源请求后,会先通过传递过来的标签找到对应的引擎标签,通过引擎标签选择对应引擎的EngineConnPlugin。然后通过EngineConnPlugin的资源生成器,对客户端传入的引擎启动参数进行计算,算出本次申请新EngineConn所需的资源,然后返回给LinkisManager。
    
    **名词解释:**
-- EgineConnPlugin:是Linkis对接一个新的计算存储引擎必须要实现的接口,该接口主要包含了这种EngineConn在启动过程中必须提供的几个接口能力,包括EngineConn资源生成器、EngineConn启动命令生成器、EngineConn引擎连接器。具体的实现可以参考Spark引擎的实现类:[SparkEngineConnPlugin](https://github.com/apache/incubator-linkis/blob/master/linkis-engineconn-plugins/engineconn-plugins/spark/src/main/scala/com/webank/wedatasphere/linkis/engineplugin/spark/SparkEngineConnPlugin.scala)。
+- EgineConnPlugin:是Linkis对接一个新的计算存储引擎必须要实现的接口,该接口主要包含了这种EngineConn在启动过程中必须提供的几个接口能力,包括EngineConn资源生成器、EngineConn启动命令生成器、EngineConn引擎连接器。具体的实现可以参考Spark引擎的实现类:[SparkEngineConnPlugin](https://github.com/apache/incubator-linkis/blob/master/linkis-engineconn-plugins/spark/src/main/scala/com/webank/wedatasphere/linkis/engineplugin/spark/SparkEngineConnPlugin.scala)。
 
 - EngineConnPluginServer:是加载了所有的EngineConnPlugin,对外提供EngineConn的所需资源生成能力和EngineConn的启动命令生成能力的微服务。
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/engine-conn-plugin-installation.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/engine-conn-plugin-installation.md
index 70d1f9db95..942a6c789b 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/engine-conn-plugin-installation.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/engine-conn-plugin-installation.md
@@ -52,7 +52,7 @@ sh ./bin/linkis-cli -engineType hive-2.3.3 -codeType hql -code "show tables"  -s
 
 Linkis项目中包含的引擎模块`linkis-engineconn-plugins/engineconn-plugins`都是按这个目录进行打包配置的,
 如果是自己实现的新增引擎,需要按照上述的目录结构进行打包,可以参考hive的assembly配置方式来配置打包流程和配置,
-源码目录:`linkis-engineconn-plugins/engineconn-plugins/hive/src/main/assembly/distribution.xml`
+源码目录:`linkis-engineconn-plugins/hive/src/main/assembly/distribution.xml`
 
 ## 2. 引擎的安装
 
@@ -78,7 +78,7 @@ mvn clean install
 
 编译出来的引擎包,位于
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/{插件模块名}/target/out/{插件模块名}
+${linkis_code_dir}/linkis-engineconn-plugins/{插件模块名}/target/out/{插件模块名}
 ```
 
 ### 2.2 部署和加载
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/sourcecode-hierarchical-structure.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/sourcecode-hierarchical-structure.md
index 5709893aa5..448d886b4f 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/sourcecode-hierarchical-structure.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/sourcecode-hierarchical-structure.md
@@ -7,7 +7,7 @@ sidebar_position: 5
 
 
 ```html
-│-- assembly-combined-package //编译打包最后阶段步骤 整合所有lib包和安装部署脚本配置等
+│-- linkis-dist //编译打包最后阶段步骤 整合所有lib包和安装部署脚本配置等
 │        │-- assembly-combined
 │        │-- bin  安装相关的脚本
 │        │-- deploy-config //安装的配置
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-compile-and-package.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-compile-and-package.md
index da0b786f7f..dc9408df7d 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-compile-and-package.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-compile-and-package.md
@@ -52,11 +52,11 @@ __编译环境要求:__  必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
 ```  
 
 ### step3 获取安装包
-编译后的完整安装包,在工程的assembly-combined-package->target目录下:
+编译后的完整安装包,在工程的linkis-dist->target目录下:
 
 ```bash
     #详细路径如下
-    incubator-linkis-x.x.x/assembly-combined-package/target/apache-linkis-x.x.x-incubating-bin.tar.gz
+    incubator-linkis-x.x.x/linkis-dist/target/apache-linkis-x.x.x-incubating-bin.tar.gz
 ```
 
 ## 3 常见问题 
@@ -97,7 +97,7 @@ __编译环境要求:__  必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
 进入到 Spark 引擎所在的目录进行编译打包,命令如下:
    
 ```bash   
-    cd incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark
+    cd incubator-linkis-x.x.x/linkis-engineconn-plugins/spark
     mvn clean install
 ```
 #### step2 获取引擎的物料包       
@@ -105,23 +105,23 @@ __编译环境要求:__  必须 **JDK8** 以上,**Oracle/Sun** 和 **OpenJDK
 
 ```
    #spark文件下就是编译好的引擎物料
-   incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark/target/out/spark
+   incubator-linkis-x.x.x/linkis-engineconn-plugins/spark/target/out/spark
 ```
 如何单独安装 Spark 引擎?请参考 [Linkis 引擎插件安装文档](../deployment/engine-conn-plugin-installation)
 
 
 ### 3.2 如何将非默认引擎打包至安装部署包中 
  
-> 默认打包配置中`assembly-combined-package/assembly-combined/src/main/assembly/assembly.xml`,只会将`spark/hive/python/shell`打包到安装包中,如果需要添加其它引擎,可参考此步骤 
+> 默认打包配置中`linkis-dist/src/main/assembly/distribution.xml`,只会将`spark/hive/python/shell`打包到安装包中,如果需要添加其它引擎,可参考此步骤 
 
 以jdbc引擎为例 
 
-step1 修改`assembly-combined-package/assembly-combined/src/main/assembly/assembly.xml` 添加jdbc引擎
+step1 修改`linkis-dist/src/main/assembly/distribution.xml` 添加jdbc引擎
 ```shell script
  <!--jdbc-->
     <fileSet>
       <directory>
-        ../../linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/
+        ../../linkis-engineconn-plugins/jdbc/target/out/
       </directory>
       <outputDirectory>lib/linkis-engineconn-plugins/</outputDirectory>
       <includes>
@@ -129,7 +129,7 @@ step1 修改`assembly-combined-package/assembly-combined/src/main/assembly/assem
       </includes>
 </fileSet>
 ```
-step2 如果已经全量编译,可以直接重新编译`assembly-combined-package`模块,如果没有,这执行全量编译
+step2 如果已经全量编译,可以直接重新编译`linkis-dist`模块,如果没有,这执行全量编译
 
  
 ## 4. 如何修改Linkis的依赖的Hadoop、Hive、Spark版本
@@ -184,7 +184,7 @@ pom:Linkis/linkis-commons/linkis-hadoop-common/pom.xml
 这里以修改 Spark 的版本为例进行介绍。进入 Spark 引擎所在的目录,手动修改 pom.xml 文件的 Spark 版本信息,具体如下:
 
 ```bash
-    cd incubator-linkis-x.x.x/linkis-engineconn-plugins/engineconn-plugins/spark
+    cd incubator-linkis-x.x.x/linkis-engineconn-plugins/spark
     vim pom.xml
 ```
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-debug-in-mac.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-debug-in-mac.md
index 3638467193..ec31c0a7d3 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-debug-in-mac.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-debug-in-mac.md
@@ -247,7 +247,7 @@ linkis-cg-entrance
 linkis-entrance
 
 [VM Opitons]
--DserviceName=linkis-cg-entrance -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-cg-entrance -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [main Class]
 org.apache.linkis.entrance.LinkisEntranceApplication
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-debug.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-debug.md
index d95d8ce3dd..e9f9f75044 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-debug.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/linkis-debug.md
@@ -22,13 +22,13 @@ mvn clean Install
 
 ## step2 必要的参数配置
 
-对于incubator-linkis/assembly-combined-package/assembly-combined/conf/下的配置文件,需要对数据库以及hive meta等必要启动参数进行配置。 
+对于incubator-linkis/linkis-dist/package/conf/下的配置文件,需要对数据库以及hive meta等必要启动参数进行配置。 
 
 ## step3 调整log4j.xml配置
 
 为了方便调试的时候将日志打印到控制台,需要修改下默认的log4j2.xml文件,修改appender默认为console。需要移除默认的RollingFile的append,增加console的appender,如下所示:
 ![](/Images/development/debug_log.png)
-log4j2.xml 路径 incubator-linkis/assembly-combined-package/assembly-combined/conf/log4j2.xml
+log4j2.xml 路径 incubator-linkis/linkis-dist/package/conf/log4j2.xml
 
 ```plain
  <?xml version="1.0" encoding="UTF-8"?>
@@ -73,7 +73,7 @@ Linkis和DSS的服务都依赖Eureka,所以需要首先启动Eureka服务,Eu
 org.apache.linkis.eureka.SpringCloudEurekaApplication
 
 [VM Opitons]
--DserviceName=linkis-mg-eureka -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-mg-eureka -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Program arguments]
 --spring.profiles.active=eureka --eureka.instance.preferIpAddress=true
@@ -104,7 +104,7 @@ server:
 org.apache.linkis.gateway.springcloud.LinkisGatewayApplication
 
 [VM Opitons]
--DserviceName=linkis-mg-gateway -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-mg-gateway -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 linkis-gateway-server-support
@@ -122,7 +122,7 @@ linkis-gateway-server-support
 org.apache.linkis.filesystem.LinkisPublicServiceApp
 
 [VM Opitons]
--DserviceName=linkis-ps-publicservice -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-ps-publicservice -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 #linkis < 1.1.0  为linkis-jobhistory  
@@ -137,7 +137,7 @@ linkis-storage-script-dev-server
 org.apache.linkis.cs.server.LinkisCSApplication
 
 [VM Opitons]
--DserviceName=linkis-ps-cs -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-ps-cs -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 linkis-cs-server
@@ -151,7 +151,7 @@ linkis-cs-server
 org.apache.linkis.manager.am.LinkisManagerApplication
 
 [VM Opitons]
--DserviceName=linkis-cg-linkismanager -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-cg-linkismanager -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 linkis-application-manager
@@ -162,7 +162,7 @@ linkis-application-manager
 org.apache.linkis.entrance.LinkisEntranceApplication
 
 [VM Opitons]
--DserviceName=linkis-cg-entrance -Xbootclasspath/a:D:\yourDir\incubator-linkis\assembly-combined-package\assembly-combined\conf
+-DserviceName=linkis-cg-entrance -Xbootclasspath/a:D:\yourDir\incubator-linkis\linkis-dist\package\conf
 
 [Use classpath of module]
 linkis-entrance
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/new-engine-conn.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/new-engine-conn.md
index 8f163fb0dd..07d50c8614 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/new-engine-conn.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/new-engine-conn.md
@@ -140,7 +140,7 @@ val CODE_TYPE_AND_RUN_TYPE_RELATION = CommonVars("wds.linkis.codeType.runType.re
 
 ### 2.6 Linkis管理员台界面引擎管理器中加入JDBC引擎文字提示或图标
 
-web/src/dss/module/resourceSimple/engine.vue
+linkis-web/src/dss/module/resourceSimple/engine.vue
 
 ```js
 methods: {
@@ -172,28 +172,28 @@ methods: {
 JDBC引擎模块编译的示例命令如下:
 
 ```shell
-cd /linkis-project/linkis-engineconn-plugins/engineconn-plugins/jdbc
+cd /linkis-project/linkis-engineconn-pluginsengineconn-plugins/jdbc
 
 mvn clean install -DskipTests
 ```
 
 编译完整项目时,新增引擎默认不会加到最终的tar.gz压缩包中,如果需要,请修改如下文件:
 
-assembly-combined-package/assembly-combined/src/main/assembly/assembly.xml
+linkis-dist/src/main/assembly/distribution.xml
 
 ```xml
 <!--jdbc-->
 <fileSets>
   ......
-  <fileSet>
-      <directory>
-          ../../linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/
-      </directory>
-      <outputDirectory>lib/linkis-engineconn-plugins/</outputDirectory>
-      <includes>
-          <include>**/*</include>
-      </includes>
-  </fileSet>
+        <fileSet>
+            <directory>
+                ../linkis-engineconn-plugins/jdbc/target/out/
+            </directory>
+            <outputDirectory>linkis-package/lib/linkis-engineconn-plugins/</outputDirectory>
+            <includes>
+                <include>**/*</include>
+            </includes>
+        </fileSet>
 </fileSets>
 ```
 
@@ -203,7 +203,7 @@ assembly-combined-package/assembly-combined/src/main/assembly/assembly.xml
 mvn clean install -DskipTests
 ```
 
-编译成功后在assembly-combined-package/target/apache-linkis-1.x.x-incubating-bin.tar.gz和linkis-engineconn-plugins/engineconn-plugins/jdbc/target/目录下找到out.zip。
+编译成功后在linkis-dist/target/apache-linkis-1.x.x-incubating-bin.tar.gz和linkis-engineconn-plugins/jdbc/target/目录下找到out.zip。
 
 上传out.zip文件到Linkis的部署节点,解压缩到Linkis安装目录/lib/linkis-engineconn-plugins/下面:
 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/web-build.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/web-build.md
index 7f971bcad5..a44c47bf24 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/web-build.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/development/web-build.md
@@ -24,7 +24,7 @@ sidebar_position: 4
 在终端命令行中执行以下指令:
 ```
 #进入项目WEB根目录
-$ cd incubator-linkis/web
+$ cd incubator-linkis/linkis-web
 #安装项目所需依赖
 $ npm install
 ```
@@ -34,7 +34,7 @@ $ npm install
 ## 2.2. 打包项目
 
 在终端命令行执行以下指令对项目进行打包,生成压缩后的部署安装包。
-检查web/package.json,web/.env文件,检查前端管理台版本号是否正确。
+检查`linkis-web/package.json`,`linkis-web/.env`文件,检查前端管理台版本号是否正确。
 ```
 $ npm run build
 ```
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/elasticsearch.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/elasticsearch.md
index f75c64d4dd..8ce2f25456 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/elasticsearch.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/elasticsearch.md
@@ -19,7 +19,7 @@ sidebar_position: 11
 单独编译 ElasticSearch 引擎 
 
 ```
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/elasticsearch/
+${linkis_code_dir}/linkis-engineconn-plugins/elasticsearch/
 mvn clean install
 ```
 
@@ -27,7 +27,7 @@ mvn clean install
 
 将 2.1 步编译出来的引擎包,位于
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/elasticsearch
+${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/elasticsearch
 ```
 上传到服务器的引擎目录下
 ```bash 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/flink.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/flink.md
index 96b3c4ba51..47ae09d804 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/flink.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/flink.md
@@ -41,12 +41,12 @@ Linkis Flink引擎默认在Linkis1.0.2+不会安装,需要您手动进行编
 
 ```
 单独编译flink的方式
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/flink/
+${linkis_code_dir}/linkis-engineconn-plugins/flink/
 mvn clean install
 ```
 安装方式是将编译出来的引擎包,位置在
 ```bash
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/flink/target/flink-engineconn.zip
+${linkis_code_dir}/linkis-engineconn-plugins/flink/target/flink-engineconn.zip
 ```
 然后部署到
 ```bash 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/jdbc.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/jdbc.md
index e77f2dd3a5..e97a795973 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/jdbc.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/jdbc.md
@@ -21,7 +21,7 @@ sidebar_position: 7
 单独编译jdbc引擎 
 
 ```
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/jdbc/
+${linkis_code_dir}/linkis-engineconn-plugins/jdbc/
 mvn clean install
 ```
 
@@ -29,7 +29,7 @@ mvn clean install
 
 将 2.1 步编译出来的引擎包,位于
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/jdbc
+${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/jdbc
 ```
 上传到服务器的引擎目录下
 ```bash 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/openlookeng.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/openlookeng.md
index 099e1d268f..7911cb5483 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/openlookeng.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/openlookeng.md
@@ -32,7 +32,7 @@ mvn clean install
 
 将 2.1 步编译出来的引擎包,位于
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/openlookeng/target/out/openlookeng
+${linkis_code_dir}/linkis-engineconn-plugins/openlookeng/target/out/openlookeng
 ```
 上传到服务器的引擎目录下
 ```bash 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/overview.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/overview.md
index 4ceddab0a2..49603117a9 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/overview.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/overview.md
@@ -3,8 +3,12 @@ title: 总览
 sidebar_position: 0
 ---
 ## 1. 概述
-&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Linkis作为一款功能强大的计算中间件,可以方便的对接不同的计算引擎,通过屏蔽不同计算引擎的使用细节,并向上提供了一套统一的使用接口,使得部署和应用Linkis的大数据平台的运维成本大大降低,目前,Linkis已经对接了几款主流的计算引擎,基本上涵盖了上生产上对数据的需求,为了提供更好的可拓展性,Linkis同时提供了接入新引擎的相关接口,可以利用该接口接入新的计算引擎。  
-&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;引擎是提供给用户数据处理和分析能力的组件,目前已经接入Linkis的引擎,有主流的大数据计算引擎Spark、Hive、Presto等,也有python、Shell这些脚本处理数据能力的引擎。DataSphereStudio作为对接了Linkis的一站式数据操作平台,用户可以方便的在DataSphereStudio中使用Linkis支持的引擎完成交互式数据分析任务和工作流任务。
+Linkis作为一款功能强大的计算中间件,可以方便的对接不同的计算引擎,通过屏蔽不同计算引擎的使用细节,并向上提供了一套统一的使用接口,
+使得部署和应用Linkis的大数据平台的运维成本大大降低,目前,Linkis已经对接了几款主流的计算引擎,基本上涵盖了上生产上对数据的需求,
+为了提供更好的可拓展性,Linkis同时提供了接入新引擎的相关接口,可以利用该接口接入新的计算引擎。 
+ 
+引擎是提供给用户数据处理和分析能力的组件,目前已经接入Linkis的引擎,有主流的大数据计算引擎Spark、Hive、Presto等,也有python、Shell这些脚本处理数据能力的引擎。
+DataSphereStudio作为对接了Linkis的一站式数据操作平台,用户可以方便的在DataSphereStudio中使用Linkis支持的引擎完成交互式数据分析任务和工作流任务。
 
 | 引擎          | 是否支持Scriptis |   是否支持工作流   |
 |-------------| ----  | ---- |
@@ -24,4 +28,8 @@ sidebar_position: 0
 - [Shell 引擎使用](shell.md)  
 - [JDBC 引擎使用](jdbc.md)  
 - [Flink 引擎使用](flink.md)  
-- [OpenLooKeng 引擎使用](openlookeng.md) 
\ No newline at end of file
+- [OpenLooKeng 引擎使用](openlookeng.md) 
+- [PipeLine 引擎使用](pipeline.md) 
+- [Sqoop 引擎使用](sqoop.md) 
+- [Presto 引擎使用](presto.md) 
+- [Elasticsearch 引擎使用](elasticsearch.md) 
\ No newline at end of file
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/pipeline.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/pipeline.md
index 4492819cfb..df028db87f 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/pipeline.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/pipeline.md
@@ -28,7 +28,7 @@ mvn clean install
 
 编译出来的引擎包,位于
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/pipeline/target/out/pipeline
+${linkis_code_dir}/linkis-engineconn-plugins/pipeline/target/out/pipeline
 ```
 
 ### 1.2 物料的部署和加载
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/presto.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/presto.md
index a28e722e23..cf551abc4a 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/presto.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/presto.md
@@ -20,7 +20,7 @@ https://linkis.apache.org/zh-CN/blog/2022/04/15/how-to-download-engineconn-plugi
 单独编译 Presto 引擎 
 
 ```
-${linkis_code_dir}/linkis-enginepconn-lugins/engineconn-plugins/presto/
+${linkis_code_dir}/linkis-engineconn-plugins/presto/
 mvn clean install
 ```
 
@@ -28,7 +28,7 @@ mvn clean install
 
 将 2.1 步编译出来的引擎包,位于
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/presto
+${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/presto
 ```
 上传到服务器的引擎目录下
 ```bash 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/sqoop.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/sqoop.md
index 8e52c0f7d8..a9d0600b72 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/sqoop.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/sqoop.md
@@ -54,7 +54,7 @@ Linkis 1.1.2及以上支持的主流Sqoop版本1.4.6与1.4.7,更高版本可
 
 ```
 单独编译sqoop的方式
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/sqoop/
+${linkis_code_dir}/linkis-engineconn-plugins/sqoop/
 mvn clean install
 ```
 安装方式是将编译出来的引擎包,位置在
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/introduction.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/introduction.md
index 80f8ef795a..794e87f94c 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/introduction.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/introduction.md
@@ -13,16 +13,16 @@ Linkis 自2019年开源发布以来,已累计积累了700多家试验企业和
 
 ## 核心特点
 - **丰富的底层计算存储引擎支持**。  
-    **目前支持的计算存储引擎**:Spark、Hive、Flink、Python、Pipeline、Sqoop、openLooKeng、JDBC和Shell等。  
-    **正在支持中的计算存储引擎**:Presto(计划1.2.0)、ElasticSearch(计划1.2.0)等。  
+    **目前支持的计算存储引擎**:Spark、Hive、Flink、Python、Pipeline、Sqoop、openLooKeng、Presto、ElasticSearch、JDBC和Shell等。  
+    **正在支持中的计算存储引擎**:Trino(计划1.3.1)、SeaTunnel(计划1.3.1)等。  
     **支持的脚本语言**:SparkSQL, HiveQL, Python, Shell, Pyspark, R, Scala 和JDBC 等。    
-- **强大的计算治理能力**。基于Orchestrator、Label Manager和定制的Spring Cloud Gateway等服务,Linkis能够提供基于多级标签的跨集群/跨IDC 细粒度路由、负载均衡、多租户、流量控制、资源控制和编排策略(如双活、主备等)支持能力。  
-- **全栈计算存储引擎架构支持**。能够接收、执行和管理针对各种计算存储引擎的任务和请求,包括离线批量任务、交互式查询任务、实时流式任务和存储型任务;
-- **资源管理能力**。 ResourceManager 不仅具备对 Yarn 和 Linkis EngineManager 的资源管理能力,还将提供基于标签的多级资源分配和回收能力,让 ResourceManager 具备跨集群、跨计算资源类型的强大资源管理能力。
-- **统一上下文服务**。为每个计算任务生成context id,跨用户、系统、计算引擎的关联管理用户和系统资源文件(JAR、ZIP、Properties等),结果集,参数变量,函数等,一处设置,处处自动引用;
-- **统一物料**。系统和用户级物料管理,可分享和流转,跨用户、系统共享物料。
-- **统一数据源管理**。提供了hive、es、mysql、kafka类型数据源的增删查改、版本控制、连接测试等功能。
-- **数据源对应的元数据查询**。提供了hive、es、mysql、kafka元数据的数据库、表、分区查询。
+- **强大的计算治理能力** 基于Orchestrator、Label Manager和定制的Spring Cloud Gateway等服务,Linkis能够提供基于多级标签的跨集群/跨IDC 细粒度路由、负载均衡、多租户、流量控制、资源控制和编排策略(如双活、主备等)支持能力。  
+- **全栈计算存储引擎架构支持** 能够接收、执行和管理针对各种计算存储引擎的任务和请求,包括离线批量任务、交互式查询任务、实时流式任务和存储型任务;
+- **资源管理能力**  ResourceManager 不仅具备对 Yarn 和 Linkis EngineManager 的资源管理能力,还将提供基于标签的多级资源分配和回收能力,让 ResourceManager 具备跨集群、跨计算资源类型的强大资源管理能力。
+- **统一上下文服务** 为每个计算任务生成context id,跨用户、系统、计算引擎的关联管理用户和系统资源文件(JAR、ZIP、Properties等),结果集,参数变量,函数等,一处设置,处处自动引用;
+- **统一物料** 系统和用户级物料管理,可分享和流转,跨用户、系统共享物料。
+- **统一数据源管理** 提供了hive、es、mysql、kafka类型数据源的增删查改、版本控制、连接测试等功能。
+- **数据源对应的元数据查询** 提供了hive、es、mysql、kafka元数据的数据库、表、分区查询。
 
 ## 支持的引擎类型
 | **引擎名** | **支持底层组件版本<br/>(默认依赖版本)** | **Linkis 1.X 版本要求** | **是否默认包含在发布包中** | **说明** |
@@ -36,9 +36,9 @@ Linkis 自2019年开源发布以来,已累计积累了700多家试验企业和
 |Pipeline|-|\>=1.0.2|否|Pipeline EngineConn, 支持文件的导入和导出|
 |openLooKeng|openLooKeng >= 1.5.0, <br/>(默认openLookEng 1.5.0)|\>=1.1.1|否|openLooKeng EngineConn, 支持用Sql查询数据虚拟化引擎openLooKeng|
 |Sqoop| Sqoop >= 1.4.6, <br/>(默认Apache Sqoop 1.4.6)|\>=1.1.2|否|Sqoop EngineConn, 支持 数据迁移工具 Sqoop 引擎|
+|Presto|Presto >= 0.180|\>=1.2.0|否|Presto EngineConn, 支持Presto SQL 代码|
+|ElasticSearch|ElasticSearch >=6.0|\>=1.2.0|否|ElasticSearch EngineConn, 支持SQL 和DSL 代码|
 |Impala|Impala >= 3.2.0, CDH >=6.3.0|ongoing|-|Impala EngineConn,支持Impala SQL 代码|
-|Presto|Presto >= 0.180|ongoing|-|Presto EngineConn, 支持Presto SQL 代码|
-|ElasticSearch|ElasticSearch >=6.0|ongoing|-|ElasticSearch EngineConn, 支持SQL 和DSL 代码|
 |MLSQL| MLSQL >=1.1.0|ongoing|-|MLSQL EngineConn, 支持MLSQL 代码.|
 |Hadoop|Apache >=2.6.0, <br/>CDH >=5.4.0|ongoing|-|Hadoop EngineConn, 支持Hadoop MR/YARN application|
 |TiSpark|1.1|ongoing|-|TiSpark EngineConn, 支持用SparkSQL 查询TiDB|
@@ -94,8 +94,8 @@ Linkis 基于微服务架构开发,其服务可以分为3类:计算治理服
 
 
 ## 联系我们
-对Linkis 的任何问题和建议,敬请提交issue,以便跟踪处理和经验沉淀共享。  
-您也可以扫描下面的二维码,加入我们的微信群,以获得更快速的响应。
+对Linkis 的任何问题和建议,敬请提交issue,以便跟踪处理和经验沉淀共享 
+您也可以扫描下面的二维码,加入我们的微信群,以获得更快速的响应 
 ![introduction05](/Images/wedatasphere_contact_01.png)
 
 Meetup 视频 [Bilibili](https://space.bilibili.com/598542776?from=search&seid=14344213924133040656)。
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/engine-usage/elasticsearch.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/engine-usage/elasticsearch.md
index f75c64d4dd..6d663937bd 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/engine-usage/elasticsearch.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/engine-usage/elasticsearch.md
@@ -27,7 +27,7 @@ mvn clean install
 
 将 2.1 步编译出来的引擎包,位于
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/elasticsearch
+${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/elasticsearch
 ```
 上传到服务器的引擎目录下
 ```bash 
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/engine-usage/presto.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/engine-usage/presto.md
index 1a10577e3b..aa9ad90102 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/engine-usage/presto.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.3/engine-usage/presto.md
@@ -28,7 +28,7 @@ mvn clean install
 
 将 2.1 步编译出来的引擎包,位于
 ```bash
-${linkis_code_dir}/linkis-engineconn-plugins/engineconn-plugins/jdbc/target/out/presto
+${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/presto
 ```
 上传到服务器的引擎目录下
 ```bash 
diff --git a/info.txt b/info.txt
index 8e28344a04..f714dde3c6 100644
--- a/info.txt
+++ b/info.txt
@@ -65,4 +65,24 @@ cp -r current.json version-1.1.2.json
 
 
 i18n/zh-CN/code.json
-theme.docs.versions.unreleasedVersionLabel
\ No newline at end of file
+theme.docs.versions.unreleasedVersionLabel
+
+[1.1.3->1.2.0]
+assembly-combined-package/assembly-combined
+linkis-dist/package
+
+assembly-combined-package\assembly-combined
+linkis-dist\package
+
+/linkis-enginepconn-lugins/engineconn-plugins/
+->
+/linkis-engineconn-plugins/
+
+linkis-engineconn-plugins/engineconn-plugins/
+->
+linkis-engineconn-plugins/
+
+web->linkis-web
+
+assembly-combined-package
+->
\ No newline at end of file


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@linkis.apache.org
For additional commands, e-mail: commits-help@linkis.apache.org