You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@dolphinscheduler.apache.org by ke...@apache.org on 2022/09/04 07:32:40 UTC

[dolphinscheduler] branch cve updated (85200811c1 -> 4078d16a61)

This is an automated email from the ASF dual-hosted git repository.

kezhenxu94 pushed a change to branch cve
in repository https://gitbox.apache.org/repos/asf/dolphinscheduler.git


 discard 85200811c1 Bump up dependencies to fix cves
     add 31ba49ac02 Update doc according to recent update of Spring / Swagger (#11755)
     add f8d46a26c1 [Feature-11530] add state history for process instance (#11757)
     add d0d481d10f [Fix][UI] Add the task name entry when the workflow instance detail page goes to the task instance page. (#11761)
     new 4078d16a61 Bump up dependencies to fix cves

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (85200811c1)
            \
             N -- N -- N   refs/heads/cve (4078d16a61)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../en/contribute/development-environment-setup.md | 12 +--
 docs/docs/en/faq.md                                |  5 +-
 docs/docs/en/guide/open-api.md                     | 11 +--
 .../zh/contribute/development-environment-setup.md |  8 +-
 docs/docs/zh/faq.md                                | 22 ++++-
 docs/docs/zh/guide/open-api.md                     | 12 ++-
 .../api/service/impl/ExecutorServiceImpl.java      |  8 +-
 .../service/impl/ProcessInstanceServiceImpl.java   |  6 +-
 .../api/service/ProcessInstanceServiceTest.java    |  6 +-
 .../dolphinscheduler/dao/DaoConfiguration.java     |  2 +-
 .../dao/entity/ProcessInstance.java                | 46 +++++++++++
 .../dao/repository/ProcessInstanceDao.java         | 19 +++--
 .../repository/impl/ProcessInstanceDaoImpl.java    | 55 +++++++++++++
 .../dao/mapper/ProcessInstanceMapper.xml           |  4 +-
 .../src/main/resources/sql/dolphinscheduler_h2.sql |  1 +
 .../main/resources/sql/dolphinscheduler_mysql.sql  |  1 +
 .../resources/sql/dolphinscheduler_postgresql.sql  |  1 +
 .../3.1.1_schema/mysql/dolphinscheduler_ddl.sql    |  2 +
 .../postgresql/dolphinscheduler_ddl.sql            |  4 +-
 .../master/runner/MasterSchedulerBootstrap.java    | 49 ++++++-----
 .../master/runner/WorkflowExecuteRunnable.java     | 16 ++--
 .../master/runner/task/BaseTaskProcessor.java      |  4 +
 .../master/runner/task/BlockingTaskProcessor.java  |  2 +-
 .../master/runner/task/SubTaskProcessor.java       |  8 +-
 .../master/runner/WorkflowExecuteRunnableTest.java | 20 ++---
 .../service/process/ProcessService.java            |  6 --
 .../service/process/ProcessServiceImpl.java        | 94 +++++++---------------
 .../service/process/ProcessServiceTest.java        |  3 +
 .../projects/task/components/node/detail-modal.tsx |  5 +-
 .../src/views/projects/task/instance/use-table.ts  |  3 +-
 30 files changed, 280 insertions(+), 155 deletions(-)
 copy dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/ProcessInstanceDto.java => dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/repository/ProcessInstanceDao.java (66%)
 create mode 100644 dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/repository/impl/ProcessInstanceDaoImpl.java


[dolphinscheduler] 01/01: Bump up dependencies to fix cves

Posted by ke...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kezhenxu94 pushed a commit to branch cve
in repository https://gitbox.apache.org/repos/asf/dolphinscheduler.git

commit 4078d16a616156ac85184763088b3c834a389e87
Author: kezhenxu94 <ke...@apache.org>
AuthorDate: Sun Sep 4 15:30:38 2022 +0800

    Bump up dependencies to fix cves
---
 docs/docs/en/faq.md                                |   2 +-
 docs/docs/zh/faq.md                                | 161 +++++++++++++--------
 .../api/controller/LoginController.java            |   2 +-
 .../dolphinscheduler/api/dto/ClusterDto.java       |   2 +-
 .../dolphinscheduler/api/dto/EnvironmentDto.java   |   2 +-
 .../api/dto/resources/ResourceComponent.java       |  22 +--
 .../api/dto/treeview/Instance.java                 |   9 +-
 .../api/interceptor/LoginHandlerInterceptor.java   |  10 +-
 .../api/service/impl/ClusterServiceImpl.java       |  14 +-
 .../api/service/impl/EnvironmentServiceImpl.java   |  94 ++++++------
 .../api/service/impl/TaskGroupServiceImpl.java     |  64 ++++----
 .../api/vo/AlertPluginInstanceVO.java              |   2 +-
 .../apache/dolphinscheduler/api/vo/ScheduleVo.java |  45 +++---
 .../api/service/ClusterServiceTest.java            |  64 ++++----
 .../api/service/EnvironmentServiceTest.java        |  67 +++++----
 .../api/service/K8SNamespaceServiceTest.java       |  28 ++--
 .../api/service/UsersServiceTest.java              | 123 ++++++++--------
 dolphinscheduler-bom/pom.xml                       |  33 ++++-
 .../common/model/WorkerServerModel.java            |   2 +-
 .../org/apache/dolphinscheduler/dao/PluginDao.java |   6 +-
 .../dao/datasource/SpringConnectionFactory.java    |  47 ++++--
 .../dolphinscheduler/dao/entity/AccessToken.java   |   5 +-
 .../dolphinscheduler/dao/entity/AlertGroup.java    |  25 ++--
 .../dao/entity/AlertPluginInstance.java            |   5 +-
 .../dao/entity/AlertSendStatus.java                |  19 +--
 .../dolphinscheduler/dao/entity/AuditLog.java      |   2 +-
 .../dolphinscheduler/dao/entity/Cluster.java       |   4 +-
 .../dolphinscheduler/dao/entity/Command.java       |  39 +++--
 .../dolphinscheduler/dao/entity/DataSource.java    |   5 +-
 .../dao/entity/DatasourceUser.java                 |   4 +-
 .../dao/entity/DqComparisonType.java               |   5 +-
 .../dao/entity/DqExecuteResult.java                |   9 +-
 .../apache/dolphinscheduler/dao/entity/DqRule.java |   5 +-
 .../dao/entity/DqRuleExecuteSql.java               |   7 +-
 .../dao/entity/DqRuleInputEntry.java               |   7 +-
 .../dao/entity/DqTaskStatisticsValue.java          |   7 +-
 .../dolphinscheduler/dao/entity/Environment.java   |   4 +-
 .../dao/entity/EnvironmentWorkerGroupRelation.java |   4 +-
 .../dolphinscheduler/dao/entity/ErrorCommand.java  |   5 +-
 .../apache/dolphinscheduler/dao/entity/K8s.java    |   5 +-
 .../dao/entity/K8sNamespaceUser.java               |  23 +--
 .../dolphinscheduler/dao/entity/PluginDefine.java  |   5 +-
 .../dao/entity/ProcessDefinition.java              |  75 +++++-----
 .../dao/entity/ProcessInstance.java                |   5 +-
 .../dao/entity/ProcessInstanceMap.java             |   4 +-
 .../dao/entity/ProcessTaskRelation.java            |  49 ++++---
 .../dolphinscheduler/dao/entity/Project.java       |   5 +-
 .../dolphinscheduler/dao/entity/ProjectUser.java   |  25 ++--
 .../apache/dolphinscheduler/dao/entity/Queue.java  |  10 +-
 .../dolphinscheduler/dao/entity/Resource.java      |  40 ++---
 .../dolphinscheduler/dao/entity/Schedule.java      |   5 +-
 .../dao/entity/TaskDefinition.java                 |  51 +++----
 .../dolphinscheduler/dao/entity/TaskGroup.java     |   5 +-
 .../dao/entity/TaskGroupQueue.java                 |  28 ++--
 .../dolphinscheduler/dao/entity/TaskInstance.java  |   7 +-
 .../apache/dolphinscheduler/dao/entity/Tenant.java |  12 +-
 .../dolphinscheduler/dao/entity/UdfFunc.java       |  19 +--
 .../apache/dolphinscheduler/dao/entity/User.java   |  16 +-
 .../dolphinscheduler/dao/entity/WorkerGroup.java   |   5 +-
 .../dolphinscheduler/dao/entity/WorkerServer.java  |   3 +-
 .../dao/mapper/CommandMapperTest.java              |  44 +++---
 .../dao/mapper/DataSourceMapperTest.java           |  58 ++++----
 .../dao/mapper/ProcessDefinitionLogMapperTest.java |  10 +-
 .../dao/mapper/ResourceMapperTest.java             |  61 ++++----
 .../dao/mapper/TaskDefinitionLogMapperTest.java    |   2 +-
 .../dao/mapper/TaskDefinitionMapperTest.java       |  18 ++-
 .../dao/mapper/UdfFuncMapperTest.java              |  65 +++++----
 dolphinscheduler-dist/release-docs/LICENSE         |  84 +++++------
 dolphinscheduler-dist/release-docs/NOTICE          |  12 +-
 .../licenses/LICENSE-hadoop-annotations.txt        |   2 +-
 .../release-docs/licenses/LICENSE-hadoop-auth.txt  |   2 +-
 .../licenses/LICENSE-hadoop-client.txt             |   2 +-
 .../licenses/LICENSE-hadoop-common.txt             |   2 +-
 .../release-docs/licenses/LICENSE-hadoop-hdfs.txt  |   2 +-
 .../LICENSE-hadoop-mapreduce-client-app.txt        |   2 +-
 .../LICENSE-hadoop-mapreduce-client-common.txt     |   2 +-
 .../LICENSE-hadoop-mapreduce-client-core.txt       |   2 +-
 .../LICENSE-hadoop-mapreduce-client-jobclient.txt  |   2 +-
 .../licenses/LICENSE-hadoop-yarn-api.txt           |   2 +-
 .../licenses/LICENSE-hadoop-yarn-client.txt        |   2 +-
 .../licenses/LICENSE-hadoop-yarn-common.txt        |   2 +-
 .../licenses/LICENSE-hadoop-yarn-server-common.txt |   2 +-
 .../server/log/MasterLogFilterTest.java            |  76 +---------
 .../server/log/TaskLogFilterTest.java              |  61 +-------
 .../server/log/WorkerLogFilterTest.java            |  65 +--------
 .../master/runner/WorkflowExecuteRunnable.java     |  48 +++---
 .../dolphinscheduler-registry-zookeeper/pom.xml    |  16 +-
 .../server/log/TaskLogDiscriminatorTest.java       |  60 +-------
 .../service/process/ProcessServiceTest.java        |  93 +++++++-----
 .../plugin/task/api/model/ResourceInfo.java        |   3 +-
 .../plugin/task/api/parameters/SqlParameters.java  |  17 ++-
 .../api/parameters/resource/UdfFuncParameters.java |   6 +-
 .../task/api/parameters/SqlParametersTest.java     |   4 +-
 .../task/dq/rule/entity/DqRuleExecuteSql.java      |   5 +-
 .../task/dq/rule/entity/DqRuleInputEntry.java      |   5 +-
 pom.xml                                            |   1 +
 tools/dependencies/known-dependencies.txt          |  95 ++++++------
 97 files changed, 1138 insertions(+), 1154 deletions(-)

diff --git a/docs/docs/en/faq.md b/docs/docs/en/faq.md
index 23fd8526e6..339d516b38 100644
--- a/docs/docs/en/faq.md
+++ b/docs/docs/en/faq.md
@@ -280,7 +280,7 @@ A : Will hive pom
 <dependency>
     <groupId>org.apache.hive</groupId>
     <artifactId>hive-jdbc</artifactId>
-    <version>2.1.0</version>
+    <version>2.3.3</version>
 </dependency>
 ```
 
diff --git a/docs/docs/zh/faq.md b/docs/docs/zh/faq.md
index 0895fb4303..5daa9a8e3b 100644
--- a/docs/docs/zh/faq.md
+++ b/docs/docs/zh/faq.md
@@ -1,4 +1,5 @@
 <!-- markdown-link-check-disable -->
+
 ## Q:项目的名称是?
 
 A:DolphinScheduler
@@ -9,19 +10,18 @@ A:DolphinScheduler
 
 A:DolphinScheduler 由 5 个服务组成,MasterServer、WorkerServer、ApiServer、AlertServer、LoggerServer 和 UI。
 
-| 服务                      | 说明                                                         |
-| ------------------------- | ------------------------------------------------------------ |
-| MasterServer              | 主要负责 **DAG** 的切分和任务状态的监控                      |
+|            服务             |                              说明                               |
+|---------------------------|---------------------------------------------------------------|
+| MasterServer              | 主要负责 **DAG** 的切分和任务状态的监控                                      |
 | WorkerServer/LoggerServer | 主要负责任务的提交、执行和任务状态的更新。LoggerServer 用于 Rest Api 通过 **RPC** 查看日志 |
-| ApiServer                 | 提供 Rest Api 服务,供 UI 进行调用                            |
-| AlertServer               | 提供告警服务                                                 |
-| UI                        | 前端页面展示                                                 |
+| ApiServer                 | 提供 Rest Api 服务,供 UI 进行调用                                      |
+| AlertServer               | 提供告警服务                                                        |
+| UI                        | 前端页面展示                                                        |
 
 注意:**由于服务比较多,建议单机部署最好是 4 核 16G 以上**
 
 ---
 
-
 ## Q:系统支持哪些邮箱?
 
 A:支持绝大多数邮箱,qq、163、126、139、outlook、aliyun 等皆支持。支持 **TLS 和 SSL** 协议,可以在dolphinscheduler的ui中进行配置:
@@ -177,21 +177,16 @@ A : 1.0.3 版本只实现了 Master 启动流程容错,不走 Worker 容错
 
 A : 设置定时的时候需要注意,如果第一位(* * * * * ? *)设置成 \* ,则表示每秒执行。**我们将会在 1.1.0 版本中加入显示最近调度的时间列表** ,使用 http://cron.qqe2.com/  可以在线看近 5 次运行时间
 
-
-
 ## Q:定时有有效时间范围吗
 
 A:有的,**如果定时的起止时间是同一个时间,那么此定时将是无效的定时**。**如果起止时间的结束时间比当前的时间小,很有可能定时会被自动删除**
 
-
-
 ## Q:任务依赖有几种实现
 
 A:  1,**DAG** 之间的任务依赖关系,是从 **入度为零** 进行 DAG 切分的
 
 ​	 2,有 **任务依赖节点** ,可以实现跨流程的任务或者流程依赖,具体请参考 依赖(DEPENDENT)节点:https://analysys.github.io/easyscheduler_docs_cn/%E7%B3%BB%E7%BB%9F%E4%BD%BF%E7%94%A8%E6%89%8B%E5%86%8C.html#%E4%BB%BB%E5%8A%A1%E8%8A%82%E7%82%B9%E7%B1%BB%E5%9E%8B%E5%92%8C%E5%8F%82%E6%95%B0%E8%AE%BE%E7%BD%AE
 
-
 ## Q:流程定义有几种启动方式
 
 A: 1,在 **流程定义列表**,点击 **启动** 按钮
@@ -216,13 +211,10 @@ export PYTHON_HOME=/bin/python
 export PATH=$HADOOP_HOME/bin:$SPARK_HOME1/bin:$SPARK_HOME2/bin:$PYTHON_HOME:$JAVA_HOME/bin:$HIVE_HOME/bin:$PATH
 ```
 
-
 ## Q:Worker Task 通过 sudo -u 租户 sh xxx.command 会产生子进程,在 kill 的时候,是否会杀掉
 
 A: 我们会在 1.0.4 中增加 kill 任务同时,kill 掉任务产生的各种所有子进程
 
-
-
 ## Q:DolphinScheduler 中的队列怎么用,用户队列和租户队列是什么意思
 
 A : DolphinScheduler 中的队列可以在用户或者租户上指定队列,**用户指定的队列优先级是高于租户队列的优先级的。**,例如:对 MR 任务指定队列,是通过 mapreduce.job.queuename 来指定队列的。
@@ -230,44 +222,36 @@ A : DolphinScheduler 中的队列可以在用户或者租户上指定队列,
 注意:MR 在用以上方法指定队列的时候,传递参数请使用如下方式:
 
 ```
-	Configuration conf = new Configuration();
-        GenericOptionsParser optionParser = new GenericOptionsParser(conf, args);
-        String[] remainingArgs = optionParser.getRemainingArgs();
+Configuration conf = new Configuration();
+   GenericOptionsParser optionParser = new GenericOptionsParser(conf, args);
+   String[] remainingArgs = optionParser.getRemainingArgs();
 ```
 
-
 如果是 Spark 任务 --queue 方式指定队列
 
-
-
 ## Q:Master 或者 Worker 报如下告警
 
 <p align="center">
    <img src="https://analysys.github.io/easyscheduler_docs_cn/images/master_worker_lack_res.png" width="60%" />
  </p>
 
-
 A : 修改 conf 下的 master.properties **master.reserved.memory** 的值为更小的值,比如说 0.1 或者
 
 worker.properties **worker.reserved.memory** 的值为更小的值,比如说 0.1
 
-
-
 ## Q:hive 版本是 1.1.0+cdh5.15.0,SQL hive 任务连接报错
 
 <p align="center">
    <img src="https://analysys.github.io/easyscheduler_docs_cn/images/cdh_hive_error.png" width="60%" />
  </p>
 
-
-
 A: 将 hive pom
 
 ```
 <dependency>
     <groupId>org.apache.hive</groupId>
     <artifactId>hive-jdbc</artifactId>
-    <version>2.1.0</version>
+    <version>2.3.3</version>
 </dependency>
 ```
 
@@ -284,6 +268,7 @@ A: 将 hive pom
 ---
 
 ## Q:如何增加一台工作服务器
+
 A: 1,参考官网[部署文档](https://dolphinscheduler.apache.org/zh-cn/docs/laster/user_doc/installation/cluster.html) 1.3 小节,创建部署用户和 hosts 映射
 
 ​	2,参考官网[部署文档](https://dolphinscheduler.apache.org/zh-cn/docs/laster/user_doc/installation/cluster.html) 1.4 小节,配置 hosts 映射和 ssh 打通及修改目录权限.
@@ -292,8 +277,9 @@ A: 1,参考官网[部署文档](https://dolphinscheduler.apache.org/zh-cn/do
 ​	3,复制正在运行的服务器上的部署目录到新机器的同样的部署目录下
 
 ​	4,到 bin 下,启动 worker server
+
 ```
-        ./dolphinscheduler-daemon.sh start worker-server
+./dolphinscheduler-daemon.sh start worker-server
 ```
 
 ---
@@ -301,22 +287,25 @@ A: 1,参考官网[部署文档](https://dolphinscheduler.apache.org/zh-cn/do
 ## Q:DolphinScheduler 什么时候发布新版本,同时新旧版本区别,以及如何升级,版本号规范
 A:1,Apache 项目的发版流程是通过邮件列表完成的。 你可以订阅 DolphinScheduler 的邮件列表,订阅之后如果有发版,你就可以收到邮件。请参照这篇[指引](https://github.com/apache/dolphinscheduler#get-help)来订阅 DolphinScheduler 的邮件列表。
 
-   2,当项目发版的时候,会有发版说明告知具体的变更内容,同时也会有从旧版本升级到新版本的升级文档。
+2,当项目发版的时候,会有发版说明告知具体的变更内容,同时也会有从旧版本升级到新版本的升级文档。
 
-   3,版本号为 x.y.z, 当 x 增加时代表全新架构的版本。当 y 增加时代表与 y 版本之前的不兼容需要升级脚本或其他人工处理才能升级。当 z 增加代表是 bug 修复,升级完全兼容。无需额外处理。之前有个问题 1.0.2 的升级不兼容 1.0.1 需要升级脚本。
+3,版本号为 x.y.z, 当 x 增加时代表全新架构的版本。当 y 增加时代表与 y 版本之前的不兼容需要升级脚本或其他人工处理才能升级。当 z 增加代表是 bug 修复,升级完全兼容。无需额外处理。之前有个问题 1.0.2 的升级不兼容 1.0.1 需要升级脚本。
 
 ---
 
 ## Q:后续任务在前置任务失败情况下仍旧可以执行
+
 A:在启动工作流的时候,你可以设置失败策略:继续还是失败。
 ![设置任务失败策略](https://user-images.githubusercontent.com/15833811/80368215-ee378080-88be-11ea-9074-01a33d012b23.png)
 
 ---
 
 ## Q:工作流模板 DAG、工作流实例、工作任务及实例之间是什么关系 工作流模板 DAG、工作流实例、工作任务及实例之间是什么关系,一个 dag 支持最大并发 100,是指产生 100 个工作流实例并发运行吗?一个 dag 中的任务节点,也有并发数的配置,是指任务也可以并发多个线程运行吗?最大数 100 吗?
+
 A:
 
 1.2.1 version
+
 ```
    master.properties
    设置 master 节点并发执行的最大工作流数
@@ -334,6 +323,7 @@ A:
 ---
 
 ## Q:工作组管理页面没有展示按钮
+
 <p align="center">
    <img src="https://user-images.githubusercontent.com/39816903/81903776-d8cb9180-95f4-11ea-98cb-94ca1e6a1db5.png" width="60%" />
 </p>
@@ -342,11 +332,13 @@ A:1.3.0 版本,为了支持 k8s,worker ip 一直变动,因此我们不
 ---
 
 ## Q:为什么不把 mysql 的 jdbc 连接包添加到 docker 镜像里面
+
 A:Mysql jdbc 连接包的许可证和 apache v2 的许可证不兼容,因此它不能被加入到 docker 镜像里面。
 
 ---
 
 ## Q:当一个任务提交多个 yarn 程序的时候经常失败
+
 <p align="center">
    <img src="https://user-images.githubusercontent.com/16174111/81312485-476e9380-90b9-11ea-9aad-ed009db899b1.png" width="60%" />
 </p>
@@ -355,32 +347,35 @@ A:这个 Bug 在 dev 分支已修复,并加入到需求/待做列表。
 ---
 
 ## Q:Master 服务和 Worker 服务在运行几天之后停止了
+
 <p align="center">
    <img src="https://user-images.githubusercontent.com/18378986/81293969-c3101680-90a0-11ea-87e5-ac9f0dd53f5e.png" width="60%" />
 </p>
 A:会话超时时间太短了,只有 0.3 秒,修改 zookeeper.properties 的配置项:
 
 ```
-   zookeeper.session.timeout=60000
-   zookeeper.connection.timeout=30000
+zookeeper.session.timeout=60000
+zookeeper.connection.timeout=30000
 ```
 
 ---
 
 ## Q:使用 docker-compose 默认配置启动,显示 zookeeper 错误
+
 <p align="center">
    <img src="https://user-images.githubusercontent.com/42579056/80374318-13c98780-88c9-11ea-8d5f-53448b957f02.png" width="60%" />
  </p>
 A:这个问题在 dev-1.3.0 版本解决了。这个 [pr](https://github.com/apache/dolphinscheduler/pull/2595) 已经解决了这个 bug,主要的改动点:
 
 ```
-    在docker-compose.yml文件中增加zookeeper的环境变量ZOO_4LW_COMMANDS_WHITELIST。
-    把minLatency,avgLatency and maxLatency的类型从int改成float。
+在docker-compose.yml文件中增加zookeeper的环境变量ZOO_4LW_COMMANDS_WHITELIST。
+把minLatency,avgLatency and maxLatency的类型从int改成float。
 ```
 
 ---
 
 ## Q:界面上显示任务一直运行,结束不了,从日志上看任务实例为空
+
 <p align="center">
    <img src="https://user-images.githubusercontent.com/51871547/80302626-b1478d00-87dd-11ea-97d4-08aa2244a6d0.jpg" width="60%" />
  </p>
@@ -399,49 +394,61 @@ A:这个 [bug](https://github.com/apache/dolphinscheduler/issues/1477)  描述
 ---
 
 ## Q:zk 中注册的 master 信息 ip 地址是 127.0.0.1,而不是配置的域名所对应或者解析的 ip 地址,可能导致不能查看任务日志
+
 A:修复 bug:
+
 ```
-   1、confirm hostname
-   $hostname
-   hadoop1
-   2、hostname -i
-   127.0.0.1 10.3.57.15
-   3、edit /etc/hosts,delete hadoop1 from 127.0.0.1 record
-   $cat /etc/hosts
-   127.0.0.1 localhost
-   10.3.57.15 ds1 hadoop1
-   4、hostname -i
-   10.3.57.15
+1、confirm hostname
+$hostname
+hadoop1
+2、hostname -i
+127.0.0.1 10.3.57.15
+3、edit /etc/hosts,delete hadoop1 from 127.0.0.1 record
+$cat /etc/hosts
+127.0.0.1 localhost
+10.3.57.15 ds1 hadoop1
+4、hostname -i
+10.3.57.15
 ```
-   hostname 命令返回服务器主机名,hostname -i 返回的是服务器主机名在 /etc/hosts 中所有匹配的ip地址。所以我把 /etc/hosts 中 127.0.0.1 中的主机名删掉,只保留内网 ip 的解析就可以了,没必要把 127.0.0.1 整条注释掉, 只要 hostname 命令返回值在 /etc/hosts 中对应的内网 ip 正确就可以,ds 程序取了第一个值,我理解上 ds 程序不应该用 hostname -i 取值这样有点问题,因为好多公司服务器的主机名都是运维配置的,感觉还是直接取配置文件的域名解析的返回 ip 更准确,或者 znode 中存域名信息而不是 /etc/hosts。
+
+hostname 命令返回服务器主机名,hostname -i 返回的是服务器主机名在 /etc/hosts 中所有匹配的ip地址。所以我把 /etc/hosts 中 127.0.0.1 中的主机名删掉,只保留内网 ip 的解析就可以了,没必要把 127.0.0.1 整条注释掉, 只要 hostname 命令返回值在 /etc/hosts 中对应的内网 ip 正确就可以,ds 程序取了第一个值,我理解上 ds 程序不应该用 hostname -i 取值这样有点问题,因为好多公司服务器的主机名都是运维配置的,感觉还是直接取配置文件的域名解析的返回 ip 更准确,或者 znode 中存域名信息而不是 /etc/hosts。
 
 ---
 
 ## Q:调度系统设置了一个秒级的任务,导致系统挂掉
+
 A:调度系统不支持秒级任务。
 
 ---
 
 ## Q:编译前后端代码 (dolphinscheduler-ui) 报错不能下载"https://github.com/sass/node-sass/releases/download/v4.13.1/darwin-x64-72_binding.node"
 A:1,cd dolphinscheduler-ui 然后删除 node_modules 目录
+
 ```
 sudo rm -rf node_modules
 ```
-   ​	2,通过 npm.taobao.org 下载 node-sass
- ```
- sudo npm uninstall node-sass
- sudo npm i node-sass --sass_binary_site=https://npm.taobao.org/mirrors/node-sass/
- ```
-   3,如果步骤 2 报错,请重新构建 node-saas [参考链接](https://dolphinscheduler.apache.org/en-us/development/frontend-development.html)
+
+​	2,通过 npm.taobao.org 下载 node-sass
+
 ```
- sudo npm rebuild node-sass
+sudo npm uninstall node-sass
+sudo npm i node-sass --sass_binary_site=https://npm.taobao.org/mirrors/node-sass/
 ```
+
+3,如果步骤 2 报错,请重新构建 node-saas [参考链接](https://dolphinscheduler.apache.org/en-us/development/frontend-development.html)
+
+```
+sudo npm rebuild node-sass
+```
+
 当问题解决之后,如果你不想每次编译都下载这个 node,你可以设置系统环境变量:SASS_BINARY_PATH= /xxx/xxx/xxx/xxx.node。
 
 ---
 
 ## Q:当使用 mysql 作为 ds 数据库需要如何配置
+
 A:1,修改项目根目录 maven 配置文件,移除 scope 的 test 属性,这样 mysql 的包就可以在其它阶段被加载
+
 ```
 <dependency>
 	<groupId>mysql</groupId>
@@ -450,31 +457,36 @@ A:1,修改项目根目录 maven 配置文件,移除 scope 的 test 属性
 	<scope>test<scope>
 </dependency>
 ```
-   ​	2,修改 application-dao.properties 和 quzrtz.properties 来使用 mysql 驱动
-   默认驱动是 postgres 主要由于许可证原因。
+
+​	2,修改 application-dao.properties 和 quzrtz.properties 来使用 mysql 驱动
+默认驱动是 postgres 主要由于许可证原因。
 
 ---
 
 ## Q:shell 任务是如何运行的
+
 A:1,被执行的服务器在哪里配置,以及实际执行的服务器是哪台? 要指定在某个 worker 上去执行,可以在 worker 分组中配置,固定 IP,这样就可以把路径写死。如果配置的 worker 分组有多个 worker,实际执行的服务器由调度决定的,具有随机性。
 
-   ​	2,如果是服务器上某个路径的一个 shell 文件,怎么指向这个路径?服务器上某个路径下的 shell 文件,涉及到权限问题,不建议这么做。建议你可以使用资源中心的存储功能,然后在 shell 编辑器里面使用资源引用就可以,系统会帮助你把脚本下载到执行目录下。如果以 hdfs 作为资源中心,在执行的时候,调度器会把依赖的 jar 包,文件等资源拉到 worker 的执行目录上,我这边是 /tmp/escheduler/exec/process,该配置可以在 install.sh 中进行指定。
+​	2,如果是服务器上某个路径的一个 shell 文件,怎么指向这个路径?服务器上某个路径下的 shell 文件,涉及到权限问题,不建议这么做。建议你可以使用资源中心的存储功能,然后在 shell 编辑器里面使用资源引用就可以,系统会帮助你把脚本下载到执行目录下。如果以 hdfs 作为资源中心,在执行的时候,调度器会把依赖的 jar 包,文件等资源拉到 worker 的执行目录上,我这边是 /tmp/escheduler/exec/process,该配置可以在 install.sh 中进行指定。
 
-   3,以哪个用户来执行任务?执行任务的时候,调度器会采用 sudo -u 租户的方式去执行,租户是一个 linux 用户。
+3,以哪个用户来执行任务?执行任务的时候,调度器会采用 sudo -u 租户的方式去执行,租户是一个 linux 用户。
 
 ---
 
 ## Q:生产环境部署方式有推荐的最佳实践吗
+
 A:1,如果没有很多任务要运行,出于稳定性考虑我们建议使用 3 个节点,并且最好把 Master/Worder 服务部署在不同的节点。如果你只有一个节点,当然只能把所有的服务部署在同一个节点!通常来说,需要多少节点取决于你的业务,海豚调度系统本身不需要很多的资源。充分测试之后,你们将找到使用较少节点的合适的部署方式。
 
 ---
 
 ## Q:DEPENDENT 节点
+
 A:1,DEPENDENT 节点实际是没有执行体的,是专门用来配置数据周期依赖逻辑,然后再把执行节点挂载后面,来实现任务间的周期依赖。
 
 ---
 
 ## Q:如何改变 Master 服务的启动端口
+
 <p align="center">
    <img src="https://user-images.githubusercontent.com/8263441/62352160-0f3e9100-b53a-11e9-95ba-3ae3dde49c72.png" width="60%" />
  </p>
@@ -483,16 +495,23 @@ A:1,修改 application_master.properties 配置文件,例如:server.port
 ---
 
 ## Q:调度任务不能上线
+
 A:1,我们可以成功创建调度任务,并且表 t_scheduler_schedules 中也成功加入了一条记录,但当我点击上线后,前端页面无反应且会把 t_scheduler_schedules 这张表锁定,我测试过将 t_scheduler_schedules 中的 RELEASE_state 字段手动更新为 1 这样前端会显示为上线状态。DS 版本 1.2+ 表名是 t_ds_schedules,其它版本表名是 t_scheduler_schedules。
 
 ---
 
 ## Q:请问 swagger ui 的地址是什么
+<<<<<<< HEAD
 A:1, 3.1.0+ 版本地址是 http://apiServerIp:apiServerPort/dolphinscheduler/swagger-ui/index.html, 1.2+ 版本地址是:http://apiServerIp:apiServerPort/dolphinscheduler/swagger-ui/index.html?language=zh_CN&lang=cn,其它版本是 http://apiServerIp:apiServerPort/escheduler/swagger-ui/index.html?language=zh_CN&lang=cn。
+=======
+
+A:1,1.2+ 版本地址是:http://apiServerIp:apiServerPort/dolphinscheduler/doc.html?language=zh_CN&lang=cn,其它版本是 http://apiServerIp:apiServerPort/escheduler/doc.html?language=zh_CN&lang=cn。
+>>>>>>> 85200811c (Bump up dependencies to fix cves)
 
 ---
 
 ## Q:前端安装包缺少文件
+
 <p align="center">
    <img src="https://user-images.githubusercontent.com/41460919/61437083-d960b080-a96e-11e9-87f1-297ba3aca5e3.png" width="60%" />
  </p>
@@ -504,33 +523,45 @@ A: 1,用户修改了 api server 配置文件中的![apiServerContextPath](ht
 ---
 
 ## Q:上传比较大的文件卡住
+
 <p align="center">
    <img src="https://user-images.githubusercontent.com/21357069/58231400-805b0e80-7d69-11e9-8107-7f37b06a95df.png" width="60%" />
  </p>
 A:1,编辑 ngnix 配置文件 vi /etc/nginx/nginx.conf,更改上传大小 client_max_body_size 1024m。
 
+<<<<<<< HEAD
    ​	2,更新 google chrome 版本到最新版本。
+=======
+​	2,更新 google chrome 版本到最新版本。
+>>>>>>> 85200811c (Bump up dependencies to fix cves)
 
 ---
 
 ## Q:创建 spark 数据源,点击“测试连接”,系统回退回到登入页面
+
 A:1,edit /etc/nginx/conf.d/escheduler.conf
+
 ```
-     proxy_connect_timeout 300s;
-     proxy_read_timeout 300s;
-     proxy_send_timeout 300s;
+proxy_connect_timeout 300s;
+proxy_read_timeout 300s;
+proxy_send_timeout 300s;
 ```
 
 ---
 
 ## Q:工作流依赖
+
 A:1,目前是按照自然天来判断,上月末:判断时间是工作流 A start_time/scheduler_time between '2019-05-31 00:00:00' and '2019-05-31 23:59:59'。上月:是判断上个月从 1 号到月末每天都要有完成的A实例。上周: 上周 7 天都要有完成的 A 实例。前两天: 判断昨天和前天,两天都要有完成的 A 实例。
 
 ---
 
 ## Q:DS 后端接口文档
+<<<<<<< HEAD
 A:1,http://106.75.43.194:8888/dolphinscheduler/swagger-ui/index.html?language=zh_CN&lang=zh。
+=======
+>>>>>>> 85200811c (Bump up dependencies to fix cves)
 
+A:1,http://106.75.43.194:8888/dolphinscheduler/doc.html?language=zh_CN&lang=zh。
 
 ## dolphinscheduler 在运行过程中,ip 地址获取错误的问题
 
@@ -582,13 +613,22 @@ sed -i 's/Defaults    requirett/#Defaults    requirett/g' /etc/sudoers
 ---
 
 ## Q:Yarn多集群支持
+
 A:将Worker节点分别部署至多个Yarn集群,步骤如下(例如AWS EMR):
 
+<<<<<<< HEAD
    1. 将 Worker 节点部署至 EMR 集群的 Master 节点
 
    2. 将 `conf/common.properties` 中的 `yarn.application.status.address` 修改为当前集群的 Yarn 的信息
 
    3. 通过 `bin/dolphinscheduler-daemon.sh start worker-server` 启动 worker-server
+=======
+1. 将 Worker 节点部署至 EMR 集群的 Master 节点
+
+2. 将 `conf/common.properties` 中的 `yarn.application.status.address` 修改为当前集群的 Yarn 的信息
+
+3. 通过 `bin/dolphinscheduler-daemon.sh start worker-server` 启动 worker-server
+>>>>>>> 85200811c (Bump up dependencies to fix cves)
 
 ---
 
@@ -683,6 +723,7 @@ DELETE FROM t_ds_task_definition_log WHERE id IN
 ## Q:使用Postgresql数据库从2.0.1升级至2.0.5更新失败
 
 A:在数据库中执行以下SQL即可完成修复:
+
 ```SQL
 update t_ds_version set version='2.0.1';
 ```
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/controller/LoginController.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/controller/LoginController.java
index e4a9b9e625..097493a4f5 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/controller/LoginController.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/controller/LoginController.java
@@ -32,8 +32,8 @@ import org.apache.dolphinscheduler.dao.entity.User;
 
 import springfox.documentation.annotations.ApiIgnore;
 
-import org.apache.commons.httpclient.HttpStatus;
 import org.apache.commons.lang3.StringUtils;
+import org.apache.http.HttpStatus;
 
 import java.util.Map;
 
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/ClusterDto.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/ClusterDto.java
index 0dd0dcd23f..f818c70ca2 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/ClusterDto.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/ClusterDto.java
@@ -55,7 +55,7 @@ public class ClusterDto {
 
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/EnvironmentDto.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/EnvironmentDto.java
index a89d34fe4a..99157a814a 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/EnvironmentDto.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/EnvironmentDto.java
@@ -55,7 +55,7 @@ public class EnvironmentDto {
 
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/resources/ResourceComponent.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/resources/ResourceComponent.java
index c66172c2ec..e2ba8c0724 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/resources/ResourceComponent.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/resources/ResourceComponent.java
@@ -27,8 +27,9 @@ import com.fasterxml.jackson.annotation.JsonPropertyOrder;
 /**
  * resource component
  */
-@JsonPropertyOrder({"id","pid","name","fullName","description","isDirctory","children","type"})
+@JsonPropertyOrder({"id", "pid", "name", "fullName", "description", "isDirctory", "children", "type"})
 public abstract class ResourceComponent {
+
     public ResourceComponent() {
     }
 
@@ -39,11 +40,10 @@ public abstract class ResourceComponent {
         this.fullName = fullName;
         this.description = description;
         this.isDirctory = isDirctory;
-        int directoryFlag = isDirctory ? 1:0;
-        this.idValue = String.format("%s_%s",id,directoryFlag);
+        int directoryFlag = isDirctory ? 1 : 0;
+        this.idValue = String.format("%s_%s", id, directoryFlag);
     }
 
-
     /**
      * id
      */
@@ -89,19 +89,19 @@ public abstract class ResourceComponent {
      * add resource component
      * @param resourceComponent resource component
      */
-    public void add(ResourceComponent resourceComponent){
+    public void add(ResourceComponent resourceComponent) {
         children.add(resourceComponent);
     }
 
-    public String getName(){
+    public String getName() {
         return this.name;
     }
 
-    public String getDescription(){
+    public String getDescription() {
         return this.description;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -145,9 +145,9 @@ public abstract class ResourceComponent {
         return idValue;
     }
 
-    public void setIdValue(int id,boolean isDirctory) {
-        int directoryFlag = isDirctory ? 1:0;
-        this.idValue = String.format("%s_%s",id,directoryFlag);
+    public void setIdValue(int id, boolean isDirctory) {
+        int directoryFlag = isDirctory ? 1 : 0;
+        this.idValue = String.format("%s_%s", id, directoryFlag);
     }
 
     public ResourceType getType() {
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/treeview/Instance.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/treeview/Instance.java
index d714fd291e..fb1b52c6d8 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/treeview/Instance.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/dto/treeview/Instance.java
@@ -56,7 +56,6 @@ public class Instance {
      */
     private Date endTime;
 
-
     /**
      * node running on which host
      */
@@ -79,7 +78,8 @@ public class Instance {
         this.type = type;
     }
 
-    public Instance(int id, String name, long code, String type, String state, Date startTime, Date endTime, String host, String duration, long subflowCode) {
+    public Instance(int id, String name, long code, String type, String state, Date startTime, Date endTime,
+                    String host, String duration, long subflowCode) {
         this.id = id;
         this.name = name;
         this.code = code;
@@ -92,11 +92,12 @@ public class Instance {
         this.subflowCode = subflowCode;
     }
 
-    public Instance(int id, String name, long code, String type, String state, Date startTime, Date endTime, String host, String duration) {
+    public Instance(int id, String name, long code, String type, String state, Date startTime, Date endTime,
+                    String host, String duration) {
         this(id, name, code, type, state, startTime, endTime, host, duration, 0);
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/interceptor/LoginHandlerInterceptor.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/interceptor/LoginHandlerInterceptor.java
index aa14d07519..ce0db99473 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/interceptor/LoginHandlerInterceptor.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/interceptor/LoginHandlerInterceptor.java
@@ -25,8 +25,10 @@ import org.apache.dolphinscheduler.common.thread.ThreadLocalContext;
 import org.apache.dolphinscheduler.dao.entity.User;
 import org.apache.dolphinscheduler.dao.mapper.UserMapper;
 
-import org.apache.commons.httpclient.HttpStatus;
 import org.apache.commons.lang3.StringUtils;
+import org.apache.http.HttpStatus;
+
+import java.util.Date;
 
 import javax.servlet.http.HttpServletRequest;
 import javax.servlet.http.HttpServletResponse;
@@ -37,12 +39,11 @@ import org.springframework.beans.factory.annotation.Autowired;
 import org.springframework.web.servlet.HandlerInterceptor;
 import org.springframework.web.servlet.ModelAndView;
 
-import java.util.Date;
-
 /**
  * login interceptor, must log in first
  */
 public class LoginHandlerInterceptor implements HandlerInterceptor {
+
     private static final Logger logger = LoggerFactory.getLogger(LoginHandlerInterceptor.class);
 
     @Autowired
@@ -93,7 +94,8 @@ public class LoginHandlerInterceptor implements HandlerInterceptor {
     }
 
     @Override
-    public void postHandle(HttpServletRequest request, HttpServletResponse response, Object handler, ModelAndView modelAndView) throws Exception {
+    public void postHandle(HttpServletRequest request, HttpServletResponse response, Object handler,
+                           ModelAndView modelAndView) throws Exception {
         ThreadLocalContext.getTimezoneThreadLocal().remove();
     }
 }
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/ClusterServiceImpl.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/ClusterServiceImpl.java
index 0fb9a53b1c..3e2cd92cca 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/ClusterServiceImpl.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/ClusterServiceImpl.java
@@ -248,8 +248,8 @@ public class ClusterServiceImpl extends BaseServiceImpl implements ClusterServic
             return result;
         }
 
-        Integer relatedNamespaceNumber = k8sNamespaceMapper
-            .selectCount(new QueryWrapper<K8sNamespace>().lambda().eq(K8sNamespace::getClusterCode, code));
+        Long relatedNamespaceNumber = k8sNamespaceMapper
+                .selectCount(new QueryWrapper<K8sNamespace>().lambda().eq(K8sNamespace::getClusterCode, code));
 
         if (relatedNamespaceNumber > 0) {
             putMsg(result, Status.DELETE_CLUSTER_RELATED_NAMESPACE_EXISTS);
@@ -265,7 +265,6 @@ public class ClusterServiceImpl extends BaseServiceImpl implements ClusterServic
         return result;
     }
 
-
     /**
      * update cluster
      *
@@ -283,7 +282,7 @@ public class ClusterServiceImpl extends BaseServiceImpl implements ClusterServic
             return result;
         }
 
-        if(checkDescriptionLength(desc)){
+        if (checkDescriptionLength(desc)) {
             putMsg(result, Status.DESCRIPTION_TOO_LONG_ERROR);
             return result;
         }
@@ -306,7 +305,7 @@ public class ClusterServiceImpl extends BaseServiceImpl implements ClusterServic
         }
 
         if (!Constants.K8S_LOCAL_TEST_CLUSTER_CODE.equals(clusterExist.getCode())
-            && !config.equals(ClusterConfUtils.getK8sConfig(clusterExist.getConfig()))) {
+                && !config.equals(ClusterConfUtils.getK8sConfig(clusterExist.getConfig()))) {
             try {
                 k8sManager.getAndUpdateK8sClient(code, true);
             } catch (RemotingException e) {
@@ -315,12 +314,12 @@ public class ClusterServiceImpl extends BaseServiceImpl implements ClusterServic
             }
         }
 
-        //update cluster
+        // update cluster
         clusterExist.setConfig(config);
         clusterExist.setName(name);
         clusterExist.setDescription(desc);
         clusterMapper.updateById(clusterExist);
-        //need not update relation
+        // need not update relation
 
         putMsg(result, Status.SUCCESS);
         return result;
@@ -366,4 +365,3 @@ public class ClusterServiceImpl extends BaseServiceImpl implements ClusterServic
     }
 
 }
-
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/EnvironmentServiceImpl.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/EnvironmentServiceImpl.java
index c3f2e970bf..8e64113718 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/EnvironmentServiceImpl.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/EnvironmentServiceImpl.java
@@ -17,6 +17,8 @@
 
 package org.apache.dolphinscheduler.api.service.impl;
 
+import static org.apache.dolphinscheduler.api.constants.ApiFuncIdentificationConstant.*;
+
 import org.apache.dolphinscheduler.api.dto.EnvironmentDto;
 import org.apache.dolphinscheduler.api.enums.Status;
 import org.apache.dolphinscheduler.api.service.EnvironmentService;
@@ -64,8 +66,6 @@ import com.baomidou.mybatisplus.core.metadata.IPage;
 import com.baomidou.mybatisplus.extension.plugins.pagination.Page;
 import com.fasterxml.jackson.core.type.TypeReference;
 
-import static org.apache.dolphinscheduler.api.constants.ApiFuncIdentificationConstant.*;
-
 /**
  * task definition service impl
  */
@@ -94,17 +94,18 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
      */
     @Override
     @Transactional
-    public Map<String, Object> createEnvironment(User loginUser, String name, String config, String desc, String workerGroups) {
+    public Map<String, Object> createEnvironment(User loginUser, String name, String config, String desc,
+                                                 String workerGroups) {
         Map<String, Object> result = new HashMap<>();
         if (!canOperatorPermissions(loginUser, null, AuthorizationType.ENVIRONMENT, ENVIRONMENT_CREATE)) {
             putMsg(result, Status.USER_NO_OPERATION_PERM);
             return result;
         }
-        if(checkDescriptionLength(desc)){
+        if (checkDescriptionLength(desc)) {
             putMsg(result, Status.DESCRIPTION_TOO_LONG_ERROR);
             return result;
         }
-        Map<String, Object> checkResult = checkParams(name,config,workerGroups);
+        Map<String, Object> checkResult = checkParams(name, config, workerGroups);
         if (checkResult.get(Constants.STATUS) != Status.SUCCESS) {
             return checkResult;
         }
@@ -136,7 +137,8 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
 
         if (environmentMapper.insert(env) > 0) {
             if (!StringUtils.isEmpty(workerGroups)) {
-                List<String> workerGroupList = JSONUtils.parseObject(workerGroups, new TypeReference<List<String>>(){});
+                List<String> workerGroupList = JSONUtils.parseObject(workerGroups, new TypeReference<List<String>>() {
+                });
                 if (CollectionUtils.isNotEmpty(workerGroupList)) {
                     workerGroupList.stream().forEach(workerGroup -> {
                         if (!StringUtils.isEmpty(workerGroup)) {
@@ -153,7 +155,8 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
             }
             result.put(Constants.DATA_LIST, env.getCode());
             putMsg(result, Status.SUCCESS);
-            permissionPostHandle(AuthorizationType.ENVIRONMENT, loginUser.getId(), Collections.singletonList(env.getId()), logger);
+            permissionPostHandle(AuthorizationType.ENVIRONMENT, loginUser.getId(),
+                    Collections.singletonList(env.getId()), logger);
         } else {
             putMsg(result, Status.CREATE_ENVIRONMENT_ERROR);
         }
@@ -178,7 +181,8 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
         if (loginUser.getUserType().equals(UserType.ADMIN_USER)) {
             environmentIPage = environmentMapper.queryEnvironmentListPaging(page, searchVal);
         } else {
-            Set<Integer> ids = resourcePermissionCheckService.userOwnedResourceIdsAcquisition(AuthorizationType.ENVIRONMENT, loginUser.getId(), logger);
+            Set<Integer> ids = resourcePermissionCheckService
+                    .userOwnedResourceIdsAcquisition(AuthorizationType.ENVIRONMENT, loginUser.getId(), logger);
             if (ids.isEmpty()) {
                 result.setData(pageInfo);
                 putMsg(result, Status.SUCCESS);
@@ -191,12 +195,13 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
 
         if (CollectionUtils.isNotEmpty(environmentIPage.getRecords())) {
             Map<Long, List<String>> relationMap = relationMapper.selectList(null).stream()
-                    .collect(Collectors.groupingBy(EnvironmentWorkerGroupRelation::getEnvironmentCode,Collectors.mapping(EnvironmentWorkerGroupRelation::getWorkerGroup,Collectors.toList())));
+                    .collect(Collectors.groupingBy(EnvironmentWorkerGroupRelation::getEnvironmentCode,
+                            Collectors.mapping(EnvironmentWorkerGroupRelation::getWorkerGroup, Collectors.toList())));
 
             List<EnvironmentDto> dtoList = environmentIPage.getRecords().stream().map(environment -> {
                 EnvironmentDto dto = new EnvironmentDto();
-                BeanUtils.copyProperties(environment,dto);
-                List<String> workerGroups = relationMap.getOrDefault(environment.getCode(),new ArrayList<String>());
+                BeanUtils.copyProperties(environment, dto);
+                List<String> workerGroups = relationMap.getOrDefault(environment.getCode(), new ArrayList<String>());
                 dto.setWorkerGroups(workerGroups);
                 return dto;
             }).collect(Collectors.toList());
@@ -219,31 +224,33 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
      */
     @Override
     public Map<String, Object> queryAllEnvironmentList(User loginUser) {
-        Map<String,Object> result = new HashMap<>();
-        Set<Integer> ids = resourcePermissionCheckService.userOwnedResourceIdsAcquisition(AuthorizationType.ENVIRONMENT, loginUser.getId(), logger);
+        Map<String, Object> result = new HashMap<>();
+        Set<Integer> ids = resourcePermissionCheckService.userOwnedResourceIdsAcquisition(AuthorizationType.ENVIRONMENT,
+                loginUser.getId(), logger);
         if (ids.isEmpty()) {
             result.put(Constants.DATA_LIST, Collections.emptyList());
-            putMsg(result,Status.SUCCESS);
+            putMsg(result, Status.SUCCESS);
             return result;
         }
         List<Environment> environmentList = environmentMapper.selectBatchIds(ids);
         if (CollectionUtils.isNotEmpty(environmentList)) {
             Map<Long, List<String>> relationMap = relationMapper.selectList(null).stream()
-                    .collect(Collectors.groupingBy(EnvironmentWorkerGroupRelation::getEnvironmentCode,Collectors.mapping(EnvironmentWorkerGroupRelation::getWorkerGroup,Collectors.toList())));
+                    .collect(Collectors.groupingBy(EnvironmentWorkerGroupRelation::getEnvironmentCode,
+                            Collectors.mapping(EnvironmentWorkerGroupRelation::getWorkerGroup, Collectors.toList())));
 
             List<EnvironmentDto> dtoList = environmentList.stream().map(environment -> {
                 EnvironmentDto dto = new EnvironmentDto();
-                BeanUtils.copyProperties(environment,dto);
-                List<String> workerGroups = relationMap.getOrDefault(environment.getCode(),new ArrayList<String>());
+                BeanUtils.copyProperties(environment, dto);
+                List<String> workerGroups = relationMap.getOrDefault(environment.getCode(), new ArrayList<String>());
                 dto.setWorkerGroups(workerGroups);
                 return dto;
             }).collect(Collectors.toList());
-            result.put(Constants.DATA_LIST,dtoList);
+            result.put(Constants.DATA_LIST, dtoList);
         } else {
             result.put(Constants.DATA_LIST, new ArrayList<>());
         }
 
-        putMsg(result,Status.SUCCESS);
+        putMsg(result, Status.SUCCESS);
         return result;
     }
 
@@ -266,7 +273,7 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
                     .collect(Collectors.toList());
 
             EnvironmentDto dto = new EnvironmentDto();
-            BeanUtils.copyProperties(env,dto);
+            BeanUtils.copyProperties(env, dto);
             dto.setWorkerGroups(workerGroups);
             result.put(Constants.DATA_LIST, dto);
             putMsg(result, Status.SUCCESS);
@@ -292,7 +299,7 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
                     .collect(Collectors.toList());
 
             EnvironmentDto dto = new EnvironmentDto();
-            BeanUtils.copyProperties(env,dto);
+            BeanUtils.copyProperties(env, dto);
             dto.setWorkerGroups(workerGroups);
             result.put(Constants.DATA_LIST, dto);
             putMsg(result, Status.SUCCESS);
@@ -310,13 +317,13 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
     @Override
     public Map<String, Object> deleteEnvironmentByCode(User loginUser, Long code) {
         Map<String, Object> result = new HashMap<>();
-        if (!canOperatorPermissions(loginUser,null, AuthorizationType.ENVIRONMENT,ENVIRONMENT_DELETE)) {
+        if (!canOperatorPermissions(loginUser, null, AuthorizationType.ENVIRONMENT, ENVIRONMENT_DELETE)) {
             putMsg(result, Status.USER_NO_OPERATION_PERM);
             return result;
         }
 
-        Integer relatedTaskNumber = taskDefinitionMapper
-                .selectCount(new QueryWrapper<TaskDefinition>().lambda().eq(TaskDefinition::getEnvironmentCode,code));
+        Long relatedTaskNumber = taskDefinitionMapper
+                .selectCount(new QueryWrapper<TaskDefinition>().lambda().eq(TaskDefinition::getEnvironmentCode, code));
 
         if (relatedTaskNumber > 0) {
             putMsg(result, Status.DELETE_ENVIRONMENT_RELATED_TASK_EXISTS);
@@ -327,7 +334,7 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
         if (delete > 0) {
             relationMapper.delete(new QueryWrapper<EnvironmentWorkerGroupRelation>()
                     .lambda()
-                    .eq(EnvironmentWorkerGroupRelation::getEnvironmentCode,code));
+                    .eq(EnvironmentWorkerGroupRelation::getEnvironmentCode, code));
             putMsg(result, Status.SUCCESS);
         } else {
             putMsg(result, Status.DELETE_ENVIRONMENT_ERROR);
@@ -347,18 +354,19 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
      */
     @Transactional
     @Override
-    public Map<String, Object> updateEnvironmentByCode(User loginUser, Long code, String name, String config, String desc, String workerGroups) {
+    public Map<String, Object> updateEnvironmentByCode(User loginUser, Long code, String name, String config,
+                                                       String desc, String workerGroups) {
         Map<String, Object> result = new HashMap<>();
-        if (!canOperatorPermissions(loginUser,null, AuthorizationType.ENVIRONMENT,ENVIRONMENT_UPDATE)) {
+        if (!canOperatorPermissions(loginUser, null, AuthorizationType.ENVIRONMENT, ENVIRONMENT_UPDATE)) {
             putMsg(result, Status.USER_NO_OPERATION_PERM);
             return result;
         }
 
-        Map<String, Object> checkResult = checkParams(name,config,workerGroups);
+        Map<String, Object> checkResult = checkParams(name, config, workerGroups);
         if (checkResult.get(Constants.STATUS) != Status.SUCCESS) {
             return checkResult;
         }
-        if(checkDescriptionLength(desc)){
+        if (checkDescriptionLength(desc)) {
             putMsg(result, Status.DESCRIPTION_TOO_LONG_ERROR);
             return result;
         }
@@ -371,7 +379,8 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
 
         Set<String> workerGroupSet;
         if (!StringUtils.isEmpty(workerGroups)) {
-            workerGroupSet = JSONUtils.parseObject(workerGroups, new TypeReference<Set<String>>() {});
+            workerGroupSet = JSONUtils.parseObject(workerGroups, new TypeReference<Set<String>>() {
+            });
         } else {
             workerGroupSet = new TreeSet<>();
         }
@@ -382,8 +391,8 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
                 .map(item -> item.getWorkerGroup())
                 .collect(Collectors.toSet());
 
-        Set<String> deleteWorkerGroupSet = SetUtils.difference(existWorkerGroupSet,workerGroupSet).toSet();
-        Set<String> addWorkerGroupSet = SetUtils.difference(workerGroupSet,existWorkerGroupSet).toSet();
+        Set<String> deleteWorkerGroupSet = SetUtils.difference(existWorkerGroupSet, workerGroupSet).toSet();
+        Set<String> addWorkerGroupSet = SetUtils.difference(workerGroupSet, existWorkerGroupSet).toSet();
 
         // verify whether the relation of this environment and worker groups can be adjusted
         checkResult = checkUsedEnvironmentWorkerGroupRelation(deleteWorkerGroupSet, name, code);
@@ -399,7 +408,8 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
         env.setOperator(loginUser.getId());
         env.setUpdateTime(new Date());
 
-        int update = environmentMapper.update(env, new UpdateWrapper<Environment>().lambda().eq(Environment::getCode, code));
+        int update =
+                environmentMapper.update(env, new UpdateWrapper<Environment>().lambda().eq(Environment::getCode, code));
         if (update > 0) {
             deleteWorkerGroupSet.stream().forEach(key -> {
                 if (StringUtils.isNotEmpty(key)) {
@@ -427,8 +437,6 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
         return result;
     }
 
-
-
     /**
      * verify environment name
      *
@@ -454,17 +462,20 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
         return result;
     }
 
-    private Map<String, Object> checkUsedEnvironmentWorkerGroupRelation(Set<String> deleteKeySet,String environmentName, Long environmentCode) {
+    private Map<String, Object> checkUsedEnvironmentWorkerGroupRelation(Set<String> deleteKeySet,
+                                                                        String environmentName, Long environmentCode) {
         Map<String, Object> result = new HashMap<>();
         for (String workerGroup : deleteKeySet) {
             List<TaskDefinition> taskDefinitionList = taskDefinitionMapper
                     .selectList(new QueryWrapper<TaskDefinition>().lambda()
-                            .eq(TaskDefinition::getEnvironmentCode,environmentCode)
-                            .eq(TaskDefinition::getWorkerGroup,workerGroup));
+                            .eq(TaskDefinition::getEnvironmentCode, environmentCode)
+                            .eq(TaskDefinition::getWorkerGroup, workerGroup));
 
             if (Objects.nonNull(taskDefinitionList) && taskDefinitionList.size() != 0) {
-                Set<String> collect = taskDefinitionList.stream().map(TaskDefinition::getName).collect(Collectors.toSet());
-                putMsg(result, Status.UPDATE_ENVIRONMENT_WORKER_GROUP_RELATION_ERROR,workerGroup,environmentName, collect);
+                Set<String> collect =
+                        taskDefinitionList.stream().map(TaskDefinition::getName).collect(Collectors.toSet());
+                putMsg(result, Status.UPDATE_ENVIRONMENT_WORKER_GROUP_RELATION_ERROR, workerGroup, environmentName,
+                        collect);
                 return result;
             }
         }
@@ -483,7 +494,8 @@ public class EnvironmentServiceImpl extends BaseServiceImpl implements Environme
             return result;
         }
         if (!StringUtils.isEmpty(workerGroups)) {
-            List<String> workerGroupList = JSONUtils.parseObject(workerGroups, new TypeReference<List<String>>(){});
+            List<String> workerGroupList = JSONUtils.parseObject(workerGroups, new TypeReference<List<String>>() {
+            });
             if (Objects.isNull(workerGroupList)) {
                 putMsg(result, Status.ENVIRONMENT_WORKER_GROUPS_IS_INVALID);
                 return result;
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/TaskGroupServiceImpl.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/TaskGroupServiceImpl.java
index 90118baa84..7040706290 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/TaskGroupServiceImpl.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/TaskGroupServiceImpl.java
@@ -26,7 +26,6 @@ import org.apache.dolphinscheduler.api.utils.PageInfo;
 import org.apache.dolphinscheduler.common.Constants;
 import org.apache.dolphinscheduler.common.enums.AuthorizationType;
 import org.apache.dolphinscheduler.common.enums.Flag;
-import org.apache.dolphinscheduler.common.enums.UserType;
 import org.apache.dolphinscheduler.dao.entity.TaskGroup;
 import org.apache.dolphinscheduler.dao.entity.User;
 import org.apache.dolphinscheduler.dao.mapper.TaskGroupMapper;
@@ -82,14 +81,16 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
      */
     @Override
     @Transactional
-    public Map<String, Object> createTaskGroup(User loginUser, Long projectCode, String name, String description, int groupSize) {
+    public Map<String, Object> createTaskGroup(User loginUser, Long projectCode, String name, String description,
+                                               int groupSize) {
         Map<String, Object> result = new HashMap<>();
-        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP, ApiFuncIdentificationConstant.TASK_GROUP_CREATE);
-        if (!canOperatorPermissions){
+        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP,
+                ApiFuncIdentificationConstant.TASK_GROUP_CREATE);
+        if (!canOperatorPermissions) {
             putMsg(result, Status.NO_CURRENT_OPERATING_PERMISSION);
             return result;
         }
-        if(checkDescriptionLength(description)){
+        if (checkDescriptionLength(description)) {
             putMsg(result, Status.DESCRIPTION_TOO_LONG_ERROR);
             return result;
         }
@@ -112,7 +113,8 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
         taskGroup.setCreateTime(new Date());
         taskGroup.setUpdateTime(new Date());
         if (taskGroupMapper.insert(taskGroup) > 0) {
-            permissionPostHandle(AuthorizationType.TASK_GROUP, loginUser.getId(), Collections.singletonList(taskGroup.getId()),logger);
+            permissionPostHandle(AuthorizationType.TASK_GROUP, loginUser.getId(),
+                    Collections.singletonList(taskGroup.getId()), logger);
             putMsg(result, Status.SUCCESS);
         } else {
             putMsg(result, Status.CREATE_TASK_GROUP_ERROR);
@@ -134,12 +136,13 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
     @Override
     public Map<String, Object> updateTaskGroup(User loginUser, int id, String name, String description, int groupSize) {
         Map<String, Object> result = new HashMap<>();
-        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP, ApiFuncIdentificationConstant.TASK_GROUP_EDIT);
-        if (!canOperatorPermissions){
+        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP,
+                ApiFuncIdentificationConstant.TASK_GROUP_EDIT);
+        if (!canOperatorPermissions) {
             putMsg(result, Status.NO_CURRENT_OPERATING_PERMISSION);
             return result;
         }
-        if(checkDescriptionLength(description)){
+        if (checkDescriptionLength(description)) {
             putMsg(result, Status.DESCRIPTION_TOO_LONG_ERROR);
             return result;
         }
@@ -151,7 +154,7 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
             putMsg(result, Status.TASK_GROUP_SIZE_ERROR);
             return result;
         }
-        Integer exists = taskGroupMapper.selectCount(new QueryWrapper<TaskGroup>().lambda()
+        Long exists = taskGroupMapper.selectCount(new QueryWrapper<TaskGroup>().lambda()
                 .eq(TaskGroup::getName, name)
                 .eq(TaskGroup::getUserId, loginUser.getId())
                 .ne(TaskGroup::getId, id));
@@ -197,7 +200,8 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
      * @return the result code and msg
      */
     @Override
-    public Map<String, Object> queryAllTaskGroup(User loginUser, String name, Integer status, int pageNo, int pageSize) {
+    public Map<String, Object> queryAllTaskGroup(User loginUser, String name, Integer status, int pageNo,
+                                                 int pageSize) {
         return this.doQuery(loginUser, pageNo, pageSize, loginUser.getId(), name, status);
     }
 
@@ -229,18 +233,21 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
         Map<String, Object> result = new HashMap<>();
         Page<TaskGroup> page = new Page<>(pageNo, pageSize);
         PageInfo<TaskGroup> emptyPageInfo = new PageInfo<>(pageNo, pageSize);
-        Set<Integer> ids = resourcePermissionCheckService.userOwnedResourceIdsAcquisition(AuthorizationType.TASK_GROUP, loginUser.getId(), logger);
+        Set<Integer> ids = resourcePermissionCheckService.userOwnedResourceIdsAcquisition(AuthorizationType.TASK_GROUP,
+                loginUser.getId(), logger);
         if (ids.isEmpty()) {
             result.put(Constants.DATA_LIST, emptyPageInfo);
             putMsg(result, Status.SUCCESS);
             return result;
         }
-        IPage<TaskGroup> taskGroupPaging = taskGroupMapper.queryTaskGroupPagingByProjectCode(page, new ArrayList<>(ids), projectCode);
+        IPage<TaskGroup> taskGroupPaging =
+                taskGroupMapper.queryTaskGroupPagingByProjectCode(page, new ArrayList<>(ids), projectCode);
 
         return getStringObjectMap(pageNo, pageSize, result, taskGroupPaging);
     }
 
-    private Map<String, Object> getStringObjectMap(int pageNo, int pageSize, Map<String, Object> result, IPage<TaskGroup> taskGroupPaging) {
+    private Map<String, Object> getStringObjectMap(int pageNo, int pageSize, Map<String, Object> result,
+                                                   IPage<TaskGroup> taskGroupPaging) {
         PageInfo<TaskGroup> pageInfo = new PageInfo<>(pageNo, pageSize);
         int total = taskGroupPaging == null ? 0 : (int) taskGroupPaging.getTotal();
         List<TaskGroup> list = taskGroupPaging == null ? new ArrayList<TaskGroup>() : taskGroupPaging.getRecords();
@@ -279,17 +286,20 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
      * @return the result code and msg
      */
     @Override
-    public Map<String, Object> doQuery(User loginUser, int pageNo, int pageSize, int userId, String name, Integer status) {
+    public Map<String, Object> doQuery(User loginUser, int pageNo, int pageSize, int userId, String name,
+                                       Integer status) {
         Map<String, Object> result = new HashMap<>();
         Page<TaskGroup> page = new Page<>(pageNo, pageSize);
         PageInfo<TaskGroup> pageInfo = new PageInfo<>(pageNo, pageSize);
-        Set<Integer> ids = resourcePermissionCheckService.userOwnedResourceIdsAcquisition(AuthorizationType.TASK_GROUP, userId, logger);
+        Set<Integer> ids = resourcePermissionCheckService.userOwnedResourceIdsAcquisition(AuthorizationType.TASK_GROUP,
+                userId, logger);
         if (ids.isEmpty()) {
             result.put(Constants.DATA_LIST, pageInfo);
             putMsg(result, Status.SUCCESS);
             return result;
         }
-        IPage<TaskGroup> taskGroupPaging = taskGroupMapper.queryTaskGroupPaging(page, new ArrayList<>(ids), name, status);
+        IPage<TaskGroup> taskGroupPaging =
+                taskGroupMapper.queryTaskGroupPaging(page, new ArrayList<>(ids), name, status);
 
         return getStringObjectMap(pageNo, pageSize, result, taskGroupPaging);
     }
@@ -305,8 +315,9 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
     public Map<String, Object> closeTaskGroup(User loginUser, int id) {
         Map<String, Object> result = new HashMap<>();
 
-        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP, ApiFuncIdentificationConstant.TASK_GROUP_CLOSE);
-        if (!canOperatorPermissions){
+        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP,
+                ApiFuncIdentificationConstant.TASK_GROUP_CLOSE);
+        if (!canOperatorPermissions) {
             putMsg(result, Status.NO_CURRENT_OPERATING_PERMISSION);
             return result;
         }
@@ -332,8 +343,9 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
     public Map<String, Object> startTaskGroup(User loginUser, int id) {
         Map<String, Object> result = new HashMap<>();
 
-        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP, ApiFuncIdentificationConstant.TASK_GROUP_CLOSE);
-        if (!canOperatorPermissions){
+        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP,
+                ApiFuncIdentificationConstant.TASK_GROUP_CLOSE);
+        if (!canOperatorPermissions) {
             putMsg(result, Status.NO_CURRENT_OPERATING_PERMISSION);
             return result;
         }
@@ -359,8 +371,9 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
     @Override
     public Map<String, Object> forceStartTask(User loginUser, int queueId) {
         Map<String, Object> result = new HashMap<>();
-        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP, ApiFuncIdentificationConstant.TASK_GROUP_QUEUE_START);
-        if (!canOperatorPermissions){
+        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP,
+                ApiFuncIdentificationConstant.TASK_GROUP_QUEUE_START);
+        if (!canOperatorPermissions) {
             putMsg(result, Status.NO_CURRENT_OPERATING_PERMISSION);
             return result;
         }
@@ -371,8 +384,9 @@ public class TaskGroupServiceImpl extends BaseServiceImpl implements TaskGroupSe
     public Map<String, Object> modifyPriority(User loginUser, Integer queueId, Integer priority) {
         Map<String, Object> result = new HashMap<>();
 
-        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP, ApiFuncIdentificationConstant.TASK_GROUP_QUEUE_PRIORITY);
-        if (!canOperatorPermissions){
+        boolean canOperatorPermissions = canOperatorPermissions(loginUser, null, AuthorizationType.TASK_GROUP,
+                ApiFuncIdentificationConstant.TASK_GROUP_QUEUE_PRIORITY);
+        if (!canOperatorPermissions) {
             putMsg(result, Status.NO_CURRENT_OPERATING_PERMISSION);
             return result;
         }
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/vo/AlertPluginInstanceVO.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/vo/AlertPluginInstanceVO.java
index d8b76f633d..d7c47185a4 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/vo/AlertPluginInstanceVO.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/vo/AlertPluginInstanceVO.java
@@ -59,7 +59,7 @@ public class AlertPluginInstanceVO {
      */
     private String alertPluginName;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/vo/ScheduleVo.java b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/vo/ScheduleVo.java
index 708eaa1082..873ad12ffd 100644
--- a/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/vo/ScheduleVo.java
+++ b/dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/vo/ScheduleVo.java
@@ -112,7 +112,6 @@ public class ScheduleVo {
      */
     private int warningGroupId;
 
-
     /**
      * process instance priority
      */
@@ -270,7 +269,7 @@ public class ScheduleVo {
         this.userName = userName;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -305,27 +304,27 @@ public class ScheduleVo {
     @Override
     public String toString() {
         return "Schedule{"
-            + "id=" + id
-            + ", processDefinitionCode=" + processDefinitionCode
-            + ", processDefinitionName='" + processDefinitionName + '\''
-            + ", projectName='" + projectName + '\''
-            + ", description='" + definitionDescription + '\''
-            + ", startTime=" + startTime
-            + ", endTime=" + endTime
-            + ", timezoneId='" + timezoneId + +'\''
-            + ", crontab='" + crontab + '\''
-            + ", failureStrategy=" + failureStrategy
-            + ", warningType=" + warningType
-            + ", createTime=" + createTime
-            + ", updateTime=" + updateTime
-            + ", userId=" + userId
-            + ", userName='" + userName + '\''
-            + ", releaseState=" + releaseState
-            + ", warningGroupId=" + warningGroupId
-            + ", processInstancePriority=" + processInstancePriority
-            + ", workerGroup='" + workerGroup + '\''
-            + ", environmentCode='" + environmentCode + '\''
-            + '}';
+                + "id=" + id
+                + ", processDefinitionCode=" + processDefinitionCode
+                + ", processDefinitionName='" + processDefinitionName + '\''
+                + ", projectName='" + projectName + '\''
+                + ", description='" + definitionDescription + '\''
+                + ", startTime=" + startTime
+                + ", endTime=" + endTime
+                + ", timezoneId='" + timezoneId + +'\''
+                + ", crontab='" + crontab + '\''
+                + ", failureStrategy=" + failureStrategy
+                + ", warningType=" + warningType
+                + ", createTime=" + createTime
+                + ", updateTime=" + updateTime
+                + ", userId=" + userId
+                + ", userName='" + userName + '\''
+                + ", releaseState=" + releaseState
+                + ", warningGroupId=" + warningGroupId
+                + ", processInstancePriority=" + processInstancePriority
+                + ", workerGroup='" + workerGroup + '\''
+                + ", environmentCode='" + environmentCode + '\''
+                + '}';
     }
 
     public String getDefinitionDescription() {
diff --git a/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/ClusterServiceTest.java b/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/ClusterServiceTest.java
index ac2e261ce9..3659fbc937 100644
--- a/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/ClusterServiceTest.java
+++ b/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/ClusterServiceTest.java
@@ -48,10 +48,7 @@ import org.mockito.Mockito;
 import org.mockito.junit.MockitoJUnitRunner;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
-import org.springframework.boot.configurationprocessor.json.JSONException;
-import org.springframework.boot.configurationprocessor.json.JSONObject;
 
-import com.baomidou.mybatisplus.core.conditions.query.LambdaQueryWrapper;
 import com.baomidou.mybatisplus.core.metadata.IPage;
 import com.baomidou.mybatisplus.extension.plugins.pagination.Page;
 
@@ -75,85 +72,85 @@ public class ClusterServiceTest {
     @Mock
     private K8sManager k8sManager;
 
-
     public static final String testUserName = "clusterServerTest";
 
     public static final String clusterName = "Env1";
 
     @Before
-    public void setUp(){
+    public void setUp() {
     }
 
     @After
-    public void after(){
+    public void after() {
     }
 
     @Test
     public void testCreateCluster() {
         User loginUser = getGeneralUser();
-        Map<String, Object> result = clusterService.createCluster(loginUser,clusterName,getConfig(),getDesc());
+        Map<String, Object> result = clusterService.createCluster(loginUser, clusterName, getConfig(), getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
         loginUser = getAdminUser();
-        result = clusterService.createCluster(loginUser,clusterName,"",getDesc());
+        result = clusterService.createCluster(loginUser, clusterName, "", getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.CLUSTER_CONFIG_IS_NULL, result.get(Constants.STATUS));
 
-        result = clusterService.createCluster(loginUser,"",getConfig(),getDesc());
+        result = clusterService.createCluster(loginUser, "", getConfig(), getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.CLUSTER_NAME_IS_NULL, result.get(Constants.STATUS));
 
         Mockito.when(clusterMapper.queryByClusterName(clusterName)).thenReturn(getCluster());
-        result = clusterService.createCluster(loginUser,clusterName,getConfig(),getDesc());
+        result = clusterService.createCluster(loginUser, clusterName, getConfig(), getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.CLUSTER_NAME_EXISTS, result.get(Constants.STATUS));
 
         Mockito.when(clusterMapper.insert(Mockito.any(Cluster.class))).thenReturn(1);
-        result = clusterService.createCluster(loginUser,"testName","testConfig","testDesc");
+        result = clusterService.createCluster(loginUser, "testName", "testConfig", "testDesc");
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
     }
 
     @Test
     public void testCheckParams() {
-        Map<String, Object> result = clusterService.checkParams(clusterName,getConfig());
+        Map<String, Object> result = clusterService.checkParams(clusterName, getConfig());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
-        result = clusterService.checkParams("",getConfig());
+        result = clusterService.checkParams("", getConfig());
         Assert.assertEquals(Status.CLUSTER_NAME_IS_NULL, result.get(Constants.STATUS));
-        result = clusterService.checkParams(clusterName,"");
+        result = clusterService.checkParams(clusterName, "");
         Assert.assertEquals(Status.CLUSTER_CONFIG_IS_NULL, result.get(Constants.STATUS));
     }
 
     @Test
     public void testUpdateClusterByCode() throws RemotingException {
         User loginUser = getGeneralUser();
-        Map<String, Object> result = clusterService.updateClusterByCode(loginUser,1L,clusterName,getConfig(),getDesc());
+        Map<String, Object> result =
+                clusterService.updateClusterByCode(loginUser, 1L, clusterName, getConfig(), getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
         loginUser = getAdminUser();
-        result = clusterService.updateClusterByCode(loginUser,1L,clusterName,"",getDesc());
+        result = clusterService.updateClusterByCode(loginUser, 1L, clusterName, "", getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.CLUSTER_CONFIG_IS_NULL, result.get(Constants.STATUS));
 
-        result = clusterService.updateClusterByCode(loginUser,1L,"",getConfig(),getDesc());
+        result = clusterService.updateClusterByCode(loginUser, 1L, "", getConfig(), getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.CLUSTER_NAME_IS_NULL, result.get(Constants.STATUS));
 
-        result = clusterService.updateClusterByCode(loginUser,2L,clusterName,getConfig(),getDesc());
+        result = clusterService.updateClusterByCode(loginUser, 2L, clusterName, getConfig(), getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.CLUSTER_NOT_EXISTS, result.get(Constants.STATUS));
 
         Mockito.when(clusterMapper.queryByClusterName(clusterName)).thenReturn(getCluster());
-        result = clusterService.updateClusterByCode(loginUser,2L,clusterName,getConfig(),getDesc());
+        result = clusterService.updateClusterByCode(loginUser, 2L, clusterName, getConfig(), getDesc());
         logger.info(result.toString());
         Assert.assertEquals(Status.CLUSTER_NAME_EXISTS, result.get(Constants.STATUS));
 
         Mockito.when(clusterMapper.updateById(Mockito.any(Cluster.class))).thenReturn(1);
         Mockito.when(clusterMapper.queryByClusterCode(1L)).thenReturn(getCluster());
 
-        result = clusterService.updateClusterByCode(loginUser,1L,"testName",getConfig(),"test");
+        result = clusterService.updateClusterByCode(loginUser, 1L, "testName", getConfig(), "test");
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
 
@@ -162,12 +159,12 @@ public class ClusterServiceTest {
     @Test
     public void testQueryAllClusterList() {
         Mockito.when(clusterMapper.queryAllClusterList()).thenReturn(Lists.newArrayList(getCluster()));
-        Map<String, Object> result  = clusterService.queryAllClusterList();
+        Map<String, Object> result = clusterService.queryAllClusterList();
         logger.info(result.toString());
-        Assert.assertEquals(Status.SUCCESS,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
 
-        List<Cluster> list = (List<Cluster>)(result.get(Constants.DATA_LIST));
-        Assert.assertEquals(1,list.size());
+        List<Cluster> list = (List<Cluster>) (result.get(Constants.DATA_LIST));
+        Assert.assertEquals(1, list.size());
     }
 
     @Test
@@ -175,7 +172,8 @@ public class ClusterServiceTest {
         IPage<Cluster> page = new Page<>(1, 10);
         page.setRecords(getList());
         page.setTotal(1L);
-        Mockito.when(clusterMapper.queryClusterListPaging(Mockito.any(Page.class), Mockito.eq(clusterName))).thenReturn(page);
+        Mockito.when(clusterMapper.queryClusterListPaging(Mockito.any(Page.class), Mockito.eq(clusterName)))
+                .thenReturn(page);
 
         Result result = clusterService.queryClusterListPaging(1, 10, clusterName);
         logger.info(result.toString());
@@ -188,12 +186,12 @@ public class ClusterServiceTest {
         Mockito.when(clusterMapper.queryByClusterName(clusterName)).thenReturn(null);
         Map<String, Object> result = clusterService.queryClusterByName(clusterName);
         logger.info(result.toString());
-        Assert.assertEquals(Status.QUERY_CLUSTER_BY_NAME_ERROR,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.QUERY_CLUSTER_BY_NAME_ERROR, result.get(Constants.STATUS));
 
         Mockito.when(clusterMapper.queryByClusterName(clusterName)).thenReturn(getCluster());
         result = clusterService.queryClusterByName(clusterName);
         logger.info(result.toString());
-        Assert.assertEquals(Status.SUCCESS,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
     }
 
     @Test
@@ -201,29 +199,29 @@ public class ClusterServiceTest {
         Mockito.when(clusterMapper.queryByClusterCode(1L)).thenReturn(null);
         Map<String, Object> result = clusterService.queryClusterByCode(1L);
         logger.info(result.toString());
-        Assert.assertEquals(Status.QUERY_CLUSTER_BY_CODE_ERROR,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.QUERY_CLUSTER_BY_CODE_ERROR, result.get(Constants.STATUS));
 
         Mockito.when(clusterMapper.queryByClusterCode(1L)).thenReturn(getCluster());
         result = clusterService.queryClusterByCode(1L);
         logger.info(result.toString());
-        Assert.assertEquals(Status.SUCCESS,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
     }
 
     @Test
     public void testDeleteClusterByCode() {
         User loginUser = getGeneralUser();
-        Map<String, Object> result = clusterService.deleteClusterByCode(loginUser,1L);
+        Map<String, Object> result = clusterService.deleteClusterByCode(loginUser, 1L);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
         loginUser = getAdminUser();
         Mockito.when(clusterMapper.deleteByCode(1L)).thenReturn(1);
-        result = clusterService.deleteClusterByCode(loginUser,1L);
+        result = clusterService.deleteClusterByCode(loginUser, 1L);
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
 
-        Mockito.when(k8sNamespaceMapper.selectCount(Mockito.any())).thenReturn(1);
-        result = clusterService.deleteClusterByCode(loginUser,1L);
+        Mockito.when(k8sNamespaceMapper.selectCount(Mockito.any())).thenReturn(1L);
+        result = clusterService.deleteClusterByCode(loginUser, 1L);
         logger.info(result.toString());
         Assert.assertEquals(Status.DELETE_CLUSTER_RELATED_NAMESPACE_EXISTS, result.get(Constants.STATUS));
     }
diff --git a/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/EnvironmentServiceTest.java b/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/EnvironmentServiceTest.java
index 3befde4c36..ebb47e788d 100644
--- a/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/EnvironmentServiceTest.java
+++ b/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/EnvironmentServiceTest.java
@@ -96,34 +96,35 @@ public class EnvironmentServiceTest {
     public void testCreateEnvironment() {
         User loginUser = getGeneralUser();
         Mockito.when(resourcePermissionCheckService.operationPermissionCheck(AuthorizationType.ENVIRONMENT, null,
-                loginUser.getId(),ENVIRONMENT_CREATE, baseServiceLogger)).thenReturn(true);
+                loginUser.getId(), ENVIRONMENT_CREATE, baseServiceLogger)).thenReturn(true);
         Mockito.when(resourcePermissionCheckService.resourcePermissionCheck(AuthorizationType.ENVIRONMENT, null,
                 0, baseServiceLogger)).thenReturn(true);
-        Map<String, Object> result = environmentService.createEnvironment(loginUser,environmentName,getConfig(),getDesc(),workerGroups);
+        Map<String, Object> result =
+                environmentService.createEnvironment(loginUser, environmentName, getConfig(), getDesc(), workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
         loginUser = getAdminUser();
-        result = environmentService.createEnvironment(loginUser,environmentName,"",getDesc(),workerGroups);
+        result = environmentService.createEnvironment(loginUser, environmentName, "", getDesc(), workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.ENVIRONMENT_CONFIG_IS_NULL, result.get(Constants.STATUS));
 
-        result = environmentService.createEnvironment(loginUser,"",getConfig(),getDesc(),workerGroups);
+        result = environmentService.createEnvironment(loginUser, "", getConfig(), getDesc(), workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.ENVIRONMENT_NAME_IS_NULL, result.get(Constants.STATUS));
 
-        result = environmentService.createEnvironment(loginUser,environmentName,getConfig(),getDesc(),"test");
+        result = environmentService.createEnvironment(loginUser, environmentName, getConfig(), getDesc(), "test");
         logger.info(result.toString());
         Assert.assertEquals(Status.ENVIRONMENT_WORKER_GROUPS_IS_INVALID, result.get(Constants.STATUS));
 
         Mockito.when(environmentMapper.queryByEnvironmentName(environmentName)).thenReturn(getEnvironment());
-        result = environmentService.createEnvironment(loginUser,environmentName,getConfig(),getDesc(),workerGroups);
+        result = environmentService.createEnvironment(loginUser, environmentName, getConfig(), getDesc(), workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.ENVIRONMENT_NAME_EXISTS, result.get(Constants.STATUS));
 
         Mockito.when(environmentMapper.insert(Mockito.any(Environment.class))).thenReturn(1);
         Mockito.when(relationMapper.insert(Mockito.any(EnvironmentWorkerGroupRelation.class))).thenReturn(1);
-        result = environmentService.createEnvironment(loginUser,"testName","test","test",workerGroups);
+        result = environmentService.createEnvironment(loginUser, "testName", "test", "test", workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
 
@@ -131,7 +132,7 @@ public class EnvironmentServiceTest {
 
     @Test
     public void testCheckParams() {
-        Map<String, Object> result = environmentService.checkParams(environmentName,getConfig(),"test");
+        Map<String, Object> result = environmentService.checkParams(environmentName, getConfig(), "test");
         Assert.assertEquals(Status.ENVIRONMENT_WORKER_GROUPS_IS_INVALID, result.get(Constants.STATUS));
     }
 
@@ -139,33 +140,38 @@ public class EnvironmentServiceTest {
     public void testUpdateEnvironmentByCode() {
         User loginUser = getGeneralUser();
         Mockito.when(resourcePermissionCheckService.operationPermissionCheck(AuthorizationType.ENVIRONMENT, null,
-                loginUser.getId(),ENVIRONMENT_UPDATE, baseServiceLogger)).thenReturn(true);
+                loginUser.getId(), ENVIRONMENT_UPDATE, baseServiceLogger)).thenReturn(true);
         Mockito.when(resourcePermissionCheckService.resourcePermissionCheck(AuthorizationType.ENVIRONMENT, null,
                 0, baseServiceLogger)).thenReturn(true);
-        Map<String, Object> result = environmentService.updateEnvironmentByCode(loginUser,1L,environmentName,getConfig(),getDesc(),workerGroups);
+        Map<String, Object> result = environmentService.updateEnvironmentByCode(loginUser, 1L, environmentName,
+                getConfig(), getDesc(), workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
         loginUser = getAdminUser();
-        result = environmentService.updateEnvironmentByCode(loginUser,1L,environmentName,"",getDesc(),workerGroups);
+        result = environmentService.updateEnvironmentByCode(loginUser, 1L, environmentName, "", getDesc(),
+                workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.ENVIRONMENT_CONFIG_IS_NULL, result.get(Constants.STATUS));
 
-        result = environmentService.updateEnvironmentByCode(loginUser,1L,"",getConfig(),getDesc(),workerGroups);
+        result = environmentService.updateEnvironmentByCode(loginUser, 1L, "", getConfig(), getDesc(), workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.ENVIRONMENT_NAME_IS_NULL, result.get(Constants.STATUS));
 
-        result = environmentService.updateEnvironmentByCode(loginUser,1L,environmentName,getConfig(),getDesc(),"test");
+        result = environmentService.updateEnvironmentByCode(loginUser, 1L, environmentName, getConfig(), getDesc(),
+                "test");
         logger.info(result.toString());
         Assert.assertEquals(Status.ENVIRONMENT_WORKER_GROUPS_IS_INVALID, result.get(Constants.STATUS));
 
         Mockito.when(environmentMapper.queryByEnvironmentName(environmentName)).thenReturn(getEnvironment());
-        result = environmentService.updateEnvironmentByCode(loginUser,2L,environmentName,getConfig(),getDesc(),workerGroups);
+        result = environmentService.updateEnvironmentByCode(loginUser, 2L, environmentName, getConfig(), getDesc(),
+                workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.ENVIRONMENT_NAME_EXISTS, result.get(Constants.STATUS));
 
-        Mockito.when(environmentMapper.update(Mockito.any(Environment.class),Mockito.any(Wrapper.class))).thenReturn(1);
-        result = environmentService.updateEnvironmentByCode(loginUser,1L,"testName","test","test",workerGroups);
+        Mockito.when(environmentMapper.update(Mockito.any(Environment.class), Mockito.any(Wrapper.class)))
+                .thenReturn(1);
+        result = environmentService.updateEnvironmentByCode(loginUser, 1L, "testName", "test", "test", workerGroups);
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
     }
@@ -178,12 +184,12 @@ public class EnvironmentServiceTest {
                 1, environmentServiceLogger)).thenReturn(ids);
         Mockito.when(environmentMapper.selectBatchIds(ids)).thenReturn(Lists.newArrayList(getEnvironment()));
 
-        Map<String, Object> result  = environmentService.queryAllEnvironmentList(getAdminUser());
+        Map<String, Object> result = environmentService.queryAllEnvironmentList(getAdminUser());
         logger.info(result.toString());
-        Assert.assertEquals(Status.SUCCESS,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
 
-        List<Environment> list = (List<Environment>)(result.get(Constants.DATA_LIST));
-        Assert.assertEquals(1,list.size());
+        List<Environment> list = (List<Environment>) (result.get(Constants.DATA_LIST));
+        Assert.assertEquals(1, list.size());
     }
 
     @Test
@@ -191,7 +197,8 @@ public class EnvironmentServiceTest {
         IPage<Environment> page = new Page<>(1, 10);
         page.setRecords(getList());
         page.setTotal(1L);
-        Mockito.when(environmentMapper.queryEnvironmentListPaging(Mockito.any(Page.class), Mockito.eq(environmentName))).thenReturn(page);
+        Mockito.when(environmentMapper.queryEnvironmentListPaging(Mockito.any(Page.class), Mockito.eq(environmentName)))
+                .thenReturn(page);
 
         Result result = environmentService.queryEnvironmentListPaging(getAdminUser(), 1, 10, environmentName);
         logger.info(result.toString());
@@ -204,12 +211,12 @@ public class EnvironmentServiceTest {
         Mockito.when(environmentMapper.queryByEnvironmentName(environmentName)).thenReturn(null);
         Map<String, Object> result = environmentService.queryEnvironmentByName(environmentName);
         logger.info(result.toString());
-        Assert.assertEquals(Status.QUERY_ENVIRONMENT_BY_NAME_ERROR,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.QUERY_ENVIRONMENT_BY_NAME_ERROR, result.get(Constants.STATUS));
 
         Mockito.when(environmentMapper.queryByEnvironmentName(environmentName)).thenReturn(getEnvironment());
         result = environmentService.queryEnvironmentByName(environmentName);
         logger.info(result.toString());
-        Assert.assertEquals(Status.SUCCESS,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
     }
 
     @Test
@@ -217,12 +224,12 @@ public class EnvironmentServiceTest {
         Mockito.when(environmentMapper.queryByEnvironmentCode(1L)).thenReturn(null);
         Map<String, Object> result = environmentService.queryEnvironmentByCode(1L);
         logger.info(result.toString());
-        Assert.assertEquals(Status.QUERY_ENVIRONMENT_BY_CODE_ERROR,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.QUERY_ENVIRONMENT_BY_CODE_ERROR, result.get(Constants.STATUS));
 
         Mockito.when(environmentMapper.queryByEnvironmentCode(1L)).thenReturn(getEnvironment());
         result = environmentService.queryEnvironmentByCode(1L);
         logger.info(result.toString());
-        Assert.assertEquals(Status.SUCCESS,result.get(Constants.STATUS));
+        Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
     }
 
     @Test
@@ -232,19 +239,19 @@ public class EnvironmentServiceTest {
                 loginUser.getId(), ENVIRONMENT_DELETE, baseServiceLogger)).thenReturn(true);
         Mockito.when(resourcePermissionCheckService.resourcePermissionCheck(AuthorizationType.ENVIRONMENT, null,
                 0, baseServiceLogger)).thenReturn(true);
-        Map<String, Object> result = environmentService.deleteEnvironmentByCode(loginUser,1L);
+        Map<String, Object> result = environmentService.deleteEnvironmentByCode(loginUser, 1L);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
         loginUser = getAdminUser();
-        Mockito.when(taskDefinitionMapper.selectCount(Mockito.any(LambdaQueryWrapper.class))).thenReturn(1);
-        result = environmentService.deleteEnvironmentByCode(loginUser,1L);
+        Mockito.when(taskDefinitionMapper.selectCount(Mockito.any(LambdaQueryWrapper.class))).thenReturn(1L);
+        result = environmentService.deleteEnvironmentByCode(loginUser, 1L);
         logger.info(result.toString());
         Assert.assertEquals(Status.DELETE_ENVIRONMENT_RELATED_TASK_EXISTS, result.get(Constants.STATUS));
 
-        Mockito.when(taskDefinitionMapper.selectCount(Mockito.any(LambdaQueryWrapper.class))).thenReturn(0);
+        Mockito.when(taskDefinitionMapper.selectCount(Mockito.any(LambdaQueryWrapper.class))).thenReturn(0L);
         Mockito.when(environmentMapper.deleteByCode(1L)).thenReturn(1);
-        result = environmentService.deleteEnvironmentByCode(loginUser,1L);
+        result = environmentService.deleteEnvironmentByCode(loginUser, 1L);
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
     }
diff --git a/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/K8SNamespaceServiceTest.java b/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/K8SNamespaceServiceTest.java
index 8732a59658..ed684ea914 100644
--- a/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/K8SNamespaceServiceTest.java
+++ b/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/K8SNamespaceServiceTest.java
@@ -18,6 +18,7 @@
 package org.apache.dolphinscheduler.api.service;
 
 import org.apache.dolphinscheduler.api.enums.Status;
+import org.apache.dolphinscheduler.api.k8s.K8sClientService;
 import org.apache.dolphinscheduler.api.service.impl.K8SNamespaceServiceImpl;
 import org.apache.dolphinscheduler.api.utils.PageInfo;
 import org.apache.dolphinscheduler.api.utils.Result;
@@ -29,10 +30,10 @@ import org.apache.dolphinscheduler.dao.entity.User;
 import org.apache.dolphinscheduler.dao.mapper.ClusterMapper;
 import org.apache.dolphinscheduler.dao.mapper.K8sNamespaceMapper;
 import org.apache.dolphinscheduler.dao.mapper.UserMapper;
-import org.apache.dolphinscheduler.api.k8s.K8sClientService;
 
 import org.apache.commons.collections.CollectionUtils;
 
+import java.io.Serializable;
 import java.util.ArrayList;
 import java.util.List;
 import java.util.Map;
@@ -77,7 +78,9 @@ public class K8SNamespaceServiceTest {
 
     @Before
     public void setUp() throws Exception {
-        Mockito.when(k8sClientService.upsertNamespaceAndResourceToK8s(Mockito.any(K8sNamespace.class), Mockito.anyString())).thenReturn(null);
+        Mockito.when(
+                k8sClientService.upsertNamespaceAndResourceToK8s(Mockito.any(K8sNamespace.class), Mockito.anyString()))
+                .thenReturn(null);
         Mockito.when(k8sClientService.deleteNamespaceToK8s(Mockito.anyString(), Mockito.anyLong())).thenReturn(null);
     }
 
@@ -90,7 +93,8 @@ public class K8SNamespaceServiceTest {
         IPage<K8sNamespace> page = new Page<>(1, 10);
         page.setTotal(1L);
         page.setRecords(getNamespaceList());
-        Mockito.when(k8sNamespaceMapper.queryK8sNamespacePaging(Mockito.any(Page.class), Mockito.eq(namespace))).thenReturn(page);
+        Mockito.when(k8sNamespaceMapper.queryK8sNamespacePaging(Mockito.any(Page.class), Mockito.eq(namespace)))
+                .thenReturn(page);
         Result result = k8sNamespaceService.queryListPaging(getLoginUser(), namespace, 1, 10);
         logger.info(result.toString());
         PageInfo<K8sNamespace> pageInfo = (PageInfo<K8sNamespace>) result.getData();
@@ -100,7 +104,8 @@ public class K8SNamespaceServiceTest {
     @Test
     public void createK8sNamespace() {
         // namespace is null
-        Map<String, Object> result = k8sNamespaceService.createK8sNamespace(getLoginUser(), null, clusterCode, 10.0, 100);
+        Map<String, Object> result =
+                k8sNamespaceService.createK8sNamespace(getLoginUser(), null, clusterCode, 10.0, 100);
         logger.info(result.toString());
         Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
         // k8s is null
@@ -112,7 +117,7 @@ public class K8SNamespaceServiceTest {
         result = k8sNamespaceService.createK8sNamespace(getLoginUser(), namespace, clusterCode, 10.0, 100);
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
-        //null limit cpu and mem
+        // null limit cpu and mem
         result = k8sNamespaceService.createK8sNamespace(getLoginUser(), namespace, clusterCode, null, null);
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
@@ -140,22 +145,22 @@ public class K8SNamespaceServiceTest {
 
         Mockito.when(k8sNamespaceMapper.existNamespace(namespace, clusterCode)).thenReturn(true);
 
-        //namespace null
+        // namespace null
         Result result = k8sNamespaceService.verifyNamespaceK8s(null, clusterCode);
         logger.info(result.toString());
         Assert.assertEquals(result.getCode().intValue(), Status.REQUEST_PARAMS_NOT_VALID_ERROR.getCode());
 
-        //k8s null
+        // k8s null
         result = k8sNamespaceService.verifyNamespaceK8s(namespace, null);
         logger.info(result.toString());
         Assert.assertEquals(result.getCode().intValue(), Status.REQUEST_PARAMS_NOT_VALID_ERROR.getCode());
 
-        //exist
+        // exist
         result = k8sNamespaceService.verifyNamespaceK8s(namespace, clusterCode);
         logger.info(result.toString());
         Assert.assertEquals(result.getCode().intValue(), Status.K8S_NAMESPACE_EXIST.getCode());
 
-        //not exist
+        // not exist
         result = k8sNamespaceService.verifyNamespaceK8s(namespace, 9999L);
         logger.info(result.toString());
         Assert.assertEquals(result.getCode().intValue(), Status.SUCCESS.getCode());
@@ -163,7 +168,7 @@ public class K8SNamespaceServiceTest {
 
     @Test
     public void deleteNamespaceById() {
-        Mockito.when(k8sNamespaceMapper.deleteById(Mockito.any())).thenReturn(1);
+        Mockito.when(k8sNamespaceMapper.deleteById(Mockito.<Serializable>any())).thenReturn(1);
         Mockito.when(k8sNamespaceMapper.selectById(1)).thenReturn(getNamespace());
 
         Map<String, Object> result = k8sNamespaceService.deleteNamespaceById(getLoginUser(), 1);
@@ -216,7 +221,6 @@ public class K8SNamespaceServiceTest {
         Assert.assertTrue(CollectionUtils.isEmpty(namespaces));
     }
 
-
     private User getLoginUser() {
 
         User loginUser = new User();
@@ -248,4 +252,4 @@ public class K8SNamespaceServiceTest {
         cluster.setOperator(1);
         return cluster;
     }
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/UsersServiceTest.java b/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/UsersServiceTest.java
index 1fc13c2198..cc32c621bb 100644
--- a/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/UsersServiceTest.java
+++ b/dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/UsersServiceTest.java
@@ -18,7 +18,6 @@
 package org.apache.dolphinscheduler.api.service;
 
 import static org.apache.dolphinscheduler.api.constants.ApiFuncIdentificationConstant.USER_MANAGER;
-
 import static org.mockito.ArgumentMatchers.any;
 import static org.mockito.ArgumentMatchers.eq;
 import static org.mockito.Mockito.when;
@@ -158,38 +157,39 @@ public class UsersServiceTest {
         String phone = "13456432345";
         int state = 1;
         try {
-            //userName error
-            Map<String, Object> result = usersService.createUser(user, userName, userPassword, email, tenantId, phone, queueName, state);
+            // userName error
+            Map<String, Object> result =
+                    usersService.createUser(user, userName, userPassword, email, tenantId, phone, queueName, state);
             logger.info(result.toString());
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
             userName = "userTest0001";
             userPassword = "userTest000111111111111111";
-            //password error
+            // password error
             result = usersService.createUser(user, userName, userPassword, email, tenantId, phone, queueName, state);
             logger.info(result.toString());
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
             userPassword = "userTest0001";
             email = "1q.com";
-            //email error
+            // email error
             result = usersService.createUser(user, userName, userPassword, email, tenantId, phone, queueName, state);
             logger.info(result.toString());
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
             email = "122222@qq.com";
             phone = "2233";
-            //phone error
+            // phone error
             result = usersService.createUser(user, userName, userPassword, email, tenantId, phone, queueName, state);
             logger.info(result.toString());
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
             phone = "13456432345";
-            //tenantId not exists
+            // tenantId not exists
             result = usersService.createUser(user, userName, userPassword, email, tenantId, phone, queueName, state);
             logger.info(result.toString());
             Assert.assertEquals(Status.TENANT_NOT_EXIST, result.get(Constants.STATUS));
-            //success
+            // success
             Mockito.when(tenantMapper.queryById(1)).thenReturn(getTenant());
             result = usersService.createUser(user, userName, userPassword, email, 1, phone, queueName, state);
             logger.info(result.toString());
@@ -205,7 +205,8 @@ public class UsersServiceTest {
     public void testQueryUser() {
         String userName = "userTest0001";
         String userPassword = "userTest0001";
-        when(userMapper.queryUserByNamePassword(userName, EncryptionUtils.getMd5(userPassword))).thenReturn(getGeneralUser());
+        when(userMapper.queryUserByNamePassword(userName, EncryptionUtils.getMd5(userPassword)))
+                .thenReturn(getGeneralUser());
         User queryUser = usersService.queryUser(userName, userPassword);
         logger.info(queryUser.toString());
         Assert.assertTrue(queryUser != null);
@@ -231,18 +232,18 @@ public class UsersServiceTest {
         user.setUserType(UserType.ADMIN_USER);
         user.setUserName("test_user");
 
-        //user name null
+        // user name null
         int userId = usersService.getUserIdByName("");
         Assert.assertEquals(0, userId);
 
-        //user not exist
+        // user not exist
         when(usersService.queryUser(user.getUserName())).thenReturn(null);
         int userNotExistId = usersService.getUserIdByName(user.getUserName());
         Assert.assertEquals(-1, userNotExistId);
 
-        //user exist
+        // user exist
         when(usersService.queryUser(user.getUserName())).thenReturn(user);
-        int userExistId = usersService.getUserIdByName(user.getUserName());
+        Integer userExistId = usersService.getUserIdByName(user.getUserName());
         Assert.assertEquals(user.getId(), userExistId);
     }
 
@@ -252,15 +253,19 @@ public class UsersServiceTest {
         user.setUserType(UserType.ADMIN_USER);
         user.setId(1);
 
-        Mockito.when(resourcePermissionCheckService.operationPermissionCheck(AuthorizationType.ACCESS_TOKEN,null, 1, USER_MANAGER, serviceLogger)).thenReturn(true);
-        Mockito.when(resourcePermissionCheckService.resourcePermissionCheck(AuthorizationType.ACCESS_TOKEN, null, 0, serviceLogger)).thenReturn(false);
+        Mockito.when(resourcePermissionCheckService.operationPermissionCheck(AuthorizationType.ACCESS_TOKEN, null, 1,
+                USER_MANAGER, serviceLogger)).thenReturn(true);
+        Mockito.when(resourcePermissionCheckService.resourcePermissionCheck(AuthorizationType.ACCESS_TOKEN, null, 0,
+                serviceLogger)).thenReturn(false);
         Map<String, Object> result = usersService.queryUserList(user);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
-        //success
-        Mockito.when(resourcePermissionCheckService.operationPermissionCheck(AuthorizationType.ACCESS_TOKEN,null, 1, USER_MANAGER, serviceLogger)).thenReturn(true);
-        Mockito.when(resourcePermissionCheckService.resourcePermissionCheck(AuthorizationType.ACCESS_TOKEN, null, 0, serviceLogger)).thenReturn(true);
+        // success
+        Mockito.when(resourcePermissionCheckService.operationPermissionCheck(AuthorizationType.ACCESS_TOKEN, null, 1,
+                USER_MANAGER, serviceLogger)).thenReturn(true);
+        Mockito.when(resourcePermissionCheckService.resourcePermissionCheck(AuthorizationType.ACCESS_TOKEN, null, 0,
+                serviceLogger)).thenReturn(true);
         user.setUserType(UserType.ADMIN_USER);
         when(userMapper.queryEnabledUsers()).thenReturn(getUserList());
         result = usersService.queryUserList(user);
@@ -275,12 +280,12 @@ public class UsersServiceTest {
         page.setRecords(getUserList());
         when(userMapper.queryUserPaging(any(Page.class), eq("userTest"))).thenReturn(page);
 
-        //no operate
+        // no operate
         Result result = usersService.queryUserList(user, "userTest", 1, 10);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM.getCode(), (int) result.getCode());
 
-        //success
+        // success
         user.setUserType(UserType.ADMIN_USER);
         result = usersService.queryUserList(user, "userTest", 1, 10);
         Assert.assertEquals(Status.SUCCESS.getCode(), (int) result.getCode());
@@ -293,14 +298,16 @@ public class UsersServiceTest {
         String userName = "userTest0001";
         String userPassword = "userTest0001";
         try {
-            //user not exist
-            Map<String, Object> result = usersService.updateUser(getLoginUser(), 0, userName, userPassword, "3443@qq.com", 1, "13457864543", "queue", 1, "Asia/Shanghai");
+            // user not exist
+            Map<String, Object> result = usersService.updateUser(getLoginUser(), 0, userName, userPassword,
+                    "3443@qq.com", 1, "13457864543", "queue", 1, "Asia/Shanghai");
             Assert.assertEquals(Status.USER_NOT_EXIST, result.get(Constants.STATUS));
             logger.info(result.toString());
 
-            //success
+            // success
             when(userMapper.selectById(1)).thenReturn(getUser());
-            result = usersService.updateUser(getLoginUser(), 1, userName, userPassword, "32222s@qq.com", 1, "13457864543", "queue", 1, "Asia/Shanghai");
+            result = usersService.updateUser(getLoginUser(), 1, userName, userPassword, "32222s@qq.com", 1,
+                    "13457864543", "queue", 1, "Asia/Shanghai");
             logger.info(result.toString());
             Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
         } catch (Exception e) {
@@ -316,7 +323,7 @@ public class UsersServiceTest {
             when(userMapper.queryTenantCodeByUserId(1)).thenReturn(getUser());
             when(userMapper.selectById(1)).thenReturn(getUser());
             when(accessTokenMapper.deleteAccessTokenByUserId(1)).thenReturn(0);
-            //no operate
+            // no operate
             Map<String, Object> result = usersService.deleteUserById(loginUser, 3);
             logger.info(result.toString());
             Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
@@ -332,7 +339,7 @@ public class UsersServiceTest {
             result = usersService.deleteUserById(loginUser, 1);
             Assert.assertEquals(Status.TRANSFORM_PROJECT_OWNERSHIP, result.get(Constants.STATUS));
 
-            //success
+            // success
             Mockito.when(projectMapper.queryProjectCreatedByUser(1)).thenReturn(null);
             result = usersService.deleteUserById(loginUser, 1);
             logger.info(result.toString());
@@ -350,7 +357,7 @@ public class UsersServiceTest {
         User loginUser = new User();
         int userId = 3;
 
-        //user not exist
+        // user not exist
         loginUser.setId(1);
         loginUser.setUserType(UserType.ADMIN_USER);
         when(userMapper.selectById(userId)).thenReturn(null);
@@ -358,7 +365,7 @@ public class UsersServiceTest {
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NOT_EXIST, result.get(Constants.STATUS));
 
-        //SUCCESS
+        // SUCCESS
         when(userMapper.selectById(userId)).thenReturn(getUser());
         result = usersService.grantProject(loginUser, userId, projectIds);
         logger.info(result.toString());
@@ -439,12 +446,12 @@ public class UsersServiceTest {
         when(userMapper.selectById(1)).thenReturn(getUser());
         User loginUser = new User();
 
-        //user not exist
+        // user not exist
         loginUser.setUserType(UserType.ADMIN_USER);
         Map<String, Object> result = usersService.grantResources(loginUser, 2, resourceIds);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NOT_EXIST, result.get(Constants.STATUS));
-        //success
+        // success
         when(resourceMapper.selectById(Mockito.anyInt())).thenReturn(getResource());
         when(resourceUserMapper.deleteResourceUser(1, 0)).thenReturn(1);
         result = usersService.grantResources(loginUser, 1, resourceIds);
@@ -459,12 +466,12 @@ public class UsersServiceTest {
         when(userMapper.selectById(1)).thenReturn(getUser());
         User loginUser = new User();
 
-        //user not exist
+        // user not exist
         loginUser.setUserType(UserType.ADMIN_USER);
         Map<String, Object> result = usersService.grantUDFFunction(loginUser, 2, udfIds);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NOT_EXIST, result.get(Constants.STATUS));
-        //success
+        // success
         when(udfUserMapper.deleteByUserId(1)).thenReturn(1);
         result = usersService.grantUDFFunction(loginUser, 1, udfIds);
         logger.info(result.toString());
@@ -477,12 +484,12 @@ public class UsersServiceTest {
         when(userMapper.selectById(1)).thenReturn(getUser());
         User loginUser = new User();
 
-        //user not exist
+        // user not exist
         loginUser.setUserType(UserType.ADMIN_USER);
         Map<String, Object> result = usersService.grantNamespaces(loginUser, 2, namespaceIds);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NOT_EXIST, result.get(Constants.STATUS));
-        //success
+        // success
         when(k8sNamespaceUserMapper.deleteNamespaceRelation(0, 1)).thenReturn(1);
         result = usersService.grantNamespaces(loginUser, 1, namespaceIds);
         logger.info(result.toString());
@@ -495,7 +502,7 @@ public class UsersServiceTest {
         User loginUser = new User();
         int userId = 3;
 
-        //user not exist
+        // user not exist
         loginUser.setId(1);
         loginUser.setUserType(UserType.ADMIN_USER);
         when(userMapper.selectById(userId)).thenReturn(null);
@@ -536,10 +543,10 @@ public class UsersServiceTest {
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
         User tempUser = (User) result.get(Constants.DATA_LIST);
-        //check userName
+        // check userName
         Assert.assertEquals("admin", tempUser.getUserName());
 
-        //get general user
+        // get general user
         loginUser.setUserType(null);
         loginUser.setId(1);
         when(userMapper.queryDetailsById(1)).thenReturn(getGeneralUser());
@@ -548,18 +555,18 @@ public class UsersServiceTest {
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
         tempUser = (User) result.get(Constants.DATA_LIST);
-        //check userName
+        // check userName
         Assert.assertEquals("userTest0001", tempUser.getUserName());
     }
 
     @Test
     public void testQueryAllGeneralUsers() {
         User loginUser = new User();
-        //no operate
+        // no operate
         Map<String, Object> result = usersService.queryAllGeneralUsers(loginUser);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
-        //success
+        // success
         loginUser.setUserType(UserType.ADMIN_USER);
         when(userMapper.queryAllGeneralUser()).thenReturn(getUserList());
         result = usersService.queryAllGeneralUsers(loginUser);
@@ -571,11 +578,11 @@ public class UsersServiceTest {
 
     @Test
     public void testVerifyUserName() {
-        //not exist user
+        // not exist user
         Result result = usersService.verifyUserName("admin89899");
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS.getMsg(), result.getMsg());
-        //exist user
+        // exist user
         when(userMapper.queryByUserNameAccurately("userTest0001")).thenReturn(getUser());
         result = usersService.verifyUserName("userTest0001");
         logger.info(result.toString());
@@ -587,12 +594,12 @@ public class UsersServiceTest {
         User loginUser = new User();
         when(userMapper.selectList(null)).thenReturn(getUserList());
         when(userMapper.queryUserListByAlertGroupId(2)).thenReturn(getUserList());
-        //no operate
+        // no operate
         Map<String, Object> result = usersService.unauthorizedUser(loginUser, 2);
         logger.info(result.toString());
         loginUser.setUserType(UserType.ADMIN_USER);
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
-        //success
+        // success
         result = usersService.unauthorizedUser(loginUser, 2);
         logger.info(result.toString());
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
@@ -602,11 +609,11 @@ public class UsersServiceTest {
     public void testAuthorizedUser() {
         User loginUser = new User();
         when(userMapper.queryUserListByAlertGroupId(2)).thenReturn(getUserList());
-        //no operate
+        // no operate
         Map<String, Object> result = usersService.authorizedUser(loginUser, 2);
         logger.info(result.toString());
         Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
-        //success
+        // success
         loginUser.setUserType(UserType.ADMIN_USER);
         result = usersService.authorizedUser(loginUser, 2);
         Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
@@ -622,29 +629,29 @@ public class UsersServiceTest {
         String repeatPassword = "userTest";
         String email = "123@qq.com";
         try {
-            //userName error
+            // userName error
             Map<String, Object> result = usersService.registerUser(userName, userPassword, repeatPassword, email);
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
             userName = "userTest0002";
             userPassword = "userTest000111111111111111";
-            //password error
+            // password error
             result = usersService.registerUser(userName, userPassword, repeatPassword, email);
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
             userPassword = "userTest0002";
             email = "1q.com";
-            //email error
+            // email error
             result = usersService.registerUser(userName, userPassword, repeatPassword, email);
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
-            //repeatPassword error
+            // repeatPassword error
             email = "7400@qq.com";
             repeatPassword = "userPassword";
             result = usersService.registerUser(userName, userPassword, repeatPassword, email);
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
-            //success
+            // success
             repeatPassword = "userTest0002";
             result = usersService.registerUser(userName, userPassword, repeatPassword, email);
             Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
@@ -660,27 +667,27 @@ public class UsersServiceTest {
         user.setUserType(UserType.GENERAL_USER);
         String userName = "userTest0002~";
         try {
-            //not admin
+            // not admin
             Map<String, Object> result = usersService.activateUser(user, userName);
             Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
-            //userName error
+            // userName error
             user.setUserType(UserType.ADMIN_USER);
             result = usersService.activateUser(user, userName);
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
-            //user not exist
+            // user not exist
             userName = "userTest10013";
             result = usersService.activateUser(user, userName);
             Assert.assertEquals(Status.USER_NOT_EXIST, result.get(Constants.STATUS));
 
-            //user state error
+            // user state error
             userName = "userTest0001";
             when(userMapper.queryByUserNameAccurately(userName)).thenReturn(getUser());
             result = usersService.activateUser(user, userName);
             Assert.assertEquals(Status.REQUEST_PARAMS_NOT_VALID_ERROR, result.get(Constants.STATUS));
 
-            //success
+            // success
             when(userMapper.queryByUserNameAccurately(userName)).thenReturn(getDisabledUser());
             result = usersService.activateUser(user, userName);
             Assert.assertEquals(Status.SUCCESS, result.get(Constants.STATUS));
@@ -700,11 +707,11 @@ public class UsersServiceTest {
         userNames.add("userTest0004");
 
         try {
-            //not admin
+            // not admin
             Map<String, Object> result = usersService.batchActivateUser(user, userNames);
             Assert.assertEquals(Status.USER_NO_OPERATION_PERM, result.get(Constants.STATUS));
 
-            //batch activate user names
+            // batch activate user names
             user.setUserType(UserType.ADMIN_USER);
             when(userMapper.queryByUserNameAccurately("userTest0001")).thenReturn(getUser());
             when(userMapper.queryByUserNameAccurately("userTest0002")).thenReturn(getDisabledUser());
diff --git a/dolphinscheduler-bom/pom.xml b/dolphinscheduler-bom/pom.xml
index 0599d5510e..3658d56414 100644
--- a/dolphinscheduler-bom/pom.xml
+++ b/dolphinscheduler-bom/pom.xml
@@ -32,10 +32,10 @@
         <spring-boot.version>2.7.3</spring-boot.version>
         <spring.version>5.3.19</spring.version>
         <java-websocket.version>1.5.1</java-websocket.version>
-        <mybatis-plus.version>3.2.0</mybatis-plus.version>
+        <mybatis-plus.version>3.5.2</mybatis-plus.version>
         <quartz.version>2.3.2</quartz.version>
         <druid.version>1.2.4</druid.version>
-        <zookeeper.version>3.4.14</zookeeper.version>
+        <zookeeper.version>3.8.0</zookeeper.version>
         <curator.version>4.3.0</curator.version>
         <curator-test.version>2.12.0</curator-test.version>
         <commons-codec.version>1.11</commons-codec.version>
@@ -52,9 +52,9 @@
         <protostuff.version>1.7.2</protostuff.version>
         <byte-buddy.version>1.9.16</byte-buddy.version>
         <logback.version>1.2.11</logback.version>
-        <hadoop.version>2.7.3</hadoop.version>
-        <cron-utils.version>9.1.3</cron-utils.version>
-        <h2.version>1.4.200</h2.version>
+        <hadoop.version>2.7.7</hadoop.version>
+        <cron-utils.version>9.1.6</cron-utils.version>
+        <h2.version>2.1.210</h2.version>
         <mysql-connector.version>8.0.16</mysql-connector.version>
         <oracle-jdbc.version>21.5.0.0</oracle-jdbc.version>
         <slf4j.version>1.7.36</slf4j.version>
@@ -63,8 +63,8 @@
         <activation.version>1.1</activation.version>
         <javax-mail>1.6.2</javax-mail>
         <guava.version>24.1-jre</guava.version>
-        <postgresql.version>42.3.4</postgresql.version>
-        <hive-jdbc.version>2.1.0</hive-jdbc.version>
+        <postgresql.version>42.4.1</postgresql.version>
+        <hive-jdbc.version>2.3.3</hive-jdbc.version>
         <commons-io.version>2.11.0</commons-io.version>
         <oshi-core.version>6.1.1</oshi-core.version>
         <clickhouse-jdbc.version>0.1.52</clickhouse-jdbc.version>
@@ -85,6 +85,9 @@
         <okhttp.version>3.14.9</okhttp.version>
         <json-path.version>2.7.0</json-path.version>
         <spring-cloud-dependencies.version>2021.0.3</spring-cloud-dependencies.version>
+        <gson.version>2.9.1</gson.version>
+        <dropwizard.metrics-version>4.2.11</dropwizard.metrics-version>
+        <snappy.version>1.1.8.4</snappy.version>
     </properties>
 
     <dependencyManagement>
@@ -203,6 +206,16 @@
                     </exclusion>
                 </exclusions>
             </dependency>
+            <dependency>
+                <groupId>io.dropwizard.metrics</groupId>
+                <artifactId>metrics-core</artifactId>
+                <version>${dropwizard.metrics-version}</version>
+            </dependency>
+            <dependency>
+                <groupId>org.xerial.snappy</groupId>
+                <artifactId>snappy-java</artifactId>
+                <version>${snappy.version}</version>
+            </dependency>
             <dependency>
                 <groupId>org.apache.curator</groupId>
                 <artifactId>curator-framework</artifactId>
@@ -615,6 +628,12 @@
                 <artifactId>spring-ldap</artifactId>
                 <version>1.1.2</version>
             </dependency>
+
+            <dependency>
+                <groupId>com.google.code.gson</groupId>
+                <artifactId>gson</artifactId>
+                <version>${gson.version}</version>
+            </dependency>
         </dependencies>
 
     </dependencyManagement>
diff --git a/dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/model/WorkerServerModel.java b/dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/model/WorkerServerModel.java
index a1f7b479d3..30d4b1686c 100644
--- a/dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/model/WorkerServerModel.java
+++ b/dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/model/WorkerServerModel.java
@@ -60,7 +60,7 @@ public class WorkerServerModel {
      */
     private Date lastHeartbeatTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/PluginDao.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/PluginDao.java
index 57ff712bc8..0527cfd705 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/PluginDao.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/PluginDao.java
@@ -27,6 +27,7 @@ import org.springframework.stereotype.Component;
 
 @Component
 public class PluginDao {
+
     @Autowired
     private PluginDefineMapper pluginDefineMapper;
 
@@ -49,9 +50,10 @@ public class PluginDao {
         requireNonNull(pluginDefine.getPluginName(), "pluginName is null");
         requireNonNull(pluginDefine.getPluginType(), "pluginType is null");
 
-        PluginDefine currPluginDefine = pluginDefineMapper.queryByNameAndType(pluginDefine.getPluginName(), pluginDefine.getPluginType());
+        PluginDefine currPluginDefine =
+                pluginDefineMapper.queryByNameAndType(pluginDefine.getPluginName(), pluginDefine.getPluginType());
         if (currPluginDefine == null) {
-            if (pluginDefineMapper.insert(pluginDefine) == 1 && pluginDefine.getId() > 0) {
+            if (pluginDefineMapper.insert(pluginDefine) == 1 && pluginDefine.getId() != null) {
                 return pluginDefine.getId();
             }
             throw new IllegalStateException("Failed to insert plugin definition");
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/datasource/SpringConnectionFactory.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/datasource/SpringConnectionFactory.java
index 74ca570fb1..5a486ee828 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/datasource/SpringConnectionFactory.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/datasource/SpringConnectionFactory.java
@@ -24,21 +24,23 @@ import org.apache.ibatis.type.JdbcType;
 
 import java.util.Properties;
 
-import javax.annotation.Resource;
 import javax.sql.DataSource;
 
 import org.springframework.beans.factory.annotation.Autowired;
 import org.springframework.boot.jdbc.init.DataSourceScriptDatabaseInitializer;
 import org.springframework.context.annotation.Bean;
 import org.springframework.context.annotation.Configuration;
+import org.springframework.context.annotation.Profile;
 import org.springframework.core.io.support.PathMatchingResourcePatternResolver;
 import org.springframework.core.io.support.ResourcePatternResolver;
 import org.springframework.jdbc.datasource.DataSourceTransactionManager;
 
+import com.baomidou.mybatisplus.annotation.DbType;
 import com.baomidou.mybatisplus.annotation.IdType;
 import com.baomidou.mybatisplus.core.MybatisConfiguration;
 import com.baomidou.mybatisplus.core.config.GlobalConfig;
-import com.baomidou.mybatisplus.extension.plugins.PaginationInterceptor;
+import com.baomidou.mybatisplus.extension.plugins.MybatisPlusInterceptor;
+import com.baomidou.mybatisplus.extension.plugins.inner.PaginationInnerInterceptor;
 import com.baomidou.mybatisplus.extension.spring.MybatisSqlSessionFactoryBean;
 
 @Configuration
@@ -51,8 +53,10 @@ public class SpringConnectionFactory {
     public DataSourceScriptDatabaseInitializer dataSourceScriptDatabaseInitializer;
 
     @Bean
-    public PaginationInterceptor paginationInterceptor() {
-        return new PaginationInterceptor();
+    public MybatisPlusInterceptor paginationInterceptor() {
+        MybatisPlusInterceptor interceptor = new MybatisPlusInterceptor();
+        interceptor.addInnerInterceptor(new PaginationInnerInterceptor(DbType.MYSQL));
+        return interceptor;
     }
 
     @Bean
@@ -61,7 +65,7 @@ public class SpringConnectionFactory {
     }
 
     @Bean
-    public SqlSessionFactory sqlSessionFactory(DataSource dataSource) throws Exception {
+    public SqlSessionFactory sqlSessionFactory(DataSource dataSource, GlobalConfig globalConfig) throws Exception {
         MybatisConfiguration configuration = new MybatisConfiguration();
         configuration.setMapUnderscoreToCamelCase(true);
         configuration.setCacheEnabled(false);
@@ -69,24 +73,25 @@ public class SpringConnectionFactory {
         configuration.setJdbcTypeForNull(JdbcType.NULL);
         configuration.addInterceptor(paginationInterceptor());
 
-        configuration.setGlobalConfig(new GlobalConfig().setBanner(false));
         MybatisSqlSessionFactoryBean sqlSessionFactoryBean = new MybatisSqlSessionFactoryBean();
         sqlSessionFactoryBean.setConfiguration(configuration);
         sqlSessionFactoryBean.setDataSource(dataSource);
 
-        GlobalConfig.DbConfig dbConfig = new GlobalConfig.DbConfig();
-        dbConfig.setIdType(IdType.AUTO);
-        GlobalConfig globalConfig = new GlobalConfig().setBanner(false);
-        globalConfig.setDbConfig(dbConfig);
         sqlSessionFactoryBean.setGlobalConfig(globalConfig);
         sqlSessionFactoryBean.setTypeAliasesPackage("org.apache.dolphinscheduler.dao.entity");
         ResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();
-        sqlSessionFactoryBean.setMapperLocations(resolver.getResources("org/apache/dolphinscheduler/dao/mapper/*Mapper.xml"));
-        sqlSessionFactoryBean.setTypeEnumsPackage("org.apache.dolphinscheduler.**.enums");
+        sqlSessionFactoryBean
+                .setMapperLocations(resolver.getResources("org/apache/dolphinscheduler/dao/mapper/*Mapper.xml"));
         sqlSessionFactoryBean.setDatabaseIdProvider(databaseIdProvider());
         return sqlSessionFactoryBean.getObject();
     }
 
+    @Bean
+    public GlobalConfig globalConfig() {
+        return new GlobalConfig().setDbConfig(new GlobalConfig.DbConfig()
+                .setIdType(IdType.AUTO)).setBanner(false);
+    }
+
     @Bean
     public DatabaseIdProvider databaseIdProvider() {
         DatabaseIdProvider databaseIdProvider = new VendorDatabaseIdProvider();
@@ -97,4 +102,22 @@ public class SpringConnectionFactory {
         databaseIdProvider.setProperties(properties);
         return databaseIdProvider;
     }
+
+    @Bean
+    @Profile("mysql")
+    public DbType mysql() {
+        return DbType.MYSQL;
+    }
+
+    @Bean
+    @Profile("h2")
+    public DbType h2() {
+        return DbType.H2;
+    }
+
+    @Bean
+    @Profile("postgresql")
+    public DbType postgresql() {
+        return DbType.POSTGRE_SQL;
+    }
 }
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AccessToken.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AccessToken.java
index 56f976173b..a3ebdd3799 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AccessToken.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AccessToken.java
@@ -26,11 +26,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_access_token")
 public class AccessToken {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * user_id
      */
@@ -59,7 +60,7 @@ public class AccessToken {
     @TableField(exist = false)
     private String userName;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertGroup.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertGroup.java
index 174eb55a3a..6c50154cb6 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertGroup.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertGroup.java
@@ -26,11 +26,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_alertgroup")
 public class AlertGroup {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * group_name
      */
@@ -62,7 +63,7 @@ public class AlertGroup {
     @TableField(value = "create_user_id")
     private int createUserId;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -138,13 +139,15 @@ public class AlertGroup {
         if (groupName != null ? !groupName.equals(that.groupName) : that.groupName != null) {
             return false;
         }
-        if (alertInstanceIds != null ? !alertInstanceIds.equals(that.alertInstanceIds) : that.alertInstanceIds != null) {
+        if (alertInstanceIds != null ? !alertInstanceIds.equals(that.alertInstanceIds)
+                : that.alertInstanceIds != null) {
             return false;
         }
         if (description != null ? !description.equals(that.description) : that.description != null) {
             return false;
         }
-        return !(createTime != null ? !createTime.equals(that.createTime) : that.createTime != null) && !(updateTime != null ? !updateTime.equals(that.updateTime) : that.updateTime != null);
+        return !(createTime != null ? !createTime.equals(that.createTime) : that.createTime != null)
+                && !(updateTime != null ? !updateTime.equals(that.updateTime) : that.updateTime != null);
 
     }
 
@@ -163,12 +166,12 @@ public class AlertGroup {
     @Override
     public String toString() {
         return "AlertGroup{"
-            + "id=" + id
-            + "createUserId=" + createUserId
-            + ", groupName='" + groupName + '\''
-            + ", description='" + description + '\''
-            + ", createTime=" + createTime
-            + ", updateTime=" + updateTime
-            + '}';
+                + "id=" + id
+                + "createUserId=" + createUserId
+                + ", groupName='" + groupName + '\''
+                + ", description='" + description + '\''
+                + ", createTime=" + createTime
+                + ", updateTime=" + updateTime
+                + '}';
     }
 }
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertPluginInstance.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertPluginInstance.java
index f94d220fb2..12974110b7 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertPluginInstance.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertPluginInstance.java
@@ -35,7 +35,7 @@ public class AlertPluginInstance {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * plugin_define_id
@@ -87,7 +87,7 @@ public class AlertPluginInstance {
         this.instanceName = instanceName;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -135,4 +135,3 @@ public class AlertPluginInstance {
         this.instanceName = instanceName;
     }
 }
-
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertSendStatus.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertSendStatus.java
index c1876a581d..1e72c435dd 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertSendStatus.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AlertSendStatus.java
@@ -30,11 +30,12 @@ import com.google.common.base.Objects;
 
 @TableName("t_ds_alert_send_status")
 public class AlertSendStatus {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * alert id
@@ -66,7 +67,7 @@ public class AlertSendStatus {
     @TableField("create_time")
     private Date createTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -134,12 +135,12 @@ public class AlertSendStatus {
     @Override
     public String toString() {
         return new StringJoiner(", ", AlertSendStatus.class.getSimpleName() + "[", "]")
-            .add("id=" + id)
-            .add("alertId=" + alertId)
-            .add("alertPluginInstanceId=" + alertPluginInstanceId)
-            .add("sendStatus=" + sendStatus)
-            .add("log='" + log + "'")
-            .add("createTime=" + createTime)
-            .toString();
+                .add("id=" + id)
+                .add("alertId=" + alertId)
+                .add("alertPluginInstanceId=" + alertPluginInstanceId)
+                .add("sendStatus=" + sendStatus)
+                .add("log='" + log + "'")
+                .add("createTime=" + createTime)
+                .toString();
     }
 }
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AuditLog.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AuditLog.java
index 6cb9821bf0..10023d1547 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AuditLog.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/AuditLog.java
@@ -31,7 +31,7 @@ public class AuditLog {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * user id
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Cluster.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Cluster.java
index ee138f3c0d..ec1e7d9dd3 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Cluster.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Cluster.java
@@ -30,7 +30,7 @@ import com.baomidou.mybatisplus.annotation.TableName;
 public class Cluster {
 
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * cluster code
@@ -58,7 +58,7 @@ public class Cluster {
 
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Command.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Command.java
index ae2ff6258a..8c6a950ea3 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Command.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Command.java
@@ -40,7 +40,7 @@ public class Command {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * command type
@@ -146,22 +146,21 @@ public class Command {
     }
 
     public Command(
-            CommandType commandType,
-            TaskDependType taskDependType,
-            FailureStrategy failureStrategy,
-            int executorId,
-            long processDefinitionCode,
-            String commandParam,
-            WarningType warningType,
-            int warningGroupId,
-            Date scheduleTime,
-            String workerGroup,
-            Long environmentCode,
-            Priority processInstancePriority,
-            int dryRun,
-            int processInstanceId,
-            int processDefinitionVersion
-    ) {
+                   CommandType commandType,
+                   TaskDependType taskDependType,
+                   FailureStrategy failureStrategy,
+                   int executorId,
+                   long processDefinitionCode,
+                   String commandParam,
+                   WarningType warningType,
+                   int warningGroupId,
+                   Date scheduleTime,
+                   String workerGroup,
+                   Long environmentCode,
+                   Priority processInstancePriority,
+                   int dryRun,
+                   int processInstanceId,
+                   int processDefinitionVersion) {
         this.commandType = commandType;
         this.executorId = executorId;
         this.processDefinitionCode = processDefinitionCode;
@@ -189,7 +188,7 @@ public class Command {
         this.taskDependType = taskDependType;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -349,7 +348,8 @@ public class Command {
             return false;
         }
 
-        if (environmentCode != null ? environmentCode.equals(command.environmentCode) : command.environmentCode == null) {
+        if (environmentCode != null ? environmentCode.equals(command.environmentCode)
+                : command.environmentCode == null) {
             return false;
         }
 
@@ -437,4 +437,3 @@ public class Command {
     }
 
 }
-
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DataSource.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DataSource.java
index 68120cafe8..87b9d0eb77 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DataSource.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DataSource.java
@@ -28,11 +28,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_datasource")
 public class DataSource {
+
     /**
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * user id
@@ -78,7 +79,7 @@ public class DataSource {
     public DataSource() {
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DatasourceUser.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DatasourceUser.java
index 6f18d0c9ee..7c09da5210 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DatasourceUser.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DatasourceUser.java
@@ -33,7 +33,7 @@ public class DatasourceUser {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * user id
@@ -59,7 +59,7 @@ public class DatasourceUser {
      */
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqComparisonType.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqComparisonType.java
index cf35342ce8..4941249f79 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqComparisonType.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqComparisonType.java
@@ -27,11 +27,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_dq_comparison_type")
 public class DqComparisonType implements Serializable {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * type
      */
@@ -68,7 +69,7 @@ public class DqComparisonType implements Serializable {
     @TableField(value = "update_time")
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqExecuteResult.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqExecuteResult.java
index ed5de24d72..b0a2e07483 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqExecuteResult.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqExecuteResult.java
@@ -27,11 +27,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_dq_execute_result")
 public class DqExecuteResult implements Serializable {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * process defined id
      */
@@ -41,7 +42,7 @@ public class DqExecuteResult implements Serializable {
      * process definition name
      */
     @TableField(exist = false)
-    private String  processDefinitionName;
+    private String processDefinitionName;
     /**
      * process definition code
      */
@@ -153,7 +154,7 @@ public class DqExecuteResult implements Serializable {
     @TableField(value = "update_time")
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -355,7 +356,7 @@ public class DqExecuteResult implements Serializable {
 
     @Override
     public String toString() {
-        return "DqExecuteResult{" 
+        return "DqExecuteResult{"
                 + "id=" + id
                 + ", processDefinitionId=" + processDefinitionId
                 + ", processDefinitionName='" + processDefinitionName + '\''
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRule.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRule.java
index bb87db257c..77369f78e1 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRule.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRule.java
@@ -27,11 +27,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_dq_rule")
 public class DqRule implements Serializable {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * name
      */
@@ -68,7 +69,7 @@ public class DqRule implements Serializable {
     @TableField(value = "update_time")
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRuleExecuteSql.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRuleExecuteSql.java
index d349b63f34..bfaed0628e 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRuleExecuteSql.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRuleExecuteSql.java
@@ -32,11 +32,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
  */
 @TableName("t_ds_dq_rule_execute_sql")
 public class DqRuleExecuteSql implements Serializable {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * index,ensure the execution order of sql
      */
@@ -73,7 +74,7 @@ public class DqRuleExecuteSql implements Serializable {
     @TableField(value = "update_time")
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -150,4 +151,4 @@ public class DqRuleExecuteSql implements Serializable {
                 + ", updateTime=" + updateTime
                 + '}';
     }
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRuleInputEntry.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRuleInputEntry.java
index 63659f5688..ecb13d1fc9 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRuleInputEntry.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqRuleInputEntry.java
@@ -34,11 +34,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
  */
 @TableName("t_ds_dq_rule_input_entry")
 public class DqRuleInputEntry implements Serializable {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * form field name
      */
@@ -127,7 +128,7 @@ public class DqRuleInputEntry implements Serializable {
     @TableField(value = "update_time")
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -294,4 +295,4 @@ public class DqRuleInputEntry implements Serializable {
                 + ", updateTime=" + updateTime
                 + '}';
     }
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqTaskStatisticsValue.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqTaskStatisticsValue.java
index f76b97f98b..79b63d65d0 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqTaskStatisticsValue.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/DqTaskStatisticsValue.java
@@ -27,11 +27,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_dq_task_statistics_value")
 public class DqTaskStatisticsValue implements Serializable {
+
     /**
      * primary key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * process defined id
      */
@@ -41,7 +42,7 @@ public class DqTaskStatisticsValue implements Serializable {
      * process definition name
      */
     @TableField(exist = false)
-    private String  processDefinitionName;
+    private String processDefinitionName;
     /**
      * task instance id
      */
@@ -93,7 +94,7 @@ public class DqTaskStatisticsValue implements Serializable {
     @TableField(value = "update_time")
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Environment.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Environment.java
index 8aed0fe8df..ca6b569638 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Environment.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Environment.java
@@ -30,7 +30,7 @@ import com.baomidou.mybatisplus.annotation.TableName;
 public class Environment {
 
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * environment code
@@ -58,7 +58,7 @@ public class Environment {
 
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/EnvironmentWorkerGroupRelation.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/EnvironmentWorkerGroupRelation.java
index 14bce3959f..683d8a5be0 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/EnvironmentWorkerGroupRelation.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/EnvironmentWorkerGroupRelation.java
@@ -30,7 +30,7 @@ import com.baomidou.mybatisplus.annotation.TableName;
 public class EnvironmentWorkerGroupRelation {
 
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * environment code
@@ -51,7 +51,7 @@ public class EnvironmentWorkerGroupRelation {
 
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ErrorCommand.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ErrorCommand.java
index f050dd2b74..db77bb7739 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ErrorCommand.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ErrorCommand.java
@@ -121,7 +121,8 @@ public class ErrorCommand {
      */
     private int dryRun;
 
-    public ErrorCommand() {}
+    public ErrorCommand() {
+    }
 
     public ErrorCommand(Command command, String message) {
         this.id = command.getId();
@@ -150,7 +151,7 @@ public class ErrorCommand {
         this.taskDependType = taskDependType;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/K8s.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/K8s.java
index 9f2ef2ff77..06fe3f0b48 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/K8s.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/K8s.java
@@ -29,11 +29,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
  */
 @TableName("t_ds_k8s")
 public class K8s {
+
     /**
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * k8s name
      */
@@ -60,7 +61,7 @@ public class K8s {
 
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/K8sNamespaceUser.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/K8sNamespaceUser.java
index 87a7a3ecc4..adc97976e0 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/K8sNamespaceUser.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/K8sNamespaceUser.java
@@ -29,11 +29,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
  */
 @TableName("t_ds_relation_namespace_user")
 public class K8sNamespaceUser {
+
     /**
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * user id
@@ -76,7 +77,7 @@ public class K8sNamespaceUser {
     @TableField("update_time")
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -151,14 +152,14 @@ public class K8sNamespaceUser {
     @Override
     public String toString() {
         return "K8sNamespaceUser{" +
-            "id=" + id +
-            ", userId=" + userId +
-            ", namespaceId=" + namespaceId +
-            ", k8s=" + k8s +
-            ", namespaceName=" + namespaceName +
-            ", perm=" + perm +
-            ", createTime=" + createTime +
-            ", updateTime=" + updateTime +
-            '}';
+                "id=" + id +
+                ", userId=" + userId +
+                ", namespaceId=" + namespaceId +
+                ", k8s=" + k8s +
+                ", namespaceName=" + namespaceName +
+                ", perm=" + perm +
+                ", createTime=" + createTime +
+                ", updateTime=" + updateTime +
+                '}';
     }
 }
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/PluginDefine.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/PluginDefine.java
index 2be8988a08..7c73cdd816 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/PluginDefine.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/PluginDefine.java
@@ -34,7 +34,7 @@ public class PluginDefine {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * plugin name
@@ -74,7 +74,7 @@ public class PluginDefine {
         this.updateTime = new Date();
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -122,4 +122,3 @@ public class PluginDefine {
         this.updateTime = updateTime;
     }
 }
-
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessDefinition.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessDefinition.java
index 80331db34d..f0a9b11872 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessDefinition.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessDefinition.java
@@ -46,7 +46,7 @@ public class ProcessDefinition {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * code
@@ -171,7 +171,8 @@ public class ProcessDefinition {
      */
     private ProcessExecutionTypeEnum executionType;
 
-    public ProcessDefinition() { }
+    public ProcessDefinition() {
+    }
 
     public ProcessDefinition(long projectCode,
                              String name,
@@ -223,7 +224,7 @@ public class ProcessDefinition {
         this.version = version;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -418,44 +419,44 @@ public class ProcessDefinition {
         }
         ProcessDefinition that = (ProcessDefinition) o;
         return projectCode == that.projectCode
-            && userId == that.userId
-            && timeout == that.timeout
-            && tenantId == that.tenantId
-            && Objects.equals(name, that.name)
-            && releaseState == that.releaseState
-            && Objects.equals(description, that.description)
-            && Objects.equals(globalParams, that.globalParams)
-            && flag == that.flag
-            && executionType == that.executionType
-            && Objects.equals(locations, that.locations);
+                && userId == that.userId
+                && timeout == that.timeout
+                && tenantId == that.tenantId
+                && Objects.equals(name, that.name)
+                && releaseState == that.releaseState
+                && Objects.equals(description, that.description)
+                && Objects.equals(globalParams, that.globalParams)
+                && flag == that.flag
+                && executionType == that.executionType
+                && Objects.equals(locations, that.locations);
     }
 
     @Override
     public String toString() {
         return "ProcessDefinition{"
-            + "id=" + id
-            + ", code=" + code
-            + ", name='" + name + '\''
-            + ", version=" + version
-            + ", releaseState=" + releaseState
-            + ", projectCode=" + projectCode
-            + ", description='" + description + '\''
-            + ", globalParams='" + globalParams + '\''
-            + ", globalParamList=" + globalParamList
-            + ", globalParamMap=" + globalParamMap
-            + ", createTime=" + createTime
-            + ", updateTime=" + updateTime
-            + ", flag=" + flag
-            + ", userId=" + userId
-            + ", userName='" + userName + '\''
-            + ", projectName='" + projectName + '\''
-            + ", locations='" + locations + '\''
-            + ", scheduleReleaseState=" + scheduleReleaseState
-            + ", timeout=" + timeout
-            + ", tenantId=" + tenantId
-            + ", tenantCode='" + tenantCode + '\''
-            + ", modifyBy='" + modifyBy + '\''
-            + ", warningGroupId=" + warningGroupId
-            + '}';
+                + "id=" + id
+                + ", code=" + code
+                + ", name='" + name + '\''
+                + ", version=" + version
+                + ", releaseState=" + releaseState
+                + ", projectCode=" + projectCode
+                + ", description='" + description + '\''
+                + ", globalParams='" + globalParams + '\''
+                + ", globalParamList=" + globalParamList
+                + ", globalParamMap=" + globalParamMap
+                + ", createTime=" + createTime
+                + ", updateTime=" + updateTime
+                + ", flag=" + flag
+                + ", userId=" + userId
+                + ", userName='" + userName + '\''
+                + ", projectName='" + projectName + '\''
+                + ", locations='" + locations + '\''
+                + ", scheduleReleaseState=" + scheduleReleaseState
+                + ", timeout=" + timeout
+                + ", tenantId=" + tenantId
+                + ", tenantCode='" + tenantCode + '\''
+                + ", modifyBy='" + modifyBy + '\''
+                + ", warningGroupId=" + warningGroupId
+                + '}';
     }
 }
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessInstance.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessInstance.java
index ed2a383ba2..c3d42b1833 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessInstance.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessInstance.java
@@ -34,6 +34,9 @@ import java.util.Date;
 import java.util.List;
 import java.util.Objects;
 
+import lombok.Data;
+import lombok.NoArgsConstructor;
+
 import com.baomidou.mybatisplus.annotation.IdType;
 import com.baomidou.mybatisplus.annotation.TableField;
 import com.baomidou.mybatisplus.annotation.TableId;
@@ -56,7 +59,7 @@ public class ProcessInstance {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * process definition code
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessInstanceMap.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessInstanceMap.java
index 4cd2186f18..817998bf66 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessInstanceMap.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessInstanceMap.java
@@ -31,7 +31,7 @@ public class ProcessInstanceMap {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * parent process instance id
@@ -48,7 +48,7 @@ public class ProcessInstanceMap {
      */
     private int processInstanceId;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessTaskRelation.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessTaskRelation.java
index 4f999b6c06..2946e38667 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessTaskRelation.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProcessTaskRelation.java
@@ -39,7 +39,7 @@ public class ProcessTaskRelation {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * name
@@ -140,7 +140,7 @@ public class ProcessTaskRelation {
         this.name = name;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -246,36 +246,37 @@ public class ProcessTaskRelation {
         }
         ProcessTaskRelation that = (ProcessTaskRelation) o;
         return processDefinitionVersion == that.processDefinitionVersion
-            && projectCode == that.projectCode
-            && processDefinitionCode == that.processDefinitionCode
-            && preTaskCode == that.preTaskCode
-            && preTaskVersion == that.preTaskVersion
-            && postTaskCode == that.postTaskCode
-            && postTaskVersion == that.postTaskVersion
-            && Objects.equals(name, that.name);
+                && projectCode == that.projectCode
+                && processDefinitionCode == that.processDefinitionCode
+                && preTaskCode == that.preTaskCode
+                && preTaskVersion == that.preTaskVersion
+                && postTaskCode == that.postTaskCode
+                && postTaskVersion == that.postTaskVersion
+                && Objects.equals(name, that.name);
     }
 
     @Override
     public int hashCode() {
-        return Objects.hash(name, processDefinitionVersion, projectCode, processDefinitionCode, preTaskCode, preTaskVersion, postTaskCode, postTaskVersion);
+        return Objects.hash(name, processDefinitionVersion, projectCode, processDefinitionCode, preTaskCode,
+                preTaskVersion, postTaskCode, postTaskVersion);
     }
 
     @Override
     public String toString() {
         return "ProcessTaskRelation{"
-            + "id=" + id
-            + ", name='" + name + '\''
-            + ", processDefinitionVersion=" + processDefinitionVersion
-            + ", projectCode=" + projectCode
-            + ", processDefinitionCode=" + processDefinitionCode
-            + ", preTaskCode=" + preTaskCode
-            + ", preTaskVersion=" + preTaskVersion
-            + ", postTaskCode=" + postTaskCode
-            + ", postTaskVersion=" + postTaskVersion
-            + ", conditionType=" + conditionType
-            + ", conditionParams='" + conditionParams + '\''
-            + ", createTime=" + createTime
-            + ", updateTime=" + updateTime
-            + '}';
+                + "id=" + id
+                + ", name='" + name + '\''
+                + ", processDefinitionVersion=" + processDefinitionVersion
+                + ", projectCode=" + projectCode
+                + ", processDefinitionCode=" + processDefinitionCode
+                + ", preTaskCode=" + preTaskCode
+                + ", preTaskVersion=" + preTaskVersion
+                + ", postTaskCode=" + postTaskCode
+                + ", postTaskVersion=" + postTaskVersion
+                + ", conditionType=" + conditionType
+                + ", conditionParams='" + conditionParams + '\''
+                + ", createTime=" + createTime
+                + ", updateTime=" + updateTime
+                + '}';
     }
 }
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Project.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Project.java
index 2a41c01306..ab0ec861ea 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Project.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Project.java
@@ -34,7 +34,7 @@ public class Project {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * user id
@@ -115,7 +115,7 @@ public class Project {
         this.instRunningCount = instRunningCount;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -226,6 +226,7 @@ public class Project {
     }
 
     public static final class Builder {
+
         private int id;
         private int userId;
         private String userName;
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProjectUser.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProjectUser.java
index 63c292ea10..e4de108250 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProjectUser.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/ProjectUser.java
@@ -26,11 +26,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_relation_project_user")
 public class ProjectUser {
+
     /**
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     @TableField("user_id")
     private int userId;
@@ -67,7 +68,7 @@ public class ProjectUser {
     @TableField("update_time")
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -142,15 +143,15 @@ public class ProjectUser {
     @Override
     public String toString() {
         return "ProjectUser{"
-               + "id=" + id
-               + ", userId=" + userId
-               + ", projectId=" + projectId
-               + ", projectCode=" + projectCode
-               + ", projectName='" + projectName + '\''
-               + ", userName='" + userName + '\''
-               + ", perm=" + perm
-               + ", createTime=" + createTime
-               + ", updateTime=" + updateTime
-               + '}';
+                + "id=" + id
+                + ", userId=" + userId
+                + ", projectId=" + projectId
+                + ", projectCode=" + projectCode
+                + ", projectName='" + projectName + '\''
+                + ", userName='" + userName + '\''
+                + ", perm=" + perm
+                + ", createTime=" + createTime
+                + ", updateTime=" + updateTime
+                + '}';
     }
 }
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Queue.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Queue.java
index cc423e13ca..6d7261d195 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Queue.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Queue.java
@@ -16,12 +16,12 @@
  */
 package org.apache.dolphinscheduler.dao.entity;
 
+import java.util.Date;
+
 import com.baomidou.mybatisplus.annotation.IdType;
 import com.baomidou.mybatisplus.annotation.TableId;
 import com.baomidou.mybatisplus.annotation.TableName;
 
-import java.util.Date;
-
 /**
  * queue
  */
@@ -31,8 +31,8 @@ public class Queue {
     /**
      * id
      */
-    @TableId(value="id", type=IdType.AUTO)
-    private int id;
+    @TableId(value = "id", type = IdType.AUTO)
+    private Integer id;
     /**
      * queue name
      */
@@ -71,7 +71,7 @@ public class Queue {
         this.updateTime = now;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Resource.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Resource.java
index 1cafe09291..95a9078f99 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Resource.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Resource.java
@@ -17,22 +17,23 @@
 
 package org.apache.dolphinscheduler.dao.entity;
 
-import com.baomidou.mybatisplus.annotation.TableField;
 import org.apache.dolphinscheduler.spi.enums.ResourceType;
 
 import java.util.Date;
 
 import com.baomidou.mybatisplus.annotation.IdType;
+import com.baomidou.mybatisplus.annotation.TableField;
 import com.baomidou.mybatisplus.annotation.TableId;
 import com.baomidou.mybatisplus.annotation.TableName;
 
 @TableName("t_ds_resources")
 public class Resource {
+
     /**
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * parent id
@@ -95,7 +96,6 @@ public class Resource {
     @TableField(exist = false)
     private String userName;
 
-
     public Resource() {
     }
 
@@ -121,7 +121,8 @@ public class Resource {
         this.isDirectory = isDirectory;
     }
 
-    public Resource(int pid, String alias, String fullName, boolean isDirectory, String description, String fileName, int userId, ResourceType type, long size, Date createTime, Date updateTime) {
+    public Resource(int pid, String alias, String fullName, boolean isDirectory, String description, String fileName,
+                    int userId, ResourceType type, long size, Date createTime, Date updateTime) {
         this.pid = pid;
         this.alias = alias;
         this.fullName = fullName;
@@ -135,7 +136,7 @@ public class Resource {
         this.updateTime = updateTime;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -199,7 +200,6 @@ public class Resource {
         this.userId = userId;
     }
 
-
     public ResourceType getType() {
         return type;
     }
@@ -243,20 +243,20 @@ public class Resource {
     @Override
     public String toString() {
         return "Resource{" +
-            "id=" + id +
-            ", pid=" + pid +
-            ", alias='" + alias + '\'' +
-            ", fullName='" + fullName + '\'' +
-            ", isDirectory=" + isDirectory +
-            ", description='" + description + '\'' +
-            ", fileName='" + fileName + '\'' +
-            ", userId=" + userId +
-            ", type=" + type +
-            ", size=" + size +
-            ", createTime=" + createTime +
-            ", updateTime=" + updateTime +
-            ",userName=" + userName +
-            '}';
+                "id=" + id +
+                ", pid=" + pid +
+                ", alias='" + alias + '\'' +
+                ", fullName='" + fullName + '\'' +
+                ", isDirectory=" + isDirectory +
+                ", description='" + description + '\'' +
+                ", fileName='" + fileName + '\'' +
+                ", userId=" + userId +
+                ", type=" + type +
+                ", size=" + size +
+                ", createTime=" + createTime +
+                ", updateTime=" + updateTime +
+                ",userName=" + userName +
+                '}';
     }
 
     @Override
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Schedule.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Schedule.java
index 4d21f25e25..bafd40df0a 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Schedule.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Schedule.java
@@ -37,7 +37,7 @@ import com.baomidou.mybatisplus.annotation.TableName;
 public class Schedule {
 
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * process definition code
@@ -124,7 +124,6 @@ public class Schedule {
      */
     private int warningGroupId;
 
-
     /**
      * process instance priority
      */
@@ -263,7 +262,7 @@ public class Schedule {
         this.userName = userName;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskDefinition.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskDefinition.java
index 2db49f9e07..e2b81b53b5 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskDefinition.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskDefinition.java
@@ -21,12 +21,13 @@ import org.apache.dolphinscheduler.common.Constants;
 import org.apache.dolphinscheduler.common.enums.Flag;
 import org.apache.dolphinscheduler.common.enums.Priority;
 import org.apache.dolphinscheduler.common.enums.TaskExecuteType;
-import org.apache.dolphinscheduler.plugin.task.api.enums.TaskTimeoutStrategy;
 import org.apache.dolphinscheduler.common.enums.TimeoutFlag;
 import org.apache.dolphinscheduler.common.utils.JSONUtils;
+import org.apache.dolphinscheduler.plugin.task.api.enums.TaskTimeoutStrategy;
 import org.apache.dolphinscheduler.plugin.task.api.model.Property;
 
 import org.apache.commons.collections4.CollectionUtils;
+
 import java.util.Date;
 import java.util.HashMap;
 import java.util.List;
@@ -53,7 +54,7 @@ public class TaskDefinition {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * code
@@ -241,7 +242,7 @@ public class TaskDefinition {
         this.name = name;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -326,7 +327,7 @@ public class TaskDefinition {
         if (taskParamMap == null && !Strings.isNullOrEmpty(taskParams)) {
             JsonNode localParams = JSONUtils.parseObject(taskParams).findValue("localParams");
 
-            //If a jsonNode is null, not only use !=null, but also it should use the isNull method to be estimated.
+            // If a jsonNode is null, not only use !=null, but also it should use the isNull method to be estimated.
             if (localParams != null && !localParams.isNull()) {
                 List<Property> propList = JSONUtils.toList(localParams.toString(), Property.class);
 
@@ -504,27 +505,27 @@ public class TaskDefinition {
         }
         TaskDefinition that = (TaskDefinition) o;
         return failRetryTimes == that.failRetryTimes
-            && failRetryInterval == that.failRetryInterval
-            && timeout == that.timeout
-            && delayTime == that.delayTime
-            && Objects.equals(name, that.name)
-            && Objects.equals(description, that.description)
-            && Objects.equals(taskType, that.taskType)
-            && Objects.equals(taskParams, that.taskParams)
-            && flag == that.flag
-            && taskPriority == that.taskPriority
-            && Objects.equals(workerGroup, that.workerGroup)
-            && timeoutFlag == that.timeoutFlag
-            && timeoutNotifyStrategy == that.timeoutNotifyStrategy
-            && (Objects.equals(resourceIds, that.resourceIds)
-            || ("".equals(resourceIds) && that.resourceIds == null)
-            || ("".equals(that.resourceIds) && resourceIds == null))
-            && environmentCode == that.environmentCode
-            && taskGroupId == that.taskGroupId
-            && taskGroupPriority == that.taskGroupPriority
-            && Objects.equals(cpuQuota, that.cpuQuota)
-            && Objects.equals(memoryMax, that.memoryMax)
-            && Objects.equals(taskExecuteType, that.taskExecuteType);
+                && failRetryInterval == that.failRetryInterval
+                && timeout == that.timeout
+                && delayTime == that.delayTime
+                && Objects.equals(name, that.name)
+                && Objects.equals(description, that.description)
+                && Objects.equals(taskType, that.taskType)
+                && Objects.equals(taskParams, that.taskParams)
+                && flag == that.flag
+                && taskPriority == that.taskPriority
+                && Objects.equals(workerGroup, that.workerGroup)
+                && timeoutFlag == that.timeoutFlag
+                && timeoutNotifyStrategy == that.timeoutNotifyStrategy
+                && (Objects.equals(resourceIds, that.resourceIds)
+                        || ("".equals(resourceIds) && that.resourceIds == null)
+                        || ("".equals(that.resourceIds) && resourceIds == null))
+                && environmentCode == that.environmentCode
+                && taskGroupId == that.taskGroupId
+                && taskGroupPriority == that.taskGroupPriority
+                && Objects.equals(cpuQuota, that.cpuQuota)
+                && Objects.equals(memoryMax, that.memoryMax)
+                && Objects.equals(taskExecuteType, that.taskExecuteType);
     }
 
     @Override
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskGroup.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskGroup.java
index cbb3b3008c..34576d5974 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskGroup.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskGroup.java
@@ -29,6 +29,7 @@ import com.baomidou.mybatisplus.annotation.TableName;
  */
 @TableName("t_ds_task_group")
 public class TaskGroup implements Serializable {
+
     /**
      * key
      */
@@ -69,7 +70,7 @@ public class TaskGroup implements Serializable {
      */
     private long projectCode;
 
-    public TaskGroup(String name,long projectCode, String description, int groupSize, int userId,int status) {
+    public TaskGroup(String name, long projectCode, String description, int groupSize, int userId, int status) {
         this.name = name;
         this.projectCode = projectCode;
         this.description = description;
@@ -104,7 +105,7 @@ public class TaskGroup implements Serializable {
                 + '}';
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskGroupQueue.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskGroupQueue.java
index b9959eb96e..2f50ab345a 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskGroupQueue.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskGroupQueue.java
@@ -32,11 +32,12 @@ import com.baomidou.mybatisplus.annotation.TableName;
  */
 @TableName("t_ds_task_group_queue")
 public class TaskGroupQueue implements Serializable {
+
     /**
      * key
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * taskInstanceId
      */
@@ -99,7 +100,8 @@ public class TaskGroupQueue implements Serializable {
 
     }
 
-    public TaskGroupQueue(int taskId, String taskName, int groupId, int processId, int priority, TaskGroupQueueStatus status) {
+    public TaskGroupQueue(int taskId, String taskName, int groupId, int processId, int priority,
+                          TaskGroupQueueStatus status) {
         this.taskId = taskId;
         this.taskName = taskName;
         this.groupId = groupId;
@@ -108,7 +110,7 @@ public class TaskGroupQueue implements Serializable {
         this.status = status;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -175,16 +177,16 @@ public class TaskGroupQueue implements Serializable {
     @Override
     public String toString() {
         return "TaskGroupQueue{"
-            + "id=" + id
-            + ", taskId=" + taskId
-            + ", taskName='" + taskName + '\''
-            + ", groupId=" + groupId
-            + ", processId=" + processId
-            + ", priority=" + priority
-            + ", status=" + status
-            + ", createTime=" + createTime
-            + ", updateTime=" + updateTime
-            + '}';
+                + "id=" + id
+                + ", taskId=" + taskId
+                + ", taskName='" + taskName + '\''
+                + ", groupId=" + groupId
+                + ", processId=" + processId
+                + ", priority=" + priority
+                + ", status=" + status
+                + ", createTime=" + createTime
+                + ", updateTime=" + updateTime
+                + '}';
     }
 
     public TaskGroupQueueStatus getStatus() {
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskInstance.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskInstance.java
index 91d82df089..08ab6d11ac 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskInstance.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskInstance.java
@@ -38,14 +38,14 @@ import java.io.Serializable;
 import java.util.Date;
 import java.util.Map;
 
+import lombok.Data;
+
 import com.baomidou.mybatisplus.annotation.IdType;
 import com.baomidou.mybatisplus.annotation.TableField;
 import com.baomidou.mybatisplus.annotation.TableId;
 import com.baomidou.mybatisplus.annotation.TableName;
 import com.fasterxml.jackson.core.type.TypeReference;
 
-import lombok.Data;
-
 /**
  * task instance
  */
@@ -57,7 +57,7 @@ public class TaskInstance implements Serializable {
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     /**
      * task name
@@ -371,7 +371,6 @@ public class TaskInstance implements Serializable {
         return endTime == null;
     }
 
-
     /**
      * determine if a task instance can retry
      * if subProcess,
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Tenant.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Tenant.java
index d6a548151e..6892db0afb 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Tenant.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/Tenant.java
@@ -16,14 +16,14 @@
  */
 package org.apache.dolphinscheduler.dao.entity;
 
+import java.util.Date;
+import java.util.Objects;
+
 import com.baomidou.mybatisplus.annotation.IdType;
 import com.baomidou.mybatisplus.annotation.TableField;
 import com.baomidou.mybatisplus.annotation.TableId;
 import com.baomidou.mybatisplus.annotation.TableName;
 
-import java.util.Date;
-import java.util.Objects;
-
 /**
  * tenant
  */
@@ -33,8 +33,8 @@ public class Tenant {
     /**
      * id
      */
-    @TableId(value="id", type=IdType.AUTO)
-    private int id;
+    @TableId(value = "id", type = IdType.AUTO)
+    private Integer id;
 
     /**
      * tenant code
@@ -94,7 +94,7 @@ public class Tenant {
         this.updateTime = now;
     }
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/UdfFunc.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/UdfFunc.java
index 8eb4d20033..6615d3394f 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/UdfFunc.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/UdfFunc.java
@@ -17,6 +17,12 @@
 
 package org.apache.dolphinscheduler.dao.entity;
 
+import org.apache.dolphinscheduler.common.enums.UdfType;
+import org.apache.dolphinscheduler.common.utils.JSONUtils;
+
+import java.io.IOException;
+import java.util.Date;
+
 import com.baomidou.mybatisplus.annotation.IdType;
 import com.baomidou.mybatisplus.annotation.TableField;
 import com.baomidou.mybatisplus.annotation.TableId;
@@ -24,22 +30,18 @@ import com.baomidou.mybatisplus.annotation.TableName;
 import com.fasterxml.jackson.databind.DeserializationContext;
 import com.fasterxml.jackson.databind.KeyDeserializer;
 import com.google.common.base.Strings;
-import org.apache.dolphinscheduler.common.enums.UdfType;
-import org.apache.dolphinscheduler.common.utils.JSONUtils;
-
-import java.io.IOException;
-import java.util.Date;
 
 /**
  * udf function
  */
 @TableName("t_ds_udfs")
 public class UdfFunc {
+
     /**
      * id
      */
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
     /**
      * user id
      */
@@ -111,7 +113,7 @@ public class UdfFunc {
     @TableField(exist = false)
     private String userName;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -199,7 +201,6 @@ public class UdfFunc {
         this.createTime = createTime;
     }
 
-
     public Date getUpdateTime() {
         return updateTime;
     }
@@ -246,7 +247,7 @@ public class UdfFunc {
         return JSONUtils.toJsonString(this);
     }
 
-    public static  class UdfFuncDeserializer extends KeyDeserializer {
+    public static class UdfFuncDeserializer extends KeyDeserializer {
 
         @Override
         public Object deserializeKey(String key, DeserializationContext ctxt) throws IOException {
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/User.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/User.java
index c0e61fffd7..a4aaa1f6d0 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/User.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/User.java
@@ -18,24 +18,25 @@
 package org.apache.dolphinscheduler.dao.entity;
 
 import org.apache.dolphinscheduler.common.enums.UserType;
+
+import java.util.Date;
+
 import com.baomidou.mybatisplus.annotation.IdType;
 import com.baomidou.mybatisplus.annotation.TableField;
 import com.baomidou.mybatisplus.annotation.TableId;
 import com.baomidou.mybatisplus.annotation.TableName;
 
-import java.util.Date;
-
 /**
  * user
  */
 @TableName("t_ds_user")
-public class  User {
+public class User {
 
     /**
      * id
      */
-    @TableId(value="id", type=IdType.AUTO)
-    private int id;
+    @TableId(value = "id", type = IdType.AUTO)
+    private Integer id;
 
     /**
      * user name
@@ -110,7 +111,7 @@ public class  User {
      */
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -174,7 +175,6 @@ public class  User {
         this.updateTime = updateTime;
     }
 
-
     public String getPhone() {
         return phone;
     }
@@ -270,7 +270,7 @@ public class  User {
                 ", tenantCode='" + tenantCode + '\'' +
                 ", queueName='" + queueName + '\'' +
                 ", alertGroup='" + alertGroup + '\'' +
-                ", queue='" + queue + '\''  +
+                ", queue='" + queue + '\'' +
                 ", timeZone='" + timeZone + '\'' +
                 ", createTime=" + createTime +
                 ", updateTime=" + updateTime +
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/WorkerGroup.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/WorkerGroup.java
index 2fdd711549..c7c50ccb33 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/WorkerGroup.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/WorkerGroup.java
@@ -19,11 +19,12 @@ package org.apache.dolphinscheduler.dao.entity;
 
 import java.util.Date;
 
+import lombok.Data;
+
 import com.baomidou.mybatisplus.annotation.IdType;
 import com.baomidou.mybatisplus.annotation.TableField;
 import com.baomidou.mybatisplus.annotation.TableId;
 import com.baomidou.mybatisplus.annotation.TableName;
-import lombok.Data;
 
 /**
  * worker group
@@ -33,7 +34,7 @@ import lombok.Data;
 public class WorkerGroup {
 
     @TableId(value = "id", type = IdType.AUTO)
-    private int id;
+    private Integer id;
 
     private String name;
 
diff --git a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/WorkerServer.java b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/WorkerServer.java
index 3fbbf71f74..28fd5b3749 100644
--- a/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/WorkerServer.java
+++ b/dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/WorkerServer.java
@@ -36,7 +36,6 @@ public class WorkerServer {
      */
     private int port;
 
-
     /**
      * zookeeper directory
      */
@@ -57,7 +56,7 @@ public class WorkerServer {
      */
     private Date lastHeartbeatTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/CommandMapperTest.java b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/CommandMapperTest.java
index 975a910bf1..bab5d103a9 100644
--- a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/CommandMapperTest.java
+++ b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/CommandMapperTest.java
@@ -64,7 +64,7 @@ public class CommandMapperTest extends BaseDaoTest {
     @Test
     public void testInsert() {
         Command command = createCommand();
-        assertThat(command.getId(),greaterThan(0));
+        assertThat(command.getId(), greaterThan(0));
     }
 
     /**
@@ -73,7 +73,7 @@ public class CommandMapperTest extends BaseDaoTest {
     @Test
     public void testSelectById() {
         Command expectedCommand = createCommand();
-        //query
+        // query
         Command actualCommand = commandMapper.selectById(expectedCommand.getId());
 
         assertNotNull(actualCommand);
@@ -96,7 +96,7 @@ public class CommandMapperTest extends BaseDaoTest {
         Command actualCommand = commandMapper.selectById(expectedCommand.getId());
 
         assertNotNull(actualCommand);
-        assertEquals(expectedCommand.getUpdateTime(),actualCommand.getUpdateTime());
+        assertEquals(expectedCommand.getUpdateTime(), actualCommand.getUpdateTime());
 
     }
 
@@ -114,8 +114,6 @@ public class CommandMapperTest extends BaseDaoTest {
         assertNull(actualCommand);
     }
 
-
-
     /**
      * test query all
      */
@@ -140,7 +138,7 @@ public class CommandMapperTest extends BaseDaoTest {
 
         createCommand(CommandType.START_PROCESS, processDefinition.getCode());
 
-        List<Command> actualCommand = commandMapper.queryCommandPage(1,0);
+        List<Command> actualCommand = commandMapper.queryCommandPage(1, 0);
 
         assertNotNull(actualCommand);
     }
@@ -164,7 +162,7 @@ public class CommandMapperTest extends BaseDaoTest {
 
         List<CommandCount> actualCommandCounts = commandMapper.countCommandState(startTime, endTime, projectCodeArray);
 
-        assertThat(actualCommandCounts.size(),greaterThanOrEqualTo(1));
+        assertThat(actualCommandCounts.size(), greaterThanOrEqualTo(1));
     }
 
     /**
@@ -175,19 +173,19 @@ public class CommandMapperTest extends BaseDaoTest {
         int masterCount = 4;
         int thisMasterSlot = 2;
         // for hit or miss
-        toTestQueryCommandPageBySlot(masterCount,thisMasterSlot);
-        toTestQueryCommandPageBySlot(masterCount,thisMasterSlot);
-        toTestQueryCommandPageBySlot(masterCount,thisMasterSlot);
-        toTestQueryCommandPageBySlot(masterCount,thisMasterSlot);
+        toTestQueryCommandPageBySlot(masterCount, thisMasterSlot);
+        toTestQueryCommandPageBySlot(masterCount, thisMasterSlot);
+        toTestQueryCommandPageBySlot(masterCount, thisMasterSlot);
+        toTestQueryCommandPageBySlot(masterCount, thisMasterSlot);
     }
 
     private boolean toTestQueryCommandPageBySlot(int masterCount, int thisMasterSlot) {
         Command command = createCommand();
-        int id = command.getId();
+        Integer id = command.getId();
         boolean hit = id % masterCount == thisMasterSlot;
         List<Command> commandList = commandMapper.queryCommandPageBySlot(1, 0, masterCount, thisMasterSlot);
         if (hit) {
-            assertEquals(id,commandList.get(0).getId());
+            assertEquals(id, commandList.get(0).getId());
         } else {
             commandList.forEach(o -> {
                 assertNotEquals(id, o.getId());
@@ -197,8 +195,6 @@ public class CommandMapperTest extends BaseDaoTest {
         return hit;
     }
 
-
-
     /**
      * create command map
      * @param count map count
@@ -207,13 +203,13 @@ public class CommandMapperTest extends BaseDaoTest {
      * @return command map
      */
     private CommandCount createCommandMap(
-            Integer count,
-            CommandType commandType,
-            long processDefinitionCode) {
+                                          Integer count,
+                                          CommandType commandType,
+                                          long processDefinitionCode) {
 
         CommandCount commandCount = new CommandCount();
 
-        for (int i = 0;i < count;i++) {
+        for (int i = 0; i < count; i++) {
             createCommand(commandType, processDefinitionCode);
         }
         commandCount.setCommandType(commandType);
@@ -246,12 +242,12 @@ public class CommandMapperTest extends BaseDaoTest {
      * @param count map count
      * @return command map
      */
-    private Map<Integer,Command> createCommandMap(Integer count) {
-        Map<Integer,Command> commandMap = new HashMap<>();
+    private Map<Integer, Command> createCommandMap(Integer count) {
+        Map<Integer, Command> commandMap = new HashMap<>();
 
-        for (int i = 0;i < count;i++) {
+        for (int i = 0; i < count; i++) {
             Command command = createCommand();
-            commandMap.put(command.getId(),command);
+            commandMap.put(command.getId(), command);
         }
         return commandMap;
     }
@@ -261,7 +257,7 @@ public class CommandMapperTest extends BaseDaoTest {
      * @return
      */
     private Command createCommand() {
-        return createCommand(CommandType.START_PROCESS,1);
+        return createCommand(CommandType.START_PROCESS, 1);
     }
 
     /**
diff --git a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/DataSourceMapperTest.java b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/DataSourceMapperTest.java
index 949bf7bbd5..606cec6816 100644
--- a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/DataSourceMapperTest.java
+++ b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/DataSourceMapperTest.java
@@ -18,7 +18,6 @@
 package org.apache.dolphinscheduler.dao.mapper;
 
 import static java.util.stream.Collectors.toList;
-
 import static org.hamcrest.Matchers.greaterThan;
 import static org.hamcrest.Matchers.greaterThanOrEqualTo;
 import static org.junit.Assert.assertEquals;
@@ -86,7 +85,6 @@ public class DataSourceMapperTest extends BaseDaoTest {
         assertEquals(expectedDataSource, actualDataSource);
     }
 
-
     /**
      * test query
      */
@@ -108,7 +106,6 @@ public class DataSourceMapperTest extends BaseDaoTest {
         assertEquals(expectedDataSource, actualDataSource);
     }
 
-
     /**
      * test delete
      */
@@ -123,8 +120,6 @@ public class DataSourceMapperTest extends BaseDaoTest {
         assertNull(actualDataSource);
     }
 
-
-
     /**
      * test query datasource by type
      */
@@ -142,7 +137,7 @@ public class DataSourceMapperTest extends BaseDaoTest {
         for (DataSource actualDataSource : actualDataSources) {
             DataSource expectedDataSource = datasourceMap.get(actualDataSource.getId());
             if (expectedDataSource != null) {
-                assertEquals(expectedDataSource,actualDataSource);
+                assertEquals(expectedDataSource, actualDataSource);
             }
         }
 
@@ -166,7 +161,7 @@ public class DataSourceMapperTest extends BaseDaoTest {
         for (DataSource actualDataSource : actualDataSources) {
             DataSource expectedDataSource = expectedDataSourceMap.get(actualDataSource.getId());
             if (expectedDataSource != null) {
-                assertEquals(expectedDataSource,actualDataSource);
+                assertEquals(expectedDataSource, actualDataSource);
             }
         }
 
@@ -184,7 +179,7 @@ public class DataSourceMapperTest extends BaseDaoTest {
 
         for (DataSource actualDataSource : actualDataSources) {
             if (expectedDataSource.getId() == actualDataSource.getId()) {
-                assertEquals(expectedDataSource,actualDataSource);
+                assertEquals(expectedDataSource, actualDataSource);
             }
         }
 
@@ -205,7 +200,7 @@ public class DataSourceMapperTest extends BaseDaoTest {
         for (DataSource actualDataSource : actualDataSources) {
             DataSource expectedDataSource = expectedDataSourceMap.get(actualDataSource.getId());
             if (expectedDataSource != null) {
-                assertEquals(expectedDataSource,actualDataSource);
+                assertEquals(expectedDataSource, actualDataSource);
             }
         }
 
@@ -226,7 +221,7 @@ public class DataSourceMapperTest extends BaseDaoTest {
         for (DataSource actualDataSource : actualDataSources) {
             DataSource expectedDataSource = expectedDataSourceMap.get(actualDataSource.getId());
             if (expectedDataSource != null) {
-                assertEquals(expectedDataSource,actualDataSource);
+                assertEquals(expectedDataSource, actualDataSource);
             }
         }
     }
@@ -247,48 +242,51 @@ public class DataSourceMapperTest extends BaseDaoTest {
         for (DataSource actualDataSource : actualDataSources) {
             DataSource expectedDataSource = expectedDataSourceMap.get(actualDataSource.getId());
             if (expectedDataSource != null) {
-                assertEquals(expectedDataSource,actualDataSource);
+                assertEquals(expectedDataSource, actualDataSource);
             }
         }
     }
 
     @Test
     public void testListAuthorizedDataSource() {
-        //create general user
+        // create general user
         User generalUser1 = createGeneralUser("user1");
         User generalUser2 = createGeneralUser("user2");
 
-        //create data source
+        // create data source
         DataSource dataSource = createDataSource(generalUser1.getId(), "ds-1");
         DataSource unauthorizdDataSource = createDataSource(generalUser2.getId(), "ds-2");
 
-        //data source ids
+        // data source ids
         Integer[] dataSourceIds = new Integer[]{dataSource.getId(), unauthorizdDataSource.getId()};
 
-        List<DataSource> authorizedDataSource = dataSourceMapper.listAuthorizedDataSource(generalUser1.getId(), dataSourceIds);
+        List<DataSource> authorizedDataSource =
+                dataSourceMapper.listAuthorizedDataSource(generalUser1.getId(), dataSourceIds);
 
-        assertEquals(generalUser1.getId(), dataSource.getUserId());
-        Assert.assertNotEquals(generalUser1.getId(), unauthorizdDataSource.getUserId());
-        Assert.assertFalse(authorizedDataSource.stream().map(t -> t.getId()).collect(toList()).containsAll(Arrays.asList(dataSourceIds)));
+        assertEquals(generalUser1.getId().intValue(), dataSource.getUserId());
+        Assert.assertNotEquals(generalUser1.getId().intValue(), unauthorizdDataSource.getUserId());
+        Assert.assertFalse(authorizedDataSource.stream().map(t -> t.getId()).collect(toList())
+                .containsAll(Arrays.asList(dataSourceIds)));
 
-        //authorize object unauthorizdDataSource to generalUser1
+        // authorize object unauthorizdDataSource to generalUser1
         createUserDataSource(generalUser1, unauthorizdDataSource);
         authorizedDataSource = dataSourceMapper.listAuthorizedDataSource(generalUser1.getId(), dataSourceIds);
 
-        Assert.assertTrue(authorizedDataSource.stream().map(t -> t.getId()).collect(toList()).containsAll(Arrays.asList(dataSourceIds)));
+        Assert.assertTrue(authorizedDataSource.stream().map(t -> t.getId()).collect(toList())
+                .containsAll(Arrays.asList(dataSourceIds)));
     }
 
     /**
      * create datasource relation
      * @param userId
      */
-    private Map<Integer,DataSource> createDataSourceMap(Integer userId,String name) {
+    private Map<Integer, DataSource> createDataSourceMap(Integer userId, String name) {
 
-        Map<Integer,DataSource> dataSourceMap = new HashMap<>();
+        Map<Integer, DataSource> dataSourceMap = new HashMap<>();
 
         DataSource dataSource = createDataSource(userId, name);
 
-        dataSourceMap.put(dataSource.getId(),dataSource);
+        dataSourceMap.put(dataSource.getId(), dataSource);
 
         DataSource otherDataSource = createDataSource(userId + 1, name + "1");
 
@@ -312,12 +310,12 @@ public class DataSourceMapperTest extends BaseDaoTest {
      * @param count datasource count
      * @return datasource map
      */
-    private Map<Integer,DataSource> createDataSourceMap(Integer count) {
-        Map<Integer,DataSource> dataSourceMap = new HashMap<>();
+    private Map<Integer, DataSource> createDataSourceMap(Integer count) {
+        Map<Integer, DataSource> dataSourceMap = new HashMap<>();
 
         for (int i = 0; i < count; i++) {
             DataSource dataSource = createDataSource("test");
-            dataSourceMap.put(dataSource.getId(),dataSource);
+            dataSourceMap.put(dataSource.getId(), dataSource);
         }
 
         return dataSourceMap;
@@ -328,7 +326,7 @@ public class DataSourceMapperTest extends BaseDaoTest {
      * @return datasource
      */
     private DataSource createDataSource() {
-        return createDataSource(1,"test");
+        return createDataSource(1, "test");
     }
 
     /**
@@ -337,7 +335,7 @@ public class DataSourceMapperTest extends BaseDaoTest {
      * @return datasource
      */
     private DataSource createDataSource(String name) {
-        return createDataSource(1,name);
+        return createDataSource(1, name);
     }
 
     /**
@@ -346,7 +344,7 @@ public class DataSourceMapperTest extends BaseDaoTest {
      * @param name name
      * @return datasource
      */
-    private DataSource createDataSource(Integer userId,String name) {
+    private DataSource createDataSource(Integer userId, String name) {
         Random random = new Random();
         DataSource dataSource = new DataSource();
         dataSource.setUserId(userId);
@@ -399,4 +397,4 @@ public class DataSourceMapperTest extends BaseDaoTest {
         return datasourceUser;
     }
 
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/ProcessDefinitionLogMapperTest.java b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/ProcessDefinitionLogMapperTest.java
index 405bc8628b..1c2c08889a 100644
--- a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/ProcessDefinitionLogMapperTest.java
+++ b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/ProcessDefinitionLogMapperTest.java
@@ -34,6 +34,7 @@ import com.baomidou.mybatisplus.core.metadata.IPage;
 import com.baomidou.mybatisplus.extension.plugins.pagination.Page;
 
 public class ProcessDefinitionLogMapperTest extends BaseDaoTest {
+
     @Autowired
     private UserMapper userMapper;
 
@@ -49,7 +50,7 @@ public class ProcessDefinitionLogMapperTest extends BaseDaoTest {
      * @return ProcessDefinition
      */
     private ProcessDefinitionLog insertOne() {
-        //insertOne
+        // insertOne
         ProcessDefinitionLog processDefinitionLog = new ProcessDefinitionLog();
         processDefinitionLog.setCode(1L);
         processDefinitionLog.setName("def 1");
@@ -68,7 +69,7 @@ public class ProcessDefinitionLogMapperTest extends BaseDaoTest {
      * @return ProcessDefinition
      */
     private ProcessDefinitionLog insertTwo() {
-        //insertOne
+        // insertOne
         ProcessDefinitionLog processDefinitionLog = new ProcessDefinitionLog();
         processDefinitionLog.setCode(1L);
         processDefinitionLog.setName("def 2");
@@ -85,7 +86,7 @@ public class ProcessDefinitionLogMapperTest extends BaseDaoTest {
     @Test
     public void testInsert() {
         ProcessDefinitionLog processDefinitionLog = insertOne();
-        Assert.assertNotEquals(processDefinitionLog.getId(), 0);
+        Assert.assertNotEquals(processDefinitionLog.getId().intValue(), 0);
     }
 
     @Test
@@ -142,7 +143,8 @@ public class ProcessDefinitionLogMapperTest extends BaseDaoTest {
     public void testQueryProcessDefinitionVersionsPaging() {
         insertOne();
         Page<ProcessDefinitionLog> page = new Page(1, 3);
-        IPage<ProcessDefinitionLog> processDefinitionLogs = processDefinitionLogMapper.queryProcessDefinitionVersionsPaging(page, 1L,1L);
+        IPage<ProcessDefinitionLog> processDefinitionLogs =
+                processDefinitionLogMapper.queryProcessDefinitionVersionsPaging(page, 1L, 1L);
         Assert.assertNotEquals(processDefinitionLogs.getTotal(), 0);
     }
 
diff --git a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/ResourceMapperTest.java b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/ResourceMapperTest.java
index d78721587d..5698bffc72 100644
--- a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/ResourceMapperTest.java
+++ b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/ResourceMapperTest.java
@@ -23,13 +23,13 @@ import static org.junit.Assert.assertNotNull;
 import static org.junit.Assert.assertThat;
 
 import org.apache.dolphinscheduler.common.Constants;
-import org.apache.dolphinscheduler.spi.enums.ResourceType;
 import org.apache.dolphinscheduler.common.enums.UserType;
 import org.apache.dolphinscheduler.dao.BaseDaoTest;
 import org.apache.dolphinscheduler.dao.entity.Resource;
 import org.apache.dolphinscheduler.dao.entity.ResourcesUser;
 import org.apache.dolphinscheduler.dao.entity.Tenant;
 import org.apache.dolphinscheduler.dao.entity.User;
+import org.apache.dolphinscheduler.spi.enums.ResourceType;
 
 import org.apache.commons.collections.CollectionUtils;
 
@@ -65,7 +65,7 @@ public class ResourceMapperTest extends BaseDaoTest {
      * @return Resource
      */
     private Resource insertOne() {
-        //insertOne
+        // insertOne
         Resource resource = new Resource();
         resource.setAlias("ut-resource");
         resource.setFullName("/ut-resource");
@@ -86,8 +86,9 @@ public class ResourceMapperTest extends BaseDaoTest {
      * @param user user
      * @return Resource
      */
-    private Resource createResource(User user, boolean isDirectory, ResourceType resourceType, int pid, String alias, String fullName) {
-        //insertOne
+    private Resource createResource(User user, boolean isDirectory, ResourceType resourceType, int pid, String alias,
+                                    String fullName) {
+        // insertOne
         Resource resource = new Resource();
         resource.setDirectory(isDirectory);
         resource.setType(resourceType);
@@ -108,7 +109,7 @@ public class ResourceMapperTest extends BaseDaoTest {
      * @return Resource
      */
     private Resource createResource(User user) {
-        //insertOne
+        // insertOne
         String alias = String.format("ut-resource-%s", user.getUserName());
         String fullName = String.format("/%s", alias);
 
@@ -144,7 +145,7 @@ public class ResourceMapperTest extends BaseDaoTest {
      * @return ResourcesUser
      */
     private ResourcesUser createResourcesUser(Resource resource, User user) {
-        //insertOne
+        // insertOne
         ResourcesUser resourcesUser = new ResourcesUser();
         resourcesUser.setCreateTime(new Date());
         resourcesUser.setUpdateTime(new Date());
@@ -167,10 +168,10 @@ public class ResourceMapperTest extends BaseDaoTest {
      */
     @Test
     public void testUpdate() {
-        //insertOne
+        // insertOne
         Resource resource = insertOne();
         resource.setCreateTime(new Date());
-        //update
+        // update
         int update = resourceMapper.updateById(resource);
         Assert.assertEquals(1, update);
     }
@@ -191,7 +192,7 @@ public class ResourceMapperTest extends BaseDaoTest {
     @Test
     public void testQuery() {
         Resource resource = insertOne();
-        //query
+        // query
         List<Resource> resources = resourceMapper.selectList(null);
         Assert.assertNotEquals(resources.size(), 0);
     }
@@ -210,8 +211,7 @@ public class ResourceMapperTest extends BaseDaoTest {
         List<Resource> resources = resourceMapper.queryResourceList(
                 alias,
                 userId,
-                type
-        );
+                type);
 
         Assert.assertNotEquals(resources.size(), 0);
     }
@@ -246,15 +246,13 @@ public class ResourceMapperTest extends BaseDaoTest {
                 -1,
                 resource.getType().ordinal(),
                 "",
-                new ArrayList<>(resource.getId())
-        );
+                new ArrayList<>(resource.getId()));
         IPage<Resource> resourceIPage1 = resourceMapper.queryResourcePaging(
                 page,
                 -1,
                 resource.getType().ordinal(),
                 "",
-                null
-        );
+                null);
         Assert.assertEquals(resourceIPage.getTotal(), 1);
         Assert.assertEquals(resourceIPage1.getTotal(), 1);
 
@@ -267,8 +265,10 @@ public class ResourceMapperTest extends BaseDaoTest {
     public void testQueryResourceListAuthored() {
         Resource resource = insertOne();
 
-        List<Integer> resIds = resourceUserMapper.queryResourcesIdListByUserIdAndPerm(resource.getUserId(), Constants.AUTHORIZE_WRITABLE_PERM);
-        List<Resource> resources = CollectionUtils.isEmpty(resIds) ? new ArrayList<>() : resourceMapper.queryResourceListById(resIds);
+        List<Integer> resIds = resourceUserMapper.queryResourcesIdListByUserIdAndPerm(resource.getUserId(),
+                Constants.AUTHORIZE_WRITABLE_PERM);
+        List<Resource> resources =
+                CollectionUtils.isEmpty(resIds) ? new ArrayList<>() : resourceMapper.queryResourceListById(resIds);
 
         ResourcesUser resourcesUser = new ResourcesUser();
 
@@ -277,8 +277,10 @@ public class ResourceMapperTest extends BaseDaoTest {
         resourcesUser.setPerm(Constants.AUTHORIZE_WRITABLE_PERM);
         resourceUserMapper.insert(resourcesUser);
 
-        List<Integer> resIds1 = resourceUserMapper.queryResourcesIdListByUserIdAndPerm(1110, Constants.AUTHORIZE_WRITABLE_PERM);
-        List<Resource> resources1 = CollectionUtils.isEmpty(resIds1) ? new ArrayList<>() : resourceMapper.queryResourceListById(resIds1);
+        List<Integer> resIds1 =
+                resourceUserMapper.queryResourcesIdListByUserIdAndPerm(1110, Constants.AUTHORIZE_WRITABLE_PERM);
+        List<Resource> resources1 =
+                CollectionUtils.isEmpty(resIds1) ? new ArrayList<>() : resourceMapper.queryResourceListById(resIds1);
 
         Assert.assertEquals(0, resources.size());
         Assert.assertNotEquals(0, resources1.size());
@@ -292,8 +294,10 @@ public class ResourceMapperTest extends BaseDaoTest {
     public void testQueryAuthorizedResourceList() {
         Resource resource = insertOne();
 
-        List<Integer> resIds = resourceUserMapper.queryResourcesIdListByUserIdAndPerm(resource.getUserId(), Constants.AUTHORIZE_WRITABLE_PERM);
-        List<Resource> resources = CollectionUtils.isEmpty(resIds) ? new ArrayList<>() : resourceMapper.queryResourceListById(resIds);
+        List<Integer> resIds = resourceUserMapper.queryResourcesIdListByUserIdAndPerm(resource.getUserId(),
+                Constants.AUTHORIZE_WRITABLE_PERM);
+        List<Resource> resources =
+                CollectionUtils.isEmpty(resIds) ? new ArrayList<>() : resourceMapper.queryResourceListById(resIds);
 
         resourceMapper.deleteById(resource.getId());
         Assert.assertEquals(0, resources.size());
@@ -306,8 +310,7 @@ public class ResourceMapperTest extends BaseDaoTest {
     public void testQueryResourceExceptUserId() {
         Resource resource = insertOne();
         List<Resource> resources = resourceMapper.queryResourceExceptUserId(
-                11111
-        );
+                11111);
         Assert.assertNotEquals(resources.size(), 0);
     }
 
@@ -365,13 +368,15 @@ public class ResourceMapperTest extends BaseDaoTest {
 
         List<Resource> resources = resourceMapper.listAuthorizedResource(generalUser2.getId(), resNames);
 
-        Assert.assertEquals(generalUser2.getId(), resource.getUserId());
-        Assert.assertFalse(resources.stream().map(t -> t.getFullName()).collect(toList()).containsAll(Arrays.asList(resNames)));
+        Assert.assertEquals(generalUser2.getId().intValue(), resource.getUserId());
+        Assert.assertFalse(
+                resources.stream().map(t -> t.getFullName()).collect(toList()).containsAll(Arrays.asList(resNames)));
 
         // authorize object unauthorizedResource to generalUser
         createResourcesUser(unauthorizedResource, generalUser2);
         List<Resource> authorizedResources = resourceMapper.listAuthorizedResource(generalUser2.getId(), resNames);
-        Assert.assertTrue(authorizedResources.stream().map(t -> t.getFullName()).collect(toList()).containsAll(Arrays.asList(resource.getFullName())));
+        Assert.assertTrue(authorizedResources.stream().map(t -> t.getFullName()).collect(toList())
+                .containsAll(Arrays.asList(resource.getFullName())));
 
     }
 
@@ -400,7 +405,8 @@ public class ResourceMapperTest extends BaseDaoTest {
         Resource resource = createResource(generalUser1);
         createResourcesUser(resource, generalUser2);
 
-        List<Resource> resourceList = resourceMapper.queryResourceListAuthored(generalUser2.getId(), ResourceType.FILE.ordinal());
+        List<Resource> resourceList =
+                resourceMapper.queryResourceListAuthored(generalUser2.getId(), ResourceType.FILE.ordinal());
         Assert.assertNotNull(resourceList);
 
         resourceList = resourceMapper.queryResourceListAuthored(generalUser2.getId(), ResourceType.FILE.ordinal());
@@ -435,4 +441,3 @@ public class ResourceMapperTest extends BaseDaoTest {
         Assert.assertTrue(resourceMapper.existResource(fullName, type));
     }
 }
-
diff --git a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/TaskDefinitionLogMapperTest.java b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/TaskDefinitionLogMapperTest.java
index f61813be3a..f0307df57e 100644
--- a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/TaskDefinitionLogMapperTest.java
+++ b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/TaskDefinitionLogMapperTest.java
@@ -57,7 +57,7 @@ public class TaskDefinitionLogMapperTest extends BaseDaoTest {
     @Test
     public void testInsert() {
         TaskDefinitionLog taskDefinitionLog = insertOne();
-        Assert.assertNotEquals(taskDefinitionLog.getId(), 0);
+        Assert.assertNotEquals(taskDefinitionLog.getId().intValue(), 0);
     }
 
     @Test
diff --git a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/TaskDefinitionMapperTest.java b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/TaskDefinitionMapperTest.java
index 3d16e2e44d..4efe97ecce 100644
--- a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/TaskDefinitionMapperTest.java
+++ b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/TaskDefinitionMapperTest.java
@@ -85,15 +85,15 @@ public class TaskDefinitionMapperTest extends BaseDaoTest {
     @Test
     public void testInsert() {
         TaskDefinition taskDefinition = insertOne();
-        Assert.assertNotEquals(taskDefinition.getId(), 0);
+        Assert.assertNotEquals(taskDefinition.getId().intValue(), 0);
     }
 
     @Test
     public void testQueryByDefinitionName() {
         TaskDefinition taskDefinition = insertOne();
         ProcessTaskRelation processTaskRelation = insertTaskRelation(taskDefinition.getCode());
-        TaskDefinition result = taskDefinitionMapper.queryByName(taskDefinition.getProjectCode(), processTaskRelation.getProcessDefinitionCode()
-                , taskDefinition.getName());
+        TaskDefinition result = taskDefinitionMapper.queryByName(taskDefinition.getProjectCode(),
+                processTaskRelation.getProcessDefinitionCode(), taskDefinition.getName());
 
         Assert.assertNotNull(result);
     }
@@ -109,7 +109,8 @@ public class TaskDefinitionMapperTest extends BaseDaoTest {
     @Test
     public void testQueryAllDefinitionList() {
         TaskDefinition taskDefinition = insertOne();
-        List<TaskDefinition> taskDefinitions = taskDefinitionMapper.queryAllDefinitionList(taskDefinition.getProjectCode());
+        List<TaskDefinition> taskDefinitions =
+                taskDefinitionMapper.queryAllDefinitionList(taskDefinition.getProjectCode());
         Assert.assertNotEquals(taskDefinitions.size(), 0);
 
     }
@@ -122,7 +123,8 @@ public class TaskDefinitionMapperTest extends BaseDaoTest {
         User un = userMapper.queryByUserNameAccurately("un");
         TaskDefinition taskDefinition = insertOne(un.getId());
 
-        List<DefinitionGroupByUser> users = taskDefinitionMapper.countDefinitionGroupByUser(new Long[]{taskDefinition.getProjectCode()});
+        List<DefinitionGroupByUser> users =
+                taskDefinitionMapper.countDefinitionGroupByUser(new Long[]{taskDefinition.getProjectCode()});
         Assert.assertNotEquals(users.size(), 0);
 
     }
@@ -158,7 +160,8 @@ public class TaskDefinitionMapperTest extends BaseDaoTest {
 
     @Test
     public void testNullPropertyValueOfLocalParams() {
-        String definitionJson = "{\"failRetryTimes\":\"0\",\"timeoutNotifyStrategy\":\"\",\"code\":\"5195043558720\",\"flag\":\"YES\",\"environmentCode\":\"-1\",\"taskDefinitionIndex\":2,\"taskPriority\":\"MEDIUM\",\"taskParams\":\"{\\\"preStatements\\\":null,\\\"postStatements\\\":null,\\\"type\\\":\\\"ADB_MYSQL\\\",\\\"database\\\":\\\"lijia\\\",\\\"sql\\\":\\\"create table nation_${random_serial_number} as select * from nation\\\",\\\"localParams\\\":[{\\\"direct\\\":2,\\\"type\\\":3, [...]
+        String definitionJson =
+                "{\"failRetryTimes\":\"0\",\"timeoutNotifyStrategy\":\"\",\"code\":\"5195043558720\",\"flag\":\"YES\",\"environmentCode\":\"-1\",\"taskDefinitionIndex\":2,\"taskPriority\":\"MEDIUM\",\"taskParams\":\"{\\\"preStatements\\\":null,\\\"postStatements\\\":null,\\\"type\\\":\\\"ADB_MYSQL\\\",\\\"database\\\":\\\"lijia\\\",\\\"sql\\\":\\\"create table nation_${random_serial_number} as select * from nation\\\",\\\"localParams\\\":[{\\\"direct\\\":2,\\\"type\\\":3,\\\"prop\\\":\\\ [...]
         TaskDefinition definition = JSONUtils.parseObject(definitionJson, TaskDefinition.class);
 
         Map<String, String> taskParamsMap = definition.getTaskParamMap();
@@ -174,7 +177,8 @@ public class TaskDefinitionMapperTest extends BaseDaoTest {
 
     @Test
     public void testNullLocalParamsOfTaskParams() {
-        String definitionJson = "{\"failRetryTimes\":\"0\",\"timeoutNotifyStrategy\":\"\",\"code\":\"5195043558720\",\"flag\":\"YES\",\"environmentCode\":\"-1\",\"taskDefinitionIndex\":2,\"taskPriority\":\"MEDIUM\",\"taskParams\":\"{\\\"preStatements\\\":null,\\\"postStatements\\\":null,\\\"type\\\":\\\"ADB_MYSQL\\\",\\\"database\\\":\\\"lijia\\\",\\\"sql\\\":\\\"create table nation_${random_serial_number} as select * from nation\\\",\\\"localParams\\\":null,\\\"Name\\\":\\\"create_table [...]
+        String definitionJson =
+                "{\"failRetryTimes\":\"0\",\"timeoutNotifyStrategy\":\"\",\"code\":\"5195043558720\",\"flag\":\"YES\",\"environmentCode\":\"-1\",\"taskDefinitionIndex\":2,\"taskPriority\":\"MEDIUM\",\"taskParams\":\"{\\\"preStatements\\\":null,\\\"postStatements\\\":null,\\\"type\\\":\\\"ADB_MYSQL\\\",\\\"database\\\":\\\"lijia\\\",\\\"sql\\\":\\\"create table nation_${random_serial_number} as select * from nation\\\",\\\"localParams\\\":null,\\\"Name\\\":\\\"create_table_as_select_natio [...]
         TaskDefinition definition = JSONUtils.parseObject(definitionJson, TaskDefinition.class);
 
         Assert.assertNull("Serialize the task definition success", definition.getTaskParamMap());
diff --git a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/UdfFuncMapperTest.java b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/UdfFuncMapperTest.java
index 1adbe689ae..30fc721fe0 100644
--- a/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/UdfFuncMapperTest.java
+++ b/dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/mapper/UdfFuncMapperTest.java
@@ -163,13 +163,13 @@ public class UdfFuncMapperTest extends BaseDaoTest {
      */
     @Test
     public void testUpdate() {
-        //insertOne
+        // insertOne
         UdfFunc udfFunc = insertOne("func1");
         udfFunc.setResourceName("dolphin_resource_update");
         udfFunc.setResourceId(2);
         udfFunc.setClassName("org.apache.dolphinscheduler.test.mrUpdate");
         udfFunc.setUpdateTime(new Date());
-        //update
+        // update
         int update = udfFuncMapper.updateById(udfFunc);
         Assert.assertEquals(update, 1);
 
@@ -180,9 +180,9 @@ public class UdfFuncMapperTest extends BaseDaoTest {
      */
     @Test
     public void testDelete() {
-        //insertOne
+        // insertOne
         UdfFunc udfFunc = insertOne("func2");
-        //delete
+        // delete
         int delete = udfFuncMapper.deleteById(udfFunc.getId());
         Assert.assertEquals(delete, 1);
     }
@@ -192,12 +192,12 @@ public class UdfFuncMapperTest extends BaseDaoTest {
      */
     @Test
     public void testQueryUdfByIdStr() {
-        //insertOne
+        // insertOne
         UdfFunc udfFunc = insertOne("func3");
-        //insertOne
+        // insertOne
         UdfFunc udfFunc1 = insertOne("func4");
         Integer[] idArray = new Integer[]{udfFunc.getId(), udfFunc1.getId()};
-        //queryUdfByIdStr
+        // queryUdfByIdStr
         List<UdfFunc> udfFuncList = udfFuncMapper.queryUdfByIdStr(idArray, "");
         Assert.assertNotEquals(udfFuncList.size(), 0);
     }
@@ -207,14 +207,15 @@ public class UdfFuncMapperTest extends BaseDaoTest {
      */
     @Test
     public void testQueryUdfFuncPaging() {
-        //insertOneUser
+        // insertOneUser
         User user = insertOneUser();
-        //insertOne
+        // insertOne
         UdfFunc udfFunc = insertOne(user);
-        //queryUdfFuncPaging
+        // queryUdfFuncPaging
         Page<UdfFunc> page = new Page(1, 3);
 
-        IPage<UdfFunc> udfFuncIPage = udfFuncMapper.queryUdfFuncPaging(page, Collections.singletonList(udfFunc.getId()), "");
+        IPage<UdfFunc> udfFuncIPage =
+                udfFuncMapper.queryUdfFuncPaging(page, Collections.singletonList(udfFunc.getId()), "");
         Assert.assertNotEquals(udfFuncIPage.getTotal(), 0);
 
     }
@@ -224,12 +225,13 @@ public class UdfFuncMapperTest extends BaseDaoTest {
      */
     @Test
     public void testGetUdfFuncByType() {
-        //insertOneUser
+        // insertOneUser
         User user = insertOneUser();
-        //insertOne
+        // insertOne
         UdfFunc udfFunc = insertOne(user);
-        //getUdfFuncByType
-        List<UdfFunc> udfFuncList = udfFuncMapper.getUdfFuncByType(Collections.singletonList(udfFunc.getId()), udfFunc.getType().ordinal());
+        // getUdfFuncByType
+        List<UdfFunc> udfFuncList =
+                udfFuncMapper.getUdfFuncByType(Collections.singletonList(udfFunc.getId()), udfFunc.getType().ordinal());
         Assert.assertNotEquals(udfFuncList.size(), 0);
 
     }
@@ -239,10 +241,10 @@ public class UdfFuncMapperTest extends BaseDaoTest {
      */
     @Test
     public void testQueryUdfFuncExceptUserId() {
-        //insertOneUser
+        // insertOneUser
         User user1 = insertOneUser();
         User user2 = insertOneUser("user2");
-        //insertOne
+        // insertOne
         UdfFunc udfFunc1 = insertOne(user1);
         UdfFunc udfFunc2 = insertOne(user2);
         List<UdfFunc> udfFuncList = udfFuncMapper.queryUdfFuncExceptUserId(user1.getId());
@@ -255,48 +257,49 @@ public class UdfFuncMapperTest extends BaseDaoTest {
      */
     @Test
     public void testQueryAuthedUdfFunc() {
-        //insertOneUser
+        // insertOneUser
         User user = insertOneUser();
 
-        //insertOne
+        // insertOne
         UdfFunc udfFunc = insertOne(user);
 
-        //insertOneUDFUser
+        // insertOneUDFUser
         UDFUser udfUser = insertOneUDFUser(user, udfFunc);
-        //queryAuthedUdfFunc
+        // queryAuthedUdfFunc
         List<UdfFunc> udfFuncList = udfFuncMapper.queryAuthedUdfFunc(user.getId());
         Assert.assertNotEquals(udfFuncList.size(), 0);
     }
 
     @Test
     public void testListAuthorizedUdfFunc() {
-        //create general user
+        // create general user
         User generalUser1 = createGeneralUser("user1");
         User generalUser2 = createGeneralUser("user2");
 
-        //create udf function
+        // create udf function
         UdfFunc udfFunc = insertOne(generalUser1);
         UdfFunc unauthorizdUdfFunc = insertOne(generalUser2);
 
-        //udf function ids
+        // udf function ids
         Integer[] udfFuncIds = new Integer[]{udfFunc.getId(), unauthorizdUdfFunc.getId()};
 
         List<UdfFunc> authorizedUdfFunc = udfFuncMapper.listAuthorizedUdfFunc(generalUser1.getId(), udfFuncIds);
 
-        Assert.assertEquals(generalUser1.getId(), udfFunc.getUserId());
-        Assert.assertNotEquals(generalUser1.getId(), unauthorizdUdfFunc.getUserId());
-        Assert.assertFalse(authorizedUdfFunc.stream().map(t -> t.getId()).collect(toList()).containsAll(Arrays.asList(udfFuncIds)));
+        Assert.assertEquals(generalUser1.getId().intValue(), udfFunc.getUserId());
+        Assert.assertNotEquals(generalUser1.getId().intValue(), unauthorizdUdfFunc.getUserId());
+        Assert.assertFalse(authorizedUdfFunc.stream().map(t -> t.getId()).collect(toList())
+                .containsAll(Arrays.asList(udfFuncIds)));
 
-
-        //authorize object unauthorizdUdfFunc to generalUser1
+        // authorize object unauthorizdUdfFunc to generalUser1
         insertOneUDFUser(generalUser1, unauthorizdUdfFunc);
         authorizedUdfFunc = udfFuncMapper.listAuthorizedUdfFunc(generalUser1.getId(), udfFuncIds);
-        Assert.assertTrue(authorizedUdfFunc.stream().map(t -> t.getId()).collect(toList()).containsAll(Arrays.asList(udfFuncIds)));
+        Assert.assertTrue(authorizedUdfFunc.stream().map(t -> t.getId()).collect(toList())
+                .containsAll(Arrays.asList(udfFuncIds)));
     }
 
     @Test
     public void batchUpdateUdfFuncTest() {
-        //create general user
+        // create general user
         User generalUser1 = createGeneralUser("user1");
         UdfFunc udfFunc = insertOne(generalUser1);
         udfFunc.setResourceName("/updateTest");
diff --git a/dolphinscheduler-dist/release-docs/LICENSE b/dolphinscheduler-dist/release-docs/LICENSE
index 4b4ea79e77..54c59bccda 100644
--- a/dolphinscheduler-dist/release-docs/LICENSE
+++ b/dolphinscheduler-dist/release-docs/LICENSE
@@ -218,10 +218,11 @@ The text of each license is also included at licenses/LICENSE-[project].txt.
     accessors-smart 2.4.8: https://github.com/netplex/json-smart-v2, Apache 2.0
     apacheds-i18n 2.0.0-M15: https://mvnrepository.com/artifact/org.apache.directory.server/apacheds-i18n/2.0.0-M15, Apache 2.0
     apacheds-kerberos-codec 2.0.0-M15: https://mvnrepository.com/artifact/org.apache.directory.server/apacheds-kerberos-codec/2.0.0-M15, Apache 2.0
+    aircompressor 0.3 https://mvnrepository.com/artifact/io.airlift/aircompressor, Apache 2.0
     tomcat-embed-el 9.0.65: https://mvnrepository.com/artifact/org.apache.tomcat.embed/tomcat-embed-el/9.0.65, Apache 2.0
     api-asn1-api 1.0.0-M20: https://mvnrepository.com/artifact/org.apache.directory.api/api-asn1-api/1.0.0-M20, Apache 2.0
     api-util 1.0.0-M20: https://mvnrepository.com/artifact/org.apache.directory.api/api-util/1.0.0-M20, Apache 2.0
-    audience-annotations 0.5.0: https://mvnrepository.com/artifact/org.apache.yetus/audience-annotations/0.5.0, Apache 2.0
+    audience-annotations 0.12.0: https://mvnrepository.com/artifact/org.apache.yetus/audience-annotations/0.12.0, Apache 2.0
     avro 1.7.4: https://github.com/apache/avro, Apache 2.0
     bonecp 0.8.0.RELEASE: https://github.com/wwadge/bonecp, Apache 2.0
     byte-buddy 1.9.16: https://mvnrepository.com/artifact/net.bytebuddy/byte-buddy/1.9.16, Apache 2.0
@@ -243,43 +244,43 @@ The text of each license is also included at licenses/LICENSE-[project].txt.
     commons-math3 3.1.1: https://mvnrepository.com/artifact/org.apache.commons/commons-math3/3.1.1, Apache 2.0
     commons-net 3.1: https://github.com/apache/commons-net, Apache 2.0
     commons-pool 1.6: https://github.com/apache/commons-pool, Apache 2.0
-    cron-utils 9.1.3: https://mvnrepository.com/artifact/com.cronutils/cron-utils/9.1.3, Apache 2.0
+    cron-utils 9.1.6: https://mvnrepository.com/artifact/com.cronutils/cron-utils/9.1.6, Apache 2.0
     commons-lang3 3.12.0: https://mvnrepository.com/artifact/org.apache.commons/commons-lang3/3.12.0, Apache 2.0
     curator-client 4.3.0: https://mvnrepository.com/artifact/org.apache.curator/curator-client/4.3.0, Apache 2.0
     curator-framework 4.3.0: https://mvnrepository.com/artifact/org.apache.curator/curator-framework/4.3.0, Apache 2.0
     curator-recipes 4.3.0: https://mvnrepository.com/artifact/org.apache.curator/curator-recipes/4.3.0, Apache 2.0
     curator-test 2.12.0: https://mvnrepository.com/artifact/org.apache.curator/curator-test/2.12.0, Apache 2.0
-    datanucleus-api-jdo 4.2.1: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-api-jdo/4.2.1, Apache 2.0
-    datanucleus-core 4.1.6: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-core/4.1.6, Apache 2.0
-    datanucleus-rdbms 4.1.7: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-rdbms/4.1.7, Apache 2.0
+    datanucleus-api-jdo 4.2.4: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-api-jdo/4.2.4, Apache 2.0
+    datanucleus-core 4.1.17: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-core/4.1.17, Apache 2.0
+    datanucleus-rdbms 4.1.19: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-rdbms/4.1.19, Apache 2.0
     derby 10.14.2.0: https://github.com/apache/derby, Apache 2.0
     druid 1.1.14: https://mvnrepository.com/artifact/com.alibaba/druid/1.1.14, Apache 2.0
+    metrics-core 4.2.11: https://mvnrepository.com/artifact/io.dropwizard.metrics/metrics-core, Apache 2.0
     error_prone_annotations 2.1.3 https://mvnrepository.com/artifact/com.google.errorprone/error_prone_annotations/2.1.3, Apache 2.0
     gson 2.9.1: https://github.com/google/gson, Apache 2.0
     guava 24.1-jre: https://mvnrepository.com/artifact/com.google.guava/guava/24.1-jre, Apache 2.0
     guava-retrying 2.0.0: https://mvnrepository.com/artifact/com.github.rholder/guava-retrying/2.0.0, Apache 2.0
-    hadoop-annotations 2.7.3:https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-annotations/2.7.3, Apache 2.0
-    hadoop-auth 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-auth/2.7.3, Apache 2.0
-    hadoop-client 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client/2.7.3, Apache 2.0
-    hadoop-common 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common/2.7.3, Apache 2.0
-    hadoop-hdfs 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-hdfs/2.7.3, Apache 2.0
-    hadoop-mapreduce-client-app 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-app/2.7.3, Apache 2.0
-    hadoop-mapreduce-client-common 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-common/2.7.3, Apache 2.0
-    hadoop-mapreduce-client-core 2.7.3: https://mvnrepository.com/artifact/io.hops/hadoop-mapreduce-client-core/2.7.3, Apache 2.0
-    hadoop-mapreduce-client-jobclient 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-jobclient/2.7.3, Apache 2.0
-    hadoop-yarn-api 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-api/2.7.3, Apache 2.0
-    hadoop-yarn-client 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-client/2.7.3, Apache 2.0
-    hadoop-yarn-common 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-common/2.7.3, Apache 2.0
-    hadoop-yarn-server-common 2.7.3: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-server-common/2.7.3, Apache 2.0
+    hadoop-annotations 2.7.7:https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-annotations/2.7.7, Apache 2.0
+    hadoop-auth 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-auth/2.7.7, Apache 2.0
+    hadoop-client 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client/2.7.7, Apache 2.0
+    hadoop-common 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common/2.7.7, Apache 2.0
+    hadoop-hdfs 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-hdfs/2.7.7, Apache 2.0
+    hadoop-mapreduce-client-app 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-app/2.7.7, Apache 2.0
+    hadoop-mapreduce-client-common 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-common/2.7.7, Apache 2.0
+    hadoop-mapreduce-client-core 2.7.7: https://mvnrepository.com/artifact/io.hops/hadoop-mapreduce-client-core/2.7.7, Apache 2.0
+    hadoop-mapreduce-client-jobclient 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-jobclient/2.7.7, Apache 2.0
+    hadoop-yarn-api 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-api/2.7.7, Apache 2.0
+    hadoop-yarn-client 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-client/2.7.7, Apache 2.0
+    hadoop-yarn-common 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-common/2.7.7, Apache 2.0
+    hadoop-yarn-server-common 2.7.7: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-server-common/2.7.7, Apache 2.0
     HikariCP 4.0.3: https://mvnrepository.com/artifact/com.zaxxer/HikariCP/4.0.3, Apache 2.0
-    hive-common 2.1.0: https://mvnrepository.com/artifact/org.apache.hive/hive-common/2.1.0, Apache 2.0
-    hive-jdbc 2.1.0: https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc/2.1.0, Apache 2.0
-    hive-metastore 2.1.0: https://mvnrepository.com/artifact/org.apache.hive/hive-metastore/2.1.0, Apache 2.0
-    hive-orc 2.1.0: https://mvnrepository.com/artifact/org.apache.hive/hive-orc/2.1.0, Apache 2.0
-    hive-serde 2.1.0: https://mvnrepository.com/artifact/org.apache.hive/hive-serde/2.1.0, Apache 2.0
-    hive-service 2.1.0: https://mvnrepository.com/artifact/org.apache.hive/hive-service/2.1.0, Apache 2.0
-    hive-service-rpc 2.1.0: https://mvnrepository.com/artifact/org.apache.hive/hive-service-rpc/2.1.0, Apache 2.0
-    hive-storage-api 2.1.0: https://mvnrepository.com/artifact/org.apache.hive/hive-storage-api/2.1.0, Apache 2.0
+    hive-common 2.3.3: https://mvnrepository.com/artifact/org.apache.hive/hive-common/2.3.3, Apache 2.0
+    hive-jdbc 2.3.3: https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc/2.3.3, Apache 2.0
+    hive-metastore 2.3.3: https://mvnrepository.com/artifact/org.apache.hive/hive-metastore/2.3.3, Apache 2.0
+    hive-serde 2.3.3: https://mvnrepository.com/artifact/org.apache.hive/hive-serde/2.3.3, Apache 2.0
+    hive-service 2.3.3: https://mvnrepository.com/artifact/org.apache.hive/hive-service/2.3.3, Apache 2.0
+    hive-service-rpc 2.3.3: https://mvnrepository.com/artifact/org.apache.hive/hive-service-rpc/2.3.3, Apache 2.0
+    hive-storage-api 2.4.0: https://mvnrepository.com/artifact/org.apache.hive/hive-storage-api/2.4.0, Apache 2.0
     htrace-core 3.1.0-incubating: https://mvnrepository.com/artifact/org.apache.htrace/htrace-core/3.1.0-incubating, Apache 2.0
     httpclient 4.5.13: https://mvnrepository.com/artifact/org.apache.httpcomponents/httpclient/4.5.13, Apache 2.0
     httpcore 4.4.15: https://mvnrepository.com/artifact/org.apache.httpcomponents/httpcore/4.4.15, Apache 2.0
@@ -306,6 +307,7 @@ The text of each license is also included at licenses/LICENSE-[project].txt.
     jetty-servlet 9.4.48.v20220622: https://mvnrepository.com/artifact/org.eclipse.jetty/jetty-servlet/9.4.48.v20220622, Apache 2.0 and EPL 1.0
     jetty-servlets 9.4.48.v20220622: https://mvnrepository.com/artifact/org.eclipse.jetty/jetty-servlets/9.4.48.v20220622, Apache 2.0 and EPL 1.0
     jetty-util 6.1.26: https://mvnrepository.com/artifact/org.mortbay.jetty/jetty-util/6.1.26, Apache 2.0 and EPL 1.0
+    jetty-sslengine 6.1.26: https://mvnrepository.com/artifact/org.mortbay.jetty/jetty-sslengine/6.1.26, Apache 2.0 and EPL 1.0
     jetty-util 9.4.48.v20220622: https://mvnrepository.com/artifact/org.eclipse.jetty/jetty-util/9.4.48.v20220622, Apache 2.0 and EPL 1.0
     jetty-util-ajax 9.4.48.v20220622: https://mvnrepository.com/artifact/org.eclipse.jetty/jetty-util-ajax/9.4.48.v20220622, Apache 2.0 and EPL 1.0
     jetty-webapp 9.4.48.v20220622: https://mvnrepository.com/artifact/org.eclipse.jetty/jetty-webapp/9.4.48.v20220622, Apache 2.0 and EPL 1.0
@@ -314,37 +316,37 @@ The text of each license is also included at licenses/LICENSE-[project].txt.
     jna-platform 5.10.0: https://mvnrepository.com/artifact/net.java.dev.jna/jna-platform/5.10.0, Apache 2.0 and LGPL 2.1
     joda-time 2.10.13: https://github.com/JodaOrg/joda-time, Apache 2.0
     jpam 1.1: https://mvnrepository.com/artifact/net.sf.jpam/jpam/1.1, Apache 2.0
+    json 1.8: https://mvnrepository.com/artifact/com.tdunning/json, Apache 2.0
     json-path 2.7.0: https://github.com/json-path/JsonPath, Apache 2.0
     json-smart 2.4.8: https://github.com/netplex/json-smart-v2, Apache 2.0
-    jsqlparser 2.1: https://github.com/JSQLParser/JSqlParser, Apache 2.0 or LGPL 2.1
+    jsqlparser 4.4: https://github.com/JSQLParser/JSqlParser, Apache 2.0 or LGPL 4.4
     jsr305 3.0.0: https://mvnrepository.com/artifact/com.google.code.findbugs/jsr305, Apache 2.0
     j2objc-annotations 1.1 https://mvnrepository.com/artifact/com.google.j2objc/j2objc-annotations/1.1, Apache 2.0
     libfb303 0.9.3: https://mvnrepository.com/artifact/org.apache.thrift/libfb303/0.9.3, Apache 2.0
     libthrift 0.9.3: https://mvnrepository.com/artifact/org.apache.thrift/libthrift/0.9.3, Apache 2.0
     log4j-api 2.11.2: https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-api/2.11.2, Apache 2.0
     log4j-core-2.11.2: https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-core/2.11.2, Apache 2.0
-    log4j 1.2.17: https://mvnrepository.com/artifact/log4j/log4j/1.2.17, Apache 2.0
     log4j-1.2-api 2.17.2: https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-1.2-api/2.17.2, Apache 2.0
     lz4 1.3.0: https://mvnrepository.com/artifact/net.jpountz.lz4/lz4/1.3.0, Apache 2.0
     mapstruct 1.3.1.Final: https://github.com/mapstruct/mapstruct, Apache 2.0
-    mybatis 3.5.2 https://mvnrepository.com/artifact/org.mybatis/mybatis/3.5.2, Apache 2.0
-    mybatis-plus 3.2.0: https://github.com/baomidou/mybatis-plus, Apache 2.0
-    mybatis-plus-annotation 3.2.0: https://mvnrepository.com/artifact/com.baomidou/mybatis-plus-annotation/3.2.0, Apache 2.0
-    mybatis-plus-boot-starter 3.2.0: https://mvnrepository.com/artifact/com.baomidou/mybatis-plus-boot-starter/3.2.0, Apache 2.0
-    mybatis-plus-core 3.2.0: https://mvnrepository.com/artifact/com.baomidou/mybatis-plus-core/3.2.0, Apache 2.0
-    mybatis-plus-extension 3.2.0: https://mvnrepository.com/artifact/com.baomidou/mybatis-plus-extension/3.2.0, Apache 2.0
-    mybatis-spring 2.0.2: https://mvnrepository.com/artifact/org.mybatis/mybatis-spring/2.0.2, Apache 2.0
+    mybatis 3.5.10 https://mvnrepository.com/artifact/org.mybatis/mybatis/3.5.10, Apache 2.0
+    mybatis-plus 3.5.2: https://github.com/baomidou/mybatis-plus, Apache 2.0
+    mybatis-plus-annotation 3.5.2: https://mvnrepository.com/artifact/com.baomidou/mybatis-plus-annotation/3.5.2, Apache 2.0
+    mybatis-plus-boot-starter 3.5.2: https://mvnrepository.com/artifact/com.baomidou/mybatis-plus-boot-starter/3.5.2, Apache 2.0
+    mybatis-plus-core 3.5.2: https://mvnrepository.com/artifact/com.baomidou/mybatis-plus-core/3.5.2, Apache 2.0
+    mybatis-plus-extension 3.5.2: https://mvnrepository.com/artifact/com.baomidou/mybatis-plus-extension/3.5.2, Apache 2.0
+    mybatis-spring 2.0.7: https://mvnrepository.com/artifact/org.mybatis/mybatis-spring/2.0.7, Apache 2.0
     netty 3.6.2.Final: https://github.com/netty/netty, Apache 2.0
     netty 4.1.53.Final: https://github.com/netty/netty/blob/netty-4.1.53.Final/LICENSE.txt, Apache 2.0
     opencsv 2.3: https://mvnrepository.com/artifact/net.sf.opencsv/opencsv/2.3, Apache 2.0
+    orc-core 1.3.3 https://mvnrepository.com/artifact/org.apache.orc/orc-core, Apache 2.0
     parquet-hadoop-bundle 1.8.1: https://mvnrepository.com/artifact/org.apache.parquet/parquet-hadoop-bundle/1.8.1, Apache 2.0
     poi 4.1.2: https://mvnrepository.com/artifact/org.apache.poi/poi/4.1.2, Apache 2.0
     poi-ooxml 4.1.2: https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml/4.1.2, Apache 2.0
     poi-ooxml-schemas-4.1.2: https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml-schemas/4.1.2, Apache 2.0
     quartz 2.3.2: https://mvnrepository.com/artifact/org.quartz-scheduler/quartz/2.3.2, Apache 2.0
     snakeyaml 1.30: https://mvnrepository.com/artifact/org.yaml/snakeyaml/1.30, Apache 2.0
-    snappy 0.2: https://mvnrepository.com/artifact/org.iq80.snappy/snappy/0.2, Apache 2.0
-    snappy-java 1.0.4.1: https://github.com/xerial/snappy-java, Apache 2.0
+    snappy-java 1.1.8.4: https://github.com/xerial/snappy-java, Apache 2.0
     SparseBitSet 1.2: https://mvnrepository.com/artifact/com.zaxxer/SparseBitSet/1.2, Apache 2.0
     spring-aop 5.3.13: https://mvnrepository.com/artifact/org.springframework/spring-aop/5.3.13, Apache 2.0
     spring-beans 5.3.19: https://mvnrepository.com/artifact/org.springframework/spring-beans/5.3.19, Apache 2.0
@@ -388,7 +390,7 @@ The text of each license is also included at licenses/LICENSE-[project].txt.
     xercesImpl 2.9.1: https://mvnrepository.com/artifact/xerces/xercesImpl/2.9.1, Apache 2.0
     xmlbeans 3.1.0: https://mvnrepository.com/artifact/org.apache.xmlbeans/xmlbeans/3.1.0, Apache 2.0
     xml-apis 1.3.04: https://mvnrepository.com/artifact/xml-apis/xml-apis/1.3.04, Apache 2.0 and W3C
-    zookeeper 3.4.14: https://mvnrepository.com/artifact/org.apache.zookeeper/zookeeper/3.4.14, Apache 2.0
+    zookeeper 3.8.0: https://mvnrepository.com/artifact/org.apache.zookeeper/zookeeper/3.8.0, Apache 2.0
     presto-jdbc 0.238.1 https://mvnrepository.com/artifact/com.facebook.presto/presto-jdbc/0.238.1
     protostuff-core 1.7.2: https://github.com/protostuff/protostuff/protostuff-core Apache-2.0
     protostuff-runtime 1.7.2: https://github.com/protostuff/protostuff/protostuff-core Apache-2.0
@@ -458,9 +460,9 @@ The text of each license is also included at licenses/LICENSE-[project].txt.
     click 8.0: https://github.com/pallets/click, BSD 3-Clause
     curvesapi 1.06: https://mvnrepository.com/artifact/com.github.virtuald/curvesapi/1.06, BSD 3-clause
     javolution 5.5.1: https://mvnrepository.com/artifact/javolution/javolution/5.5.1, BSD
-    jline 0.9.94: https://github.com/jline/jline3, BSD
+    jline 2.12: https://github.com/jline/jline3, BSD
     jsch 0.1.42: https://mvnrepository.com/artifact/com.jcraft/jsch/0.1.42, BSD
-    postgresql 42.3.4: https://mvnrepository.com/artifact/org.postgresql/postgresql/42.3.4, BSD 2-clause
+    postgresql 42.4.1: https://mvnrepository.com/artifact/org.postgresql/postgresql/42.4.1, BSD 2-clause
     protobuf-java 2.5.0: https://mvnrepository.com/artifact/com.google.protobuf/protobuf-java/2.5.0, BSD 2-clause
     paranamer 2.3: https://mvnrepository.com/artifact/com.thoughtworks.paranamer/paranamer/2.3, BSD
     threetenbp 1.3.6: https://mvnrepository.com/artifact/org.threeten/threetenbp/1.3.6,  BSD 3-clause
@@ -501,7 +503,7 @@ The text of each license is also included at licenses/LICENSE-[project].txt.
     aspectjweaver 1.9.7:https://mvnrepository.com/artifact/org.aspectj/aspectjweaver/1.9.7, EPL 1.0
     logback-classic 1.2.11: https://mvnrepository.com/artifact/ch.qos.logback/logback-classic/1.2.11, EPL 1.0 and LGPL 2.1
     logback-core 1.2.11: https://mvnrepository.com/artifact/ch.qos.logback/logback-core/1.2.11, EPL 1.0 and LGPL 2.1
-    h2-1.4.200 https://github.com/h2database/h2database/blob/master/LICENSE.txt, MPL 2.0 or EPL 1.0
+    h2-2.1.210 https://github.com/h2database/h2database/blob/master/LICENSE.txt, MPL 2.0 or EPL 1.0
 
 ========================================================================
 MIT licenses
diff --git a/dolphinscheduler-dist/release-docs/NOTICE b/dolphinscheduler-dist/release-docs/NOTICE
index 1411e7a78c..5e8cf2f532 100644
--- a/dolphinscheduler-dist/release-docs/NOTICE
+++ b/dolphinscheduler-dist/release-docs/NOTICE
@@ -1680,16 +1680,6 @@ Commons Lang 2.6,
 which has the following notices:
  * This product includes software from the Spring Framework,under the Apache License 2.0 (see: StringUtils.containsWhitespace())
 
-The binary distribution of this product bundles binaries of
-Apache Log4j 1.2.17,
-which has the following notices:
- * ResolverUtil.java
-    Copyright 2005-2006 Tim Fennell
-  Dumbster SMTP test server
-    Copyright 2004 Jason Paul Kitchen
-  TypeUtil.java
-    Copyright 2002-2012 Ramnivas Laddad, Juergen Hoeller, Chris Beams
-
 The binary distribution of this product bundles binaries of
 Jetty 6.1.26,
 which has the following notices:
@@ -1781,7 +1771,7 @@ which has the following notices:
    granted provided that the copyright notice appears in all copies./
 
 The binary distribution of this product bundles binaries of
-Snappy for Java 1.0.4.1,
+Snappy for Java 1.1.8.4,
 which has the following notices:
  * This product includes software developed by Google
     Snappy: http://code.google.com/p/snappy/ (New BSD License)
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-annotations.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-annotations.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-annotations.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-annotations.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-auth.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-auth.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-auth.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-auth.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-client.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-client.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-client.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-client.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-common.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-common.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-common.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-common.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-hdfs.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-hdfs.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-hdfs.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-hdfs.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-app.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-app.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-app.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-app.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-common.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-common.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-common.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-common.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-core.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-core.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-core.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-core.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-jobclient.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-jobclient.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-jobclient.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-mapreduce-client-jobclient.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-api.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-api.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-api.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-api.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-client.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-client.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-client.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-client.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-common.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-common.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-common.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-common.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-server-common.txt b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-server-common.txt
index b7d41e6553..7579c1159a 100644
--- a/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-server-common.txt
+++ b/dolphinscheduler-dist/release-docs/licenses/LICENSE-hadoop-yarn-server-common.txt
@@ -1506,7 +1506,7 @@ following license:
 ASM Core 3.2
 JSch 0.1.42
 ParaNamer Core 2.3
-JLine 0.9.94
+JLine 2.12
 leveldbjni-all 1.8
 Hamcrest Core 1.3
 xmlenc Library 0.52
diff --git a/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/MasterLogFilterTest.java b/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/MasterLogFilterTest.java
index 1a546951d6..a77a1002be 100644
--- a/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/MasterLogFilterTest.java
+++ b/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/MasterLogFilterTest.java
@@ -16,16 +16,14 @@
  */
 package org.apache.dolphinscheduler.server.log;
 
-import ch.qos.logback.classic.Level;
-import ch.qos.logback.classic.spi.ILoggingEvent;
-import ch.qos.logback.classic.spi.IThrowableProxy;
-import ch.qos.logback.classic.spi.LoggerContextVO;
-import ch.qos.logback.core.spi.FilterReply;
 import org.apache.dolphinscheduler.common.Constants;
+
 import org.junit.Assert;
 import org.junit.Test;
-import org.slf4j.Marker;
-import java.util.Map;
+
+import ch.qos.logback.classic.Level;
+import ch.qos.logback.classic.spi.LoggingEvent;
+import ch.qos.logback.core.spi.FilterReply;
 
 public class MasterLogFilterTest {
 
@@ -33,8 +31,8 @@ public class MasterLogFilterTest {
     public void decide() {
         MasterLogFilter masterLogFilter = new MasterLogFilter();
 
+        FilterReply filterReply = masterLogFilter.decide(new LoggingEvent() {
 
-        FilterReply filterReply = masterLogFilter.decide(new ILoggingEvent() {
             @Override
             public String getThreadName() {
                 return Constants.THREAD_NAME_MASTER_SERVER;
@@ -48,71 +46,11 @@ public class MasterLogFilterTest {
             @Override
             public String getMessage() {
                 return "master insert into queue success, task : shell2";
-//                return "consume tasks: [2_177_2_704_-1],there still have 0 tasks need to be executed";
-            }
-
-            @Override
-            public Object[] getArgumentArray() {
-                return new Object[0];
-            }
-
-            @Override
-            public String getFormattedMessage() {
-                return "master insert into queue success, task : shell2";
-            }
-
-            @Override
-            public String getLoggerName() {
-                return null;
-            }
-
-            @Override
-            public LoggerContextVO getLoggerContextVO() {
-                return null;
-            }
-
-            @Override
-            public IThrowableProxy getThrowableProxy() {
-                return null;
             }
 
-            @Override
-            public StackTraceElement[] getCallerData() {
-                return new StackTraceElement[0];
-            }
-
-            @Override
-            public boolean hasCallerData() {
-                return false;
-            }
-
-            @Override
-            public Marker getMarker() {
-                return null;
-            }
-
-            @Override
-            public Map<String, String> getMDCPropertyMap() {
-                return null;
-            }
-
-            @Override
-            public Map<String, String> getMdc() {
-                return null;
-            }
-
-            @Override
-            public long getTimeStamp() {
-                return 0;
-            }
-
-            @Override
-            public void prepareForDeferredProcessing() {
-
-            }
         });
 
         Assert.assertEquals(FilterReply.ACCEPT, filterReply);
 
     }
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/TaskLogFilterTest.java b/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/TaskLogFilterTest.java
index 34c939d7ce..a02b1acc41 100644
--- a/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/TaskLogFilterTest.java
+++ b/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/TaskLogFilterTest.java
@@ -16,20 +16,14 @@
  */
 package org.apache.dolphinscheduler.server.log;
 
-import ch.qos.logback.classic.Level;
-import ch.qos.logback.classic.spi.ILoggingEvent;
-import ch.qos.logback.classic.spi.IThrowableProxy;
-import ch.qos.logback.classic.spi.LoggerContextVO;
-import ch.qos.logback.core.spi.FilterReply;
-
 import org.apache.dolphinscheduler.plugin.task.api.TaskConstants;
 
 import org.junit.Assert;
 import org.junit.Test;
-import org.slf4j.Marker;
-
-import java.util.Map;
 
+import ch.qos.logback.classic.Level;
+import ch.qos.logback.classic.spi.LoggingEvent;
+import ch.qos.logback.core.spi.FilterReply;
 
 public class TaskLogFilterTest {
 
@@ -37,8 +31,8 @@ public class TaskLogFilterTest {
     public void decide() {
         TaskLogFilter taskLogFilter = new TaskLogFilter();
 
+        FilterReply filterReply = taskLogFilter.decide(new LoggingEvent() {
 
-        FilterReply filterReply = taskLogFilter.decide(new ILoggingEvent() {
             @Override
             public String getThreadName() {
                 return TaskConstants.TASK_APPID_LOG_FORMAT;
@@ -68,54 +62,9 @@ public class TaskLogFilterTest {
             public String getLoggerName() {
                 return TaskConstants.TASK_LOG_LOGGER_NAME;
             }
-
-            @Override
-            public LoggerContextVO getLoggerContextVO() {
-                return null;
-            }
-
-            @Override
-            public IThrowableProxy getThrowableProxy() {
-                return null;
-            }
-
-            @Override
-            public StackTraceElement[] getCallerData() {
-                return new StackTraceElement[0];
-            }
-
-            @Override
-            public boolean hasCallerData() {
-                return false;
-            }
-
-            @Override
-            public Marker getMarker() {
-                return null;
-            }
-
-            @Override
-            public Map<String, String> getMDCPropertyMap() {
-                return null;
-            }
-
-            @Override
-            public Map<String, String> getMdc() {
-                return null;
-            }
-
-            @Override
-            public long getTimeStamp() {
-                return 0;
-            }
-
-            @Override
-            public void prepareForDeferredProcessing() {
-
-            }
         });
 
         Assert.assertEquals(FilterReply.ACCEPT, filterReply);
 
     }
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/WorkerLogFilterTest.java b/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/WorkerLogFilterTest.java
index dbcd4b8633..9af6954e1b 100644
--- a/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/WorkerLogFilterTest.java
+++ b/dolphinscheduler-log-server/src/test/java/org/apache/dolphinscheduler/server/log/WorkerLogFilterTest.java
@@ -16,18 +16,14 @@
  */
 package org.apache.dolphinscheduler.server.log;
 
-import ch.qos.logback.classic.Level;
-import ch.qos.logback.classic.spi.ILoggingEvent;
-import ch.qos.logback.classic.spi.IThrowableProxy;
-import ch.qos.logback.classic.spi.LoggerContextVO;
-import ch.qos.logback.core.spi.FilterReply;
 import org.apache.dolphinscheduler.common.Constants;
+
 import org.junit.Assert;
 import org.junit.Test;
-import org.slf4j.Marker;
-
-import java.util.Map;
 
+import ch.qos.logback.classic.Level;
+import ch.qos.logback.classic.spi.LoggingEvent;
+import ch.qos.logback.core.spi.FilterReply;
 
 public class WorkerLogFilterTest {
 
@@ -35,8 +31,8 @@ public class WorkerLogFilterTest {
     public void decide() {
         WorkerLogFilter workerLogFilter = new WorkerLogFilter();
 
+        FilterReply filterReply = workerLogFilter.decide(new LoggingEvent() {
 
-        FilterReply filterReply = workerLogFilter.decide(new ILoggingEvent() {
             @Override
             public String getThreadName() {
                 return Constants.THREAD_NAME_WORKER_SERVER;
@@ -62,58 +58,9 @@ public class WorkerLogFilterTest {
                 return "consume tasks: [2_177_2_704_-1],there still have 0 tasks need to be executed";
             }
 
-            @Override
-            public String getLoggerName() {
-                return null;
-            }
-
-            @Override
-            public LoggerContextVO getLoggerContextVO() {
-                return null;
-            }
-
-            @Override
-            public IThrowableProxy getThrowableProxy() {
-                return null;
-            }
-
-            @Override
-            public StackTraceElement[] getCallerData() {
-                return new StackTraceElement[0];
-            }
-
-            @Override
-            public boolean hasCallerData() {
-                return false;
-            }
-
-            @Override
-            public Marker getMarker() {
-                return null;
-            }
-
-            @Override
-            public Map<String, String> getMDCPropertyMap() {
-                return null;
-            }
-
-            @Override
-            public Map<String, String> getMdc() {
-                return null;
-            }
-
-            @Override
-            public long getTimeStamp() {
-                return 0;
-            }
-
-            @Override
-            public void prepareForDeferredProcessing() {
-
-            }
         });
 
         Assert.assertEquals(FilterReply.ACCEPT, filterReply);
 
     }
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-master/src/main/java/org/apache/dolphinscheduler/server/master/runner/WorkflowExecuteRunnable.java b/dolphinscheduler-master/src/main/java/org/apache/dolphinscheduler/server/master/runner/WorkflowExecuteRunnable.java
index a868f9fa51..a064a2b9dc 100644
--- a/dolphinscheduler-master/src/main/java/org/apache/dolphinscheduler/server/master/runner/WorkflowExecuteRunnable.java
+++ b/dolphinscheduler-master/src/main/java/org/apache/dolphinscheduler/server/master/runner/WorkflowExecuteRunnable.java
@@ -17,11 +17,19 @@
 
 package org.apache.dolphinscheduler.server.master.runner;
 
-import com.google.common.collect.Lists;
-import lombok.NonNull;
-import org.apache.commons.collections.CollectionUtils;
-import org.apache.commons.lang3.StringUtils;
-import org.apache.commons.lang3.math.NumberUtils;
+import static org.apache.dolphinscheduler.common.Constants.CMDPARAM_COMPLEMENT_DATA_END_DATE;
+import static org.apache.dolphinscheduler.common.Constants.CMDPARAM_COMPLEMENT_DATA_SCHEDULE_DATE_LIST;
+import static org.apache.dolphinscheduler.common.Constants.CMDPARAM_COMPLEMENT_DATA_START_DATE;
+import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_RECOVERY_START_NODE_STRING;
+import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_RECOVER_PROCESS_ID_STRING;
+import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_START_NODES;
+import static org.apache.dolphinscheduler.common.Constants.COMMA;
+import static org.apache.dolphinscheduler.common.Constants.DEFAULT_WORKER_GROUP;
+import static org.apache.dolphinscheduler.common.Constants.YYYY_MM_DD_HH_MM_SS;
+import static org.apache.dolphinscheduler.plugin.task.api.TaskConstants.TASK_TYPE_BLOCKING;
+import static org.apache.dolphinscheduler.plugin.task.api.enums.DataType.VARCHAR;
+import static org.apache.dolphinscheduler.plugin.task.api.enums.Direct.IN;
+
 import org.apache.dolphinscheduler.common.Constants;
 import org.apache.dolphinscheduler.common.enums.CommandType;
 import org.apache.dolphinscheduler.common.enums.FailureStrategy;
@@ -77,9 +85,10 @@ import org.apache.dolphinscheduler.service.exceptions.CronParseException;
 import org.apache.dolphinscheduler.service.expand.CuringParamsService;
 import org.apache.dolphinscheduler.service.process.ProcessService;
 import org.apache.dolphinscheduler.service.queue.PeerTaskInstancePriorityQueue;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-import org.springframework.beans.BeanUtils;
+
+import org.apache.commons.collections.CollectionUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.commons.lang3.math.NumberUtils;
 
 import java.util.ArrayList;
 import java.util.Arrays;
@@ -100,18 +109,13 @@ import java.util.concurrent.ConcurrentLinkedQueue;
 import java.util.concurrent.atomic.AtomicBoolean;
 import java.util.stream.Collectors;
 
-import static org.apache.dolphinscheduler.common.Constants.CMDPARAM_COMPLEMENT_DATA_END_DATE;
-import static org.apache.dolphinscheduler.common.Constants.CMDPARAM_COMPLEMENT_DATA_SCHEDULE_DATE_LIST;
-import static org.apache.dolphinscheduler.common.Constants.CMDPARAM_COMPLEMENT_DATA_START_DATE;
-import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_RECOVERY_START_NODE_STRING;
-import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_RECOVER_PROCESS_ID_STRING;
-import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_START_NODES;
-import static org.apache.dolphinscheduler.common.Constants.COMMA;
-import static org.apache.dolphinscheduler.common.Constants.DEFAULT_WORKER_GROUP;
-import static org.apache.dolphinscheduler.common.Constants.YYYY_MM_DD_HH_MM_SS;
-import static org.apache.dolphinscheduler.plugin.task.api.TaskConstants.TASK_TYPE_BLOCKING;
-import static org.apache.dolphinscheduler.plugin.task.api.enums.DataType.VARCHAR;
-import static org.apache.dolphinscheduler.plugin.task.api.enums.Direct.IN;
+import lombok.NonNull;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+import org.springframework.beans.BeanUtils;
+
+import com.google.common.collect.Lists;
 
 /**
  * Workflow execute task, used to execute a workflow instance.
@@ -1331,7 +1335,7 @@ public class WorkflowExecuteRunnable implements Callable<WorkflowSubmitStatue> {
                 continue;
             }
 
-            if (task.getId() > 0 && completeTaskMap.containsKey(task.getTaskCode())) {
+            if (task.getId() != null && completeTaskMap.containsKey(task.getTaskCode())) {
                 logger.info("task {} has already run success", task.getName());
                 continue;
             }
@@ -2014,4 +2018,4 @@ public class WorkflowExecuteRunnable implements Callable<WorkflowSubmitStatue> {
 
     }
 
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-registry/dolphinscheduler-registry-plugins/dolphinscheduler-registry-zookeeper/pom.xml b/dolphinscheduler-registry/dolphinscheduler-registry-plugins/dolphinscheduler-registry-zookeeper/pom.xml
index 028e5d6cb7..8f8bd7b645 100644
--- a/dolphinscheduler-registry/dolphinscheduler-registry-plugins/dolphinscheduler-registry-zookeeper/pom.xml
+++ b/dolphinscheduler-registry/dolphinscheduler-registry-plugins/dolphinscheduler-registry-zookeeper/pom.xml
@@ -15,15 +15,14 @@
   ~ See the License for the specific language governing permissions and
   ~ limitations under the License.
   -->
-<project xmlns="http://maven.apache.org/POM/4.0.0"
-         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+    <modelVersion>4.0.0</modelVersion>
     <parent>
-        <artifactId>dolphinscheduler-registry-plugins</artifactId>
         <groupId>org.apache.dolphinscheduler</groupId>
+        <artifactId>dolphinscheduler-registry-plugins</artifactId>
         <version>dev-SNAPSHOT</version>
     </parent>
-    <modelVersion>4.0.0</modelVersion>
 
     <artifactId>dolphinscheduler-registry-zookeeper</artifactId>
 
@@ -57,6 +56,15 @@
             <artifactId>slf4j-api</artifactId>
         </dependency>
 
+        <dependency>
+            <groupId>io.dropwizard.metrics</groupId>
+            <artifactId>metrics-core</artifactId>
+        </dependency>
+        <dependency>
+            <groupId>org.xerial.snappy</groupId>
+            <artifactId>snappy-java</artifactId>
+        </dependency>
+
         <dependency>
             <groupId>org.apache.curator</groupId>
             <artifactId>curator-test</artifactId>
diff --git a/dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/log/TaskLogDiscriminatorTest.java b/dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/log/TaskLogDiscriminatorTest.java
index 5b785448b7..3931bac99e 100644
--- a/dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/log/TaskLogDiscriminatorTest.java
+++ b/dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/log/TaskLogDiscriminatorTest.java
@@ -18,16 +18,12 @@ package org.apache.dolphinscheduler.server.log;
 
 import org.apache.dolphinscheduler.plugin.task.api.TaskConstants;
 
-import ch.qos.logback.classic.Level;
-import ch.qos.logback.classic.spi.ILoggingEvent;
-import ch.qos.logback.classic.spi.IThrowableProxy;
-import ch.qos.logback.classic.spi.LoggerContextVO;
 import org.junit.Assert;
 import org.junit.Before;
 import org.junit.Test;
-import org.slf4j.Marker;
 
-import java.util.Map;
+import ch.qos.logback.classic.Level;
+import ch.qos.logback.classic.spi.LoggingEvent;
 
 public class TaskLogDiscriminatorTest {
 
@@ -39,7 +35,7 @@ public class TaskLogDiscriminatorTest {
     TaskLogDiscriminator taskLogDiscriminator;
 
     @Before
-    public void before(){
+    public void before() {
         taskLogDiscriminator = new TaskLogDiscriminator();
         taskLogDiscriminator.setLogBase("logs");
         taskLogDiscriminator.setKey("123");
@@ -47,7 +43,8 @@ public class TaskLogDiscriminatorTest {
 
     @Test
     public void getDiscriminatingValue() {
-       String result = taskLogDiscriminator.getDiscriminatingValue(new ILoggingEvent() {
+        String result = taskLogDiscriminator.getDiscriminatingValue(new LoggingEvent() {
+
             @Override
             public String getThreadName() {
                 return "taskAppId=TASK-20220105-101-1-1001";
@@ -77,51 +74,6 @@ public class TaskLogDiscriminatorTest {
             public String getLoggerName() {
                 return TaskConstants.TASK_LOG_LOGGER_NAME;
             }
-
-            @Override
-            public LoggerContextVO getLoggerContextVO() {
-                return null;
-            }
-
-            @Override
-            public IThrowableProxy getThrowableProxy() {
-                return null;
-            }
-
-            @Override
-            public StackTraceElement[] getCallerData() {
-                return new StackTraceElement[0];
-            }
-
-            @Override
-            public boolean hasCallerData() {
-                return false;
-            }
-
-            @Override
-            public Marker getMarker() {
-                return null;
-            }
-
-            @Override
-            public Map<String, String> getMDCPropertyMap() {
-                return null;
-            }
-
-            @Override
-            public Map<String, String> getMdc() {
-                return null;
-            }
-
-            @Override
-            public long getTimeStamp() {
-                return 0;
-            }
-
-            @Override
-            public void prepareForDeferredProcessing() {
-
-            }
         });
         Assert.assertEquals("20220105/101-1-1001", result);
     }
@@ -150,6 +102,6 @@ public class TaskLogDiscriminatorTest {
 
     @Test
     public void setLogBase() {
-       taskLogDiscriminator.setLogBase("logs");
+        taskLogDiscriminator.setLogBase("logs");
     }
 }
diff --git a/dolphinscheduler-service/src/test/java/org/apache/dolphinscheduler/service/process/ProcessServiceTest.java b/dolphinscheduler-service/src/test/java/org/apache/dolphinscheduler/service/process/ProcessServiceTest.java
index 5dd64fac9d..c19c4aaf71 100644
--- a/dolphinscheduler-service/src/test/java/org/apache/dolphinscheduler/service/process/ProcessServiceTest.java
+++ b/dolphinscheduler-service/src/test/java/org/apache/dolphinscheduler/service/process/ProcessServiceTest.java
@@ -20,7 +20,6 @@ package org.apache.dolphinscheduler.service.process;
 import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_RECOVER_PROCESS_ID_STRING;
 import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_START_PARAMS;
 import static org.apache.dolphinscheduler.common.Constants.CMD_PARAM_SUB_PROCESS_DEFINE_CODE;
-
 import static org.mockito.ArgumentMatchers.any;
 
 import org.apache.dolphinscheduler.common.Constants;
@@ -193,7 +192,7 @@ public class ProcessServiceTest {
         instanceMap.setParentTaskInstanceId(10);
         Command command;
 
-        //father history: start; child null == command type: start
+        // father history: start; child null == command type: start
         parentInstance.setHistoryCmd("START_PROCESS");
         parentInstance.setCommandType(CommandType.START_PROCESS);
         ProcessDefinition processDefinition = new ProcessDefinition();
@@ -203,19 +202,19 @@ public class ProcessServiceTest {
         command = processService.createSubProcessCommand(parentInstance, childInstance, instanceMap, task);
         Assert.assertEquals(CommandType.START_PROCESS, command.getCommandType());
 
-        //father history: start,start failure; child null == command type: start
+        // father history: start,start failure; child null == command type: start
         parentInstance.setCommandType(CommandType.START_FAILURE_TASK_PROCESS);
         parentInstance.setHistoryCmd("START_PROCESS,START_FAILURE_TASK_PROCESS");
         command = processService.createSubProcessCommand(parentInstance, childInstance, instanceMap, task);
         Assert.assertEquals(CommandType.START_PROCESS, command.getCommandType());
 
-        //father history: scheduler,start failure; child null == command type: scheduler
+        // father history: scheduler,start failure; child null == command type: scheduler
         parentInstance.setCommandType(CommandType.START_FAILURE_TASK_PROCESS);
         parentInstance.setHistoryCmd("SCHEDULER,START_FAILURE_TASK_PROCESS");
         command = processService.createSubProcessCommand(parentInstance, childInstance, instanceMap, task);
         Assert.assertEquals(CommandType.SCHEDULER, command.getCommandType());
 
-        //father history: complement,start failure; child null == command type: complement
+        // father history: complement,start failure; child null == command type: complement
 
         String startString = "2020-01-01 00:00:00";
         String endString = "2020-01-10 00:00:00";
@@ -234,7 +233,7 @@ public class ProcessServiceTest {
         Assert.assertEquals(startString, DateUtils.dateToString(start));
         Assert.assertEquals(endString, DateUtils.dateToString(end));
 
-        //father history: start,failure,start failure; child not null == command type: start failure
+        // father history: start,failure,start failure; child not null == command type: start failure
         childInstance = new ProcessInstance();
         parentInstance.setCommandType(CommandType.START_FAILURE_TASK_PROCESS);
         parentInstance.setHistoryCmd("START_PROCESS,START_FAILURE_TASK_PROCESS");
@@ -295,16 +294,16 @@ public class ProcessServiceTest {
     @Test
     public void testHandleCommand() throws CronParseException, CodeGenerateUtils.CodeGenerateException {
 
-        //cannot construct process instance, return null;
+        // cannot construct process instance, return null;
         String host = "127.0.0.1";
         Command command = new Command();
         command.setProcessDefinitionCode(222);
         command.setCommandType(CommandType.REPEAT_RUNNING);
         command.setCommandParam("{\""
-                                    + CMD_PARAM_RECOVER_PROCESS_ID_STRING
-                                    + "\":\"111\",\""
-                                    + CMD_PARAM_SUB_PROCESS_DEFINE_CODE
-                                    + "\":\"222\"}");
+                + CMD_PARAM_RECOVER_PROCESS_ID_STRING
+                + "\":\"111\",\""
+                + CMD_PARAM_SUB_PROCESS_DEFINE_CODE
+                + "\":\"222\"}");
         try {
             Assert.assertNull(processService.handleCommand(host, command));
         } catch (IllegalArgumentException illegalArgumentException) {
@@ -315,7 +314,7 @@ public class ProcessServiceTest {
         int definitionVersion = 1;
         long definitionCode = 123;
         int processInstanceId = 222;
-        //there is not enough thread for this command
+        // there is not enough thread for this command
         Command command1 = new Command();
         command1.setId(1);
         command1.setProcessDefinitionCode(definitionCode);
@@ -329,7 +328,8 @@ public class ProcessServiceTest {
         processDefinition.setName("test");
         processDefinition.setVersion(definitionVersion);
         processDefinition.setCode(definitionCode);
-        processDefinition.setGlobalParams("[{\"prop\":\"startParam1\",\"direct\":\"IN\",\"type\":\"VARCHAR\",\"value\":\"\"}]");
+        processDefinition
+                .setGlobalParams("[{\"prop\":\"startParam1\",\"direct\":\"IN\",\"type\":\"VARCHAR\",\"value\":\"\"}]");
         processDefinition.setExecutionType(ProcessExecutionTypeEnum.PARALLEL);
 
         ProcessInstance processInstance = new ProcessInstance();
@@ -341,7 +341,8 @@ public class ProcessServiceTest {
         processInstance.setProcessDefinitionCode(definitionCode);
         processInstance.setProcessDefinitionVersion(definitionVersion);
 
-        Mockito.when(processDefineMapper.queryByCode(command1.getProcessDefinitionCode())).thenReturn(processDefinition);
+        Mockito.when(processDefineMapper.queryByCode(command1.getProcessDefinitionCode()))
+                .thenReturn(processDefinition);
         Mockito.when(processDefineLogMapper.queryByDefinitionCodeAndVersion(processInstance.getProcessDefinitionCode(),
                 processInstance.getProcessDefinitionVersion())).thenReturn(new ProcessDefinitionLog(processDefinition));
         Mockito.when(processInstanceMapper.queryDetailById(222)).thenReturn(processInstance);
@@ -427,7 +428,8 @@ public class ProcessServiceTest {
         command6.setCommandParam("{\"ProcessInstanceId\":223}");
         command6.setCommandType(CommandType.RECOVER_SERIAL_WAIT);
         command6.setProcessDefinitionVersion(1);
-        Mockito.when(processInstanceMapper.queryByProcessDefineCodeAndProcessDefinitionVersionAndStatusAndNextId(11L, 1, Constants.RUNNING_PROCESS_STATE, 223)).thenReturn(lists);
+        Mockito.when(processInstanceMapper.queryByProcessDefineCodeAndProcessDefinitionVersionAndStatusAndNextId(11L, 1,
+                Constants.RUNNING_PROCESS_STATE, 223)).thenReturn(lists);
         Mockito.when(processInstanceMapper.updateNextProcessIdById(223, 222)).thenReturn(true);
         Mockito.when(commandMapper.deleteById(6)).thenReturn(1);
         ProcessInstance processInstance6 = processService.handleCommand(host, command6);
@@ -448,7 +450,8 @@ public class ProcessServiceTest {
         command7.setCommandType(CommandType.RECOVER_SERIAL_WAIT);
         command7.setProcessDefinitionVersion(1);
         Mockito.when(commandMapper.deleteById(7)).thenReturn(1);
-        Mockito.when(processInstanceMapper.queryByProcessDefineCodeAndProcessDefinitionVersionAndStatusAndNextId(11L, 1, Constants.RUNNING_PROCESS_STATE, 224)).thenReturn(null);
+        Mockito.when(processInstanceMapper.queryByProcessDefineCodeAndProcessDefinitionVersionAndStatusAndNextId(11L, 1,
+                Constants.RUNNING_PROCESS_STATE, 224)).thenReturn(null);
         ProcessInstance processInstance8 = processService.handleCommand(host, command7);
         Assert.assertTrue(processInstance8 != null);
 
@@ -470,7 +473,8 @@ public class ProcessServiceTest {
         command9.setCommandType(CommandType.RECOVER_SERIAL_WAIT);
         command9.setProcessDefinitionVersion(1);
         Mockito.when(processInstanceMapper.queryDetailById(225)).thenReturn(processInstance9);
-        Mockito.when(processInstanceMapper.queryByProcessDefineCodeAndProcessDefinitionVersionAndStatusAndNextId(12L, 1, Constants.RUNNING_PROCESS_STATE, 0)).thenReturn(lists);
+        Mockito.when(processInstanceMapper.queryByProcessDefineCodeAndProcessDefinitionVersionAndStatusAndNextId(12L, 1,
+                Constants.RUNNING_PROCESS_STATE, 0)).thenReturn(lists);
         Mockito.when(processInstanceMapper.updateById(processInstance)).thenReturn(1);
         Mockito.when(commandMapper.deleteById(9)).thenReturn(1);
         ProcessInstance processInstance10 = processService.handleCommand(host, command9);
@@ -496,7 +500,8 @@ public class ProcessServiceTest {
         processDefinition.setName("test");
         processDefinition.setVersion(definitionVersion);
         processDefinition.setCode(definitionCode);
-        processDefinition.setGlobalParams("[{\"prop\":\"startParam1\",\"direct\":\"IN\",\"type\":\"VARCHAR\",\"value\":\"\"}]");
+        processDefinition
+                .setGlobalParams("[{\"prop\":\"startParam1\",\"direct\":\"IN\",\"type\":\"VARCHAR\",\"value\":\"\"}]");
         processDefinition.setExecutionType(ProcessExecutionTypeEnum.PARALLEL);
 
         ProcessInstance processInstance = new ProcessInstance();
@@ -508,7 +513,8 @@ public class ProcessServiceTest {
         processInstance.setProcessDefinitionCode(definitionCode);
         processInstance.setProcessDefinitionVersion(definitionVersion);
 
-        Mockito.when(processDefineMapper.queryByCode(command1.getProcessDefinitionCode())).thenReturn(processDefinition);
+        Mockito.when(processDefineMapper.queryByCode(command1.getProcessDefinitionCode()))
+                .thenReturn(processDefinition);
         Mockito.when(processDefineLogMapper.queryByDefinitionCodeAndVersion(processInstance.getProcessDefinitionCode(),
                 processInstance.getProcessDefinitionVersion())).thenReturn(new ProcessDefinitionLog(processDefinition));
         Mockito.when(processInstanceMapper.queryDetailById(222)).thenReturn(processInstance);
@@ -539,7 +545,8 @@ public class ProcessServiceTest {
         processInstance.setId(222);
         processInstance.setProcessDefinitionVersion(1);
         processInstance.setProcessDefinitionCode(1L);
-        Mockito.when(processService.findProcessInstanceById(taskInstance.getProcessInstanceId())).thenReturn(processInstance);
+        Mockito.when(processService.findProcessInstanceById(taskInstance.getProcessInstanceId()))
+                .thenReturn(processInstance);
         Assert.assertEquals("", processService.formatTaskAppId(taskInstance));
     }
 
@@ -562,8 +569,8 @@ public class ProcessServiceTest {
         processTaskRelationLog.setPostTaskCode(postTaskCode);
         processTaskRelationLog.setPostTaskVersion(postTaskVersion);
         relationLogList.add(processTaskRelationLog);
-        Mockito.when(processTaskRelationLogMapper.queryByProcessCodeAndVersion(parentProcessDefineCode
-                , parentProcessDefineVersion)).thenReturn(relationLogList);
+        Mockito.when(processTaskRelationLogMapper.queryByProcessCodeAndVersion(parentProcessDefineCode,
+                parentProcessDefineVersion)).thenReturn(relationLogList);
 
         List<TaskDefinitionLog> taskDefinitionLogs = new ArrayList<>();
         TaskDefinitionLog taskDefinitionLog1 = new TaskDefinitionLog();
@@ -622,7 +629,8 @@ public class ProcessServiceTest {
         srcConnectorType.setValue("JDBC");
         srcConnectorType.setPlaceholder("Please select the source connector type");
         srcConnectorType.setOptionSourceType(OptionSourceType.DEFAULT.getCode());
-        srcConnectorType.setOptions("[{\"label\":\"HIVE\",\"value\":\"HIVE\"},{\"label\":\"JDBC\",\"value\":\"JDBC\"}]");
+        srcConnectorType
+                .setOptions("[{\"label\":\"HIVE\",\"value\":\"HIVE\"},{\"label\":\"JDBC\",\"value\":\"JDBC\"}]");
         srcConnectorType.setInputType(InputType.DEFAULT.getCode());
         srcConnectorType.setValueType(ValueType.NUMBER.getCode());
         srcConnectorType.setEmit(true);
@@ -695,15 +703,16 @@ public class ProcessServiceTest {
         operator.setId(-1);
         operator.setUserType(UserType.GENERAL_USER);
         long projectCode = 751485690568704L;
-        String taskJson = "[{\"code\":751500437479424,\"name\":\"aa\",\"version\":1,\"description\":\"\",\"delayTime\":0,"
-                + "\"taskType\":\"SHELL\",\"taskParams\":{\"resourceList\":[],\"localParams\":[],\"rawScript\":\"sleep 1s\\necho 11\","
-                + "\"dependence\":{},\"conditionResult\":{\"successNode\":[\"\"],\"failedNode\":[\"\"]},\"waitStartTimeout\":{}},"
-                + "\"flag\":\"YES\",\"taskPriority\":\"MEDIUM\",\"workerGroup\":\"yarn\",\"failRetryTimes\":0,\"failRetryInterval\":1,"
-                + "\"timeoutFlag\":\"OPEN\",\"timeoutNotifyStrategy\":\"FAILED\",\"timeout\":1,\"environmentCode\":751496815697920},"
-                + "{\"code\":751516889636864,\"name\":\"bb\",\"description\":\"\",\"taskType\":\"SHELL\",\"taskParams\":{\"resourceList\":[],"
-                + "\"localParams\":[],\"rawScript\":\"echo 22\",\"dependence\":{},\"conditionResult\":{\"successNode\":[\"\"],\"failedNode\":[\"\"]},"
-                + "\"waitStartTimeout\":{}},\"flag\":\"YES\",\"taskPriority\":\"MEDIUM\",\"workerGroup\":\"default\",\"failRetryTimes\":\"0\","
-                + "\"failRetryInterval\":\"1\",\"timeoutFlag\":\"CLOSE\",\"timeoutNotifyStrategy\":\"\",\"timeout\":0,\"delayTime\":\"0\",\"environmentCode\":-1}]";
+        String taskJson =
+                "[{\"code\":751500437479424,\"name\":\"aa\",\"version\":1,\"description\":\"\",\"delayTime\":0,"
+                        + "\"taskType\":\"SHELL\",\"taskParams\":{\"resourceList\":[],\"localParams\":[],\"rawScript\":\"sleep 1s\\necho 11\","
+                        + "\"dependence\":{},\"conditionResult\":{\"successNode\":[\"\"],\"failedNode\":[\"\"]},\"waitStartTimeout\":{}},"
+                        + "\"flag\":\"YES\",\"taskPriority\":\"MEDIUM\",\"workerGroup\":\"yarn\",\"failRetryTimes\":0,\"failRetryInterval\":1,"
+                        + "\"timeoutFlag\":\"OPEN\",\"timeoutNotifyStrategy\":\"FAILED\",\"timeout\":1,\"environmentCode\":751496815697920},"
+                        + "{\"code\":751516889636864,\"name\":\"bb\",\"description\":\"\",\"taskType\":\"SHELL\",\"taskParams\":{\"resourceList\":[],"
+                        + "\"localParams\":[],\"rawScript\":\"echo 22\",\"dependence\":{},\"conditionResult\":{\"successNode\":[\"\"],\"failedNode\":[\"\"]},"
+                        + "\"waitStartTimeout\":{}},\"flag\":\"YES\",\"taskPriority\":\"MEDIUM\",\"workerGroup\":\"default\",\"failRetryTimes\":\"0\","
+                        + "\"failRetryInterval\":\"1\",\"timeoutFlag\":\"CLOSE\",\"timeoutNotifyStrategy\":\"\",\"timeout\":0,\"delayTime\":\"0\",\"environmentCode\":-1}]";
         List<TaskDefinitionLog> taskDefinitionLogs = JSONUtils.toList(taskJson, TaskDefinitionLog.class);
         TaskDefinitionLog taskDefinition = new TaskDefinitionLog();
         taskDefinition.setCode(751500437479424L);
@@ -715,9 +724,11 @@ public class ProcessServiceTest {
         taskDefinition.setCreateTime(new Date());
         taskDefinition.setUpdateTime(new Date());
         Mockito.when(taskPluginManager.getParameters(any())).thenReturn(null);
-        Mockito.when(taskDefinitionLogMapper.queryByDefinitionCodeAndVersion(taskDefinition.getCode(), taskDefinition.getVersion())).thenReturn(taskDefinition);
+        Mockito.when(taskDefinitionLogMapper.queryByDefinitionCodeAndVersion(taskDefinition.getCode(),
+                taskDefinition.getVersion())).thenReturn(taskDefinition);
         Mockito.when(taskDefinitionLogMapper.queryMaxVersionForDefinition(taskDefinition.getCode())).thenReturn(1);
-        Mockito.when(taskDefinitionMapper.queryByCodeList(Collections.singletonList(taskDefinition.getCode()))).thenReturn(Collections.singletonList(taskDefinition));
+        Mockito.when(taskDefinitionMapper.queryByCodeList(Collections.singletonList(taskDefinition.getCode())))
+                .thenReturn(Collections.singletonList(taskDefinition));
         int result = processService.saveTaskDefine(operator, projectCode, taskDefinitionLogs, Boolean.TRUE);
         Assert.assertEquals(0, result);
     }
@@ -768,9 +779,11 @@ public class ProcessServiceTest {
         taskDefinitionLogs.add(td2);
 
         Mockito.when(taskDefinitionLogMapper.queryByTaskDefinitions(any())).thenReturn(taskDefinitionLogs);
-        Mockito.when(processTaskRelationLogMapper.queryByProcessCodeAndVersion(Mockito.anyLong(), Mockito.anyInt())).thenReturn(list);
+        Mockito.when(processTaskRelationLogMapper.queryByProcessCodeAndVersion(Mockito.anyLong(), Mockito.anyInt()))
+                .thenReturn(list);
 
-        DAG<String, TaskNode, TaskNodeRelation> stringTaskNodeTaskNodeRelationDAG = processService.genDagGraph(processDefinition);
+        DAG<String, TaskNode, TaskNodeRelation> stringTaskNodeTaskNodeRelationDAG =
+                processService.genDagGraph(processDefinition);
         Assert.assertEquals(1, stringTaskNodeTaskNodeRelationDAG.getNodesCount());
     }
 
@@ -830,7 +843,7 @@ public class ProcessServiceTest {
                 "updateResourceInfo",
                 resourceInfoNormal);
 
-        Assert.assertEquals(1, updatedResourceInfo3.getId());
+        Assert.assertEquals(1, updatedResourceInfo3.getId().intValue());
         Assert.assertEquals("test.txt", updatedResourceInfo3.getRes());
         Assert.assertEquals("/test.txt", updatedResourceInfo3.getResourceName());
 
@@ -839,7 +852,8 @@ public class ProcessServiceTest {
     @Test
     public void testCreateTaskGroupQueue() {
         Mockito.when(taskGroupQueueMapper.insert(Mockito.any(TaskGroupQueue.class))).thenReturn(1);
-        TaskGroupQueue taskGroupQueue = processService.insertIntoTaskGroupQueue(1, "task name", 1, 1, 1, TaskGroupQueueStatus.WAIT_QUEUE);
+        TaskGroupQueue taskGroupQueue =
+                processService.insertIntoTaskGroupQueue(1, "task name", 1, 1, 1, TaskGroupQueueStatus.WAIT_QUEUE);
         Assert.assertNotNull(taskGroupQueue);
     }
 
@@ -882,7 +896,8 @@ public class ProcessServiceTest {
         int pageNumber = 0;
         int masterCount = 0;
         int thisMasterSlot = 2;
-        List<Command> commandList = processService.findCommandPageBySlot(pageSize, pageNumber, masterCount, thisMasterSlot);
+        List<Command> commandList =
+                processService.findCommandPageBySlot(pageSize, pageNumber, masterCount, thisMasterSlot);
         Assert.assertEquals(0, commandList.size());
     }
 
diff --git a/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/model/ResourceInfo.java b/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/model/ResourceInfo.java
index f86b01daa7..7cde602bcf 100644
--- a/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/model/ResourceInfo.java
+++ b/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/model/ResourceInfo.java
@@ -21,6 +21,7 @@ package org.apache.dolphinscheduler.plugin.task.api.model;
  * resource info
  */
 public class ResourceInfo {
+
     /**
      * res the name of the resource that was uploaded
      */
@@ -33,7 +34,7 @@ public class ResourceInfo {
 
     private String res;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/parameters/SqlParameters.java b/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/parameters/SqlParameters.java
index 8c7725ccf8..2bb7822a87 100644
--- a/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/parameters/SqlParameters.java
+++ b/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/parameters/SqlParameters.java
@@ -18,6 +18,7 @@
 package org.apache.dolphinscheduler.plugin.task.api.parameters;
 
 import org.apache.dolphinscheduler.plugin.task.api.SQLTaskExecutionContext;
+import org.apache.dolphinscheduler.plugin.task.api.enums.DataType;
 import org.apache.dolphinscheduler.plugin.task.api.enums.ResourceType;
 import org.apache.dolphinscheduler.plugin.task.api.enums.UdfType;
 import org.apache.dolphinscheduler.plugin.task.api.model.Property;
@@ -25,11 +26,10 @@ import org.apache.dolphinscheduler.plugin.task.api.model.ResourceInfo;
 import org.apache.dolphinscheduler.plugin.task.api.parameters.resource.DataSourceParameters;
 import org.apache.dolphinscheduler.plugin.task.api.parameters.resource.ResourceParametersHelper;
 import org.apache.dolphinscheduler.plugin.task.api.parameters.resource.UdfFuncParameters;
-import org.apache.dolphinscheduler.plugin.task.api.enums.DataType;
 import org.apache.dolphinscheduler.spi.utils.JSONUtils;
 import org.apache.dolphinscheduler.spi.utils.StringUtils;
 
-import org.apache.commons.collections.CollectionUtils;
+import org.apache.commons.collections4.CollectionUtils;
 
 import java.util.ArrayList;
 import java.util.HashMap;
@@ -45,6 +45,7 @@ import com.google.common.base.Strings;
  * Sql/Hql parameter
  */
 public class SqlParameters extends AbstractParameters {
+
     /**
      * data source type,eg  MYSQL, POSTGRES, HIVE ...
      */
@@ -269,10 +270,10 @@ public class SqlParameters extends AbstractParameters {
         if (CollectionUtils.isEmpty(sqlResult)) {
             return;
         }
-        //if sql return more than one line
+        // if sql return more than one line
         if (sqlResult.size() > 1) {
             Map<String, List<String>> sqlResultFormat = new HashMap<>();
-            //init sqlResultFormat
+            // init sqlResultFormat
             Set<String> keySet = sqlResult.get(0).keySet();
             for (String key : keySet) {
                 sqlResultFormat.put(key, new ArrayList<>());
@@ -289,7 +290,7 @@ public class SqlParameters extends AbstractParameters {
                 }
             }
         } else {
-            //result only one line
+            // result only one line
             Map<String, String> firstRow = sqlResult.get(0);
             for (Property info : outProperty) {
                 info.setValue(String.valueOf(firstRow.get(info.getProp())));
@@ -346,7 +347,8 @@ public class SqlParameters extends AbstractParameters {
     public SQLTaskExecutionContext generateExtendedContext(ResourceParametersHelper parametersHelper) {
         SQLTaskExecutionContext sqlTaskExecutionContext = new SQLTaskExecutionContext();
 
-        DataSourceParameters dbSource = (DataSourceParameters) parametersHelper.getResourceParameters(ResourceType.DATASOURCE, datasource);
+        DataSourceParameters dbSource =
+                (DataSourceParameters) parametersHelper.getResourceParameters(ResourceType.DATASOURCE, datasource);
         sqlTaskExecutionContext.setConnectionParams(dbSource.getConnectionParams());
 
         // whether udf type
@@ -354,7 +356,8 @@ public class SqlParameters extends AbstractParameters {
                 && !StringUtils.isEmpty(this.getUdfs());
 
         if (udfTypeFlag) {
-            List<UdfFuncParameters> collect = parametersHelper.getResourceMap(ResourceType.UDF).entrySet().stream().map(entry -> (UdfFuncParameters) entry.getValue()).collect(Collectors.toList());
+            List<UdfFuncParameters> collect = parametersHelper.getResourceMap(ResourceType.UDF).entrySet().stream()
+                    .map(entry -> (UdfFuncParameters) entry.getValue()).collect(Collectors.toList());
             sqlTaskExecutionContext.setUdfFuncParametersList(collect);
         }
 
diff --git a/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/parameters/resource/UdfFuncParameters.java b/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/parameters/resource/UdfFuncParameters.java
index f0ce772935..4cc99f7cb5 100644
--- a/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/parameters/resource/UdfFuncParameters.java
+++ b/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/main/java/org/apache/dolphinscheduler/plugin/task/api/parameters/resource/UdfFuncParameters.java
@@ -17,16 +17,18 @@
 
 package org.apache.dolphinscheduler.plugin.task.api.parameters.resource;
 
-import com.fasterxml.jackson.annotation.JsonProperty;
 import org.apache.dolphinscheduler.plugin.task.api.enums.UdfType;
 import org.apache.dolphinscheduler.spi.utils.JSONUtils;
 
 import java.util.Date;
 
+import com.fasterxml.jackson.annotation.JsonProperty;
+
 /**
  * udf function
  */
 public class UdfFuncParameters extends AbstractResourceParameters {
+
     /**
      * id
      */
@@ -102,7 +104,7 @@ public class UdfFuncParameters extends AbstractResourceParameters {
      */
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
diff --git a/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/test/java/org/apache/dolphinscheduler/plugin/task/api/parameters/SqlParametersTest.java b/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/test/java/org/apache/dolphinscheduler/plugin/task/api/parameters/SqlParametersTest.java
index 8aded957b6..6fa1148795 100644
--- a/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/test/java/org/apache/dolphinscheduler/plugin/task/api/parameters/SqlParametersTest.java
+++ b/dolphinscheduler-task-plugin/dolphinscheduler-task-api/src/test/java/org/apache/dolphinscheduler/plugin/task/api/parameters/SqlParametersTest.java
@@ -19,11 +19,11 @@ package org.apache.dolphinscheduler.plugin.task.api.parameters;
 
 import static org.junit.Assert.assertNotNull;
 
+import org.apache.dolphinscheduler.plugin.task.api.enums.DataType;
 import org.apache.dolphinscheduler.plugin.task.api.enums.Direct;
 import org.apache.dolphinscheduler.plugin.task.api.model.Property;
-import org.apache.dolphinscheduler.plugin.task.api.enums.DataType;
 
-import org.apache.commons.collections.CollectionUtils;
+import org.apache.commons.collections4.CollectionUtils;
 
 import java.util.ArrayList;
 import java.util.List;
diff --git a/dolphinscheduler-task-plugin/dolphinscheduler-task-dataquality/src/main/java/org/apache/dolphinscheduler/plugin/task/dq/rule/entity/DqRuleExecuteSql.java b/dolphinscheduler-task-plugin/dolphinscheduler-task-dataquality/src/main/java/org/apache/dolphinscheduler/plugin/task/dq/rule/entity/DqRuleExecuteSql.java
index 8e49df09e1..c0a2255370 100644
--- a/dolphinscheduler-task-plugin/dolphinscheduler-task-dataquality/src/main/java/org/apache/dolphinscheduler/plugin/task/dq/rule/entity/DqRuleExecuteSql.java
+++ b/dolphinscheduler-task-plugin/dolphinscheduler-task-dataquality/src/main/java/org/apache/dolphinscheduler/plugin/task/dq/rule/entity/DqRuleExecuteSql.java
@@ -26,6 +26,7 @@ import java.util.Date;
  * RuleExecuteSql
  */
 public class DqRuleExecuteSql implements Serializable {
+
     /**
      * primary key
      */
@@ -59,7 +60,7 @@ public class DqRuleExecuteSql implements Serializable {
      */
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -136,4 +137,4 @@ public class DqRuleExecuteSql implements Serializable {
                 + ", updateTime=" + updateTime
                 + '}';
     }
-}
\ No newline at end of file
+}
diff --git a/dolphinscheduler-task-plugin/dolphinscheduler-task-dataquality/src/main/java/org/apache/dolphinscheduler/plugin/task/dq/rule/entity/DqRuleInputEntry.java b/dolphinscheduler-task-plugin/dolphinscheduler-task-dataquality/src/main/java/org/apache/dolphinscheduler/plugin/task/dq/rule/entity/DqRuleInputEntry.java
index 0e88b6e430..246acf20b3 100644
--- a/dolphinscheduler-task-plugin/dolphinscheduler-task-dataquality/src/main/java/org/apache/dolphinscheduler/plugin/task/dq/rule/entity/DqRuleInputEntry.java
+++ b/dolphinscheduler-task-plugin/dolphinscheduler-task-dataquality/src/main/java/org/apache/dolphinscheduler/plugin/task/dq/rule/entity/DqRuleInputEntry.java
@@ -28,6 +28,7 @@ import java.util.Date;
  * RuleInputEntry
  */
 public class DqRuleInputEntry implements Serializable {
+
     /**
      * primary key
      */
@@ -102,7 +103,7 @@ public class DqRuleInputEntry implements Serializable {
      */
     private Date updateTime;
 
-    public int getId() {
+    public Integer getId() {
         return id;
     }
 
@@ -269,4 +270,4 @@ public class DqRuleInputEntry implements Serializable {
                 + ", updateTime=" + updateTime
                 + '}';
     }
-}
\ No newline at end of file
+}
diff --git a/pom.xml b/pom.xml
index 1e50eb29c9..87c9d588fd 100644
--- a/pom.xml
+++ b/pom.xml
@@ -450,6 +450,7 @@
                                 <workingDirectory>${project.basedir}</workingDirectory>
                                 <arguments>
                                     <argument>build</argument>
+                                    <argument>--load</argument>
                                     <argument>--no-cache</argument>
                                     <argument>-t</argument>
                                     <argument>${docker.hub}/${docker.repo}:${docker.tag}</argument>
diff --git a/tools/dependencies/known-dependencies.txt b/tools/dependencies/known-dependencies.txt
index 57c4fb1c55..427c87c4bd 100755
--- a/tools/dependencies/known-dependencies.txt
+++ b/tools/dependencies/known-dependencies.txt
@@ -5,6 +5,7 @@ LatencyUtils-2.0.3.jar
 SparseBitSet-1.2.jar
 accessors-smart-2.4.8.jar
 activation-1.1.jar
+aircompressor-0.3.jar
 animal-sniffer-annotations-1.14.jar
 annotations-13.0.jar
 apacheds-i18n-2.0.0-M15.jar
@@ -13,7 +14,7 @@ api-asn1-api-1.0.0-M20.jar
 api-util-1.0.0-M20.jar
 asm-9.1.jar
 aspectjweaver-1.9.7.jar
-audience-annotations-0.5.0.jar
+audience-annotations-0.12.0.jar
 automaton-1.11-8.jar
 avro-1.7.4.jar
 aws-java-sdk-core-1.12.160.jar
@@ -51,15 +52,15 @@ commons-math3-3.1.1.jar
 commons-net-3.1.jar
 commons-pool-1.6.jar
 commons-text-1.8.jar
-cron-utils-9.1.3.jar
+cron-utils-9.1.6.jar
 curator-client-4.3.0.jar
 curator-framework-4.3.0.jar
 curator-recipes-4.3.0.jar
 curator-test-2.12.0.jar
 curvesapi-1.06.jar
-datanucleus-api-jdo-4.2.1.jar
-datanucleus-core-4.1.6.jar
-datanucleus-rdbms-4.1.7.jar
+datanucleus-api-jdo-4.2.4.jar
+datanucleus-core-4.1.17.jar
+datanucleus-rdbms-4.1.19.jar
 derby-10.14.2.0.jar
 druid-1.2.4.jar
 error_prone_annotations-2.5.1.jar
@@ -67,29 +68,28 @@ generex-1.0.2.jar
 gson-2.9.1.jar
 guava-24.1-jre.jar
 guava-retrying-2.0.0.jar
-h2-1.4.200.jar
-hadoop-annotations-2.7.3.jar
-hadoop-auth-2.7.3.jar
-hadoop-client-2.7.3.jar
-hadoop-common-2.7.3.jar
-hadoop-hdfs-2.7.3.jar
-hadoop-mapreduce-client-app-2.7.3.jar
-hadoop-mapreduce-client-common-2.7.3.jar
-hadoop-mapreduce-client-core-2.7.3.jar
-hadoop-mapreduce-client-jobclient-2.7.3.jar
-hadoop-yarn-api-2.7.3.jar
-hadoop-yarn-client-2.7.3.jar
-hadoop-yarn-common-2.7.3.jar
-hadoop-yarn-server-common-2.7.3.jar
+h2-2.1.210.jar
+hadoop-annotations-2.7.7.jar
+hadoop-auth-2.7.7.jar
+hadoop-client-2.7.7.jar
+hadoop-common-2.7.7.jar
+hadoop-hdfs-2.7.7.jar
+hadoop-mapreduce-client-app-2.7.7.jar
+hadoop-mapreduce-client-common-2.7.7.jar
+hadoop-mapreduce-client-core-2.7.7.jar
+hadoop-mapreduce-client-jobclient-2.7.7.jar
+hadoop-yarn-api-2.7.7.jar
+hadoop-yarn-client-2.7.7.jar
+hadoop-yarn-common-2.7.7.jar
+hadoop-yarn-server-common-2.7.7.jar
 hibernate-validator-6.2.2.Final.jar
-hive-common-2.1.0.jar
-hive-jdbc-2.1.0.jar
-hive-metastore-2.1.0.jar
-hive-orc-2.1.0.jar
-hive-serde-2.1.0.jar
-hive-service-2.1.0.jar
-hive-service-rpc-2.1.0.jar
-hive-storage-api-2.1.0.jar
+hive-common-2.3.3.jar
+hive-jdbc-2.3.3.jar
+hive-metastore-2.3.3.jar
+hive-serde-2.3.3.jar
+hive-service-2.3.3.jar
+hive-service-rpc-2.3.3.jar
+hive-storage-api-2.4.0.jar
 htrace-core-3.1.0-incubating.jar
 httpasyncclient-4.1.5.jar
 httpclient-4.5.13.jar
@@ -135,22 +135,24 @@ jetty-security-9.4.48.v20220622.jar
 jetty-server-9.4.48.v20220622.jar
 jetty-servlet-9.4.48.v20220622.jar
 jetty-servlets-9.4.48.v20220622.jar
+jetty-sslengine-6.1.26.jar
 jetty-util-6.1.26.jar
 jetty-util-9.4.48.v20220622.jar
 jetty-util-ajax-9.4.48.v20220622.jar
 jetty-webapp-9.4.48.v20220622.jar
 jetty-xml-9.4.48.v20220622.jar
-jline-0.9.94.jar
+jline-2.12.jar
 jmespath-java-1.12.160.jar
 jna-5.10.0.jar
 jna-platform-5.10.0.jar
 joda-time-2.10.13.jar
 jpam-1.1.jar
 jsch-0.1.55.jar
+json-1.8.jar
 json-path-2.7.0.jar
 json-smart-2.4.8.jar
 jsp-api-2.1.jar
-jsqlparser-2.1.jar
+jsqlparser-4.4.jar
 jsr305-3.0.0.jar
 jta-1.1.jar
 jul-to-slf4j-1.7.36.jar
@@ -182,34 +184,45 @@ kubernetes-model-storageclass-5.10.2.jar
 libfb303-0.9.3.jar
 libthrift-0.9.3.jar
 log4j-1.2-api-2.17.2.jar
-log4j-1.2.17.jar
 logback-classic-1.2.11.jar
 logback-core-1.2.11.jar
 logging-interceptor-4.9.3.jar
 lz4-1.3.0.jar
 mapstruct-1.3.1.Final.jar
+metrics-core-4.2.11.jar
 micrometer-core-1.9.3.jar
 micrometer-registry-prometheus-1.9.3.jar
 mssql-jdbc-6.1.0.jre8.jar
-mybatis-3.5.2.jar
-mybatis-plus-3.2.0.jar
-mybatis-plus-annotation-3.2.0.jar
-mybatis-plus-boot-starter-3.2.0.jar
-mybatis-plus-core-3.2.0.jar
-mybatis-plus-extension-3.2.0.jar
-mybatis-spring-2.0.2.jar
+mybatis-3.5.10.jar
+mybatis-plus-3.5.2.jar
+mybatis-plus-annotation-3.5.2.jar
+mybatis-plus-boot-starter-3.5.2.jar
+mybatis-plus-core-3.5.2.jar
+mybatis-plus-extension-3.5.2.jar
+mybatis-spring-2.0.7.jar
 netty-3.6.2.Final.jar
 netty-all-4.1.53.Final.jar
+netty-buffer-4.1.53.Final.jar
+netty-codec-4.1.53.Final.jar
+netty-common-4.1.53.Final.jar
+netty-handler-4.1.53.Final.jar
+netty-resolver-4.1.53.Final.jar
+netty-tcnative-2.0.48.Final.jar
+netty-tcnative-classes-2.0.53.Final.jar
+netty-transport-4.1.53.Final.jar
+netty-transport-native-epoll-4.1.53.Final.jar
+netty-transport-native-unix-common-4.1.53.Final.jar
 okhttp-3.14.9.jar
 okio-1.17.2.jar
 opencsv-2.3.jar
+orc-core-1.3.3.jar
 oshi-core-6.1.1.jar
 paranamer-2.3.jar
 parquet-hadoop-bundle-1.8.1.jar
 poi-4.1.2.jar
 poi-ooxml-4.1.2.jar
 poi-ooxml-schemas-4.1.2.jar
-postgresql-42.3.4.jar
+postgresql-42.4.1.jar
 presto-jdbc-0.238.1.jar
 protobuf-java-2.5.0.jar
 protostuff-api-1.7.2.jar
@@ -226,8 +239,7 @@ simpleclient_tracer_otel-0.15.0.jar
 simpleclient_tracer_otel_agent-0.15.0.jar
 slf4j-api-1.7.36.jar
 snakeyaml-1.30.jar
-snappy-0.2.jar
-snappy-java-1.0.4.1.jar
+snappy-java-1.1.8.4.jar
 spring-aop-5.3.22.jar
 spring-beans-5.3.19.jar
 spring-boot-2.7.3.jar
@@ -299,4 +311,5 @@ xmlenc-0.52.jar
 zeppelin-client-0.10.1.jar
 zeppelin-common-0.10.1.jar
 zjsonpatch-0.3.0.jar
-zookeeper-3.4.14.jar
+zookeeper-3.8.0.jar
+zookeeper-jute-3.8.0.jar