You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kylin.apache.org by xx...@apache.org on 2023/03/24 02:46:35 UTC

[kylin] branch kylin5-alpha updated (e4d40d990d -> 0017ab1b54)

This is an automated email from the ASF dual-hosted git repository.

xxyu pushed a change to branch kylin5-alpha
in repository https://gitbox.apache.org/repos/asf/kylin.git


 discard e4d40d990d Create first release of Kylin 5.X
     add 8c213bdab1 KYLIN-5407 KYLIN-5467 KYLIN-5468 KYLIN-5469 KYLIN-5470 KYLIN-5471 KYLIN-5476  (#2103)
     new 0017ab1b54 Create first release of Kylin 5.X

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (e4d40d990d)
            \
             N -- N -- N   refs/heads/kylin5-alpha (0017ab1b54)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 kystudio/src/assets/styles/main.less               |    5 +-
 .../common/GlobalDialog/dialog/detail_dialog.vue   |    6 +-
 .../common/ModelERDiagram/ModelERDiagram.vue       |  278 ++-
 .../components/common/ModelERDiagram/handler.js    |  114 +-
 .../components/common/ModelERDiagram/locales.js    |    9 +-
 .../common/ModelTools/ModelNavigationTools.vue     |  187 ++
 .../guide/modelEditPage/ActionUpdateGuide.vue      |  235 ++-
 .../components/layout/layout_left_right_top.vue    |   14 -
 kystudio/src/components/monitor/batchJobs/jobs.vue |    4 +
 .../studio/StudioModel/ModelEdit/config.js         |   44 +-
 .../studio/StudioModel/ModelEdit/index.vue         | 1914 ++++++++++----------
 .../studio/StudioModel/ModelEdit/layout.js         |  151 +-
 .../studio/StudioModel/ModelEdit/locales.js        |   10 +-
 .../studio/StudioModel/ModelEdit/model.js          |  108 +-
 .../studio/StudioModel/ModelEdit/table.js          |    1 +
 .../StudioModel/ModelList/AggregateModal/index.vue |   75 +-
 .../ModelList/Components/ModelTitleDescription.vue |  217 +++
 .../ModelList/GuideModal/GuideModal.vue            |    2 +-
 .../ModelList/ModelActions/modelActions.vue        |   55 +-
 .../StudioModel/ModelList/ModelAggregate/index.vue |   21 +-
 .../ModelAggregateView/AggAdvancedModal/index.vue  |    5 -
 .../ModelList/ModelBuildModal/build.vue            |   45 +-
 .../ModelList/ModelLayout/modelLayout.vue          |   63 +-
 .../ModelList/ModelOverview/ModelOverview.vue      |   30 +-
 .../studio/StudioModel/ModelList/index.vue         |  150 +-
 .../studio/StudioModel/ModelList/locales.js        |   14 -
 .../StudioModel/TableIndexEdit/tableindex_edit.vue |   39 +-
 kystudio/src/directive/index.js                    |   28 +-
 kystudio/src/router/index.js                       |    2 +-
 kystudio/src/service/model.js                      |    3 +
 kystudio/src/service/monitor.js                    |    6 -
 kystudio/src/store/model.js                        |    3 +
 kystudio/src/store/monitor.js                      |    3 -
 kystudio/src/store/types.js                        |    2 +-
 kystudio/src/util/dataGenerator.js                 |   15 +-
 kystudio/src/util/domHelper.js                     |    5 +-
 kystudio/src/util/plumb.js                         |   25 +-
 37 files changed, 2373 insertions(+), 1515 deletions(-)
 create mode 100644 kystudio/src/components/common/ModelTools/ModelNavigationTools.vue
 create mode 100644 kystudio/src/components/studio/StudioModel/ModelList/Components/ModelTitleDescription.vue


[kylin] 01/01: Create first release of Kylin 5.X

Posted by xx...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

xxyu pushed a commit to branch kylin5-alpha
in repository https://gitbox.apache.org/repos/asf/kylin.git

commit 0017ab1b54902147e3e42baa1235aec7f0dec108
Author: XiaoxiangYu <xx...@apache.org>
AuthorDate: Fri Mar 24 10:29:13 2023 +0800

    Create first release of Kylin 5.X
---
 build/bin/upgrade.sh                               |  10 +-
 build/release/compress.sh                          |   8 +-
 build/release/package.sh                           |  14 +-
 build/release/prepare.sh                           |   2 +-
 build/release/release-pipeline-docker/README.md    |  92 ++++++++++
 .../release-pipeline-docker/release-in-docker.sh   |  51 ++++++
 .../release-machine/Dockerfile                     |  67 ++++++++
 .../release-machine/conf/setenv.sh                 |  37 ++++
 .../release-machine/conf/settings.xml              |  64 +++++++
 .../release-machine/create-release-machine.sh      |  27 +++
 .../release-machine/script/entrypoint.sh           |  34 ++++
 .../release-machine/script/release-publish.sh      | 188 +++++++++++++++++++++
 build/release/release.sh                           |   2 +-
 ...influxdb.sh => disabled-check-1900-influxdb.sh} |   0
 ...khouse.sh => disabled-check-3000-clickhouse.sh} |   0
 build/sbin/download-spark-user.sh                  |  65 +++++++
 build/sbin/prepare-hadoop-conf-dir.sh              |   2 +-
 .../src/components/setting/SettingModel/locales.js |   2 +-
 pom.xml                                            |  64 +++----
 src/assembly/source-assembly.xml                   | 114 +++++++++++++
 .../src/main/resources/config/config_library.csv   |  10 +-
 .../org/apache/kylin/rest/cache/RedisCache.java    |   2 +-
 .../apache/kylin/tool/garbage/StorageCleaner.java  |   2 +-
 .../java/org/apache/kylin/common/KylinVersion.java |   2 +-
 .../org/apache/kylin/common/msg/CnMessage.java     | 109 +-----------
 .../java/org/apache/kylin/common/msg/Message.java  |  73 +++-----
 .../resources/kylin_error_msg_conf_cn.properties   |   2 +-
 .../resources/kylin_error_msg_conf_en.properties   |   2 +-
 .../kylin/common/exception/code/ErrorCodeTest.java |   6 +-
 .../job/impl/threadpool/NDefaultScheduler.java     |   3 -
 .../org/apache/kylin/job/lock/ZookeeperUtil.java   |   1 -
 .../apache/kylin/measure/topn/TopNCounterTest.java |   2 +-
 .../metadata/cube/model/NDataflowManagerTest.java  |   1 -
 .../cube/planner/CostBasePlannerUtilsTest.java     |  17 ++
 .../src/main/resources/config/config_library.csv   |  10 +-
 src/examples/LICENSE                               |  10 --
 src/jdbc/pom.xml                                   |  15 --
 .../java/org/apache/kylin/jdbc/KylinClient.java    |   2 +-
 .../initialize/JobSchedulerListenerTest.java       |   2 +-
 .../apache/kylin/query/engine/QueryExecTest.java   |   2 -
 .../src/main/resources/config/config_library.csv   |  10 +-
 .../apache/kylin/query/relnode/ContextUtil.java    |   1 -
 .../kap/query/optrule/ExtensionOlapJoinRule.java   |   1 -
 .../kyligence/kap/query/optrule/KapJoinRule.java   |   2 +-
 .../kap/query/optrule/SumBasicOperatorRule.java    |   1 -
 .../apache/kylin/query/engine/SQLConverter.java    |   1 -
 .../kylin/query/relnode/ContextUtilTest.java       |   1 -
 .../kylin/query/util/ExpressionComparatorTest.java |   1 -
 .../org/apache/kylin/query/util/QueryUtilTest.java |   4 +-
 .../org/apache/kylin/rest/BootstrapServer.java     |   2 +-
 .../org/apache/kylin/rest/InitConfiguration.java   |   2 +-
 .../apache/kylin/rest/config/SwaggerConfig.java    |   2 +-
 .../src/main/resources/config/config_library.csv   |  10 +-
 src/systools/pom.xml                               |   0
 src/systools/src/test/resources/ehcache.xml        |   0
 .../kylin/tool/AbstractInfoExtractorToolTest.java  |   4 +-
 .../kylin/tool/upgrade/UpdateUserAclToolTest.java  |   8 +-
 .../apache/kylin/tool/util/ServerInfoUtilTest.java |   2 +-
 src/tool/src/test/resources/diag/eventlog          |   2 +-
 59 files changed, 877 insertions(+), 293 deletions(-)

diff --git a/build/bin/upgrade.sh b/build/bin/upgrade.sh
index 36c6ae0017..7a98a17d5a 100644
--- a/build/bin/upgrade.sh
+++ b/build/bin/upgrade.sh
@@ -20,7 +20,7 @@
 function help() {
     echo "Usage: upgrade.sh <OLD_KYLIN_HOME> [--silent]"
     echo
-    echo "<OLD_KYLIN_HOME>    Specify the old version of the Kyligence Enterprise"
+    echo "<OLD_KYLIN_HOME>    Specify the old version of the Kylin 5.0"
     echo "                    installation directory."
     echo
     echo "--silent            Optional, don't enter interactive mode, automatically complete the upgrade."
@@ -52,7 +52,7 @@ function logging() {
 
 function fail() {
     error "...................................................[FAIL]"
-    error "Upgrade Kyligence Enterprise failed."
+    error "Upgrade Kylin 5.0 failed."
     recordKylinUpgradeResult "${START_TIME}" "false" "${NEW_KYLIN_HOME}"
     exit 1
 }
@@ -94,7 +94,7 @@ function upgrade() {
     if [[ -f ${OLD_KYLIN_HOME}/pid ]]; then
         PID=`cat ${OLD_KYLIN_HOME}/pid`
         if ps -p $PID > /dev/null; then
-          error "Please stop the Kyligence Enterprise during the upgrade process."
+          error "Please stop the Kylin 5.0 during the upgrade process."
           exit 1
         fi
     fi
@@ -111,7 +111,7 @@ function upgrade() {
     origin_version=$(awk '{print $NF}' ${OLD_KYLIN_HOME}/VERSION)
     target_version=$(awk '{print $NF}' ${NEW_KYLIN_HOME}/VERSION)
     echo
-    logging "warn" "Upgrade Kyligence Enterprise from ${origin_version} to ${target_version}"
+    logging "warn" "Upgrade Kylin 5.0 from ${origin_version} to ${target_version}"
     warn "Old KYLIN_HOME is ${OLD_KYLIN_HOME}, log is at ${upgrade_log}"
     echo
 
@@ -307,7 +307,7 @@ if [[ -z $OLD_KYLIN_HOME ]] || [[ ! -d $OLD_KYLIN_HOME ]]; then
 fi
 
 if [[ $OLD_KYLIN_HOME == $NEW_KYLIN_HOME ]]; then
-    error "Please specify the old version of the Kyligence Enterprise installation directory."
+    error "Please specify the old version of the Kylin 5.0 installation directory."
     help
 fi
 
diff --git a/build/release/compress.sh b/build/release/compress.sh
index bf9d584b92..079295db75 100755
--- a/build/release/compress.sh
+++ b/build/release/compress.sh
@@ -76,13 +76,17 @@ cp -rf conf/setenv.sh ${package_name}/conf/setenv.sh.template
 cp -rf bin/ ${package_name}/bin/
 cp -rf sbin/ ${package_name}/sbin/
 
+spark_version_pom=`mvn -f ../pom.xml help:evaluate -Dexpression=spark.version | grep -E '^[0-9]+\.[0-9]+\.[0-9]+' `
+echo "Prepare download spark scripts for end user with version ${spark_version_pom}."
+sed -i "s/SPARK_VERSION_IN_BINARY/${spark_version_pom}/g" ../build/sbin/download-spark-user.sh
+
 rm -rf ext lib commit_SHA1 VERSION # keep the spark folder on purpose
 
 cp -rf server/webapp/dist ${package_name}/server/public
 cp -rf server/newten.jar ${package_name}/server/
 cp -rf server/jars ${package_name}/server/
 cp -rf deploy/.keystore ${package_name}/server/
-mv ${package_name}/server/jars/log4j* ${package_name}/spark/jars/
+# mv ${package_name}/server/jars/log4j* ${package_name}/spark/jars/
 rm -rf server/
 
 ## comment all default properties, and append them to the user visible kylin.properties
@@ -92,7 +96,7 @@ sed '1,21d' ../src/core-common/src/main/resources/kylin-defaults0.properties | a
 find ${package_name} -type d -exec chmod 755 {} \;
 find ${package_name} -type f -exec chmod 644 {} \;
 find ${package_name} -type f -name "*.sh" -exec chmod 755 {} \;
-find ${package_name}/spark -type f -exec chmod 755 {} \;
+# find ${package_name}/spark -type f -exec chmod 755 {} \;
 
 if [[ -d "${package_name}/postgresql" ]]; then
     find ${package_name}/influxdb -type f -exec chmod 755 {} \;
diff --git a/build/release/package.sh b/build/release/package.sh
index 758ab0f490..2befcb8415 100755
--- a/build/release/package.sh
+++ b/build/release/package.sh
@@ -49,24 +49,24 @@ echo "${KYLIN_VERSION_NAME}" > build/VERSION
 echo "VERSION file content:" ${KYLIN_VERSION_NAME}
 
 echo "BUILD STAGE 2 - Build binaries..."
-sh build/release/build.sh $@             || { exit 1; }
+bash build/release/build.sh $@             || { exit 1; }
 
 if [[ "${WITH_SPARK}" = "1" ]]; then
     echo "BUILD STAGE 3 - Prepare spark..."
-    sh -x build/release/download-spark.sh      || { exit 1; }
+    bash -x build/release/download-spark.sh      || { exit 1; }
 else
     rm -rf build/spark
 fi
 
 if [[ "${WITH_THIRDPARTY}" = "1" ]]; then
     echo "BUILD STAGE 4 - Prepare influxdb..."
-    sh build/release/download-influxdb.sh      || { exit 1; }
+    bash build/release/download-influxdb.sh      || { exit 1; }
 
     echo "BUILD STAGE 5 - Prepare grafana..."
-    sh build/release/download-grafana.sh      || { exit 1; }
+    bash build/release/download-grafana.sh      || { exit 1; }
 
     echo "BUILD STAGE 6 - Prepare postgresql..."
-    sh build/release/download-postgresql.sh      || { exit 1; }
+    bash build/release/download-postgresql.sh      || { exit 1; }
 else
     echo "BUILD STAGE 4-6 is skipped ..."
     rm -rf build/influxdb
@@ -75,8 +75,8 @@ else
 fi
 
 echo "BUILD STAGE 7 - Prepare and compress package..."
-sh build/release/prepare.sh                || { exit 1; }
-sh build/release/compress.sh               || { exit 1; }
+bash build/release/prepare.sh                || { exit 1; }
+bash build/release/compress.sh               || { exit 1; }
 
 echo "BUILD STAGE 8 - Clean up..."
 
diff --git a/build/release/prepare.sh b/build/release/prepare.sh
index 77093bc98e..08129bcc5c 100755
--- a/build/release/prepare.sh
+++ b/build/release/prepare.sh
@@ -24,7 +24,7 @@ cd ${dir}/../..
 source build/release/functions.sh
 exportProjectVersions
 
-sh build/release/prepare-libs.sh || { exit 1; }
+bash build/release/prepare-libs.sh || { exit 1; }
 
 #create ext dir
 mkdir -p build/ext
diff --git a/build/release/release-pipeline-docker/README.md b/build/release/release-pipeline-docker/README.md
new file mode 100644
index 0000000000..6f900c3616
--- /dev/null
+++ b/build/release/release-pipeline-docker/README.md
@@ -0,0 +1,92 @@
+## Background
+
+These scripts and docker image are used to provide an **easy and standard way** for release manager to complete [apache release process](https://www.apache.org/legal/release-policy.html) .
+
+Some source code are modified from [apache spark release guide](https://github.com/apache/spark/tree/master/dev/create-release) scripts.
+
+## How to release
+
+### What you need to prepare
+
+| Item                                                                     | Used for                                                                   | Reference                                                                                             |
+|--------------------------------------------------------------------------|----------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------|
+| Apache Account<br/>(Should belongs to PMC member) <br> (id and password) | 1. Write access to ASF's Gitbox service and SVN service <br> 2. Send email | https://id.apache.org                                                                                 |
+| GPG Key <br> (key files and GPG_PASSPHRASE)                              | Sign your released files(binary and compressed source files)               | https://infra.apache.org/release-signing.html <br> https://infra.apache.org/release-distribution.html |
+| Laptop which installed Docker                                            | The place you run release scripts                                          | N/A                                                                                                   |
+
+### Step 1 : Configure Basic Info and Copy GPG Private Key
+
+-  Start docker container
+
+```bash
+# you may use custom name other than 'rm-xxyu'
+docker run --name release-machine --hostname release-machine -i -t apachekylin/release-machine:5-alpha  bash
+# docker ps -f name=release-machine
+```
+
+- Copy GPG Private Key from your laptop into container
+
+```bash
+# ~/xxyu-release-manager.private.key is your private key
+docker cp ~/xxyu-release-manager.private.key release-machine:/root
+```
+
+### Step 2 : Configure setenv.sh
+
+- Set correct values for all variables in `/root/scripts/setenv.sh`, such as **ASF_PASSWORD** and **GPG_PASSPHRASE**.
+
+#### Variables in setenv.sh
+
+| Name            | Comment                                                                                                                                                                          |
+|-----------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| ASF_USERNAME    | ID of Apache Account                                                                                                                                                             |
+| ASF_PASSWORD    | (**Never leak this**)Password of Apache Account                                                                                                                                  |
+| GPG_PASSPHRASE  | (**Never leak this**)PASSPHRASE of GPG Key                                                                                                                                       |
+| GIT_BRANCH      | Branch which you used to release, default is **kylin5**                                                                                                                          |
+| RELEASE_VERSION | Which version you want to release, default is **kylin5.0.0-alpha**                                                                                                               |
+| NEXT_VERSION    | Next version you want to use after released, default is **kylin5.0.0-beta**                                                                                                      |
+| RELEASE_STEP    | (default is **publish-rc**)<br/>Legal values are <br/> publish-rc : upload binary to release candidate folder <br> publish : publish release binary officially after vote passed |
+
+
+
+#### Add **servers** to `~/.m2/settings.xml`
+
+Otherwise, you will fail in maven-deploy-plugin.
+
+### Step 3 : Install GPG Private Key
+
+```bash
+gpg --import xxyu-release-manager.private.key
+```
+
+```bash
+# you may use the name in private key
+gpg --list-sigs Xiaoxiang Yu
+```
+
+### Step 4 : Publish Release Candidate
+
+```bash
+export RELEASE_STEP=publish-rc
+bash release-publish.sh
+```
+
+### Step 5 : Vote for Release Candidate
+
+- Prepare vote template for voting
+
+### Step 6 : Publish Release Candidate
+
+```bash
+export RELEASE_STEP=publish
+bash release-publish.sh
+```
+
+- Prepare vote template for announcement
+- Close maven repository
+
+### Step 7 : Remove Docker container
+
+```bash
+docker rm release-machine
+```
\ No newline at end of file
diff --git a/build/release/release-pipeline-docker/release-in-docker.sh b/build/release/release-pipeline-docker/release-in-docker.sh
new file mode 100644
index 0000000000..fdc14146db
--- /dev/null
+++ b/build/release/release-pipeline-docker/release-in-docker.sh
@@ -0,0 +1,51 @@
+#!/bin/bash
+
+#
+# /*
+#  * Licensed to the Apache Software Foundation (ASF) under one
+#  * or more contributor license agreements.  See the NOTICE file
+#  * distributed with this work for additional information
+#  * regarding copyright ownership.  The ASF licenses this file
+#  * to you under the Apache License, Version 2.0 (the
+#  * "License"); you may not use this file except in compliance
+#  * with the License.  You may obtain a copy of the License at
+#  *
+#  *     http://www.apache.org/licenses/LICENSE-2.0
+#  *
+#  * Unless required by applicable law or agreed to in writing, software
+#  * distributed under the License is distributed on an "AS IS" BASIS,
+#  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  * See the License for the specific language governing permissions and
+#  * limitations under the License.
+#  */
+#
+
+## Refer to https://github.com/apache/spark/tree/master/dev/create-release
+
+ENVFILE="env.list"
+cat > $ENVFILE <<EOF
+DRY_RUN=$DRY_RUN
+GIT_BRANCH=$GIT_BRANCH
+NEXT_RELEASE_VERSION=$NEXT_RELEASE_VERSION
+RELEASE_VERSION=$RELEASE_VERSION
+RELEASE_TAG=$RELEASE_TAG
+GIT_REF=$GIT_REF
+GIT_REPO_URL=$GIT_REPO_URL
+GIT_NAME=$GIT_NAME
+GIT_EMAIL=$GIT_EMAIL
+GPG_KEY=$GPG_KEY
+ASF_USERNAME=$ASF_USERNAME
+ASF_PASSWORD=$ASF_PASSWORD
+GPG_PASSPHRASE=$GPG_PASSPHRASE
+USER=$USER
+EOF
+
+docker stop kylin-release-machine
+docker rm kylin-release-machine
+
+docker run -i \
+  --env-file "$ENVFILE" \
+  --name kylin-release-machine \
+  apachekylin/release-machine:jdk8-slim
+
+docker cp kylin-release-machine:/root/ci/apache-kylin-bin.tar.gz ../../apache-kylin-bin.tar.gz
\ No newline at end of file
diff --git a/build/release/release-pipeline-docker/release-machine/Dockerfile b/build/release/release-pipeline-docker/release-machine/Dockerfile
new file mode 100644
index 0000000000..c212eb9410
--- /dev/null
+++ b/build/release/release-pipeline-docker/release-machine/Dockerfile
@@ -0,0 +1,67 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# Docker image for Kylin 5.X release
+FROM openjdk:8-slim
+
+ENV M2_HOME /root/apache-maven-3.8.8
+ENV PATH $PATH:$M2_HOME/bin
+USER root
+
+WORKDIR /root
+
+# install tools
+RUN set -eux; \
+	apt-get update; \
+	apt-get install -y --no-install-recommends lsof wget tar
+
+RUN set -eux; \
+	apt-get update; \
+	apt-get install -y --no-install-recommends git unzip subversion
+
+RUN set -eux; \
+	apt-get update; \
+	apt-get install -y --no-install-recommends curl vim ssh
+
+RUN set -eux; \
+	apt-get update; \
+	apt-get install -y --no-install-recommends gcc g++ make
+
+# install Node JS
+RUN curl -sL https://deb.nodesource.com/setup_12.x | bash - \
+    && apt-get update \
+    && apt-get install -y --no-install-recommends nodejs
+
+## =====================================
+
+# install maven
+RUN wget https://archive.apache.org/dist/maven/maven-3/3.8.8/binaries/apache-maven-3.8.8-bin.tar.gz \
+    && tar -zxvf apache-maven-3.8.8-bin.tar.gz \
+    && rm -f apache-maven-3.8.8-bin.tar.gz
+
+RUN mkdir /root/.m2
+COPY conf/settings.xml /root/.m2/settings.xml
+
+COPY script/entrypoint.sh /root/scripts/entrypoint.sh
+RUN chmod u+x /root/scripts/entrypoint.sh
+
+COPY script/release-publish.sh /root/release-publish.sh
+RUN chmod u+x /root/release-publish.sh
+
+COPY conf/setenv.sh /root/scripts/setenv.sh
+
+#ENTRYPOINT ["/root/scripts/entrypoint.sh"]
\ No newline at end of file
diff --git a/build/release/release-pipeline-docker/release-machine/conf/setenv.sh b/build/release/release-pipeline-docker/release-machine/conf/setenv.sh
new file mode 100644
index 0000000000..40f2d6b331
--- /dev/null
+++ b/build/release/release-pipeline-docker/release-machine/conf/setenv.sh
@@ -0,0 +1,37 @@
+#
+# /*
+#  * Licensed to the Apache Software Foundation (ASF) under one
+#  * or more contributor license agreements.  See the NOTICE file
+#  * distributed with this work for additional information
+#  * regarding copyright ownership.  The ASF licenses this file
+#  * to you under the Apache License, Version 2.0 (the
+#  * "License"); you may not use this file except in compliance
+#  * with the License.  You may obtain a copy of the License at
+#  *
+#  *     http://www.apache.org/licenses/LICENSE-2.0
+#  *
+#  * Unless required by applicable law or agreed to in writing, software
+#  * distributed under the License is distributed on an "AS IS" BASIS,
+#  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  * See the License for the specific language governing permissions and
+#  * limitations under the License.
+#  */
+#
+
+## Basic Info
+export DRY_RUN=0 # use in maven-release-plugin
+export RELEASE_VERSION=5.0.0-alpha
+export NEXT_VERSION=5.0.0-beta
+export GIT_BRANCH=kylin5
+export GIT_USERNAME=XiaoxiangYu
+# publish-rc / publish
+export RELEASE_STEP=publish-rc
+
+## Confidential Info
+export ASF_USERNAME=
+export ASF_PASSWORD=
+export GPG_PASSPHRASE=
+
+# https://stackoverflow.com/questions/57591432/gpg-signing-failed-inappropriate-ioctl-for-device-on-macos-with-maven
+GPG_TTY=$(tty)
+export GPG_TTY
\ No newline at end of file
diff --git a/build/release/release-pipeline-docker/release-machine/conf/settings.xml b/build/release/release-pipeline-docker/release-machine/conf/settings.xml
new file mode 100644
index 0000000000..cd2e3511cf
--- /dev/null
+++ b/build/release/release-pipeline-docker/release-machine/conf/settings.xml
@@ -0,0 +1,64 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  ~ /*
+  ~  * Licensed to the Apache Software Foundation (ASF) under one
+  ~  * or more contributor license agreements.  See the NOTICE file
+  ~  * distributed with this work for additional information
+  ~  * regarding copyright ownership.  The ASF licenses this file
+  ~  * to you under the Apache License, Version 2.0 (the
+  ~  * "License"); you may not use this file except in compliance
+  ~  * with the License.  You may obtain a copy of the License at
+  ~  *
+  ~  *     http://www.apache.org/licenses/LICENSE-2.0
+  ~  *
+  ~  * Unless required by applicable law or agreed to in writing, software
+  ~  * distributed under the License is distributed on an "AS IS" BASIS,
+  ~  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  ~  * See the License for the specific language governing permissions and
+  ~  * limitations under the License.
+  ~  */
+  -->
+<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
+          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+          xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
+
+    <mirrors>
+        <mirror>
+            <id>nexus-aliyun</id>
+            <mirrorOf>central</mirrorOf>
+            <name>Nexus Aliyun</name>
+            <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
+        </mirror>
+    </mirrors>
+
+    <profiles>
+        <profile>
+            <repositories>
+                <repository>
+                    <id>nexus</id>
+                    <name>local private nexus</name>
+                    <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
+                    <releases>
+                        <enabled>true</enabled>
+                    </releases>
+                    <snapshots>
+                        <enabled>false</enabled>
+                    </snapshots>
+                </repository>
+            </repositories>
+            <pluginRepositories>
+                <pluginRepository>
+                    <id>nexus</id>
+                    <name>local private nexus</name>
+                    <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
+                    <releases>
+                        <enabled>true</enabled>
+                    </releases>
+                    <snapshots>
+                        <enabled>true</enabled>
+                    </snapshots>
+                </pluginRepository>
+            </pluginRepositories>
+        </profile>
+    </profiles>
+</settings>
\ No newline at end of file
diff --git a/build/release/release-pipeline-docker/release-machine/create-release-machine.sh b/build/release/release-pipeline-docker/release-machine/create-release-machine.sh
new file mode 100644
index 0000000000..1305e0eb13
--- /dev/null
+++ b/build/release/release-pipeline-docker/release-machine/create-release-machine.sh
@@ -0,0 +1,27 @@
+#!/bin/bash
+
+#
+# /*
+#  * Licensed to the Apache Software Foundation (ASF) under one
+#  * or more contributor license agreements.  See the NOTICE file
+#  * distributed with this work for additional information
+#  * regarding copyright ownership.  The ASF licenses this file
+#  * to you under the Apache License, Version 2.0 (the
+#  * "License"); you may not use this file except in compliance
+#  * with the License.  You may obtain a copy of the License at
+#  *
+#  *     http://www.apache.org/licenses/LICENSE-2.0
+#  *
+#  * Unless required by applicable law or agreed to in writing, software
+#  * distributed under the License is distributed on an "AS IS" BASIS,
+#  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  * See the License for the specific language governing permissions and
+#  * limitations under the License.
+#  */
+#
+
+docker build -f Dockerfile -t release-machine:5-alpha .
+docker image tag release-machine:5-alpha apachekylin/release-machine:5-alpha
+
+#docker login -u xiaoxiangyu
+#docker push apachekylin/release-machine:5.0
\ No newline at end of file
diff --git a/build/release/release-pipeline-docker/release-machine/script/entrypoint.sh b/build/release/release-pipeline-docker/release-machine/script/entrypoint.sh
new file mode 100644
index 0000000000..43cdebed8c
--- /dev/null
+++ b/build/release/release-pipeline-docker/release-machine/script/entrypoint.sh
@@ -0,0 +1,34 @@
+#!/bin/bash
+
+#
+# /*
+#  * Licensed to the Apache Software Foundation (ASF) under one
+#  * or more contributor license agreements.  See the NOTICE file
+#  * distributed with this work for additional information
+#  * regarding copyright ownership.  The ASF licenses this file
+#  * to you under the Apache License, Version 2.0 (the
+#  * "License"); you may not use this file except in compliance
+#  * with the License.  You may obtain a copy of the License at
+#  *
+#  *     http://www.apache.org/licenses/LICENSE-2.0
+#  *
+#  * Unless required by applicable law or agreed to in writing, software
+#  * distributed under the License is distributed on an "AS IS" BASIS,
+#  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  * See the License for the specific language governing permissions and
+#  * limitations under the License.
+#  */
+#
+
+# https://kylin.apache.org/5.0/docs/development/how_to_package
+# https://kylin.apache.org/5.0/docs/development/how_to_release
+
+echo "Checking env for package and release ..."
+
+node -v
+java -version
+mvn -v
+
+echo "Hello, release manager."
+
+# bash -x /root/release-publish.sh
\ No newline at end of file
diff --git a/build/release/release-pipeline-docker/release-machine/script/release-publish.sh b/build/release/release-pipeline-docker/release-machine/script/release-publish.sh
new file mode 100644
index 0000000000..51df8617dd
--- /dev/null
+++ b/build/release/release-pipeline-docker/release-machine/script/release-publish.sh
@@ -0,0 +1,188 @@
+#!/bin/bash
+
+#
+# /*
+#  * Licensed to the Apache Software Foundation (ASF) under one
+#  * or more contributor license agreements.  See the NOTICE file
+#  * distributed with this work for additional information
+#  * regarding copyright ownership.  The ASF licenses this file
+#  * to you under the Apache License, Version 2.0 (the
+#  * "License"); you may not use this file except in compliance
+#  * with the License.  You may obtain a copy of the License at
+#  *
+#  *     http://www.apache.org/licenses/LICENSE-2.0
+#  *
+#  * Unless required by applicable law or agreed to in writing, software
+#  * distributed under the License is distributed on an "AS IS" BASIS,
+#  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  * See the License for the specific language governing permissions and
+#  * limitations under the License.
+#  */
+#
+
+export LC_ALL=C.UTF-8
+export LANG=C.UTF-8
+set -e
+
+function run_command {
+  local BANNER="$1"
+  shift 1
+
+  echo "========================"
+  echo "==> $BANNER"
+  echo "Command: $@"
+
+  "$@" 2>&1
+
+  local EC=$?
+  if [ $EC != 0 ]; then
+    echo "Command FAILED : $@, please check!!!"
+    exit $EC
+  fi
+}
+
+####################################################
+####################################################
+#### Release Configuration
+
+source /root/scripts/setenv.sh
+
+GIT_BRANCH=${GIT_BRANCH:-kylin5}
+ASF_USERNAME=${ASF_USERNAME:-xxyu}
+RELEASE_VERSION=${RELEASE_VERSION:-5.0.0-alpha}
+NEXT_RELEASE_VERSION=${NEXT_RELEASE_VERSION:-5.0.0-beta}
+
+export working_dir=/root/kylin-release-folder
+export source_code_folder=$working_dir/kylin
+export svn_folder=$working_dir/svn
+export rc_name=apache-kylin-"${RELEASE_VERSION}"-rc
+export release_candidate_folder=$svn_folder/$rc_name
+
+export ASF_KYLIN_REPO="gitbox.apache.org/repos/asf/kylin.git"
+# GITHUB_REPO_URL=${GIT_REPO_URL:-https://github.com/apache/kylin.git}
+export RELEASE_STAGING_LOCATION="https://dist.apache.org/repos/dist/dev/kylin"
+export RELEASE_LOCATION="https://dist.apache.org/repos/dist/release/kylin"
+
+####################################################
+####################################################
+#### ASF Confidential
+
+echo "==> Check ASF confidential"
+
+if [[ -z "$ASF_PASSWORD" ]]; then
+  echo 'The environment variable ASF_PASSWORD is not set. Enter the password.'
+  echo
+  stty -echo && printf "ASF password: " && read ASF_PASSWORD && printf '\n' && stty echo
+fi
+
+if [[ -z "$GPG_PASSPHRASE" ]]; then
+  echo 'The environment variable GPG_PASSPHRASE is not set. Enter the passphrase to'
+  echo 'unlock the GPG signing key that will be used to sign the release!'
+  echo
+  stty -echo && printf "GPG passphrase: " && read GPG_PASSPHRASE && printf '\n' && stty echo
+fi
+
+echo "==> Init Git Configuration"
+mkdir -p $working_dir
+git config --global user.name "${GIT_USERNAME}"
+git config --global user.email "${ASF_USERNAME}"@apache.org
+git config --global user.password ${ASF_PASSWORD}
+
+####################################################
+####################################################
+#### Prepare source code
+
+if [[ "$RELEASE_STEP" == "publish-rc" ]]; then
+  echo "==> Clone kylin source for $RELEASE_VERSION"
+
+  cd $working_dir
+
+  if [ ! -d "${source_code_folder}" ]
+  then
+      echo "Clone source code to ${source_code_folder} ."
+      run_command "Clone Gitbox" git clone "https://$ASF_USERNAME:$ASF_PASSWORD@$ASF_KYLIN_REPO" -b "$GIT_BRANCH"
+  fi
+
+  if [ ! -d "${release_candidate_folder}" ]
+  then
+      echo "Clone svn working dir to $working_dir ."
+      run_command "Clone ASF SVN" svn co $RELEASE_STAGING_LOCATION $svn_folder
+  fi
+fi
+
+####################################################
+####################################################
+#### Publish maven artifact and source package
+
+if [[ "$RELEASE_STEP" == "publish-rc" ]]; then
+  echo "==> publish-release-candidate source code"
+  # Go to source directory
+  cd ${source_code_folder}
+
+  tag_exist=`git tag --list | grep kylin-"${RELEASE_VERSION}" | wc -l`
+  if [[ $tag_exist != 0 ]]; then
+     echo "Delete local tag"
+     git tag --delete kylin-"${RELEASE_VERSION}"
+  fi
+
+  ## Prepare tag & source tarball & upload maven artifact
+  # Use release-plugin to check license & build source package & build and upload maven artifact
+  # https://maven.apache.org/maven-release/maven-release-plugin/examples/prepare-release.html
+  # https://infra.apache.org/publishing-maven-artifacts.html
+  # Use `mvn release:clean`  if you want to prepare again
+  run_command "Maven Release Prepare" mvn -DskipTests -DreleaseVersion="${RELEASE_VERSION}" \
+    -DdevelopmentVersion="${NEXT_RELEASE_VERSION}"-SNAPSHOT -Papache-release,nexus -DdryRun=${DRY_RUN} \
+    -Darguments="-Dmaven.javadoc.skip=true -Dgpg.passphrase=${GPG_PASSPHRASE} -DskipTests" \
+    release:prepare
+  run_command "Maven Release Perform" mvn -DskipTests -Papache-release,nexus \
+    -Darguments="-Dmaven.javadoc.skip=true -Dgpg.passphrase=${GPG_PASSPHRASE} -DskipTests" \
+    release:perform
+
+  # Create a directory for this release candidate
+  mkdir -p ${release_candidate_folder}
+  # rm -rf target/apache-kylin-*ource-release.zip.asc.sha256
+
+  # Move source code and signture of source code to release candidate directory
+  cp target/apache-kylin-*source-release.zip* "${release_candidate_folder}"
+fi
+
+####################################################
+####################################################
+#### Build Binary
+
+if [[ "$RELEASE_STEP" == "publish-rc" ]]; then
+  echo "==> Building kylin binary for $RELEASE_VERSION"
+  cd $source_code_folder
+  git pull -r origin "$GIT_BRANCH"
+
+  export release_version=$RELEASE_VERSION
+  run_command "Build binary" bash build/release/release.sh -official -noSpark
+
+  cp dist/apache-kylin-*.tar.gz "${release_candidate_folder}"
+fi
+
+####################################################
+####################################################
+#### Publish binary to release candidate folder
+
+if [[ "$RELEASE_STEP" == "publish-rc" ]]; then
+  ## Sign binary
+  echo "==> publish-release-candidate binary"
+  cd "${release_candidate_folder}"
+  run_command "Sign binary" gpg --armor --output apache-kylin-"${RELEASE_VERSION}"-bin.tar.gz.asc --detach-sig apache-kylin-${RELEASE_VERSION}-bin.tar.gz
+  shasum -a 256 apache-kylin-"${RELEASE_VERSION}"-bin.tar.gz > apache-kylin-${RELEASE_VERSION}-bin.tar.gz.sha256
+
+  ## Upload to svn repository
+  cd ${svn_folder}
+  svn add ${rc_name}
+  run_command "Publish release candidate dir" svn commit -m 'Check in release artifacts for '${rc_name}
+fi
+
+####################################################
+####################################################
+#### Publish binary to release folder after vote passed
+
+if [[ "$RELEASE_STEP" == "publish-release" ]]; then
+  echo "==> publish-release"
+  # todo
+fi
\ No newline at end of file
diff --git a/build/release/release.sh b/build/release/release.sh
index 9ddd41d35c..d880ec54ea 100755
--- a/build/release/release.sh
+++ b/build/release/release.sh
@@ -78,7 +78,7 @@ if [[ "${PACKAGE_OFFICIAL}" = "0" ]]; then
 fi
 export package_name="apache-kylin-${release_version}"
 
-sh build/release/package.sh $@ || { echo "package failed!"; exit 1; }
+bash build/release/package.sh $@ || { echo "package failed!"; exit 1; }
 
 echo "Release Version: ${release_version}"
 
diff --git a/build/sbin/check-1900-influxdb.sh b/build/sbin/disabled-check-1900-influxdb.sh
similarity index 100%
rename from build/sbin/check-1900-influxdb.sh
rename to build/sbin/disabled-check-1900-influxdb.sh
diff --git a/build/sbin/check-3000-clickhouse.sh b/build/sbin/disabled-check-3000-clickhouse.sh
similarity index 100%
rename from build/sbin/check-3000-clickhouse.sh
rename to build/sbin/disabled-check-3000-clickhouse.sh
diff --git a/build/sbin/download-spark-user.sh b/build/sbin/download-spark-user.sh
new file mode 100755
index 0000000000..80a6194ded
--- /dev/null
+++ b/build/sbin/download-spark-user.sh
@@ -0,0 +1,65 @@
+#!/bin/bash
+
+#
+# /*
+#  * Licensed to the Apache Software Foundation (ASF) under one
+#  * or more contributor license agreements.  See the NOTICE file
+#  * distributed with this work for additional information
+#  * regarding copyright ownership.  The ASF licenses this file
+#  * to you under the Apache License, Version 2.0 (the
+#  * "License"); you may not use this file except in compliance
+#  * with the License.  You may obtain a copy of the License at
+#  *
+#  *     http://www.apache.org/licenses/LICENSE-2.0
+#  *
+#  * Unless required by applicable law or agreed to in writing, software
+#  * distributed under the License is distributed on an "AS IS" BASIS,
+#  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  * See the License for the specific language governing permissions and
+#  * limitations under the License.
+#  */
+#
+
+## Because Apache Infra has limit size of a single file for download site,
+## current the limit is 650MB. The release manager have to remove spark binary,
+## so Kylin end user should execute this script to download and unzip Spark binary.
+## After remove spark binary, Kylin binary is 476MB in 2023/03/09.
+## Check details at https://issues.apache.org/jira/browse/INFRA-24053
+
+source $(cd -P -- "$(dirname -- "$0")" && pwd -P)/header.sh
+
+if [[ -d ${KYLIN_HOME}/spark ]]; then
+    echo "Spark already exists, please check."
+    exit 1
+fi
+
+spark_version_in_binary=3.2.0-4.6.3.0
+spark_pkg_name=spark-newten-"`echo ${spark_version_in_binary}| sed "s/-kylin//g"`"
+spark_pkg_file_name="${spark_pkg_name}.tgz"
+
+echo "spark_pkg_file_name : "${spark_pkg_file_name}
+wget --directory-prefix=${KYLIN_HOME} https://s3.cn-north-1.amazonaws.com.cn/download-resource/kyspark/${spark_pkg_file_name} || echo "Download spark failed"
+
+tar -zxf ${spark_pkg_file_name} -C ${spark_pkg_name} --strip-components 1 || { exit 1; }
+mv ${spark_pkg_name} spark
+
+# Remove unused components in Spark
+rm -rf ${KYLIN_HOME}/spark/lib/spark-examples-*
+rm -rf ${KYLIN_HOME}/spark/examples
+rm -rf ${KYLIN_HOME}/spark/data
+rm -rf ${KYLIN_HOME}/spark/R
+rm -rf ${KYLIN_HOME}/spark/hive_1_2_2
+
+if [ ! -f ${KYLIN_HOME}/"hive_1_2_2.tar.gz" ]
+then
+    echo "no binary file found"
+    wget --directory-prefix=${KYLIN_HOME} https://s3.cn-north-1.amazonaws.com.cn/download-resource/kyspark/hive_1_2_2.tar.gz || echo "Download hive1 failed"
+else
+    if [ `calMd5 ${KYLIN_HOME}/hive_1_2_2.tar.gz | awk '{print $1}'` != "e8e86e086fb7e821d724ad0c19457a36" ]
+    then
+        echo "md5 check failed"
+        rm ${KYLIN_HOME}/hive_1_2_2.tar.gz
+        wget --directory-prefix=${KYLIN_HOME} https://s3.cn-north-1.amazonaws.com.cn/download-resource/kyspark/hive_1_2_2.tar.gz  || echo "Download hive1 failed"
+    fi
+fi
+tar -zxf hive_1_2_2.tar.gz -C ${KYLIN_HOME}/spark/ || { exit 1; }
diff --git a/build/sbin/prepare-hadoop-conf-dir.sh b/build/sbin/prepare-hadoop-conf-dir.sh
index fa35b5a587..d6f3418ab9 100755
--- a/build/sbin/prepare-hadoop-conf-dir.sh
+++ b/build/sbin/prepare-hadoop-conf-dir.sh
@@ -155,7 +155,7 @@ function fetchKylinHadoopConf() {
           then
               echo "Hadoop conf directory currently generated based on manual mode."
           else
-              echo "Missing hadoop conf files. Please contact Kyligence technical support for more details."
+              echo "Missing hadoop conf files. Please contact Community support for more details."
               exit 1
           fi
         fi
diff --git a/kystudio/src/components/setting/SettingModel/locales.js b/kystudio/src/components/setting/SettingModel/locales.js
index 5391bd8455..59f49b4008 100644
--- a/kystudio/src/components/setting/SettingModel/locales.js
+++ b/kystudio/src/components/setting/SettingModel/locales.js
@@ -52,7 +52,7 @@ export default {
     sparkShuffle: 'The number of partitions to use when shuffling data for joins or aggregations.',
     baseCuboidConfig: 'According to your business scenario, you can decide whether to add an index that contains dimensions and measures defined in all aggregate groups. The index can answer queries across multiple aggregate groups, but this will impact query performance. In addition to this, there are some storage and building costs by adding this index.',
     customSettings: 'Custom Settings',
-    customOptions: 'Besides the defined configurations, you can also add some advanced settings.<br/><i class="el-icon-ksd-alert"></i>Note: It\'s highly recommended to use this feature with the support of Kyligence Service Team.',
+    customOptions: 'Besides the defined configurations, you can also add some advanced settings.<br/><i class="el-icon-ksd-alert"></i>Note: It\'s highly recommended to use this feature with the support of Kylin 5 Team.',
     customSettingKeyPlaceholder: 'Configuration Name',
     customSettingValuePlaceholder: 'Value',
     delCustomConfigTip: 'Are you sure you want to delete custom setting item {name}?'
diff --git a/pom.xml b/pom.xml
index 516eae722b..4dfba2dd28 100644
--- a/pom.xml
+++ b/pom.xml
@@ -17,6 +17,14 @@
 <project xmlns="http://maven.apache.org/POM/4.0.0"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+    <parent>
+        <!-- https://infra.apache.org/publishing-maven-artifacts.html -->
+        <groupId>org.apache</groupId>
+        <artifactId>apache</artifactId>
+        <version>23</version>
+    </parent>
+
     <modelVersion>4.0.0</modelVersion>
 
     <groupId>org.apache.kylin</groupId>
@@ -35,10 +43,10 @@
     </organization>
 
     <scm>
-        <connection>scm:git:git://github.com/apache/kylin.git</connection>
-        <developerConnection>scm:git:ssh://git@github.com/apache/kylin.git
-        </developerConnection>
-        <url>https://github.com/apache/kylin</url>
+        <connection>scm:git:https://gitbox.apache.org/repos/asf/kylin.git</connection>
+        <!-- developerConnection is using in maven-release-plugin  https://maven.apache.org/guides/mini/guide-releasing.html-->
+        <developerConnection>scm:git:https://gitbox.apache.org/repos/asf/kylin.git</developerConnection>
+        <url>scm:git:https://gitbox.apache.org/repos/asf/kylin.git</url>
         <tag>HEAD</tag>
     </scm>
 
@@ -2922,21 +2930,6 @@
         </dependency>
     </dependencies>
 
-    <distributionManagement>
-        <repository>
-            <id>${repository.id}</id>
-            <url>${repository.url}</url>
-            <name>${repository.name}</name>
-            <layout>default</layout>
-        </repository>
-        <snapshotRepository>
-            <id>${repository.id.snapshots}</id>
-            <url>${repository.url.snapshots}</url>
-            <name>${repository.name.snapshots}</name>
-            <layout>default</layout>
-        </snapshotRepository>
-    </distributionManagement>
-
     <build>
         <pluginManagement>
             <plugins>
@@ -3067,17 +3060,17 @@
                         <includePom>true</includePom>
                     </configuration>
                 </plugin>
-                <plugin>
-                    <groupId>org.apache.maven.plugins</groupId>
-                    <artifactId>maven-javadoc-plugin</artifactId>
-                    <version>3.3.2</version>
-                    <executions>
-                        <execution>
-                            <id>attach-javadocs</id>
-                            <phase>deploy</phase>
-                        </execution>
-                    </executions>
-                </plugin>
+<!--                <plugin>-->
+<!--                    <groupId>org.apache.maven.plugins</groupId>-->
+<!--                    <artifactId>maven-javadoc-plugin</artifactId>-->
+<!--                    <version>3.3.2</version>-->
+<!--                    <executions>-->
+<!--                        <execution>-->
+<!--                            <id>attach-javadocs</id>-->
+<!--                            <phase>deploy</phase>-->
+<!--                        </execution>-->
+<!--                    </executions>-->
+<!--                </plugin>-->
                 <plugin>
                     <groupId>org.apache.maven.plugins</groupId>
                     <artifactId>maven-deploy-plugin</artifactId>
@@ -3560,6 +3553,13 @@
             </activation>
             <build>
                 <plugins>
+                    <plugin>
+                        <groupId>org.apache.maven.plugins</groupId>
+                        <artifactId>maven-release-plugin</artifactId>
+                        <configuration>
+                            <dryRun>false</dryRun>
+                        </configuration>
+                    </plugin>
                     <plugin>
                         <groupId>org.apache.maven.plugins</groupId>
                         <artifactId>maven-gpg-plugin</artifactId>
@@ -3579,12 +3579,12 @@
                                     <goal>single</goal>
                                 </goals>
                                 <configuration>
-                                    <tarLongFileMode>posix</tarLongFileMode>
+                                    <tarLongFileMode>gnu</tarLongFileMode>
                                     <runOnlyAtExecutionRoot>true</runOnlyAtExecutionRoot>
                                     <appendAssemblyId>true</appendAssemblyId>
                                     <descriptors>
                                         <descriptor>
-                                            assembly/src/main/config/assemblies/source-assembly.xml
+                                            src/assembly/source-assembly.xml
                                         </descriptor>
                                     </descriptors>
                                     <finalName>apache-kylin-${project.version}</finalName>
diff --git a/src/assembly/source-assembly.xml b/src/assembly/source-assembly.xml
new file mode 100644
index 0000000000..07fcc967d2
--- /dev/null
+++ b/src/assembly/source-assembly.xml
@@ -0,0 +1,114 @@
+<?xml version='1.0' encoding='UTF-8'?>
+<!--
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to you under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+http://www.apache.org/licenses/LICENSE-2.0
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+-->
+<assembly>
+    <id>src</id>
+    <formats>
+        <format>zip</format>
+        <format>tar.gz</format>
+    </formats>
+
+    <fileSets>
+        <!-- main project directory structure -->
+        <fileSet>
+            <directory>.</directory>
+            <outputDirectory>.</outputDirectory>
+            <useDefaultExcludes>true</useDefaultExcludes>
+            <excludes>
+                <!-- build output -->
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/).*${project.build.directory}.*]
+                </exclude>
+
+                <!-- NOTE: Most of the following excludes should not be required
+                  if the standard release process is followed. This is because the release
+                  plugin checks out project sources into a location like target/checkout, then
+                  runs the build from there. The result is a source-release archive that comes
+                  from a pretty clean directory structure. HOWEVER, if the release plugin is
+                  configured to run extra goals or generate a project website, it's definitely
+                  possible that some of these files will be present. So, it's safer to exclude
+                  them. -->
+
+                <!-- IDEs -->
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?maven-eclipse\.xml]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?\.project]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?\.classpath]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?[^/]*\.iws]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?\.idea(/.*)?]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?out(/.*)?]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?[^/]*\.ipr]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?[^/]*\.iml]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?\.settings(/.*)?]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?\.externalToolBuilders(/.*)?]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?\.deployables(/.*)?]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?\.wtpmodules(/.*)?]
+                </exclude>
+
+
+                <!-- scm -->
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?\.gitignore(/.*)?]
+                </exclude>
+
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?docs/website(/.*)?]
+                </exclude>
+
+                <!-- release-plugin temp files -->
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?pom\.xml\.releaseBackup]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?pom\.xml\.next]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?pom\.xml\.tag]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?release\.properties]
+                </exclude>
+
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?release\.properties]
+                </exclude>
+
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?dist(/.*)?]
+                </exclude>
+
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?lib(/.*)?]
+                </exclude>
+                <exclude>%regex[(?!((?!${project.build.directory}/)[^/]+/)*src/)(.*/)?docs(/.*)?]
+                </exclude>
+
+            </excludes>
+        </fileSet>
+        <!-- LICENSE, NOTICE, DEPENDENCIES, git.properties, etc. calculated at build time -->
+        <fileSet>
+            <directory>${project.build.directory}/maven-shared-archive-resources/META-INF
+            </directory>
+            <outputDirectory>.</outputDirectory>
+        </fileSet>
+        <fileSet>
+            <directory>${project.build.directory}</directory>
+            <includes>
+                <include>git.properties</include>
+            </includes>
+            <outputDirectory>.</outputDirectory>
+        </fileSet>
+    </fileSets>
+</assembly>
\ No newline at end of file
diff --git a/src/common-booter/src/main/resources/config/config_library.csv b/src/common-booter/src/main/resources/config/config_library.csv
index 5f9429c661..631c673653 100644
--- a/src/common-booter/src/main/resources/config/config_library.csv
+++ b/src/common-booter/src/main/resources/config/config_library.csv
@@ -120,7 +120,7 @@ kylin.engine.streaming-segment-merge-ratio,Double,流式 segment 合并比率,St
 kylin.engine.streaming-segment-merge-threshold,Int,实时OLAP构建触发合并segment的条件,Real -time OLAP constructs the condition for triggering the combination of segment,TRUE,TRUE
 kylin.engine.streaming-trigger-once,boolean,实时OLAP构建仅触发一次,Real -time OLAP construction only triggers once,TRUE,TRUE
 kylin.engine.submit-hadoop-conf-dir,String,hadoop conf 目录,Hadoop Conf directory,TRUE,TRUE
-kylin.env,String,Kyligence Enterprise 部署模式,Kyligence Enterprise deployment mode,TRUE,TRUE
+kylin.env,String,Kylin 5.0 部署模式,Kylin 5.0 deployment mode,TRUE,TRUE
 kylin.env.channel,String,判断当前部署环境,Determine the current deployment environment,TRUE,TRUE
 kylin.env.engine-write-fs,String,配置构建写集群,Configuration Construction Writing Cluster,TRUE,TRUE
 kylin.env.hdfs-data-working-dir,String,数据存储路径,可以和working-dir分开,Data storage paths can be separated from Working-DIR,TRUE,TRUE
@@ -256,7 +256,7 @@ kylin.metadata.key-case-insensitive,boolean,当本地用户、用户组较多影
 kylin.metadata.ops-cron,boolean,定时备份元数据、垃圾清理的定时任务 cron 表达式,默认值是 0 0 0 * * *,"Timing backup metad data, the timing task of garbage clearance CRON expression, the default value is 0 0 0 * * * *",TRUE,TRUE
 kylin.metadata.random-admin-password.enabled,boolean,admin使用随机密码,admin uses random passwords,TRUE,TRUE
 kylin.metadata.semi-automatic-mode,boolean,前端:当前project 是否含有半自动的标志;后端:是否默认使用半自动档,Front end: whether the current Project contains semi -automatic signs; back -end: Whether the semi -automatic file is used by default,TRUE,TRUE
-kylin.metadata.url,String,Kyligence Enterprise 元数据库路径。,Kyligence Enterprise metad database path.,TRUE,TRUE
+kylin.metadata.url,String,Kylin 5.0 元数据库路径。,Kylin 5.0 metad database path.,TRUE,TRUE
 kylin.metadata.validate-computed-column,boolean,是否检验可计算列,Whether to test can be calculated,TRUE,TRUE
 kylin.metrics.daily-influx-db,String,按天归档influxdb数据的数据库,Database of InfluxDB data by the day,TRUE,TRUE
 kylin.metrics.daily-max-retry-times,Int,按天归档influxdb数据时异常后重试次数,Archive by the day when INFLUXDB data is abnormal,TRUE,TRUE
@@ -323,7 +323,7 @@ kylin.query.engine.sparder-enabled,boolean,是否启用sparder引擎,Whether to
 kylin.query.engine.spark-scheduler-mode,String,spark 调度模式,spark scheduling mode,TRUE,TRUE
 kylin.query.engine.spark-sql-shuffle-partitions,Int,spark sql 中 join 或 aggregations 时的分区数,The number of partitions in spark sql when join or aggregations,TRUE,TRUE
 kylin.query.engine.split-group-sets-into-union,boolean,是否对 grouping sets 进行拆开处理,Whether the Grouping Sets is disassembled and processed,TRUE,TRUE
-kylin.query.escape-default-keyword,boolean,"Kyligence Enterprise 会自动为 default 转为""DEFAULT""。","Kyligence Enterprise will automatically convert to ""Default"" for Default.",TRUE,TRUE
+kylin.query.escape-default-keyword,boolean,"Kylin 5.0 会自动为 default 转为""DEFAULT""。","Kylin 5.0 will automatically convert to ""Default"" for Default.",TRUE,TRUE
 kylin.query.exception-cache-enabled,boolean,是否缓存失败的查询,Whether to cache failure query,TRUE,TRUE
 kylin.query.exception-cache-threshold-duration,Int,需要缓存异常查询的最小查询时间,The minimum query time that needs to be cached abnormal query,TRUE,TRUE
 kylin.query.exception-cache-threshold-times,Int,需要缓存异常查询的最大失败次数,The maximum failure of cache abnormal query,TRUE,TRUE
@@ -447,7 +447,7 @@ kylin.server.https.port,Int,https端口号,https port number,TRUE,TRUE
 kylin.server.leader-race.enabled,boolean,是否打开HA开关,Whether to turn on the HA switch,TRUE,TRUE
 kylin.server.leader-race.heart-beat-interval,Int,HA更新心跳频率,单位s,"HA update heartbeat frequency, unit S",TRUE,TRUE
 kylin.server.leader-race.heart-beat-timeout,Int,HA心跳超时时长,单位s,"HA's heartbeat time timeout, unit S",TRUE,TRUE
-kylin.server.mode,String,Kyligence Enterprise 实例的运行模式,参数值可以是 all 或 query 中的一个,默认值为 all。,"The running mode of the Kyligence Enterprise instance can be one of the parameter values ​​in ALL or Query, and the default value is ALL.",TRUE,TRUE
+kylin.server.mode,String,Kylin 5.0 实例的运行模式,参数值可以是 all 或 query 中的一个,默认值为 all。,"The running mode of the Kylin 5.0 instance can be one of the parameter values ​​in ALL or Query, and the default value is ALL.",TRUE,TRUE
 kylin.server.renew-batch-size,Int,epoch renew批量更新一次最多更新的个数,EPOCH Renew Batch Update Most Updated Number,TRUE,TRUE
 kylin.server.renew-epoch-pool-size,Int,renew epoch 的线程池大小,Renew epoch's thread pool size,TRUE,TRUE
 kylin.server.streaming-change-meta,boolean,流式事务可以修改元数据,Streaming transactions can modify metadata data,TRUE,TRUE
@@ -540,7 +540,7 @@ kylin.web.export-allow-other,boolean,是否允许非 Admin 用户导出查询结
 kylin.web.properties.whitelist,String,提供给前端使用的配置项列表,Configuration items provided to the front -end use,TRUE,TRUE
 kylin.web.session.jdbc-encode-enabled,boolean,session id存入数据库中是否加密,Session ID is encrypted into the database,TRUE,TRUE
 kylin.web.session.secure-random-create-enabled,boolean,session id存入数据库中是否加密,Session ID is encrypted into the database,TRUE,TRUE
-kylin.web.timezone,String,Kyligence Enterprise 的 REST 服务所使用的时区。默认值为本地机器的系统的时区。,The time zone used by Kyligence Enterprise's REST service. The default value is the time zone of the system of the local machine.,TRUE,TRUE
+kylin.web.timezone,String,Kylin 5.0 的 REST 服务所使用的时区。默认值为本地机器的系统的时区。,The time zone used by Kylin 5.0's REST service. The default value is the time zone of the system of the local machine.,TRUE,TRUE
 server.port,Int,服务启动的端口,The port of the service start,TRUE,TRUE
 spring.session.store-type,String,spring session 存储类型,Spring session storage type,TRUE,TRUE
 kylin.engine.dynamic-resource-plan-enabled,boolean,在FI平台使用动态资源计划时候,有可能导致无法分配资源,从而构建失败,"When using a dynamic resource plan on the FI platform, it may lead to the failure to allocate resources and build a failure",TRUE,TRUE
diff --git a/src/common-service/src/main/java/org/apache/kylin/rest/cache/RedisCache.java b/src/common-service/src/main/java/org/apache/kylin/rest/cache/RedisCache.java
index e2f9a48e70..9cb2e43b8c 100644
--- a/src/common-service/src/main/java/org/apache/kylin/rest/cache/RedisCache.java
+++ b/src/common-service/src/main/java/org/apache/kylin/rest/cache/RedisCache.java
@@ -64,7 +64,7 @@ public class RedisCache implements KylinCache {
 
     private static final String NX = "NX";
     private static final String XX = "XX";
-    private static final String PREFIX = "Kyligence-";
+    private static final String PREFIX = "Kylin5-";
     private static final String CHARSET_NAME = "UTF-8";
     private static final Charset CHARSET = StandardCharsets.UTF_8;
     private static final String SCAN_POINTER_START_STR = new String(ScanParams.SCAN_POINTER_START_BINARY, CHARSET);
diff --git a/src/common-service/src/main/java/org/apache/kylin/tool/garbage/StorageCleaner.java b/src/common-service/src/main/java/org/apache/kylin/tool/garbage/StorageCleaner.java
index ff8e833bcf..2452305000 100644
--- a/src/common-service/src/main/java/org/apache/kylin/tool/garbage/StorageCleaner.java
+++ b/src/common-service/src/main/java/org/apache/kylin/tool/garbage/StorageCleaner.java
@@ -205,7 +205,7 @@ public class StorageCleaner {
     }
 
     public void printConsole(boolean success, long duration) {
-        System.out.println(ANSI_BLUE + "Kyligence Enterprise garbage report: (cleanup=" + cleanup + ")" + ANSI_RESET);
+        System.out.println(ANSI_BLUE + "Kylin 5.0 garbage report: (cleanup=" + cleanup + ")" + ANSI_RESET);
         for (StorageItem item : outdatedItems) {
             System.out.println("  Storage File: " + item.getPath());
         }
diff --git a/src/core-common/src/main/java/org/apache/kylin/common/KylinVersion.java b/src/core-common/src/main/java/org/apache/kylin/common/KylinVersion.java
index d2a577cbdc..48461e155e 100644
--- a/src/core-common/src/main/java/org/apache/kylin/common/KylinVersion.java
+++ b/src/core-common/src/main/java/org/apache/kylin/common/KylinVersion.java
@@ -92,7 +92,7 @@ public class KylinVersion implements Comparable {
     /**
      * Require MANUAL updating kylin version per ANY upgrading.
      */
-    private static final KylinVersion CURRENT_KYLIN_VERSION = new KylinVersion("4.0.0");
+    private static final KylinVersion CURRENT_KYLIN_VERSION = new KylinVersion("5.0.0");
 
     private static final KylinVersion VERSION_200 = new KylinVersion("2.0.0");
 
diff --git a/src/core-common/src/main/java/org/apache/kylin/common/msg/CnMessage.java b/src/core-common/src/main/java/org/apache/kylin/common/msg/CnMessage.java
index 1f7923622b..8403f70fa5 100644
--- a/src/core-common/src/main/java/org/apache/kylin/common/msg/CnMessage.java
+++ b/src/core-common/src/main/java/org/apache/kylin/common/msg/CnMessage.java
@@ -540,96 +540,6 @@ public class CnMessage extends Message {
         return "配置列表不能为空。请检查后重试。";
     }
 
-    // Query statistics
-
-    //license
-    @Override
-    public String getLicenseErrorPre() {
-        return "无法更新许可证:\n";
-    }
-
-    @Override
-    public String getLicenseErrorSuff() {
-        return "\n请重新上传新的许可证或联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getLicenseOverdueTrial() {
-        return "许可证已过期,当前有效期为[%s - %s]。请重新上传新的许可证或联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getLicenseNodesExceed() {
-        return "您使用的节点数已超过许可证范围,请联系您的客户经理。";
-    }
-
-    @Override
-    public String getLicenseNodesNotMatch() {
-        return "当前许可证的节点数与集群信息不匹配,请重新上传新的许可证或联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getlicenseWrongCategory() {
-        return "当前许可证的版本与产品不匹配,请重新上传新的许可证或联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getLicenseNoLicense() {
-        return "没有许可证文件。请联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getLicenseMismatchLicense() {
-        return "该许可证适用的集群信息与当前不符。请上传新的许可证或联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getLicenseNotEffective() {
-        return "许可证尚未生效,请重新申请。";
-    }
-
-    @Override
-    public String getLicenseExpired() {
-        return "该许可证已过期。请上传新的许可证或联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getLicenseOverVolume() {
-        return "当前系统已使用容量超过该许可证允许的容量。请上传新的许可证或联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getLicenseInvalidLicense() {
-        return "无效许可证。请上传新的许可证或联系 Kyligence 销售人员。";
-    }
-
-    @Override
-    public String getLicenseSourceOverCapacity() {
-        return "当前已使用数据量(%s/%s)超过许可证上限。系统无法进行构建或数据加载任务。\n" + "请联系 Kyligence 销售人员,或尝试删除一些 Segment 以解除限制。";
-    }
-
-    @Override
-    public String getLicenseProjectSourceOverCapacity() {
-        return "当前项目已使用数据量(%s/%s)超过配置上限。系统无法进行构建或数据加载任务。\n" + "请联系 Kyligence 销售人员,或尝试删除一些 Segment 以解除限制。";
-    }
-
-    @Override
-    public String getLicenseNodesOverCapacity() {
-        return "当前已使用节点数(%s/%s)超过许可证上限。系统无法进行构建或数据加载任务。\n" + "请联系 Kyligence 销售人员,或尝试停止部分节点以解除限制。";
-    }
-
-    @Override
-    public String getLicenseSourceNodesOverCapacity() {
-        return "当前已使用数据量(%s/%s)和节点数(%s/%s)均超过许可证上限。\n" + "系统无法进行构建或数据加载任务。\n"
-                + "请联系 Kyligence 销售人员,或尝试删除一些 segments 并停止部分节点以解除限制。";
-    }
-
-    @Override
-    public String getlicenseProjectSourceNodesOverCapacity() {
-        return "当前项目已使用数据量(%s/%s)和节点数(%s/%s)均超过配置上限。\n" + "系统无法进行构建或数据加载任务。\n"
-                + "请联系 Kyligence 销售人员,或尝试删除一些 segments 并停止部分节点以解除限制。";
-    }
-
     @Override
     public String saveModelFail() {
         return "模型 “%s” 保存失败。请确保模型中使用的列 “%s” 在源表 “%s” 中存在。";
@@ -915,21 +825,6 @@ public class CnMessage extends Message {
         return "请输入 Segment ID 或名称。";
     }
 
-    @Override
-    public String getContentIsEmpty() {
-        return "许可证内容为空";
-    }
-
-    @Override
-    public String getIllegalEmail() {
-        return "不允许使用个人电子邮件或非法电子邮件";
-    }
-
-    @Override
-    public String getLicenseError() {
-        return "获取许可证失败";
-    }
-
     @Override
     public String getEmailUsernameCompanyCanNotEmpty() {
         return "邮箱, 用户名, 公司不能为空";
@@ -1067,7 +962,7 @@ public class CnMessage extends Message {
 
     @Override
     public String getDefaultSuggest() {
-        return "更多详情请联系 Kyligence 技术支持。";
+        return "更多详情请联系 Kylin 5 技术支持。";
     }
 
     @Override
@@ -1551,7 +1446,7 @@ public class CnMessage extends Message {
 
     @Override
     public String getStreamingDisabled() {
-        return "只有 Kyligence 高级版才能使用批流一体功能,请联系 Kyligence 客户经理升级 License。";
+        return "开发中。";
     }
 
     @Override
diff --git a/src/core-common/src/main/java/org/apache/kylin/common/msg/Message.java b/src/core-common/src/main/java/org/apache/kylin/common/msg/Message.java
index 805caa5688..98bd73637f 100644
--- a/src/core-common/src/main/java/org/apache/kylin/common/msg/Message.java
+++ b/src/core-common/src/main/java/org/apache/kylin/common/msg/Message.java
@@ -22,10 +22,9 @@ import java.util.List;
 import java.util.Locale;
 
 import org.apache.kylin.common.Singletons;
-import org.apache.kylin.common.annotation.Clarification;
 
-@Clarification(priority = Clarification.Priority.MAJOR, msg = "Part message is for enterprise.")
 public class Message {
+    private static final String UNKNOWN_ERROR = "UNKNOWN ERROR";
     private static final String SECOND_STORAGE_PROJECT_ENABLED = "The project %s does not have tiered storage enabled.";
     private static final String SECOND_STORAGE_MODEL_ENABLED = "The model %s does not have tiered storage enabled.";
     private static final String SECOND_STORAGE_SEGMENT_WITHOUT_BASE_INDEX = "The base table index is missing in the segments, please add and try again.";
@@ -48,18 +47,6 @@ public class Message {
     private static final String PROFILING_COLLECT_TIMEOUT = "Async profiler timeout";
 
     private static final String PARAMETER_EMPTY = "Please enter the value for the parameter '%s'.";
-
-    private static final String LICENSE_ERROR_PRE = "The license couldn’t be updated:\n";
-    private static final String LICENSE_ERROR_SUFF = "\nPlease upload a new license, or contact Kyligence.";
-    private static final String LICENSE_OVERDUE_TRIAL = "The license has expired and the validity period is [%s - %s]. Please upload a new license or contact Kyligence.";
-    private static final String LICENSE_NODES_EXCEED = "The number of nodes which you are using is higher than the allowable number. Please contact your Kyligence account manager.";
-    private static final String LICENSE_NODES_NOT_MATCH = "The cluster information dose not match the license. Please upload a new license or contact Kyligence.";
-    private static final String LICENSE_OVER_VOLUME = "The current used system capacity exceeds the license’s limit. Please upload a new license, or contact Kyligence.";
-    private static final String LICENSE_NO_LICENSE = "No license file. Please contact Kyligence.";
-    private static final String LICENSE_WRONG_CATEGORY = "The current version of Kyligence Enterprise does not match the license. Please upload a new license or contact Kyligence.";
-    private static final String LICENSE_MISMATCH_LICENSE = "The license doesn’t match the current cluster information. Please upload a new license, or contact Kyligence.";
-    private static final String LICENSE_NOT_EFFECTIVE = "License is not effective yet, please apply for a new license.";
-    private static final String LICENSE_EXPIRED = "The license has expired. Please upload a new license, or contact Kyligence.";
     private static final String DDL_UNSUPPORTED = "Unsupported DDL syntax, only support single `create view`, `drop view`,  `alter view`, `show create table`";
     private static final String DDL_VIEW_NAME_ERROR = "View names need to start with KE_";
     private static final String DDL_VIEW_NAME_DUPLICATE_ERROR = "Logical View names is duplicate";
@@ -574,78 +561,71 @@ public class Message {
 
     // License
     public String getLicenseErrorPre() {
-        return LICENSE_ERROR_PRE;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseErrorSuff() {
-        return LICENSE_ERROR_SUFF;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseOverdueTrial() {
-        return LICENSE_OVERDUE_TRIAL;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseNodesExceed() {
-        return LICENSE_NODES_EXCEED;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseNodesNotMatch() {
-        return LICENSE_NODES_NOT_MATCH;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseOverVolume() {
-        return LICENSE_OVER_VOLUME;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseNoLicense() {
-        return LICENSE_NO_LICENSE;
+        return UNKNOWN_ERROR;
     }
 
     public String getlicenseWrongCategory() {
-        return LICENSE_WRONG_CATEGORY;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseInvalidLicense() {
-        return "The license is invalid. Please upload a new license, or contact Kyligence.";
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseMismatchLicense() {
-        return LICENSE_MISMATCH_LICENSE;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseNotEffective() {
-        return LICENSE_NOT_EFFECTIVE;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseExpired() {
-        return LICENSE_EXPIRED;
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseSourceOverCapacity() {
-        return "The amount of data volume used(%s/%s) exceeds the license’s limit. Build index and load data is unavailable.\n"
-                + "Please contact Kyligence, or try deleting some segments.";
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseProjectSourceOverCapacity() {
-        return "The amount of data volume used(%s/%s) exceeds the project’s limit. Build index and load data is unavailable.\n"
-                + "Please contact Kyligence, or try deleting some segments.";
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseNodesOverCapacity() {
-        return "The amount of nodes used (%s/%s) exceeds the license’s limit. Build index and load data is unavailable.\n"
-                + "Please contact Kyligence, or try stopping some nodes.";
+        return UNKNOWN_ERROR;
     }
 
     public String getLicenseSourceNodesOverCapacity() {
-        return "The amount of data volume used (%s/%s)  and nodes used (%s/%s) exceeds license’s limit.\n"
-                + "Build index and load data is unavailable.\n"
-                + "Please contact Kyligence, or try deleting some segments and stopping some nodes.";
+        return UNKNOWN_ERROR;
     }
 
     public String getlicenseProjectSourceNodesOverCapacity() {
-        return "The amount of data volume used (%s/%s)  and nodes used (%s/%s) exceeds project’s limit.\n"
-                + "Build index and load data is unavailable.\n"
-                + "Please contact Kyligence, or try deleting some segments and stopping some nodes.";
+        return UNKNOWN_ERROR;
     }
 
     public String saveModelFail() {
@@ -846,18 +826,6 @@ public class Message {
         return "Please enter segment ID or name.";
     }
 
-    public String getContentIsEmpty() {
-        return "license content is empty";
-    }
-
-    public String getIllegalEmail() {
-        return "A personal email or illegal email is not allowed";
-    }
-
-    public String getLicenseError() {
-        return "Get license error";
-    }
-
     public String getEmailUsernameCompanyCanNotEmpty() {
         return "Email, username, company can not be empty";
     }
@@ -961,7 +929,7 @@ public class Message {
     }
 
     public String getDefaultSuggest() {
-        return "Please contact Kyligence technical support for more details.";
+        return "Please contact Community support for more details.";
     }
 
     public String getUnexpectedToken() {
@@ -1417,8 +1385,7 @@ public class Message {
     }
 
     public String getStreamingDisabled() {
-        return "The Real-time functions can only be used under Kyligence Premium Version, "
-                + "please contact Kyligence customer manager to upgrade your license.";
+        return "The Real-time functions is under development.";
     }
 
     public String getNoStreamingModelFound() {
diff --git a/src/core-common/src/main/resources/kylin_error_msg_conf_cn.properties b/src/core-common/src/main/resources/kylin_error_msg_conf_cn.properties
index 77070d8924..11f10b5af9 100644
--- a/src/core-common/src/main/resources/kylin_error_msg_conf_cn.properties
+++ b/src/core-common/src/main/resources/kylin_error_msg_conf_cn.properties
@@ -212,4 +212,4 @@ KE-050041202=该路径已存在:%s。
 
 # Common
 ## KE-060100201
-KE-060100201=Kyligence Enterprise 外部发生异常。
+KE-060100201=Kylin 5.0 外部发生异常。
diff --git a/src/core-common/src/main/resources/kylin_error_msg_conf_en.properties b/src/core-common/src/main/resources/kylin_error_msg_conf_en.properties
index 7c67ccdae4..028f593bce 100644
--- a/src/core-common/src/main/resources/kylin_error_msg_conf_en.properties
+++ b/src/core-common/src/main/resources/kylin_error_msg_conf_en.properties
@@ -211,4 +211,4 @@ KE-050041202=The path already exists: %s.
 
 # Common
 ## KE-060100201
-KE-060100201=An Exception occurred outside Kyligence Enterprise.
+KE-060100201=An Exception occurred outside Kylin 5.0.
diff --git a/src/core-common/src/test/java/org/apache/kylin/common/exception/code/ErrorCodeTest.java b/src/core-common/src/test/java/org/apache/kylin/common/exception/code/ErrorCodeTest.java
index 0da0fc095c..476f1d4b13 100644
--- a/src/core-common/src/test/java/org/apache/kylin/common/exception/code/ErrorCodeTest.java
+++ b/src/core-common/src/test/java/org/apache/kylin/common/exception/code/ErrorCodeTest.java
@@ -29,12 +29,12 @@ public class ErrorCodeTest {
         Assert.assertEquals("KE-060100201", nonKeException.getErrorCode().getCode());
         Assert.assertEquals("Please check whether the external environment(other systems, components, etc.) is normal.",
                 nonKeException.getErrorSuggest().getLocalizedString());
-        Assert.assertEquals("An Exception occurred outside Kyligence Enterprise.", nonKeException.getMsg());
-        Assert.assertEquals("An Exception occurred outside Kyligence Enterprise.",
+        Assert.assertEquals("An Exception occurred outside Kylin 5.0.", nonKeException.getMsg());
+        Assert.assertEquals("An Exception occurred outside Kylin 5.0.",
                 nonKeException.getErrorMsg().getLocalizedString());
         Assert.assertEquals("Please check whether the external environment(other systems, components, etc.) is normal.",
                 nonKeException.getErrorSuggestion().getLocalizedString());
-        Assert.assertEquals("KE-060100201: An Exception occurred outside Kyligence Enterprise.",
+        Assert.assertEquals("KE-060100201: An Exception occurred outside Kylin 5.0.",
                 nonKeException.getCodeMsg());
     }
 }
diff --git a/src/core-job/src/main/java/org/apache/kylin/job/impl/threadpool/NDefaultScheduler.java b/src/core-job/src/main/java/org/apache/kylin/job/impl/threadpool/NDefaultScheduler.java
index bcac0074f6..c574b649c6 100644
--- a/src/core-job/src/main/java/org/apache/kylin/job/impl/threadpool/NDefaultScheduler.java
+++ b/src/core-job/src/main/java/org/apache/kylin/job/impl/threadpool/NDefaultScheduler.java
@@ -43,7 +43,6 @@ import org.apache.kylin.job.execution.ExecutableContext;
 import org.apache.kylin.job.execution.NExecutableManager;
 import org.apache.kylin.job.runners.FetcherRunner;
 import org.apache.kylin.job.runners.JobCheckRunner;
-import org.apache.kylin.job.runners.LicenseCapacityCheckRunner;
 import org.apache.kylin.job.runners.QuotaStorageCheckRunner;
 import org.apache.kylin.metadata.project.NProjectManager;
 import org.apache.kylin.metadata.project.ProjectInstance;
@@ -186,8 +185,6 @@ public class NDefaultScheduler implements Scheduler<AbstractExecutable> {
 
         fetcherPool.scheduleWithFixedDelay(new JobCheckRunner(this), RandomUtils.nextInt(0, pollSecond), pollSecond,
                 TimeUnit.SECONDS);
-        fetcherPool.scheduleWithFixedDelay(new LicenseCapacityCheckRunner(this), RandomUtils.nextInt(0, pollSecond),
-                pollSecond, TimeUnit.SECONDS);
         fetcherPool.scheduleWithFixedDelay(fetcher, RandomUtils.nextInt(0, pollSecond), pollSecond, TimeUnit.SECONDS);
         hasStarted.set(true);
     }
diff --git a/src/core-job/src/main/java/org/apache/kylin/job/lock/ZookeeperUtil.java b/src/core-job/src/main/java/org/apache/kylin/job/lock/ZookeeperUtil.java
index 6f3722fbda..42d5f90b71 100644
--- a/src/core-job/src/main/java/org/apache/kylin/job/lock/ZookeeperUtil.java
+++ b/src/core-job/src/main/java/org/apache/kylin/job/lock/ZookeeperUtil.java
@@ -24,7 +24,6 @@ import org.apache.kylin.common.annotation.ThirdPartyDependencies;
 /**
  *   DO NOT DELETE.
  *
- *   Used By Kyligence Cloud.
  */
 @ThirdPartyDependencies({ @ThirdPartyDependencies.ThirdPartyDependent(repository = "static-user-manager", classes = {
         "AuthenticationClient" }) })
diff --git a/src/core-metadata/src/test/java/org/apache/kylin/measure/topn/TopNCounterTest.java b/src/core-metadata/src/test/java/org/apache/kylin/measure/topn/TopNCounterTest.java
index e641a9847a..be87df121b 100644
--- a/src/core-metadata/src/test/java/org/apache/kylin/measure/topn/TopNCounterTest.java
+++ b/src/core-metadata/src/test/java/org/apache/kylin/measure/topn/TopNCounterTest.java
@@ -196,7 +196,7 @@ public class TopNCounterTest {
     }
 
     /**
-     * https://github.com/Kyligence/KAP/issues/16933
+     *
      *
      * the error of “Comparison method violates its general contract!”
      * are deep in the timsort algorithm and there are two necessary
diff --git a/src/core-metadata/src/test/java/org/apache/kylin/metadata/cube/model/NDataflowManagerTest.java b/src/core-metadata/src/test/java/org/apache/kylin/metadata/cube/model/NDataflowManagerTest.java
index 67796a1eb5..015cc06c23 100644
--- a/src/core-metadata/src/test/java/org/apache/kylin/metadata/cube/model/NDataflowManagerTest.java
+++ b/src/core-metadata/src/test/java/org/apache/kylin/metadata/cube/model/NDataflowManagerTest.java
@@ -558,7 +558,6 @@ public class NDataflowManagerTest extends NLocalFileMetadataTestCase {
     @Test
     @Ignore
     public void testConcurrency() throws IOException, InterruptedException {
-        // this test case merge from PR <https://github.com/Kyligence/KAP/pull/4744>
         final KylinConfig testConfig = getTestConfig();
         final NDataflowManager mgr = NDataflowManager.getInstance(testConfig, projectDefault);
         NIndexPlanManager indePlanMgr = NIndexPlanManager.getInstance(testConfig, projectDefault);
diff --git a/src/core-metadata/src/test/java/org/apache/kylin/metadata/cube/planner/CostBasePlannerUtilsTest.java b/src/core-metadata/src/test/java/org/apache/kylin/metadata/cube/planner/CostBasePlannerUtilsTest.java
index 647d43ef7c..c9d461643d 100644
--- a/src/core-metadata/src/test/java/org/apache/kylin/metadata/cube/planner/CostBasePlannerUtilsTest.java
+++ b/src/core-metadata/src/test/java/org/apache/kylin/metadata/cube/planner/CostBasePlannerUtilsTest.java
@@ -1,3 +1,20 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
 package org.apache.kylin.metadata.cube.planner;
 
 import static org.junit.jupiter.api.Assertions.assertEquals;
diff --git a/src/data-loading-booter/src/main/resources/config/config_library.csv b/src/data-loading-booter/src/main/resources/config/config_library.csv
index 5f9429c661..631c673653 100644
--- a/src/data-loading-booter/src/main/resources/config/config_library.csv
+++ b/src/data-loading-booter/src/main/resources/config/config_library.csv
@@ -120,7 +120,7 @@ kylin.engine.streaming-segment-merge-ratio,Double,流式 segment 合并比率,St
 kylin.engine.streaming-segment-merge-threshold,Int,实时OLAP构建触发合并segment的条件,Real -time OLAP constructs the condition for triggering the combination of segment,TRUE,TRUE
 kylin.engine.streaming-trigger-once,boolean,实时OLAP构建仅触发一次,Real -time OLAP construction only triggers once,TRUE,TRUE
 kylin.engine.submit-hadoop-conf-dir,String,hadoop conf 目录,Hadoop Conf directory,TRUE,TRUE
-kylin.env,String,Kyligence Enterprise 部署模式,Kyligence Enterprise deployment mode,TRUE,TRUE
+kylin.env,String,Kylin 5.0 部署模式,Kylin 5.0 deployment mode,TRUE,TRUE
 kylin.env.channel,String,判断当前部署环境,Determine the current deployment environment,TRUE,TRUE
 kylin.env.engine-write-fs,String,配置构建写集群,Configuration Construction Writing Cluster,TRUE,TRUE
 kylin.env.hdfs-data-working-dir,String,数据存储路径,可以和working-dir分开,Data storage paths can be separated from Working-DIR,TRUE,TRUE
@@ -256,7 +256,7 @@ kylin.metadata.key-case-insensitive,boolean,当本地用户、用户组较多影
 kylin.metadata.ops-cron,boolean,定时备份元数据、垃圾清理的定时任务 cron 表达式,默认值是 0 0 0 * * *,"Timing backup metad data, the timing task of garbage clearance CRON expression, the default value is 0 0 0 * * * *",TRUE,TRUE
 kylin.metadata.random-admin-password.enabled,boolean,admin使用随机密码,admin uses random passwords,TRUE,TRUE
 kylin.metadata.semi-automatic-mode,boolean,前端:当前project 是否含有半自动的标志;后端:是否默认使用半自动档,Front end: whether the current Project contains semi -automatic signs; back -end: Whether the semi -automatic file is used by default,TRUE,TRUE
-kylin.metadata.url,String,Kyligence Enterprise 元数据库路径。,Kyligence Enterprise metad database path.,TRUE,TRUE
+kylin.metadata.url,String,Kylin 5.0 元数据库路径。,Kylin 5.0 metad database path.,TRUE,TRUE
 kylin.metadata.validate-computed-column,boolean,是否检验可计算列,Whether to test can be calculated,TRUE,TRUE
 kylin.metrics.daily-influx-db,String,按天归档influxdb数据的数据库,Database of InfluxDB data by the day,TRUE,TRUE
 kylin.metrics.daily-max-retry-times,Int,按天归档influxdb数据时异常后重试次数,Archive by the day when INFLUXDB data is abnormal,TRUE,TRUE
@@ -323,7 +323,7 @@ kylin.query.engine.sparder-enabled,boolean,是否启用sparder引擎,Whether to
 kylin.query.engine.spark-scheduler-mode,String,spark 调度模式,spark scheduling mode,TRUE,TRUE
 kylin.query.engine.spark-sql-shuffle-partitions,Int,spark sql 中 join 或 aggregations 时的分区数,The number of partitions in spark sql when join or aggregations,TRUE,TRUE
 kylin.query.engine.split-group-sets-into-union,boolean,是否对 grouping sets 进行拆开处理,Whether the Grouping Sets is disassembled and processed,TRUE,TRUE
-kylin.query.escape-default-keyword,boolean,"Kyligence Enterprise 会自动为 default 转为""DEFAULT""。","Kyligence Enterprise will automatically convert to ""Default"" for Default.",TRUE,TRUE
+kylin.query.escape-default-keyword,boolean,"Kylin 5.0 会自动为 default 转为""DEFAULT""。","Kylin 5.0 will automatically convert to ""Default"" for Default.",TRUE,TRUE
 kylin.query.exception-cache-enabled,boolean,是否缓存失败的查询,Whether to cache failure query,TRUE,TRUE
 kylin.query.exception-cache-threshold-duration,Int,需要缓存异常查询的最小查询时间,The minimum query time that needs to be cached abnormal query,TRUE,TRUE
 kylin.query.exception-cache-threshold-times,Int,需要缓存异常查询的最大失败次数,The maximum failure of cache abnormal query,TRUE,TRUE
@@ -447,7 +447,7 @@ kylin.server.https.port,Int,https端口号,https port number,TRUE,TRUE
 kylin.server.leader-race.enabled,boolean,是否打开HA开关,Whether to turn on the HA switch,TRUE,TRUE
 kylin.server.leader-race.heart-beat-interval,Int,HA更新心跳频率,单位s,"HA update heartbeat frequency, unit S",TRUE,TRUE
 kylin.server.leader-race.heart-beat-timeout,Int,HA心跳超时时长,单位s,"HA's heartbeat time timeout, unit S",TRUE,TRUE
-kylin.server.mode,String,Kyligence Enterprise 实例的运行模式,参数值可以是 all 或 query 中的一个,默认值为 all。,"The running mode of the Kyligence Enterprise instance can be one of the parameter values ​​in ALL or Query, and the default value is ALL.",TRUE,TRUE
+kylin.server.mode,String,Kylin 5.0 实例的运行模式,参数值可以是 all 或 query 中的一个,默认值为 all。,"The running mode of the Kylin 5.0 instance can be one of the parameter values ​​in ALL or Query, and the default value is ALL.",TRUE,TRUE
 kylin.server.renew-batch-size,Int,epoch renew批量更新一次最多更新的个数,EPOCH Renew Batch Update Most Updated Number,TRUE,TRUE
 kylin.server.renew-epoch-pool-size,Int,renew epoch 的线程池大小,Renew epoch's thread pool size,TRUE,TRUE
 kylin.server.streaming-change-meta,boolean,流式事务可以修改元数据,Streaming transactions can modify metadata data,TRUE,TRUE
@@ -540,7 +540,7 @@ kylin.web.export-allow-other,boolean,是否允许非 Admin 用户导出查询结
 kylin.web.properties.whitelist,String,提供给前端使用的配置项列表,Configuration items provided to the front -end use,TRUE,TRUE
 kylin.web.session.jdbc-encode-enabled,boolean,session id存入数据库中是否加密,Session ID is encrypted into the database,TRUE,TRUE
 kylin.web.session.secure-random-create-enabled,boolean,session id存入数据库中是否加密,Session ID is encrypted into the database,TRUE,TRUE
-kylin.web.timezone,String,Kyligence Enterprise 的 REST 服务所使用的时区。默认值为本地机器的系统的时区。,The time zone used by Kyligence Enterprise's REST service. The default value is the time zone of the system of the local machine.,TRUE,TRUE
+kylin.web.timezone,String,Kylin 5.0 的 REST 服务所使用的时区。默认值为本地机器的系统的时区。,The time zone used by Kylin 5.0's REST service. The default value is the time zone of the system of the local machine.,TRUE,TRUE
 server.port,Int,服务启动的端口,The port of the service start,TRUE,TRUE
 spring.session.store-type,String,spring session 存储类型,Spring session storage type,TRUE,TRUE
 kylin.engine.dynamic-resource-plan-enabled,boolean,在FI平台使用动态资源计划时候,有可能导致无法分配资源,从而构建失败,"When using a dynamic resource plan on the FI platform, it may lead to the failure to allocate resources and build a failure",TRUE,TRUE
diff --git a/src/examples/LICENSE b/src/examples/LICENSE
deleted file mode 100644
index e2652ae583..0000000000
--- a/src/examples/LICENSE
+++ /dev/null
@@ -1,10 +0,0 @@
-Evaluation license for Kyligence Enterprise
-Category: 4.x
-SLA Service: NO
-Volume: 1
-Level: professional
-Insight License: 5 users; evaluation; 2019-06-01,2019-07-30
-====
-Kyligence Enterprise
-2019-06-01,2019-07-30
-19d4801b6dardchr83bp3i7wadbdvycs8ay7ibicu2msfogl6kiwz7z3dmdizepmicl3bgqznn34794jt5g51sutofcfpn9jeiw5k3cvt2750faxw7ip1fp08mt3og6xijt4x02euf1zkrn5m7huwal8lqms3gmn0d5i8y2dqlvkvpqtwz3m9tqcnq6n4lznthbdtfncdqsly7a8v9pndh1cav2tdcczzs17ns6e0d4izeatwybr25lir5f5s6qe4ry10x2fkqco7unb4h4ivx8jo6vdb5sp3r4738zhlvrbdwfa38s3wh82lrnugrhxq8eap3rebq9dz8xka713aui4v2acquulicdadt63cv0biz7y7eccfh1tri60526b2bmon71k29n6p29tsbhyl2wdx5hsjuxg2wd993hcndot1fc5oz8kebopqrudyf4o7tjc5ca0bvtysnw3gn64c1sd2iw2rlhlxk7c5szp6kde [...]
\ No newline at end of file
diff --git a/src/jdbc/pom.xml b/src/jdbc/pom.xml
index be5048b52c..979fa50454 100644
--- a/src/jdbc/pom.xml
+++ b/src/jdbc/pom.xml
@@ -237,19 +237,4 @@
         </plugins>
     </build>
 
-    <distributionManagement>
-        <repository>
-            <id>${repository.id}</id>
-            <url>${repository.url}</url>
-            <name>${repository.name}</name>
-            <layout>default</layout>
-        </repository>
-        <snapshotRepository>
-            <id>${repository.id.snapshots}</id>
-            <url>${repository.url.snapshots}</url>
-            <name>${repository.name.snapshots}</name>
-            <layout>default</layout>
-        </snapshotRepository>
-    </distributionManagement>
-
 </project>
diff --git a/src/jdbc/src/main/java/org/apache/kylin/jdbc/KylinClient.java b/src/jdbc/src/main/java/org/apache/kylin/jdbc/KylinClient.java
index f211d960f0..57e1fc120b 100644
--- a/src/jdbc/src/main/java/org/apache/kylin/jdbc/KylinClient.java
+++ b/src/jdbc/src/main/java/org/apache/kylin/jdbc/KylinClient.java
@@ -633,7 +633,7 @@ public class KylinClient implements IRemoteClient {
         String responseStr = EntityUtils.toString(response.getEntity());
         if (responseStr.contains("Error occured while trying to proxy to")) {
             return new IOException("FAILED!\n"
-                    + "[Kyligence][JDBCDriver]  Unsupported Apache Kylin instance, please contact Apache Kylin Community.");
+                    + "[Kylin 5][JDBCDriver]  Unsupported Apache Kylin instance, please contact Apache Kylin Community.");
         } else {
             return new IOException(
                     request.getMethod() + " failed, error code " + statusCode + " and response: " + responseStr);
diff --git a/src/job-service/src/test/java/org/apache/kylin/rest/config/initialize/JobSchedulerListenerTest.java b/src/job-service/src/test/java/org/apache/kylin/rest/config/initialize/JobSchedulerListenerTest.java
index f64e878136..ce0f57f526 100644
--- a/src/job-service/src/test/java/org/apache/kylin/rest/config/initialize/JobSchedulerListenerTest.java
+++ b/src/job-service/src/test/java/org/apache/kylin/rest/config/initialize/JobSchedulerListenerTest.java
@@ -430,7 +430,7 @@ public class JobSchedulerListenerTest extends NLocalFileMetadataTestCase {
         Assert.assertEquals("KE-060100201", jobInfo.getErrorCode());
         Assert.assertEquals("Please check whether the external environment(other systems, components, etc.) is normal.",
                 jobInfo.getSuggestion());
-        Assert.assertEquals("KE-060100201: An Exception occurred outside Kyligence Enterprise.", jobInfo.getMsg());
+        Assert.assertEquals("KE-060100201: An Exception occurred outside Kylin 5.0.", jobInfo.getMsg());
         Assert.assertTrue(jobInfo.getStacktrace().startsWith("java.lang.RuntimeException"));
     }
 
diff --git a/src/kylin-it/src/test/java/org/apache/kylin/query/engine/QueryExecTest.java b/src/kylin-it/src/test/java/org/apache/kylin/query/engine/QueryExecTest.java
index 7d7fcaba7c..a13cd7fb6a 100644
--- a/src/kylin-it/src/test/java/org/apache/kylin/query/engine/QueryExecTest.java
+++ b/src/kylin-it/src/test/java/org/apache/kylin/query/engine/QueryExecTest.java
@@ -75,8 +75,6 @@ public class QueryExecTest extends NLocalFileMetadataTestCase {
      */
     @Test
     public void testWorkWithoutKapAggregateReduceFunctionsRule() throws SQLException {
-        // Can not reproduce https://github.com/Kyligence/KAP/issues/15261 at 4.x
-        // we needn't introduce KapAggregateReduceFunctionsRule as we did in 3.x
         overwriteSystemProp("kylin.query.convert-sum-expression-enabled", "true");
         String SQL = "select sum(t.a1 * 2)  from ("
                 + "select sum(price/2) as a1, sum(ITEM_COUNT) as a2 from TEST_KYLIN_FACT group by LSTG_FORMAT_NAME"
diff --git a/src/query-booter/src/main/resources/config/config_library.csv b/src/query-booter/src/main/resources/config/config_library.csv
index 5f9429c661..631c673653 100644
--- a/src/query-booter/src/main/resources/config/config_library.csv
+++ b/src/query-booter/src/main/resources/config/config_library.csv
@@ -120,7 +120,7 @@ kylin.engine.streaming-segment-merge-ratio,Double,流式 segment 合并比率,St
 kylin.engine.streaming-segment-merge-threshold,Int,实时OLAP构建触发合并segment的条件,Real -time OLAP constructs the condition for triggering the combination of segment,TRUE,TRUE
 kylin.engine.streaming-trigger-once,boolean,实时OLAP构建仅触发一次,Real -time OLAP construction only triggers once,TRUE,TRUE
 kylin.engine.submit-hadoop-conf-dir,String,hadoop conf 目录,Hadoop Conf directory,TRUE,TRUE
-kylin.env,String,Kyligence Enterprise 部署模式,Kyligence Enterprise deployment mode,TRUE,TRUE
+kylin.env,String,Kylin 5.0 部署模式,Kylin 5.0 deployment mode,TRUE,TRUE
 kylin.env.channel,String,判断当前部署环境,Determine the current deployment environment,TRUE,TRUE
 kylin.env.engine-write-fs,String,配置构建写集群,Configuration Construction Writing Cluster,TRUE,TRUE
 kylin.env.hdfs-data-working-dir,String,数据存储路径,可以和working-dir分开,Data storage paths can be separated from Working-DIR,TRUE,TRUE
@@ -256,7 +256,7 @@ kylin.metadata.key-case-insensitive,boolean,当本地用户、用户组较多影
 kylin.metadata.ops-cron,boolean,定时备份元数据、垃圾清理的定时任务 cron 表达式,默认值是 0 0 0 * * *,"Timing backup metad data, the timing task of garbage clearance CRON expression, the default value is 0 0 0 * * * *",TRUE,TRUE
 kylin.metadata.random-admin-password.enabled,boolean,admin使用随机密码,admin uses random passwords,TRUE,TRUE
 kylin.metadata.semi-automatic-mode,boolean,前端:当前project 是否含有半自动的标志;后端:是否默认使用半自动档,Front end: whether the current Project contains semi -automatic signs; back -end: Whether the semi -automatic file is used by default,TRUE,TRUE
-kylin.metadata.url,String,Kyligence Enterprise 元数据库路径。,Kyligence Enterprise metad database path.,TRUE,TRUE
+kylin.metadata.url,String,Kylin 5.0 元数据库路径。,Kylin 5.0 metad database path.,TRUE,TRUE
 kylin.metadata.validate-computed-column,boolean,是否检验可计算列,Whether to test can be calculated,TRUE,TRUE
 kylin.metrics.daily-influx-db,String,按天归档influxdb数据的数据库,Database of InfluxDB data by the day,TRUE,TRUE
 kylin.metrics.daily-max-retry-times,Int,按天归档influxdb数据时异常后重试次数,Archive by the day when INFLUXDB data is abnormal,TRUE,TRUE
@@ -323,7 +323,7 @@ kylin.query.engine.sparder-enabled,boolean,是否启用sparder引擎,Whether to
 kylin.query.engine.spark-scheduler-mode,String,spark 调度模式,spark scheduling mode,TRUE,TRUE
 kylin.query.engine.spark-sql-shuffle-partitions,Int,spark sql 中 join 或 aggregations 时的分区数,The number of partitions in spark sql when join or aggregations,TRUE,TRUE
 kylin.query.engine.split-group-sets-into-union,boolean,是否对 grouping sets 进行拆开处理,Whether the Grouping Sets is disassembled and processed,TRUE,TRUE
-kylin.query.escape-default-keyword,boolean,"Kyligence Enterprise 会自动为 default 转为""DEFAULT""。","Kyligence Enterprise will automatically convert to ""Default"" for Default.",TRUE,TRUE
+kylin.query.escape-default-keyword,boolean,"Kylin 5.0 会自动为 default 转为""DEFAULT""。","Kylin 5.0 will automatically convert to ""Default"" for Default.",TRUE,TRUE
 kylin.query.exception-cache-enabled,boolean,是否缓存失败的查询,Whether to cache failure query,TRUE,TRUE
 kylin.query.exception-cache-threshold-duration,Int,需要缓存异常查询的最小查询时间,The minimum query time that needs to be cached abnormal query,TRUE,TRUE
 kylin.query.exception-cache-threshold-times,Int,需要缓存异常查询的最大失败次数,The maximum failure of cache abnormal query,TRUE,TRUE
@@ -447,7 +447,7 @@ kylin.server.https.port,Int,https端口号,https port number,TRUE,TRUE
 kylin.server.leader-race.enabled,boolean,是否打开HA开关,Whether to turn on the HA switch,TRUE,TRUE
 kylin.server.leader-race.heart-beat-interval,Int,HA更新心跳频率,单位s,"HA update heartbeat frequency, unit S",TRUE,TRUE
 kylin.server.leader-race.heart-beat-timeout,Int,HA心跳超时时长,单位s,"HA's heartbeat time timeout, unit S",TRUE,TRUE
-kylin.server.mode,String,Kyligence Enterprise 实例的运行模式,参数值可以是 all 或 query 中的一个,默认值为 all。,"The running mode of the Kyligence Enterprise instance can be one of the parameter values ​​in ALL or Query, and the default value is ALL.",TRUE,TRUE
+kylin.server.mode,String,Kylin 5.0 实例的运行模式,参数值可以是 all 或 query 中的一个,默认值为 all。,"The running mode of the Kylin 5.0 instance can be one of the parameter values ​​in ALL or Query, and the default value is ALL.",TRUE,TRUE
 kylin.server.renew-batch-size,Int,epoch renew批量更新一次最多更新的个数,EPOCH Renew Batch Update Most Updated Number,TRUE,TRUE
 kylin.server.renew-epoch-pool-size,Int,renew epoch 的线程池大小,Renew epoch's thread pool size,TRUE,TRUE
 kylin.server.streaming-change-meta,boolean,流式事务可以修改元数据,Streaming transactions can modify metadata data,TRUE,TRUE
@@ -540,7 +540,7 @@ kylin.web.export-allow-other,boolean,是否允许非 Admin 用户导出查询结
 kylin.web.properties.whitelist,String,提供给前端使用的配置项列表,Configuration items provided to the front -end use,TRUE,TRUE
 kylin.web.session.jdbc-encode-enabled,boolean,session id存入数据库中是否加密,Session ID is encrypted into the database,TRUE,TRUE
 kylin.web.session.secure-random-create-enabled,boolean,session id存入数据库中是否加密,Session ID is encrypted into the database,TRUE,TRUE
-kylin.web.timezone,String,Kyligence Enterprise 的 REST 服务所使用的时区。默认值为本地机器的系统的时区。,The time zone used by Kyligence Enterprise's REST service. The default value is the time zone of the system of the local machine.,TRUE,TRUE
+kylin.web.timezone,String,Kylin 5.0 的 REST 服务所使用的时区。默认值为本地机器的系统的时区。,The time zone used by Kylin 5.0's REST service. The default value is the time zone of the system of the local machine.,TRUE,TRUE
 server.port,Int,服务启动的端口,The port of the service start,TRUE,TRUE
 spring.session.store-type,String,spring session 存储类型,Spring session storage type,TRUE,TRUE
 kylin.engine.dynamic-resource-plan-enabled,boolean,在FI平台使用动态资源计划时候,有可能导致无法分配资源,从而构建失败,"When using a dynamic resource plan on the FI platform, it may lead to the failure to allocate resources and build a failure",TRUE,TRUE
diff --git a/src/query-common/src/main/java/org/apache/kylin/query/relnode/ContextUtil.java b/src/query-common/src/main/java/org/apache/kylin/query/relnode/ContextUtil.java
index 766335ce7a..ea7c417d2b 100644
--- a/src/query-common/src/main/java/org/apache/kylin/query/relnode/ContextUtil.java
+++ b/src/query-common/src/main/java/org/apache/kylin/query/relnode/ContextUtil.java
@@ -149,7 +149,6 @@ public class ContextUtil {
                     hasCountConstant);
 
         } else {
-            //https://github.com/Kyligence/KAP/issues/9952
             //do not support agg pushdown if WindowRel, SortRel, LimitRel, ValueRel is met
             return false;
         }
diff --git a/src/query/src/main/java/io/kyligence/kap/query/optrule/ExtensionOlapJoinRule.java b/src/query/src/main/java/io/kyligence/kap/query/optrule/ExtensionOlapJoinRule.java
index 6732429376..6264572b08 100644
--- a/src/query/src/main/java/io/kyligence/kap/query/optrule/ExtensionOlapJoinRule.java
+++ b/src/query/src/main/java/io/kyligence/kap/query/optrule/ExtensionOlapJoinRule.java
@@ -66,7 +66,6 @@ public class ExtensionOlapJoinRule extends ConverterRule {
 
         final JoinInfo info = JoinInfo.of(left, right, join.getCondition());
 
-        // handle powerbi inner join, see https://github.com/Kyligence/KAP/issues/1823
         Join tmpJoin = transformJoinCondition(join, info, traitSet, left, right);
         if (tmpJoin instanceof OLAPJoinRel) {
             return tmpJoin;
diff --git a/src/query/src/main/java/io/kyligence/kap/query/optrule/KapJoinRule.java b/src/query/src/main/java/io/kyligence/kap/query/optrule/KapJoinRule.java
index 8e82375827..5b364b3663 100644
--- a/src/query/src/main/java/io/kyligence/kap/query/optrule/KapJoinRule.java
+++ b/src/query/src/main/java/io/kyligence/kap/query/optrule/KapJoinRule.java
@@ -81,7 +81,7 @@ public class KapJoinRule extends ConverterRule {
 
         final JoinInfo info = JoinInfo.of(left, right, join.getCondition());
 
-        // handle powerbi inner join, see https://github.com/Kyligence/KAP/issues/1823
+        // handle powerbi inner join
         Join tmpJoin = transformJoinCondition(join, info, traitSet, left, right);
         if (tmpJoin instanceof KapJoinRel) {
             return tmpJoin;
diff --git a/src/query/src/main/java/io/kyligence/kap/query/optrule/SumBasicOperatorRule.java b/src/query/src/main/java/io/kyligence/kap/query/optrule/SumBasicOperatorRule.java
index 8671b0d805..e3570cb730 100644
--- a/src/query/src/main/java/io/kyligence/kap/query/optrule/SumBasicOperatorRule.java
+++ b/src/query/src/main/java/io/kyligence/kap/query/optrule/SumBasicOperatorRule.java
@@ -361,7 +361,6 @@ public class SumBasicOperatorRule extends RelOptRule {
 
     private void verifyPlusOrMinus(RexCall exprCall) {
         // plus or minus does not support SUM EXPRESSION caused by null values
-        // please see https://github.com/Kyligence/KAP/issues/14627
         throw new SumExprUnSupportException("That PLUS/MINUS of the columns is not supported for sum expression");
     }
 
diff --git a/src/query/src/main/java/org/apache/kylin/query/engine/SQLConverter.java b/src/query/src/main/java/org/apache/kylin/query/engine/SQLConverter.java
index 201bd469b8..e80bdfbb83 100644
--- a/src/query/src/main/java/org/apache/kylin/query/engine/SQLConverter.java
+++ b/src/query/src/main/java/org/apache/kylin/query/engine/SQLConverter.java
@@ -184,7 +184,6 @@ public class SQLConverter {
         }
 
         /* OVERRIDE POINT */
-        // https://github.com/Kyligence/KAP/issues/10964
         RelNode rel = root.rel;
         if (connectionConfig.projectUnderRelRoot() && !root.isRefTrivial()) {
             final List<RexNode> projects = new ArrayList<>();
diff --git a/src/query/src/test/java/org/apache/kylin/query/relnode/ContextUtilTest.java b/src/query/src/test/java/org/apache/kylin/query/relnode/ContextUtilTest.java
index 11608c5827..85859077ae 100644
--- a/src/query/src/test/java/org/apache/kylin/query/relnode/ContextUtilTest.java
+++ b/src/query/src/test/java/org/apache/kylin/query/relnode/ContextUtilTest.java
@@ -30,7 +30,6 @@ import org.mockito.Mockito;
 
 public class ContextUtilTest {
 
-    //https://github.com/Kyligence/KAP/issues/9952
     //do not support agg pushdown if WindowRel, SortRel, LimitRel, ValueRel is met
     @Test
     public void testDerivedFromSameContextWhenMetWindowOrSort() throws Exception {
diff --git a/src/query/src/test/java/org/apache/kylin/query/util/ExpressionComparatorTest.java b/src/query/src/test/java/org/apache/kylin/query/util/ExpressionComparatorTest.java
index 60fedb60a3..b7be5e5aba 100644
--- a/src/query/src/test/java/org/apache/kylin/query/util/ExpressionComparatorTest.java
+++ b/src/query/src/test/java/org/apache/kylin/query/util/ExpressionComparatorTest.java
@@ -139,7 +139,6 @@ class ExpressionComparatorTest {
 
     @Test
     void testNoNPE() {
-        //https://github.com/Kyligence/KAP/issues/10934
         String sql0 = "select a.a + a.b + a.c from t as a";
         String sql1 = "select a.a + a.b + a.c from t as a";
         String sql2 = "select 1";
diff --git a/src/query/src/test/java/org/apache/kylin/query/util/QueryUtilTest.java b/src/query/src/test/java/org/apache/kylin/query/util/QueryUtilTest.java
index 8a703d262a..3d0ac0f1c5 100644
--- a/src/query/src/test/java/org/apache/kylin/query/util/QueryUtilTest.java
+++ b/src/query/src/test/java/org/apache/kylin/query/util/QueryUtilTest.java
@@ -264,10 +264,10 @@ public class QueryUtilTest extends NLocalFileMetadataTestCase {
 
         final Exception exception = new IllegalStateException(
                 "\tThere is no column\t'age' in table 'test_kylin_fact'.\n"
-                        + "Please contact Kyligence Enterprise technical support for more details.\n");
+                        + "Please contact Kylin 5.0 technical support for more details.\n");
         final String errorMsg = QueryUtil.makeErrorMsgUserFriendly(exception);
         Assert.assertEquals("There is no column\t'age' in table 'test_kylin_fact'.\n"
-                + "Please contact Kyligence Enterprise technical support for more details.", errorMsg);
+                + "Please contact Kylin 5.0 technical support for more details.", errorMsg);
     }
 
     @Test
diff --git a/src/server/src/main/java/org/apache/kylin/rest/BootstrapServer.java b/src/server/src/main/java/org/apache/kylin/rest/BootstrapServer.java
index 474ccad787..fd436c4c73 100644
--- a/src/server/src/main/java/org/apache/kylin/rest/BootstrapServer.java
+++ b/src/server/src/main/java/org/apache/kylin/rest/BootstrapServer.java
@@ -130,7 +130,7 @@ public class BootstrapServer implements ISmartApplicationListenerForSystem {
         if (event instanceof ApplicationReadyEvent) {
             logger.info("init backend end...");
         } else if (event instanceof ContextClosedEvent) {
-            logger.info("Stop Kyligence node...");
+            logger.info("Stop Kylin 5 node...");
             EpochManager.getInstance().releaseOwnedEpochs();
         }
     }
diff --git a/src/server/src/main/java/org/apache/kylin/rest/InitConfiguration.java b/src/server/src/main/java/org/apache/kylin/rest/InitConfiguration.java
index e1fa6928aa..99e675154c 100644
--- a/src/server/src/main/java/org/apache/kylin/rest/InitConfiguration.java
+++ b/src/server/src/main/java/org/apache/kylin/rest/InitConfiguration.java
@@ -46,7 +46,7 @@ public class InitConfiguration {
     public void init() {
         if (KylinConfig.getInstanceFromEnv().isCheckHostname() && hostInfoFetcher.getHostname().indexOf("_") != -1) {
             throw new KylinRuntimeException("The hostname does not support containing '_' characters,"
-                    + " please modify the hostname of Kyligence Enterprise nodes.");
+                    + " please modify the hostname of Kylin 5.0 nodes.");
         }
     }
 
diff --git a/src/server/src/main/java/org/apache/kylin/rest/config/SwaggerConfig.java b/src/server/src/main/java/org/apache/kylin/rest/config/SwaggerConfig.java
index c153787696..a9270561b0 100644
--- a/src/server/src/main/java/org/apache/kylin/rest/config/SwaggerConfig.java
+++ b/src/server/src/main/java/org/apache/kylin/rest/config/SwaggerConfig.java
@@ -37,7 +37,7 @@ public class SwaggerConfig {
 
     public static final String LICENSE = "Apache 2.0";
     public static final String SWAGGER_LICENSE_URL = "http://www.apache.org/licenses/LICENSE-2.0.html";
-    public static final String TITLE = "Kyligence Enterprise API";
+    public static final String TITLE = "Kylin 5.0 API";
 
     @Order(2)
     @Bean(value = "v4 public")
diff --git a/src/server/src/main/resources/config/config_library.csv b/src/server/src/main/resources/config/config_library.csv
index 5f9429c661..631c673653 100644
--- a/src/server/src/main/resources/config/config_library.csv
+++ b/src/server/src/main/resources/config/config_library.csv
@@ -120,7 +120,7 @@ kylin.engine.streaming-segment-merge-ratio,Double,流式 segment 合并比率,St
 kylin.engine.streaming-segment-merge-threshold,Int,实时OLAP构建触发合并segment的条件,Real -time OLAP constructs the condition for triggering the combination of segment,TRUE,TRUE
 kylin.engine.streaming-trigger-once,boolean,实时OLAP构建仅触发一次,Real -time OLAP construction only triggers once,TRUE,TRUE
 kylin.engine.submit-hadoop-conf-dir,String,hadoop conf 目录,Hadoop Conf directory,TRUE,TRUE
-kylin.env,String,Kyligence Enterprise 部署模式,Kyligence Enterprise deployment mode,TRUE,TRUE
+kylin.env,String,Kylin 5.0 部署模式,Kylin 5.0 deployment mode,TRUE,TRUE
 kylin.env.channel,String,判断当前部署环境,Determine the current deployment environment,TRUE,TRUE
 kylin.env.engine-write-fs,String,配置构建写集群,Configuration Construction Writing Cluster,TRUE,TRUE
 kylin.env.hdfs-data-working-dir,String,数据存储路径,可以和working-dir分开,Data storage paths can be separated from Working-DIR,TRUE,TRUE
@@ -256,7 +256,7 @@ kylin.metadata.key-case-insensitive,boolean,当本地用户、用户组较多影
 kylin.metadata.ops-cron,boolean,定时备份元数据、垃圾清理的定时任务 cron 表达式,默认值是 0 0 0 * * *,"Timing backup metad data, the timing task of garbage clearance CRON expression, the default value is 0 0 0 * * * *",TRUE,TRUE
 kylin.metadata.random-admin-password.enabled,boolean,admin使用随机密码,admin uses random passwords,TRUE,TRUE
 kylin.metadata.semi-automatic-mode,boolean,前端:当前project 是否含有半自动的标志;后端:是否默认使用半自动档,Front end: whether the current Project contains semi -automatic signs; back -end: Whether the semi -automatic file is used by default,TRUE,TRUE
-kylin.metadata.url,String,Kyligence Enterprise 元数据库路径。,Kyligence Enterprise metad database path.,TRUE,TRUE
+kylin.metadata.url,String,Kylin 5.0 元数据库路径。,Kylin 5.0 metad database path.,TRUE,TRUE
 kylin.metadata.validate-computed-column,boolean,是否检验可计算列,Whether to test can be calculated,TRUE,TRUE
 kylin.metrics.daily-influx-db,String,按天归档influxdb数据的数据库,Database of InfluxDB data by the day,TRUE,TRUE
 kylin.metrics.daily-max-retry-times,Int,按天归档influxdb数据时异常后重试次数,Archive by the day when INFLUXDB data is abnormal,TRUE,TRUE
@@ -323,7 +323,7 @@ kylin.query.engine.sparder-enabled,boolean,是否启用sparder引擎,Whether to
 kylin.query.engine.spark-scheduler-mode,String,spark 调度模式,spark scheduling mode,TRUE,TRUE
 kylin.query.engine.spark-sql-shuffle-partitions,Int,spark sql 中 join 或 aggregations 时的分区数,The number of partitions in spark sql when join or aggregations,TRUE,TRUE
 kylin.query.engine.split-group-sets-into-union,boolean,是否对 grouping sets 进行拆开处理,Whether the Grouping Sets is disassembled and processed,TRUE,TRUE
-kylin.query.escape-default-keyword,boolean,"Kyligence Enterprise 会自动为 default 转为""DEFAULT""。","Kyligence Enterprise will automatically convert to ""Default"" for Default.",TRUE,TRUE
+kylin.query.escape-default-keyword,boolean,"Kylin 5.0 会自动为 default 转为""DEFAULT""。","Kylin 5.0 will automatically convert to ""Default"" for Default.",TRUE,TRUE
 kylin.query.exception-cache-enabled,boolean,是否缓存失败的查询,Whether to cache failure query,TRUE,TRUE
 kylin.query.exception-cache-threshold-duration,Int,需要缓存异常查询的最小查询时间,The minimum query time that needs to be cached abnormal query,TRUE,TRUE
 kylin.query.exception-cache-threshold-times,Int,需要缓存异常查询的最大失败次数,The maximum failure of cache abnormal query,TRUE,TRUE
@@ -447,7 +447,7 @@ kylin.server.https.port,Int,https端口号,https port number,TRUE,TRUE
 kylin.server.leader-race.enabled,boolean,是否打开HA开关,Whether to turn on the HA switch,TRUE,TRUE
 kylin.server.leader-race.heart-beat-interval,Int,HA更新心跳频率,单位s,"HA update heartbeat frequency, unit S",TRUE,TRUE
 kylin.server.leader-race.heart-beat-timeout,Int,HA心跳超时时长,单位s,"HA's heartbeat time timeout, unit S",TRUE,TRUE
-kylin.server.mode,String,Kyligence Enterprise 实例的运行模式,参数值可以是 all 或 query 中的一个,默认值为 all。,"The running mode of the Kyligence Enterprise instance can be one of the parameter values ​​in ALL or Query, and the default value is ALL.",TRUE,TRUE
+kylin.server.mode,String,Kylin 5.0 实例的运行模式,参数值可以是 all 或 query 中的一个,默认值为 all。,"The running mode of the Kylin 5.0 instance can be one of the parameter values ​​in ALL or Query, and the default value is ALL.",TRUE,TRUE
 kylin.server.renew-batch-size,Int,epoch renew批量更新一次最多更新的个数,EPOCH Renew Batch Update Most Updated Number,TRUE,TRUE
 kylin.server.renew-epoch-pool-size,Int,renew epoch 的线程池大小,Renew epoch's thread pool size,TRUE,TRUE
 kylin.server.streaming-change-meta,boolean,流式事务可以修改元数据,Streaming transactions can modify metadata data,TRUE,TRUE
@@ -540,7 +540,7 @@ kylin.web.export-allow-other,boolean,是否允许非 Admin 用户导出查询结
 kylin.web.properties.whitelist,String,提供给前端使用的配置项列表,Configuration items provided to the front -end use,TRUE,TRUE
 kylin.web.session.jdbc-encode-enabled,boolean,session id存入数据库中是否加密,Session ID is encrypted into the database,TRUE,TRUE
 kylin.web.session.secure-random-create-enabled,boolean,session id存入数据库中是否加密,Session ID is encrypted into the database,TRUE,TRUE
-kylin.web.timezone,String,Kyligence Enterprise 的 REST 服务所使用的时区。默认值为本地机器的系统的时区。,The time zone used by Kyligence Enterprise's REST service. The default value is the time zone of the system of the local machine.,TRUE,TRUE
+kylin.web.timezone,String,Kylin 5.0 的 REST 服务所使用的时区。默认值为本地机器的系统的时区。,The time zone used by Kylin 5.0's REST service. The default value is the time zone of the system of the local machine.,TRUE,TRUE
 server.port,Int,服务启动的端口,The port of the service start,TRUE,TRUE
 spring.session.store-type,String,spring session 存储类型,Spring session storage type,TRUE,TRUE
 kylin.engine.dynamic-resource-plan-enabled,boolean,在FI平台使用动态资源计划时候,有可能导致无法分配资源,从而构建失败,"When using a dynamic resource plan on the FI platform, it may lead to the failure to allocate resources and build a failure",TRUE,TRUE
diff --git a/src/systools/pom.xml b/src/systools/pom.xml
deleted file mode 100644
index e69de29bb2..0000000000
diff --git a/src/systools/src/test/resources/ehcache.xml b/src/systools/src/test/resources/ehcache.xml
deleted file mode 100644
index e69de29bb2..0000000000
diff --git a/src/tool/src/test/java/org/apache/kylin/tool/AbstractInfoExtractorToolTest.java b/src/tool/src/test/java/org/apache/kylin/tool/AbstractInfoExtractorToolTest.java
index 641bed2d75..ac18d5b841 100644
--- a/src/tool/src/test/java/org/apache/kylin/tool/AbstractInfoExtractorToolTest.java
+++ b/src/tool/src/test/java/org/apache/kylin/tool/AbstractInfoExtractorToolTest.java
@@ -102,9 +102,9 @@ public class AbstractInfoExtractorToolTest extends NLocalFileMetadataTestCase {
         MockInfoExtractorTool mock = new MockInfoExtractorTool();
 
         File license = new File(ToolUtil.getKylinHome(), "LICENSE");
-        String licenseInfo = "Evaluation license for Kyligence Enterprise\n" + "Category: 4.x\n" + "SLA Service: NO\n"
+        String licenseInfo = "Evaluation license for Kylin 5.0\n" + "Category: 4.x\n" + "SLA Service: NO\n"
                 + "Volume: 1\n" + "Level: professional\n"
-                + "Insight License: 5 users; evaluation; 2019-06-01,2019-07-30\n" + "====\n" + "Kyligence Enterprise\n"
+                + "Insight License: 5 users; evaluation; 2019-06-01,2019-07-30\n" + "====\n" + "Kylin 5.0\n"
                 + "2019-06-01,2019-07-30\n"
                 + "19d4801b6dardchr83bp3i7wadbdvycs8ay7ibicu2msfogl6kiwz7z3dmdizepmicl3bgqznn34794jt5g51sutofcfpn9jeiw5k3cvt2750faxw7ip1fp08mt3og6xijt4x02euf1zkrn5m7huwal8lqms3gmn0d5i8y2dqlvkvpqtwz3m9tqcnq6n4lznthbdtfncdqsly7a8v9pndh1cav2tdcczzs17ns6e0d4izeatwybr25lir5f5s6qe4ry10x2fkqco7unb4h4ivx8jo6vdb5sp3r4738zhlvrbdwfa38s3wh82lrnugrhxq8eap3rebq9dz8xka713aui4v2acquulicdadt63cv0biz7y7eccfh1tri60526b2bmon71k29n6p29tsbhyl2wdx5hsjuxg2wd993hcndot1fc5oz8kebopqrudyf4o7tjc5ca0bvtysnw3gn64c1sd2 [...]
         FileUtils.writeStringToFile(license, licenseInfo);
diff --git a/src/tool/src/test/java/org/apache/kylin/tool/upgrade/UpdateUserAclToolTest.java b/src/tool/src/test/java/org/apache/kylin/tool/upgrade/UpdateUserAclToolTest.java
index eec36ce0f3..a82b34cde4 100644
--- a/src/tool/src/test/java/org/apache/kylin/tool/upgrade/UpdateUserAclToolTest.java
+++ b/src/tool/src/test/java/org/apache/kylin/tool/upgrade/UpdateUserAclToolTest.java
@@ -117,10 +117,10 @@ public class UpdateUserAclToolTest extends NLocalFileMetadataTestCase {
 
     @Test
     public void testParseVersion() {
-        Assert.assertEquals("4.5.9", tool.parseVersion("Kyligence Enterprise 4.5.9"));
-        Assert.assertEquals("4.5.9", tool.parseVersion("Kyligence Enterprise 4.5.9-dev"));
-        Assert.assertEquals("4.5.9.0", tool.parseVersion("Kyligence Enterprise 4.5.9.0"));
-        Assert.assertEquals("4.5.9.3299", tool.parseVersion("Kyligence Enterprise 4.5.9.3299"));
+        Assert.assertEquals("4.5.9", tool.parseVersion("Kylin 5.0 4.5.9"));
+        Assert.assertEquals("4.5.9", tool.parseVersion("Kylin 5.0 4.5.9-dev"));
+        Assert.assertEquals("4.5.9.0", tool.parseVersion("Kylin 5.0 4.5.9.0"));
+        Assert.assertEquals("4.5.9.3299", tool.parseVersion("Kylin 5.0 4.5.9.3299"));
     }
 
     @Test
diff --git a/src/tool/src/test/java/org/apache/kylin/tool/util/ServerInfoUtilTest.java b/src/tool/src/test/java/org/apache/kylin/tool/util/ServerInfoUtilTest.java
index 7022338e10..3cb4617e87 100644
--- a/src/tool/src/test/java/org/apache/kylin/tool/util/ServerInfoUtilTest.java
+++ b/src/tool/src/test/java/org/apache/kylin/tool/util/ServerInfoUtilTest.java
@@ -54,7 +54,7 @@ public class ServerInfoUtilTest extends NLocalFileMetadataTestCase {
             clearFileList.add(sha1File);
         }
 
-        String version = "Kyligence Enterprise 4.0.0-SNAPSHOT";
+        String version = "Kylin 5.0 4.0.0-SNAPSHOT";
         File versionFile = new File(KylinConfig.getKylinHome(), "VERSION");
         if (!versionFile.exists()) {
             FileUtils.writeStringToFile(versionFile, version);
diff --git a/src/tool/src/test/resources/diag/eventlog b/src/tool/src/test/resources/diag/eventlog
index f3c39b9851..20faf87073 100644
--- a/src/tool/src/test/resources/diag/eventlog
+++ b/src/tool/src/test/resources/diag/eventlog
@@ -1,5 +1,5 @@
 {"Event":"SparkListenerLogStart","Spark Version":"3.1.1-kylin-4.x-r21"}
 {"Event":"SparkListenerResourceProfileAdded","Resource Profile Id":0,"Executor Resource Requests":{"memoryOverhead":{"Resource Name":"memoryOverhead","Amount":512,"Discovery Script":"","Vendor":""},"cores":{"Resource Name":"cores","Amount":1,"Discovery Script":"","Vendor":""},"memory":{"Resource Name":"memory","Amount":512,"Discovery Script":"","Vendor":""},"offHeap":{"Resource Name":"offHeap","Amount":0,"Discovery Script":"","Vendor":""}},"Task Resource Requests":{"cpus":{"Resource Name [...]
 {"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"driver","Host":"sandbox.hortonworks.com","Port":42844},"Maximum Memory":4965217075,"Timestamp":1617822026015,"Maximum Onheap Memory":4965217075,"Maximum Offheap Memory":0}
-{"Event":"SparkListenerEnvironmentUpdate","JVM Information":{"Java Home":"/usr/lib/jvm/jdk1.8.0_171/jre","Java Version":"1.8.0_171 (Oracle Corporation)","Scala Version":"version 2.12.10"},"Spark Properties":{"spark.port.maxRetries":"128","spark.yarn.dist.files":"/root/zhoujiazhi/test/server/conf/spark-executor-log4j.xml,/root/zhoujiazhi/test/server/conf/spark-appmaster-log4j.xml","spark.scheduler.allocation.file":"/root/zhoujiazhi/test/conf/fairscheduler.xml","spark.executor.extraJavaOpt [...]
+{"Event":"SparkListenerEnvironmentUpdate","JVM Information":{"Java Home":"/usr/lib/jvm/jdk1.8.0_171/jre","Java Version":"1.8.0_171 (Oracle Corporation)","Scala Version":"version 2.12.10"},"Spark Properties":{"spark.port.maxRetries":"128","spark.yarn.dist.files":"/root/zhoujiazhi/test/server/conf/spark-executor-log4j.xml,/root/zhoujiazhi/test/server/conf/spark-appmaster-log4j.xml","spark.scheduler.allocation.file":"/root/zhoujiazhi/test/conf/fairscheduler.xml","spark.executor.extraJavaOpt [...]
 {"Event":"SparkListenerApplicationStart","App Name":"sparder-root-sandbox.hortonworks.com","App ID":"application_1617666536623_0144","Timestamp":1617822012674,"User":"root"}