You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@dolphinscheduler.apache.org by zh...@apache.org on 2022/11/09 13:20:29 UTC

[dolphinscheduler] branch dev updated: [chore] Separate Python API into another repository (#12779)

This is an automated email from the ASF dual-hosted git repository.

zhongjiajie pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/dolphinscheduler.git


The following commit(s) were added to refs/heads/dev by this push:
     new 1347a8f94c [chore] Separate Python API into another repository (#12779)
1347a8f94c is described below

commit 1347a8f94c45c447df9c5e1d0fcecfe17966b68b
Author: Jay Chung <zh...@gmail.com>
AuthorDate: Wed Nov 9 21:20:15 2022 +0800

    [chore] Separate Python API into another repository (#12779)
    
    Currently, our Python API code is a module in apache/dolphinscheduler codebase,
    each time users change Python API code, they need to run all requests CI check
    for dolphinscheduler and Python API, But if the user does only change Python
    code, it could be merged if Python API CI pass and do not dependent on others CI.
    
    Besides, we release Python API as the same version of dolphinscheduler. It is
    easy for user to match Python API version. But when Python API does not change
    any code, but dolphinscheduler release a bugfix version, Python API has to
    release the new version to match dolphinscheduler. This happened when we
    released Python API 2.0.6 and 2.0.7. 2.0.6 and 2.0.7 is bugfix version, and
    Python API does not change any code, so the PyPI package is the same.
    
    Separate Python API also makes our code more sense, we will have more
    distinguished code in dolphinscheduler and Python API new repository.
    Have separate issue tracker and changelog for information to users.
    
    ref PR in other repository: apache/dolphinscheduler-sdk-python#1
    
    see more detail in mail thread: https://lists.apache.org/thread/4z7l5l54c4d81smjlk1n8nq380p9f0oo
---
 .../pydolphinscheduler/.flake8 => .flake8          |  12 -
 .github/CODEOWNERS                                 |   1 -
 .github/actions/labeler/labeler.yml                |   4 -
 .github/workflows/py-ci.yml                        | 205 ------
 .github/workflows/unit-test.yml                    |   1 -
 .gitignore                                         |  15 -
 .pre-commit-config.yaml                            |   5 -
 README.md                                          |   7 +-
 docs/docs/en/contribute/release/release-post.md    |   7 +-
 docs/docs/en/contribute/release/release-prepare.md |   1 -
 docs/docs/en/contribute/release/release.md         |  34 +-
 docs/docs/zh/contribute/release/release-post.md    |   7 +-
 docs/docs/zh/contribute/release/release-prepare.md |   1 -
 docs/docs/zh/contribute/release/release.md         |  34 +-
 dolphinscheduler-api/pom.xml                       |   2 +-
 dolphinscheduler-dist/pom.xml                      |  35 -
 .../main/assembly/dolphinscheduler-python-api.xml  |  34 -
 .../src/main/assembly/dolphinscheduler-src.xml     |   7 -
 dolphinscheduler-python/pom.xml                    | 165 -----
 .../pydolphinscheduler/.coveragerc                 |  34 -
 .../pydolphinscheduler/.isort.cfg                  |  19 -
 .../pydolphinscheduler/DEVELOP.md                  | 265 -------
 dolphinscheduler-python/pydolphinscheduler/LICENSE | 228 ------
 dolphinscheduler-python/pydolphinscheduler/NOTICE  |   5 -
 .../pydolphinscheduler/README.md                   |  90 ---
 .../pydolphinscheduler/RELEASE.md                  |  35 -
 .../pydolphinscheduler/UPDATING.md                 |  40 --
 .../pydolphinscheduler/docs/Makefile               |  44 --
 .../pydolphinscheduler/docs/make.bat               |  54 --
 .../docs/source/_static/.gitkeep                   |   0
 .../docs/source/_templates/versioning.html         |  27 -
 .../docs/source/_templates/versions.html           |  46 --
 .../pydolphinscheduler/docs/source/api.rst         |  47 --
 .../pydolphinscheduler/docs/source/cli.rst         |  36 -
 .../pydolphinscheduler/docs/source/concept.rst     | 151 ----
 .../pydolphinscheduler/docs/source/conf.py         | 121 ----
 .../pydolphinscheduler/docs/source/config.rst      | 218 ------
 .../pydolphinscheduler/docs/source/howto/index.rst |  30 -
 .../docs/source/howto/remote-submit.rst            |  51 --
 .../pydolphinscheduler/docs/source/index.rst       |  46 --
 .../docs/source/resources_plugin/develop.rst       |  46 --
 .../docs/source/resources_plugin/github.rst        |  35 -
 .../docs/source/resources_plugin/gitlab.rst        |  46 --
 .../docs/source/resources_plugin/index.rst         |  32 -
 .../docs/source/resources_plugin/local.rst         |  32 -
 .../docs/source/resources_plugin/oss.rst           |  44 --
 .../source/resources_plugin/resource-plugin.rst    |  75 --
 .../docs/source/resources_plugin/s3.rst            |  36 -
 .../pydolphinscheduler/docs/source/start.rst       | 171 -----
 .../docs/source/tasks/condition.rst                |  40 --
 .../pydolphinscheduler/docs/source/tasks/datax.rst |  46 --
 .../docs/source/tasks/dependent.rst                |  47 --
 .../pydolphinscheduler/docs/source/tasks/dvc.rst   |  41 --
 .../pydolphinscheduler/docs/source/tasks/flink.rst |  40 --
 .../docs/source/tasks/func_wrap.rst                |  33 -
 .../pydolphinscheduler/docs/source/tasks/http.rst  |  29 -
 .../pydolphinscheduler/docs/source/tasks/index.rst |  48 --
 .../docs/source/tasks/map_reduce.rst               |  42 --
 .../docs/source/tasks/mlflow.rst                   |  42 --
 .../docs/source/tasks/openmldb.rst                 |  42 --
 .../docs/source/tasks/procedure.rst                |  29 -
 .../docs/source/tasks/python.rst                   |  29 -
 .../docs/source/tasks/pytorch.rst                  |  42 --
 .../docs/source/tasks/sagemaker.rst                |  46 --
 .../pydolphinscheduler/docs/source/tasks/shell.rst |  41 --
 .../pydolphinscheduler/docs/source/tasks/spark.rst |  41 --
 .../pydolphinscheduler/docs/source/tasks/sql.rst   |  35 -
 .../docs/source/tasks/sub_process.rst              |  38 -
 .../docs/source/tasks/switch.rst                   |  42 --
 .../pydolphinscheduler/docs/source/tutorial.rst    | 319 ---------
 .../examples/yaml_define/Condition.yaml            |  43 --
 .../examples/yaml_define/DataX.yaml                |  33 -
 .../examples/yaml_define/Dependent.yaml            |  76 --
 .../examples/yaml_define/Dependent_External.yaml   |  26 -
 .../examples/yaml_define/Dvc.yaml                  |  46 --
 .../examples/yaml_define/Flink.yaml                |  29 -
 .../examples/yaml_define/Http.yaml                 |  37 -
 .../examples/yaml_define/MapReduce.yaml            |  29 -
 .../examples/yaml_define/MoreConfiguration.yaml    |  40 --
 .../examples/yaml_define/OpenMLDB.yaml             |  33 -
 .../examples/yaml_define/Procedure.yaml            |  27 -
 .../examples/yaml_define/Python.yaml               |  30 -
 .../examples/yaml_define/Pytorch.yaml              |  53 --
 .../examples/yaml_define/Sagemaker.yaml            |  28 -
 .../examples/yaml_define/Shell.yaml                |  40 --
 .../examples/yaml_define/Spark.yaml                |  29 -
 .../examples/yaml_define/Sql.yaml                  |  45 --
 .../examples/yaml_define/SubProcess.yaml           |  27 -
 .../examples/yaml_define/Switch.yaml               |  39 -
 .../examples/yaml_define/example_datax.json        |  62 --
 .../yaml_define/example_sagemaker_params.json      |  18 -
 .../examples/yaml_define/example_sql.sql           |  22 -
 .../examples/yaml_define/example_sub_workflow.yaml |  26 -
 .../examples/yaml_define/mlflow.yaml               |  69 --
 .../examples/yaml_define/tutorial.yaml             |  46 --
 .../pydolphinscheduler/pytest.ini                  |  21 -
 .../pydolphinscheduler/setup.cfg                   |  16 -
 .../pydolphinscheduler/setup.py                    | 198 -----
 .../src/pydolphinscheduler/__init__.py             |  22 -
 .../src/pydolphinscheduler/cli/__init__.py         |  18 -
 .../src/pydolphinscheduler/cli/commands.py         | 106 ---
 .../src/pydolphinscheduler/configuration.py        | 193 -----
 .../src/pydolphinscheduler/constants.py            | 122 ----
 .../src/pydolphinscheduler/core/__init__.py        |  30 -
 .../src/pydolphinscheduler/core/database.py        |  62 --
 .../src/pydolphinscheduler/core/engine.py          |  94 ---
 .../pydolphinscheduler/core/process_definition.py  | 424 -----------
 .../src/pydolphinscheduler/core/resource.py        |  73 --
 .../src/pydolphinscheduler/core/resource_plugin.py |  58 --
 .../src/pydolphinscheduler/core/task.py            | 384 ----------
 .../pydolphinscheduler/core/yaml_process_define.py | 466 ------------
 .../src/pydolphinscheduler/default_config.yaml     |  58 --
 .../src/pydolphinscheduler/examples/__init__.py    |  18 -
 .../examples/bulk_create_example.py                |  55 --
 .../examples/task_condition_example.py             |  59 --
 .../examples/task_datax_example.py                 |  95 ---
 .../examples/task_dependent_example.py             |  74 --
 .../examples/task_dvc_example.py                   |  52 --
 .../examples/task_flink_example.py                 |  33 -
 .../examples/task_map_reduce_example.py            |  34 -
 .../examples/task_mlflow_example.py                |  93 ---
 .../examples/task_openmldb_example.py              |  43 --
 .../examples/task_pytorch_example.py               |  62 --
 .../examples/task_sagemaker_example.py             |  46 --
 .../examples/task_spark_example.py                 |  33 -
 .../examples/task_switch_example.py                |  51 --
 .../src/pydolphinscheduler/examples/tutorial.py    |  68 --
 .../examples/tutorial_decorator.py                 |  91 ---
 .../examples/tutorial_resource_plugin.py           |  64 --
 .../src/pydolphinscheduler/exceptions.py           |  46 --
 .../src/pydolphinscheduler/java_gateway.py         | 308 --------
 .../src/pydolphinscheduler/models/__init__.py      |  36 -
 .../src/pydolphinscheduler/models/base.py          |  74 --
 .../src/pydolphinscheduler/models/base_side.py     |  48 --
 .../src/pydolphinscheduler/models/project.py       |  72 --
 .../src/pydolphinscheduler/models/queue.py         |  34 -
 .../src/pydolphinscheduler/models/tenant.py        |  80 ---
 .../src/pydolphinscheduler/models/user.py          | 130 ----
 .../src/pydolphinscheduler/models/worker_group.py  |  30 -
 .../resources_plugin/__init__.py                   |  25 -
 .../resources_plugin/base/__init__.py              |  18 -
 .../resources_plugin/base/bucket.py                |  86 ---
 .../resources_plugin/base/git.py                   | 115 ---
 .../pydolphinscheduler/resources_plugin/github.py  | 106 ---
 .../pydolphinscheduler/resources_plugin/gitlab.py  | 112 ---
 .../pydolphinscheduler/resources_plugin/local.py   |  56 --
 .../src/pydolphinscheduler/resources_plugin/oss.py |  76 --
 .../src/pydolphinscheduler/resources_plugin/s3.py  |  74 --
 .../src/pydolphinscheduler/tasks/__init__.py       |  69 --
 .../src/pydolphinscheduler/tasks/condition.py      | 204 ------
 .../src/pydolphinscheduler/tasks/datax.py          | 127 ----
 .../src/pydolphinscheduler/tasks/dependent.py      | 273 -------
 .../src/pydolphinscheduler/tasks/dvc.py            | 124 ----
 .../src/pydolphinscheduler/tasks/flink.py          |  93 ---
 .../src/pydolphinscheduler/tasks/func_wrap.py      |  61 --
 .../src/pydolphinscheduler/tasks/http.py           | 101 ---
 .../src/pydolphinscheduler/tasks/map_reduce.py     |  52 --
 .../src/pydolphinscheduler/tasks/mlflow.py         | 256 -------
 .../src/pydolphinscheduler/tasks/openmldb.py       |  48 --
 .../src/pydolphinscheduler/tasks/procedure.py      |  60 --
 .../src/pydolphinscheduler/tasks/python.py         | 105 ---
 .../src/pydolphinscheduler/tasks/pytorch.py        |  95 ---
 .../src/pydolphinscheduler/tasks/sagemaker.py      |  40 --
 .../src/pydolphinscheduler/tasks/shell.py          |  58 --
 .../src/pydolphinscheduler/tasks/spark.py          |  84 ---
 .../src/pydolphinscheduler/tasks/sql.py            | 122 ----
 .../src/pydolphinscheduler/tasks/sub_process.py    |  54 --
 .../src/pydolphinscheduler/tasks/switch.py         | 166 -----
 .../src/pydolphinscheduler/utils/__init__.py       |  18 -
 .../src/pydolphinscheduler/utils/date.py           |  82 ---
 .../src/pydolphinscheduler/utils/file.py           |  57 --
 .../src/pydolphinscheduler/utils/string.py         |  39 -
 .../src/pydolphinscheduler/utils/yaml_parser.py    | 159 -----
 .../pydolphinscheduler/tests/__init__.py           |  18 -
 .../pydolphinscheduler/tests/cli/__init__.py       |  18 -
 .../pydolphinscheduler/tests/cli/test_config.py    | 198 -----
 .../pydolphinscheduler/tests/cli/test_version.py   |  67 --
 .../pydolphinscheduler/tests/core/__init__.py      |  18 -
 .../tests/core/test_configuration.py               | 272 -------
 .../pydolphinscheduler/tests/core/test_database.py |  54 --
 .../tests/core/test_default_config_yaml.py         |  39 -
 .../pydolphinscheduler/tests/core/test_engine.py   | 148 ----
 .../tests/core/test_process_definition.py          | 502 -------------
 .../tests/core/test_resource_definition.py         |  68 --
 .../pydolphinscheduler/tests/core/test_task.py     | 470 ------------
 .../tests/core/test_yaml_process_define.py         | 191 -----
 .../pydolphinscheduler/tests/example/__init__.py   |  18 -
 .../tests/example/test_example.py                  | 176 -----
 .../tests/integration/__init__.py                  |  18 -
 .../tests/integration/conftest.py                  |  51 --
 .../tests/integration/test_java_gateway.py         |  53 --
 .../tests/integration/test_process_definition.py   |  50 --
 .../tests/integration/test_project.py              |  78 --
 .../tests/integration/test_submit_examples.py      |  56 --
 .../tests/integration/test_tenant.py               |  86 ---
 .../tests/integration/test_user.py                 | 107 ---
 .../tests/resources_plugin/__init__.py             |  18 -
 .../tests/resources_plugin/test_github.py          | 195 -----
 .../tests/resources_plugin/test_gitlab.py          | 116 ---
 .../tests/resources_plugin/test_local.py           | 108 ---
 .../tests/resources_plugin/test_oss.py             | 112 ---
 .../tests/resources_plugin/test_resource_plugin.py |  75 --
 .../tests/resources_plugin/test_s3.py              |  79 --
 .../pydolphinscheduler/tests/tasks/__init__.py     |  18 -
 .../tests/tasks/test_condition.py                  | 461 ------------
 .../pydolphinscheduler/tests/tasks/test_datax.py   | 213 ------
 .../tests/tasks/test_dependent.py                  | 794 ---------------------
 .../pydolphinscheduler/tests/tasks/test_dvc.py     | 173 -----
 .../pydolphinscheduler/tests/tasks/test_flink.py   |  83 ---
 .../tests/tasks/test_func_wrap.py                  | 169 -----
 .../pydolphinscheduler/tests/tasks/test_http.py    | 145 ----
 .../tests/tasks/test_map_reduce.py                 |  76 --
 .../pydolphinscheduler/tests/tasks/test_mlflow.py  | 205 ------
 .../tests/tasks/test_openmldb.py                   |  73 --
 .../tests/tasks/test_procedure.py                  | 107 ---
 .../pydolphinscheduler/tests/tasks/test_python.py  | 201 ------
 .../pydolphinscheduler/tests/tasks/test_pytorch.py | 124 ----
 .../tests/tasks/test_sagemaker.py                  | 102 ---
 .../pydolphinscheduler/tests/tasks/test_shell.py   | 133 ----
 .../pydolphinscheduler/tests/tasks/test_spark.py   |  82 ---
 .../pydolphinscheduler/tests/tasks/test_sql.py     | 208 ------
 .../tests/tasks/test_sub_process.py                | 115 ---
 .../pydolphinscheduler/tests/tasks/test_switch.py  | 299 --------
 .../pydolphinscheduler/tests/test_docs.py          |  59 --
 .../pydolphinscheduler/tests/testing/__init__.py   |  18 -
 .../pydolphinscheduler/tests/testing/cli.py        |  87 ---
 .../pydolphinscheduler/tests/testing/constants.py  |  48 --
 .../pydolphinscheduler/tests/testing/decorator.py  |  32 -
 .../tests/testing/docker_wrapper.py                |  98 ---
 .../pydolphinscheduler/tests/testing/file.py       |  34 -
 .../pydolphinscheduler/tests/testing/path.py       |  58 --
 .../pydolphinscheduler/tests/testing/task.py       |  47 --
 .../pydolphinscheduler/tests/utils/__init__.py     |  18 -
 .../pydolphinscheduler/tests/utils/test_date.py    |  78 --
 .../pydolphinscheduler/tests/utils/test_file.py    |  85 ---
 .../pydolphinscheduler/tests/utils/test_string.py  |  87 ---
 .../tests/utils/test_yaml_parser.py                | 255 -------
 dolphinscheduler-python/pydolphinscheduler/tox.ini |  79 --
 pom.xml                                            |   8 -
 239 files changed, 19 insertions(+), 20447 deletions(-)

diff --git a/dolphinscheduler-python/pydolphinscheduler/.flake8 b/.flake8
similarity index 81%
rename from dolphinscheduler-python/pydolphinscheduler/.flake8
rename to .flake8
index 120b42fb68..f6829fc382 100644
--- a/dolphinscheduler-python/pydolphinscheduler/.flake8
+++ b/.flake8
@@ -19,15 +19,6 @@
 max-line-length = 110
 exclude =
     .git,
-    __pycache__,
-    .pytest_cache,
-    *.egg-info,
-    docs/source/conf.py
-    old,
-    build,
-    dist,
-    htmlcov,
-    .tox,
     dist,
 ignore = 
     # It's clear and not need to add docstring
@@ -35,6 +26,3 @@ ignore =
     D105,  # D105: Missing docstring in magic method
     # Conflict to Black
     W503   # W503: Line breaks before binary operators
-per-file-ignores =
-    */pydolphinscheduler/side/__init__.py:F401
-    */pydolphinscheduler/tasks/__init__.py:F401
diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS
index b58a97f831..eceda6a97a 100644
--- a/.github/CODEOWNERS
+++ b/.github/CODEOWNERS
@@ -38,7 +38,6 @@
 /dolphinscheduler-task-plugin/ @caishunfeng @SbloodyS @zhuangchong
 /dolphinscheduler-tools/ @caishunfeng @SbloodyS @zhongjiajie @EricGao888
 /script/ @caishunfeng @SbloodyS @zhongjiajie @EricGao888
-/dolphinscheduler-python/ @zhongjiajie
 /dolphinscheduler-ui/ @songjianet @Amy0104
 /docs/ @zhongjiajie @Tianqi-Dotes @EricGao888
 /licenses/ @kezhenxu94 @zhongjiajie
diff --git a/.github/actions/labeler/labeler.yml b/.github/actions/labeler/labeler.yml
index 4bb724fed2..fbfcb098fe 100644
--- a/.github/actions/labeler/labeler.yml
+++ b/.github/actions/labeler/labeler.yml
@@ -15,9 +15,6 @@
 # limitations under the License.
 #
 
-Python:
-  - any: ['dolphinscheduler-python/**/*']
-
 backend:
   - 'dolphinscheduler-alert/**/*'
   - 'dolphinscheduler-api/**/*'
@@ -40,7 +37,6 @@ backend:
 
 document:
   - 'docs/**/*'
-  - 'dolphinscheduler-python/pydolphinscheduler/docs/**/*'
 
 CI&CD:
   - any: ['.github/**/*']
diff --git a/.github/workflows/py-ci.yml b/.github/workflows/py-ci.yml
deleted file mode 100644
index 7e0333efd8..0000000000
--- a/.github/workflows/py-ci.yml
+++ /dev/null
@@ -1,205 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-name: Python API
-
-on:
-  push:
-    branches:
-      - dev
-    paths:
-      - 'dolphinscheduler-python/**'
-  pull_request:
-
-concurrency:
-  group: py-${{ github.event.pull_request.number || github.ref }}
-  cancel-in-progress: true
-
-defaults:
-  run:
-    working-directory: dolphinscheduler-python/pydolphinscheduler
-
-# We have to update setuptools wheel to package with package_data, LICENSE, NOTICE
-env:
-  DEPENDENCES: pip setuptools wheel tox
-
-jobs:
-  paths-filter:
-    name: Python-Path-Filter
-    runs-on: ubuntu-latest
-    outputs:
-      not-docs: ${{ steps.filter.outputs.not-docs }}
-      py-change: ${{ steps.filter.outputs.py-change }}
-    steps:
-      - uses: actions/checkout@v2
-      - uses: dorny/paths-filter@b2feaf19c27470162a626bd6fa8438ae5b263721
-        id: filter
-        with:
-          filters: |
-            not-docs:
-              - '!(docs/**)'
-            py-change:
-              - 'dolphinscheduler-python/pydolphinscheduler/**'
-  lint:
-    name: Lint
-    if: ${{ (needs.paths-filter.outputs.py-change == 'true') || (github.event_name == 'push') }}
-    timeout-minutes: 15
-    needs: paths-filter
-    runs-on: ubuntu-latest
-    steps:
-      - uses: actions/checkout@v2
-      - name: Set up Python 3.7
-        uses: actions/setup-python@v4
-        with:
-          python-version: 3.7
-      - name: Install Dependences
-        run: |
-          python -m pip install --upgrade ${{ env.DEPENDENCES }}
-      - name: Run All Lint Check
-        run: |
-          python -m tox -vv -e lint
-  pytest:
-    name: Pytest
-    timeout-minutes: 15
-    needs: lint
-    runs-on: ${{ matrix.os }}
-    strategy:
-      fail-fast: false
-      matrix:
-        # YAML parse `3.10` to `3.1`, so we have to add quotes for `'3.10'`, see also:
-        # https://github.com/actions/setup-python/issues/160#issuecomment-724485470
-        python-version: [3.6, 3.7, 3.8, 3.9, '3.10', 3.11-dev]
-        os: [ubuntu-latest, macOS-latest, windows-latest]
-        # Skip because dependence [py4j](https://pypi.org/project/py4j/) not work on those environments
-        exclude:
-          - os: windows-latest
-            python-version: '3.10'
-          - os: windows-latest
-            python-version: 3.11-dev
-    steps:
-      - uses: actions/checkout@v2
-      - name: Set up Python ${{ matrix.python-version }}
-        uses: actions/setup-python@v4
-        with:
-          python-version: ${{ matrix.python-version }}
-      - name: Install Dependences
-        run: |
-          python -m pip install --upgrade ${{ env.DEPENDENCES }}
-      - name: Run All Tests
-        run: |
-          python -m tox -vv -e code-test
-  doc-build:
-    name: Docs Build Test
-    timeout-minutes: 15
-    needs: lint
-    runs-on: ubuntu-latest
-    strategy:
-      fail-fast: false
-      matrix:
-        env-list: [doc-build, doc-build-multi]
-    steps:
-      - uses: actions/checkout@v2
-      - name: Set up Python 3.7
-        uses: actions/setup-python@v4
-        with:
-          python-version: 3.7
-      - name: Install Dependences
-        run: |
-          python -m pip install --upgrade ${{ env.DEPENDENCES }}
-      - name: Run Build Docs Tests ${{ matrix.env-list }}
-        run: |
-          python -m tox -vv -e ${{ matrix.env-list }}
-  local-ci:
-    name: Local CI
-    timeout-minutes: 15
-    needs:
-      - pytest
-      - doc-build
-    runs-on: ubuntu-latest
-    steps:
-      - uses: actions/checkout@v2
-      - name: Set up Python 3.7
-        uses: actions/setup-python@v4
-        with:
-          python-version: 3.7
-      - name: Install Dependences
-        run: |
-          python -m pip install --upgrade ${{ env.DEPENDENCES }}
-      - name: Run Tests Build Docs
-        run: |
-          python -m tox -vv -e local-ci
-  integrate-test:
-    name: Integrate Test
-    if: ${{ (needs.paths-filter.outputs.not-docs == 'true') || (github.event_name == 'push') }}
-    runs-on: ubuntu-latest
-    needs: paths-filter
-    timeout-minutes: 30
-    steps:
-      - uses: actions/checkout@v2
-        with:
-          submodules: true
-      - name: Sanity Check
-        uses: ./.github/actions/sanity-check
-        with:
-          token: ${{ secrets.GITHUB_TOKEN }}
-      - name: Cache local Maven repository
-        uses: actions/cache@v3
-        with:
-          path: ~/.m2/repository
-          key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
-          restore-keys: ${{ runner.os }}-maven-
-      # Switch to project root directory to run mvnw command
-      - name: Build Image
-        working-directory: ./
-        run: |
-          ./mvnw -B clean install \
-          -Dmaven.test.skip \
-          -Dmaven.javadoc.skip \
-          -Dcheckstyle.skip=true \
-          -Pdocker,release -Ddocker.tag=ci \
-          -pl dolphinscheduler-standalone-server -am
-      - name: Set up Python 3.7
-        uses: actions/setup-python@v4
-        with:
-          python-version: 3.7
-      - name: Install Dependences
-        run: |
-          python -m pip install --upgrade ${{ env.DEPENDENCES }}
-      - name: Run Integrate Tests
-        run: |
-          python -m tox -vv -e integrate-test
-  result:
-    name: Python
-    runs-on: ubuntu-latest
-    timeout-minutes: 30
-    needs: [ paths-filter, local-ci, integrate-test ]
-    if: always()
-    steps:
-      - name: Status
-        # We need change CWD to current directory to avoid global default working directory not exists
-        working-directory: ./
-        run: |
-          if [[ ${{ needs.paths-filter.outputs.not-docs }} == 'false' && ${{ github.event_name }} == 'pull_request' ]]; then
-            echo "Only document change, skip both python unit and integrate test!"
-            exit 0
-          fi
-          if [[ ${{ needs.paths-filter.outputs.py-change }} == 'false' && ${{ needs.integrate-test.result }} == 'success' && ${{ github.event_name }} == 'pull_request' ]]; then
-            echo "No python code change, and integrate test pass!"
-            exit 0
-          fi
-          if [[ ${{ needs.integrate-test.result }} != 'success' || ${{ needs.local-ci.result }} != 'success' ]]; then
-            echo "py-ci Failed!"
-            exit -1
-          fi
diff --git a/.github/workflows/unit-test.yml b/.github/workflows/unit-test.yml
index 441672839b..6acfa1fc4b 100644
--- a/.github/workflows/unit-test.yml
+++ b/.github/workflows/unit-test.yml
@@ -23,7 +23,6 @@ on:
     paths-ignore:
       - '**/*.md'
       - 'dolphinscheduler-ui'
-      - 'dolphinscheduler-python/pydolphinscheduler'
     branches:
       - dev
 
diff --git a/.gitignore b/.gitignore
index 1082e4b155..e5eccc1308 100644
--- a/.gitignore
+++ b/.gitignore
@@ -50,18 +50,3 @@ dolphinscheduler-common/test
 dolphinscheduler-worker/logs
 dolphinscheduler-master/logs
 dolphinscheduler-api/logs
-
-# ------------------
-# pydolphinscheduler
-# ------------------
-# Cache
-__pycache__/
-.tox/
-
-# Build
-build/
-*egg-info/
-
-# Test coverage
-.coverage
-htmlcov/
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 6b36749392..e51d15a16e 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -41,11 +41,6 @@ repos:
           'flake8-docstrings>=1.6',
           'flake8-black>=0.2',
         ]
-        # pre-commit run in the root, so we have to point out the full path of configuration
-        args: [
-          --config,
-          dolphinscheduler-python/pydolphinscheduler/.flake8
-        ]
   - repo: https://github.com/pycqa/autoflake
     rev: v1.4
     hooks:
diff --git a/README.md b/README.md
index 8c49415866..0b25dda917 100644
--- a/README.md
+++ b/README.md
@@ -19,10 +19,11 @@ Apache DolphinScheduler is the modern data workflow orchestration platform with
 
 The key features for DolphinScheduler are as follows:
 - Easy to deploy, we provide 4 ways to deploy, such as Standalone deployment,Cluster deployment,Docker / Kubernetes deployment and Rainbond deployment
-- Easy to use, there are 3 ways to create workflows:
+- Easy to use, there are four ways to create workflows:
   - Visually, create tasks by dragging and dropping tasks
-  - Creating workflows by PyDolphinScheduler(Python way)
-  - Creating workflows through Open API
+  - [PyDolphinScheduler](https://dolphinscheduler.apache.org/python/dev/index.html), Creating workflows via Python API, aka workflow-as-code
+  - Yaml definition, mapping yaml into workflow(have to install PyDolphinScheduler currently)
+  - Open API, Creating workflows
 
 - Highly Reliable,
 DolphinScheduler uses a decentralized multi-master and multi-worker architecture, which naturally supports horizontal scaling and high availability
diff --git a/docs/docs/en/contribute/release/release-post.md b/docs/docs/en/contribute/release/release-post.md
index 8d24b3a80f..20a8e43008 100644
--- a/docs/docs/en/contribute/release/release-post.md
+++ b/docs/docs/en/contribute/release/release-post.md
@@ -1,7 +1,7 @@
 # Release Post
 
 We still have some publish task to do after we send the announcement mail, currently we have to publish Docker images to
-Docker Hub and also publish pydolphinscheduler to PyPI.
+Docker Hub.
 
 ## Publish Docker Image
 
@@ -20,11 +20,6 @@ We could reuse the main command the CI run and publish our Docker images to Dock
     -Pdocker,release
 ```
 
-## Publish pydolphinscheduler to PyPI
-
-Python API need to release to PyPI for easier download and use, you can see more detail in [Python API release](https://github.com/apache/dolphinscheduler/blob/dev/dolphinscheduler-python/pydolphinscheduler/RELEASE.md#to-pypi)
-to finish PyPI release.
-
 ## Get All Contributors
 
 You might need all contributors in current release when you want to publish the release news or announcement, you could
diff --git a/docs/docs/en/contribute/release/release-prepare.md b/docs/docs/en/contribute/release/release-prepare.md
index e7fb41c5a1..30bcaae52f 100644
--- a/docs/docs/en/contribute/release/release-prepare.md
+++ b/docs/docs/en/contribute/release/release-prepare.md
@@ -23,7 +23,6 @@ For example, to release `x.y.z`, the following updates are required:
   - `deploy/kubernetes/dolphinscheduler`:
     - `Chart.yaml`: `appVersion` needs to be updated to x.y.z (`version` is helm chart version,incremented and different from x.y.z)
     - `values.yaml`: `image.tag` needs to be updated to x.y.z
-  - `dolphinscheduler-python/pydolphinscheduler/setup.py`: change `version` to x.y.z
 - Version in the docs:
   - Change the placeholder `<version>`(except `pom`)  to the `x.y.z` in directory `docs`
   - Add new history version
diff --git a/docs/docs/en/contribute/release/release.md b/docs/docs/en/contribute/release/release.md
index ff2b9fe1e3..dffb5fb4fc 100644
--- a/docs/docs/en/contribute/release/release.md
+++ b/docs/docs/en/contribute/release/release.md
@@ -10,8 +10,6 @@ all conditions are met, if any or them are missing, you should install them and
 java -version
 # Maven requests
 mvn -version
-# Python 3.6 above is requests, and you have to make keyword `python` work in your terminal and version match
-python --version
 ```
 
 ## GPG Settings
@@ -166,13 +164,10 @@ git push origin "${VERSION}"-release
 ### Pre-Release Check
 
 ```shell
-# make gpg command could be run in maven correct
-export GPG_TTY=$(tty)
-
-mvn release:prepare -Prelease,python -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -DdryRun=true -Dusername="${GH_USERNAME}"
+mvn release:prepare -Prelease -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -DdryRun=true -Dusername="${GH_USERNAME}"
 ```
 
-* `-Prelease,python`: choose release and python profile, which will pack all the source codes, jar files and executable binary packages, and Python distribute package.
+* `-Prelease`: choose release profile, which will pack all the source codes, jar files and executable binary packages.
 * `-DautoVersionSubmodules=true`: it can make the version number is inputted only once and not for each sub-module.
 * `-DdryRun=true`: dry run which means not to generate or submit new version number and new tag.
 
@@ -187,7 +182,7 @@ mvn release:clean
 Then, prepare to execute the release.
 
 ```shell
-mvn release:prepare -Prelease,python -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -DpushChanges=false -Dusername="${GH_USERNAME}"
+mvn release:prepare -Prelease -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -DpushChanges=false -Dusername="${GH_USERNAME}"
 ```
 
 It is basically the same as the previous rehearsal command, but deleting `-DdryRun=true` parameter.
@@ -219,7 +214,7 @@ git push origin --tags
 ### Deploy the Release
 
 ```shell
-mvn release:perform -Prelease,python -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -Dusername="${GH_USERNAME}"
+mvn release:perform -Prelease -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -Dusername="${GH_USERNAME}"
 ```
 
 After that command is executed, the version to be released will be uploaded to Apache staging repository automatically.
@@ -267,7 +262,6 @@ Create folder by version number.
 
 ```shell
 mkdir -p ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
-mkdir -p ~/ds_svn/dev/dolphinscheduler/"${VERSION}"/python
 cd ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
 ```
 
@@ -277,9 +271,6 @@ Add source code packages, binary packages and executable binary packages to SVN
 # Source and binary tarball for main code
 cp -f ~/dolphinscheduler/dolphinscheduler-dist/target/*.tar.gz ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
 cp -f ~/dolphinscheduler/dolphinscheduler-dist/target/*.tar.gz.asc ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
-
-# Source and binary tarball for Python API
-cp -f ~/dolphinscheduler/dolphinscheduler-dist/target/python/* ~/ds_svn/dev/dolphinscheduler/"${VERSION}"/python
 ```
 
 ### Generate sign files
@@ -287,10 +278,6 @@ cp -f ~/dolphinscheduler/dolphinscheduler-dist/target/python/* ~/ds_svn/dev/dolp
 ```shell
 shasum -a 512 apache-dolphinscheduler-"${VERSION}"-src.tar.gz >> apache-dolphinscheduler-"${VERSION}"-src.tar.gz.sha512
 shasum -b -a 512 apache-dolphinscheduler-"${VERSION}"-bin.tar.gz >> apache-dolphinscheduler-"${VERSION}"-bin.tar.gz.sha512
-cd python
-shasum -a 512 apache-dolphinscheduler-python-"${VERSION}".tar.gz >> apache-dolphinscheduler-python-"${VERSION}".tar.gz.sha512
-shasum -b -a 512 apache_dolphinscheduler-python-"${VERSION}"-py3-none-any.whl >> apache_dolphinscheduler-python-"${VERSION}"-py3-none-any.whl.sha512
-cd ../
 ```
 
 ### Commit to Apache SVN
@@ -308,10 +295,6 @@ svn --username="${A_USERNAME}" commit -m "release ${VERSION}"
 ```shell
 shasum -c apache-dolphinscheduler-"${VERSION}"-src.tar.gz.sha512
 shasum -c apache-dolphinscheduler-"${VERSION}"-bin.tar.gz.sha512
-cd python
-shasum -c apache-dolphinscheduler-python-"${VERSION}".tar.gz.sha512
-shasum -c apache_dolphinscheduler-python-"${VERSION}"-py3-none-any.whl.sha512
-cd ../
 ```
 
 ### Check gpg Signature
@@ -345,10 +328,6 @@ Then, check the gpg signature.
 ```shell
 gpg --verify apache-dolphinscheduler-"${VERSION}"-src.tar.gz.asc
 gpg --verify apache-dolphinscheduler-"${VERSION}"-bin.tar.gz.asc
-cd python
-gpg --verify apache-dolphinscheduler-python-"${VERSION}".tar.gz.asc
-gpg --verify apache_dolphinscheduler-python-"${VERSION}"-py3-none-any.whl.asc
-cd ../
 ```
 
 > Note: You have to create gpg signature manually when you can not find your `asc` file, the command
@@ -359,7 +338,7 @@ cd ../
 
 #### Check source package
 
-Decompress `apache-dolphinscheduler-<VERSION>-src.tar.gz` and `python/apache-dolphinscheduler-python-<VERSION>.tar.gz` then check the following items:
+Decompress `apache-dolphinscheduler-<VERSION>-src.tar.gz` then check the following items:
 
 * Check whether source tarball is oversized for including nonessential files
 * `LICENSE` and `NOTICE` files exist
@@ -372,8 +351,7 @@ Decompress `apache-dolphinscheduler-<VERSION>-src.tar.gz` and `python/apache-dol
 
 #### Check binary packages
 
-Decompress `apache-dolphinscheduler-<VERSION>-src.tar.gz` and `python/apache-dolphinscheduler-python-<VERSION>-bin.tar.gz`
-to check the following items:
+Decompress `apache-dolphinscheduler-<VERSION>-src.tar.gz` to check the following items:
 
 - `LICENSE` and `NOTICE` files exist
 - Correct year in `NOTICE` file
diff --git a/docs/docs/zh/contribute/release/release-post.md b/docs/docs/zh/contribute/release/release-post.md
index 783503f659..fe1f7e323f 100644
--- a/docs/docs/zh/contribute/release/release-post.md
+++ b/docs/docs/zh/contribute/release/release-post.md
@@ -1,6 +1,6 @@
 # 发版后续
 
-发送公告邮件后,我们还有一些发布任务要做,目前我们必须将 Docker 镜像发布到 Docker Hub 和 并且需要将 pydolphinscheduler 发布到 PyPI。
+发送公告邮件后,我们还有一些发布任务要做,目前我们必须将 Docker 镜像发布到 Docker Hub。
 
 ## 发布 Docker 镜像
 
@@ -19,11 +19,6 @@
     -Pdocker,release
 ```
 
-## 发布 pydolphinscheduler 到 PyPI
-
-需要将 Python API 发布到 PyPI,请参考 [Python API release](https://github.com/apache/dolphinscheduler/blob/dev/dolphinscheduler-python/pydolphinscheduler/RELEASE.md#to-pypi)
-完成 PyPI 的发版
-
 ## 获取全部的贡献者
 
 当您想要发布新版本的新闻或公告时,您可能需要当前版本的所有贡献者,您可以在 `tools/release` 中使用命令 `python release.py contributor` 自动生成贡献者 Github id。
diff --git a/docs/docs/zh/contribute/release/release-prepare.md b/docs/docs/zh/contribute/release/release-prepare.md
index 9fd8d9dfed..85eea69e6b 100644
--- a/docs/docs/zh/contribute/release/release-prepare.md
+++ b/docs/docs/zh/contribute/release/release-prepare.md
@@ -23,7 +23,6 @@
   - `deploy/kubernetes/dolphinscheduler`:
     - `Chart.yaml`: `appVersion` 版本更新为 x.y.z (`version` 为 helm chart 版本, 增量更新但不要设置为 x.y.z)
     - `values.yaml`: `image.tag` 版本更新为 x.y.z
-  - `dolphinscheduler-python/pydolphinscheduler/setup.py`: 修改其中的 `version` 为 x.y.z
 - 修改文档(docs模块)中的版本号:
   - 将 `docs` 文件夹下文件的占位符 `<version>` (除了 pom.xml 相关的) 修改成 `x.y.z`
   - 新增历史版本
diff --git a/docs/docs/zh/contribute/release/release.md b/docs/docs/zh/contribute/release/release.md
index 5b00867b77..f8137ef78a 100644
--- a/docs/docs/zh/contribute/release/release.md
+++ b/docs/docs/zh/contribute/release/release.md
@@ -9,8 +9,6 @@
 java -version
 # 需要 Maven 
 mvn -version
-# 需要 Python 3.6 及以上的版本,并且需要 `python` 关键字能在命令行中运行,且版本符合条件。
-python --version
 ```
 
 ## GPG设置
@@ -172,14 +170,11 @@ git push origin ${RELEASE.VERSION}-release
 ### 发布预校验
 
 ```shell
-# 保证 python profile 的 gpg 可以正常运行
-export GPG_TTY=$(tty)
-
 # 运行发版校验
-mvn release:prepare -Prelease,python -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -DdryRun=true -Dusername="${GH_USERNAME}"
+mvn release:prepare -Prelease -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -DdryRun=true -Dusername="${GH_USERNAME}"
 ```
 
-* `-Prelease,python`: 选择release和python的profile,这个profile会打包所有源码、jar文件以及可执行二进制包,以及Python的二进制包。
+* `-Prelease`: 选择release的profile,这个profile会打包所有源码、jar文件以及可执行二进制包。
 * `-DautoVersionSubmodules=true`: 作用是发布过程中版本号只需要输入一次,不必为每个子模块都输入一次。
 * `-DdryRun=true`: 演练,即不产生版本号提交,不生成新的tag。
 
@@ -194,7 +189,7 @@ mvn release:clean
 然后准备执行发布。
 
 ```shell
-mvn release:prepare -Prelease,python -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -DpushChanges=false -Dusername="${GH_USERNAME}"
+mvn release:prepare -Prelease -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -DpushChanges=false -Dusername="${GH_USERNAME}"
 ```
 
 和上一步演练的命令基本相同,去掉了 `-DdryRun=true` 参数。
@@ -223,7 +218,7 @@ git push origin --tags
 ### 部署发布
 
 ```shell
-mvn release:perform -Prelease,python -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -Dusername="${GH_USERNAME}"
+mvn release:perform -Prelease -Darguments="-Dmaven.test.skip=true -Dcheckstyle.skip=true -Dmaven.javadoc.skip=true" -DautoVersionSubmodules=true -Dusername="${GH_USERNAME}"
 ```
 
 执行完该命令后,待发布版本会自动上传到Apache的临时筹备仓库(staging repository)。你可以通过访问 [apache staging repositories](https://repository.apache.org/#stagingRepositories)
@@ -270,7 +265,6 @@ svn --username="${A_USERNAME}" commit -m "new key <YOUR-GPG-KEY-ID> add"
 
 ```shell
 mkdir -p ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
-mkdir -p ~/ds_svn/dev/dolphinscheduler/"${VERSION}"/python
 cd ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
 ```
 
@@ -280,9 +274,6 @@ cd ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
 # 主程序源码包和二进制包
 cp -f ~/dolphinscheduler/dolphinscheduler-dist/target/*.tar.gz ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
 cp -f ~/dolphinscheduler/dolphinscheduler-dist/target/*.tar.gz.asc ~/ds_svn/dev/dolphinscheduler/"${VERSION}"
-
-# Python API 源码和二进制包
-cp -f ~/dolphinscheduler/dolphinscheduler-dist/target/python/* ~/ds_svn/dev/dolphinscheduler/"${VERSION}"/python
 ```
 
 ### 生成文件签名
@@ -290,10 +281,6 @@ cp -f ~/dolphinscheduler/dolphinscheduler-dist/target/python/* ~/ds_svn/dev/dolp
 ```shell
 shasum -a 512 apache-dolphinscheduler-"${VERSION}"-src.tar.gz >> apache-dolphinscheduler-"${VERSION}"-src.tar.gz.sha512
 shasum -b -a 512 apache-dolphinscheduler-"${VERSION}"-bin.tar.gz >> apache-dolphinscheduler-"${VERSION}"-bin.tar.gz.sha512
-cd python
-shasum -a 512 apache-dolphinscheduler-python-"${VERSION}".tar.gz >> apache-dolphinscheduler-python-"${VERSION}".tar.gz.sha512
-shasum -b -a 512 apache_dolphinscheduler-python-"${VERSION}"-py3-none-any.whl >> apache_dolphinscheduler-python-"${VERSION}"-py3-none-any.whl.sha512
-cd ../
 ```
 
 ### 提交Apache SVN
@@ -311,10 +298,6 @@ svn --username="${A_USERNAME}" commit -m "release ${VERSION}"
 ```shell
 shasum -c apache-dolphinscheduler-"${VERSION}"-src.tar.gz.sha512
 shasum -c apache-dolphinscheduler-"${VERSION}"-bin.tar.gz.sha512
-cd python
-shasum -c apache-dolphinscheduler-python-"${VERSION}".tar.gz.sha512
-shasum -c apache_dolphinscheduler-python-"${VERSION}"-py3-none-any.whl.sha512
-cd ../
 ```
 
 ### 检查gpg签名
@@ -347,10 +330,6 @@ Your decision? 5
 ```shell
 gpg --verify apache-dolphinscheduler-"${VERSION}"-src.tar.gz.asc
 gpg --verify apache-dolphinscheduler-"${VERSION}"-bin.tar.gz.asc
-cd python
-gpg --verify apache-dolphinscheduler-python-"${VERSION}".tar.gz.asc
-gpg --verify apache_dolphinscheduler-python-"${VERSION}"-py3-none-any.whl.asc
-cd ../
 ```
 
 > 注意:当你找不到你的 `asc` 文件时,你必须手动创建 gpg 签名,命令
@@ -361,7 +340,7 @@ cd ../
 
 #### 检查源码包的文件内容
 
-解压缩`apache-dolphinscheduler-<VERSION>-src.tar.gz`以及Python文件夹下的`apache-dolphinscheduler-python-<VERSION>.tar.gz`,进行如下检查:
+解压缩`apache-dolphinscheduler-<VERSION>-src.tar.gz`,进行如下检查:
 
 - 检查源码包是否包含由于包含不必要文件,致使tarball过于庞大
 - 存在`LICENSE`和`NOTICE`文件
@@ -373,8 +352,7 @@ cd ../
 
 #### 检查二进制包的文件内容
 
-解压缩`apache-dolphinscheduler-<VERSION>-src.tar.gz`和`apache-dolphinscheduler-python-<VERSION>-bin.tar.gz`
-进行如下检查:
+解压缩`apache-dolphinscheduler-<VERSION>-src.tar.gz`进行如下检查:
 
 - 存在`LICENSE`和`NOTICE`文件
 - 所有文本文件开头都有ASF许可证
diff --git a/dolphinscheduler-api/pom.xml b/dolphinscheduler-api/pom.xml
index cc34dcba1e..facd4e4e69 100644
--- a/dolphinscheduler-api/pom.xml
+++ b/dolphinscheduler-api/pom.xml
@@ -163,7 +163,7 @@
             </exclusions>
         </dependency>
 
-        <!-- Python -->
+        <!-- Python API's Gateway server -->
         <dependency>
             <groupId>net.sf.py4j</groupId>
             <artifactId>py4j</artifactId>
diff --git a/dolphinscheduler-dist/pom.xml b/dolphinscheduler-dist/pom.xml
index b202cdd281..ee4c85589e 100644
--- a/dolphinscheduler-dist/pom.xml
+++ b/dolphinscheduler-dist/pom.xml
@@ -73,11 +73,6 @@
             <groupId>org.apache.dolphinscheduler</groupId>
             <artifactId>dolphinscheduler-tools</artifactId>
         </dependency>
-
-        <dependency>
-            <groupId>org.apache.dolphinscheduler</groupId>
-            <artifactId>dolphinscheduler-python</artifactId>
-        </dependency>
     </dependencies>
 
     <build>
@@ -126,35 +121,5 @@
                 </plugins>
             </build>
         </profile>
-
-        <profile>
-            <id>python</id>
-            <build>
-                <plugins>
-                    <plugin>
-                        <artifactId>maven-assembly-plugin</artifactId>
-                        <executions>
-
-                            <execution>
-                                <id>python</id>
-                                <goals>
-                                    <goal>single</goal>
-                                </goals>
-                                <phase>package</phase>
-                                <configuration>
-                                    <!-- Make final directory with simple name `python`, and without any addtion information -->
-                                    <finalName>python</finalName>
-                                    <appendAssemblyId>false</appendAssemblyId>
-                                    <descriptors>
-                                        <descriptor>src/main/assembly/dolphinscheduler-python-api.xml</descriptor>
-                                    </descriptors>
-                                </configuration>
-                            </execution>
-
-                        </executions>
-                    </plugin>
-                </plugins>
-            </build>
-        </profile>
     </profiles>
 </project>
diff --git a/dolphinscheduler-dist/src/main/assembly/dolphinscheduler-python-api.xml b/dolphinscheduler-dist/src/main/assembly/dolphinscheduler-python-api.xml
deleted file mode 100644
index cd37acee62..0000000000
--- a/dolphinscheduler-dist/src/main/assembly/dolphinscheduler-python-api.xml
+++ /dev/null
@@ -1,34 +0,0 @@
-<!--
-  ~ Licensed to the Apache Software Foundation (ASF) under one or more
-  ~ contributor license agreements.  See the NOTICE file distributed with
-  ~ this work for additional information regarding copyright ownership.
-  ~ The ASF licenses this file to You under the Apache License, Version 2.0
-  ~ (the "License"); you may not use this file except in compliance with
-  ~ the License.  You may obtain a copy of the License at
-  ~
-  ~     http://www.apache.org/licenses/LICENSE-2.0
-  ~
-  ~ Unless required by applicable law or agreed to in writing, software
-  ~ distributed under the License is distributed on an "AS IS" BASIS,
-  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-  ~ See the License for the specific language governing permissions and
-  ~ limitations under the License.
-  -->
-
-<assembly
-        xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0"
-        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
-        xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0 http://maven.apache.org/xsd/assembly-1.1.0.xsd">
-    <id>python-api</id>
-    <formats>
-        <format>dir</format>
-    </formats>
-    <includeBaseDirectory>false</includeBaseDirectory>
-
-    <fileSets>
-        <fileSet>
-            <directory>${basedir}/../dolphinscheduler-python/pydolphinscheduler/dist</directory>
-            <outputDirectory>.</outputDirectory>
-        </fileSet>
-    </fileSets>
-</assembly>
diff --git a/dolphinscheduler-dist/src/main/assembly/dolphinscheduler-src.xml b/dolphinscheduler-dist/src/main/assembly/dolphinscheduler-src.xml
index 05d54871c7..3ccc60ef0a 100644
--- a/dolphinscheduler-dist/src/main/assembly/dolphinscheduler-src.xml
+++ b/dolphinscheduler-dist/src/main/assembly/dolphinscheduler-src.xml
@@ -57,13 +57,6 @@
                 <exclude>**/dolphinscheduler-ui/node/**</exclude>
                 <exclude>**/dolphinscheduler-ui/node_modules/**</exclude>
 
-                <!-- python ignore -->
-                <exclude>**/dolphinscheduler-python/pydolphinscheduler/.pytest_cache/**</exclude>
-                <exclude>**/dolphinscheduler-python/pydolphinscheduler/build/**</exclude>
-                <exclude>**/dolphinscheduler-python/pydolphinscheduler/dist/**</exclude>
-                <exclude>**/dolphinscheduler-python/pydolphinscheduler/dist/**</exclude>
-                <exclude>**/dolphinscheduler-python/pydolphinscheduler/htmlcov/**</exclude>
-
                 <!-- eclipse ignore -->
                 <exclude>**/.settings/**</exclude>
                 <exclude>**/.project</exclude>
diff --git a/dolphinscheduler-python/pom.xml b/dolphinscheduler-python/pom.xml
deleted file mode 100644
index a3133a52e7..0000000000
--- a/dolphinscheduler-python/pom.xml
+++ /dev/null
@@ -1,165 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
-  ~ Licensed to the Apache Software Foundation (ASF) under one or more
-  ~ contributor license agreements.  See the NOTICE file distributed with
-  ~ this work for additional information regarding copyright ownership.
-  ~ The ASF licenses this file to You under the Apache License, Version 2.0
-  ~ (the "License"); you may not use this file except in compliance with
-  ~ the License.  You may obtain a copy of the License at
-  ~
-  ~     http://www.apache.org/licenses/LICENSE-2.0
-  ~
-  ~ Unless required by applicable law or agreed to in writing, software
-  ~ distributed under the License is distributed on an "AS IS" BASIS,
-  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-  ~ See the License for the specific language governing permissions and
-  ~ limitations under the License.
-  -->
-<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
-         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
-    <modelVersion>4.0.0</modelVersion>
-    <parent>
-        <groupId>org.apache.dolphinscheduler</groupId>
-        <artifactId>dolphinscheduler</artifactId>
-        <version>dev-SNAPSHOT</version>
-    </parent>
-    <artifactId>dolphinscheduler-python</artifactId>
-    <packaging>jar</packaging>
-    <name>${project.artifactId}</name>
-
-    <profiles>
-        <profile>
-            <id>release</id>
-            <properties>
-                <python.sign.skip>false</python.sign.skip>
-            </properties>
-        </profile>
-        <profile>
-            <id>python</id>
-            <build>
-                <plugins>
-                    <plugin>
-                        <groupId>org.codehaus.mojo</groupId>
-                        <artifactId>exec-maven-plugin</artifactId>
-                        <executions>
-                            <execution>
-                                <id>python-api-prepare</id>
-                                <goals>
-                                    <goal>exec</goal>
-                                </goals>
-                                <phase>prepare-package</phase>
-                                <configuration>
-                                    <executable>python</executable>
-                                    <workingDirectory>${project.basedir}/pydolphinscheduler</workingDirectory>
-                                    <arguments>
-                                        <argument>-m</argument>
-                                        <argument>pip</argument>
-                                        <argument>install</argument>
-                                        <argument>--upgrade</argument>
-                                        <argument>pip</argument>
-                                        <argument>.[build]</argument>
-                                    </arguments>
-                                </configuration>
-                            </execution>
-                            <execution>
-                                <id>python-api-clean</id>
-                                <goals>
-                                    <goal>exec</goal>
-                                </goals>
-                                <phase>prepare-package</phase>
-                                <configuration>
-                                    <executable>python</executable>
-                                    <workingDirectory>${project.basedir}/pydolphinscheduler</workingDirectory>
-                                    <arguments>
-                                        <argument>setup.py</argument>
-                                        <argument>pre_clean</argument>
-                                    </arguments>
-                                </configuration>
-                            </execution>
-                            <execution>
-                                <id>python-api-build</id>
-                                <goals>
-                                    <goal>exec</goal>
-                                </goals>
-                                <phase>prepare-package</phase>
-                                <configuration>
-                                    <executable>python</executable>
-                                    <workingDirectory>${project.basedir}/pydolphinscheduler</workingDirectory>
-                                    <arguments>
-                                        <argument>-m</argument>
-                                        <argument>build</argument>
-                                    </arguments>
-                                </configuration>
-                            </execution>
-                            <!-- Rename Python dist package to avoid confusion with dolphinscheduler main package -->
-                            <execution>
-                                <id>python-pkg-rename-tar</id>
-                                <goals>
-                                    <goal>exec</goal>
-                                </goals>
-                                <phase>prepare-package</phase>
-                                <configuration>
-                                    <executable>bash</executable>
-                                    <workingDirectory>${project.basedir}/pydolphinscheduler</workingDirectory>
-                                    <arguments>
-                                        <argument>-c</argument>
-                                        <argument>mv dist/apache-dolphinscheduler-*.tar.gz dist/apache-dolphinscheduler-python-${project.version}.tar.gz</argument>
-                                    </arguments>
-                                </configuration>
-                            </execution>
-                            <execution>
-                                <id>python-pkg-rename-whl</id>
-                                <goals>
-                                    <goal>exec</goal>
-                                </goals>
-                                <phase>prepare-package</phase>
-                                <configuration>
-                                    <executable>bash</executable>
-                                    <workingDirectory>${project.basedir}/pydolphinscheduler</workingDirectory>
-                                    <arguments>
-                                        <argument>-c</argument>
-                                        <argument>mv dist/apache_dolphinscheduler-*py3-none-any.whl dist/apache_dolphinscheduler-python-${project.version}-py3-none-any.whl</argument>
-                                    </arguments>
-                                </configuration>
-                            </execution>
-                            <execution>
-                                <id>sign-source</id>
-                                <goals>
-                                    <goal>exec</goal>
-                                </goals>
-                                <phase>prepare-package</phase>
-                                <configuration>
-                                    <skip>${python.sign.skip}</skip>
-                                    <executable>bash</executable>
-                                    <workingDirectory>${project.basedir}/pydolphinscheduler</workingDirectory>
-                                    <arguments>
-                                        <argument>-c</argument>
-                                        <!-- We use `bash -c` here cause plugin exec-maven-plugin do not support wildcard-->
-                                        <argument>gpg --armor --detach-sign --digest-algo=SHA512 dist/*.tar.gz</argument>
-                                    </arguments>
-                                </configuration>
-                            </execution>
-                            <execution>
-                                <id>sign-wheel</id>
-                                <goals>
-                                    <goal>exec</goal>
-                                </goals>
-                                <phase>prepare-package</phase>
-                                <configuration>
-                                    <skip>${python.sign.skip}</skip>
-                                    <executable>bash</executable>
-                                    <workingDirectory>${project.basedir}/pydolphinscheduler</workingDirectory>
-                                    <arguments>
-                                        <argument>-c</argument>
-                                        <!-- We use `bash -c` here cause plugin exec-maven-plugin do not support wildcard-->
-                                        <argument>gpg --armor --detach-sign --digest-algo=SHA512 dist/*.whl</argument>
-                                    </arguments>
-                                </configuration>
-                            </execution>
-                        </executions>
-                    </plugin>
-                </plugins>
-            </build>
-        </profile>
-    </profiles>
-</project>
diff --git a/dolphinscheduler-python/pydolphinscheduler/.coveragerc b/dolphinscheduler-python/pydolphinscheduler/.coveragerc
deleted file mode 100644
index 16205094c2..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/.coveragerc
+++ /dev/null
@@ -1,34 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-[run]
-command_line = -m pytest
-omit = 
-    # Ignore all test cases in tests/
-    tests/*
-    # Ignore examples directory
-    */pydolphinscheduler/examples/*
-    # TODO. Temporary ignore java_gateway file, because we could not find good way to test it.
-    */pydolphinscheduler/java_gateway.py
-
-[report]
-# Don’t report files that are 100% covered
-skip_covered = True
-show_missing = True
-precision = 2
-# Report will fail when coverage under 90.00%
-fail_under = 90
diff --git a/dolphinscheduler-python/pydolphinscheduler/.isort.cfg b/dolphinscheduler-python/pydolphinscheduler/.isort.cfg
deleted file mode 100644
index 70fa2e05bd..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/.isort.cfg
+++ /dev/null
@@ -1,19 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-[settings]
-profile=black
diff --git a/dolphinscheduler-python/pydolphinscheduler/DEVELOP.md b/dolphinscheduler-python/pydolphinscheduler/DEVELOP.md
deleted file mode 100644
index eac4b3678a..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/DEVELOP.md
+++ /dev/null
@@ -1,265 +0,0 @@
-<!--
-Licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
--->
-
-# Develop
-
-pydolphinscheduler is python API for Apache DolphinScheduler, it just defines what workflow look like instead of
-store or execute it. We here use [py4j][py4j] to dynamically access Java Virtual Machine.
-
-## Setup Develop Environment
-
-**PyDolphinScheduler** use GitHub to hold all source code, you should clone the code before you do same change.
-
-```shell
-git clone git@github.com:apache/dolphinscheduler.git
-```
-
-Now, we should install all dependence to make sure we could run test or check code style locally
-
-```shell
-cd dolphinscheduler/dolphinscheduler-python/pydolphinscheduler
-python -m pip install -e '.[dev]'
-```
-
-Next, we have to open pydolphinscheduler project in you editor. We recommend you use [pycharm][pycharm]
-instead of [IntelliJ IDEA][idea] to open it. And you could just open directory
-`dolphinscheduler-python/pydolphinscheduler` instead of `dolphinscheduler-python`.
-
-## Brief Concept
-
-Apache DolphinScheduler is design to define workflow by UI, and pydolphinscheduler try to define it by code. When
-define by code, user usually do not care user, tenant, or queue exists or not. All user care about is created
-a new workflow by the code his/her definition. So we have some **side object** in `pydolphinscheduler/side`
-directory, their only check object exists or not, and create them if not exists.
-
-### Process Definition
-
-pydolphinscheduler workflow object name, process definition is also same name as Java object(maybe would be change to
-other word for more simple).
-
-### Tasks
-
-pydolphinscheduler tasks object, we use tasks to define exact job we want DolphinScheduler do for us. For now,
-we only support `shell` task to execute shell task. [This link][all-task] list all tasks support in DolphinScheduler
-and would be implemented in the further.
-
-## Test Your Code
-
-Linting and tests is very important for open source project, so we pay more attention to it. We have continuous
-integration service run by GitHub Action to test whether the patch is good or not, which you could jump to
-section [With GitHub Action](#with-github-action) see more detail.
-
-And to make more convenience to local tests, we also have the way to run your [test automated with tox](#automated-testing-with-tox)
-locally(*run all tests except integrate test with need docker environment*). It is helpful when your try to find out the
-detail when continuous integration in GitHub Action failed, or you have a great patch and want to test local first.
-
-Besides [automated testing with tox](#automated-testing-with-tox) locally, we also have a [manual way](#manually)
-run tests. And it is scattered commands to reproduce each step of the integration test we told about.
-
-* Remote
-  * [With GitHub Action](#with-github-action)
-* Local
-  * [Automated Testing With tox](#automated-testing-with-tox)(including all but integrate test)
-  * [Manually](#manually)(with integrate test)
-
-### With GitHub Action
-
-GitHub Action test in various environment for pydolphinscheduler, including different python version in
-`3.6|3.7|3.8|3.9` and operating system `linux|macOS|windows`. It will trigger and run automatically when you
-submit pull requests to `apache/dolphinscheduler`.
-
-### Automated Testing With tox
-
-[tox](https://tox.wiki) is a package aims to automate and standardize testing in Python, both our continuous
-integration and local test use it to run actual task. To use it, you should install it first
-
-```shell
-python -m pip install --upgrade tox
-```
-
-After installation, you could run a single command to run all the tests, it is almost like test in GitHub Action
-but not so much different environment.
-
-```shell
-tox -e local-ci
-```
-
-It will take a while when you run it the first time, because it has to install dependencies and make some prepare,
-and the next time you run it will be faster.
-
-If you failed section `lint` when you run command `tox -e local-ci`, you could try to run command `tox -e auto-lint`
-which we provider fix as many lints as possible. When I finish, you could run command `tox -e local-ci` to see
-whether the linter pass or not, you have to fix it by yourself if linter still fail.
-
-### Manually
-
-#### Code Style
-
-We use [isort][isort] to automatically keep Python imports alphabetically, and use [Black][black] for code
-formatter and [Flake8][flake8] for pep8 checker. If you use [pycharm][pycharm]or [IntelliJ IDEA][idea],
-maybe you could follow [Black-integration][black-editor] to configure them in your environment.
-
-Our Python API CI would automatically run code style checker and unittest when you submit pull request in
-GitHub, you could also run static check locally.
-
-We recommend [pre-commit](https://pre-commit.com/) to do the checker mentioned above before you develop locally.
-You should install `pre-commit` by running
-
-```shell
-python -m pip install pre-commit 
-```
-
-in your development environment and then run `pre-commit install` to set up the git hooks scripts. After finish
-above steps, each time you run `git commit` or `git push` would run pre-commit check to make basic check before
-you create pull requests in GitHub.
-
-```shell
-# We recommend you run isort and Black before Flake8, because Black could auto fix some code style issue
-# but Flake8 just hint when code style not match pep8
-
-# Run Isort
-python -m isort .
-
-# Run Black
-python -m black .
-
-# Run Flake8
-python -m flake8
-```
-
-#### Testing
-
-## Build Document
-
-We use [sphinx][sphinx] to build docs. Dolphinscheduler Python API CI would automatically build docs when you submit pull request in
-GitHub. You may locally ensure docs could be built successfully in case the failure blocks CI, you can build by tox or manual.
-
-### Build Document Automatically with tox
-
-We integrated document build process into tox, you can build the latest document and all document(including history documents) via
-single command
-
-```shell
-# Build the latest document in dev branch
-tox -e doc-build
-# Build all documents, which including the latest and all history documents
-tox -e doc-build-multi
-```
-
-### Build Document Manually
-
-To build docs locally, install sphinx and related python modules first via:
-
-```shell
-python -m pip install '.[doc]'
-```
-
-Then go to document directory and execute the build command
-
-```shell
-cd pydolphinscheduler/docs/
-make clean && make html
-```
-
-> NOTE: We support build multiple versions of documents with [sphinx-multiversion](https://holzhaus.github.io/sphinx-multiversion/master/index.html),
-> you can build with command `git fetch --tags && make clean && make multiversion`
-
-## Testing
-
-pydolphinscheduler using [pytest][pytest] to test our codebase. GitHub Action will run our test when you create
-pull request or commit to dev branch, with python version `3.6|3.7|3.8|3.9` and operating system `linux|macOS|windows`.
-
-pydolphinscheduler using [pytest][pytest] to run all tests in directory `tests`. You could run tests by the commands
-
-```shell
-python -m pytest --cov=pydolphinscheduler --cov-config=.coveragerc tests/
-```
-
-Besides run tests, it will also check the unit test [coverage][coverage] threshold, for now when test cover less than 90%
-will fail the coverage, as well as our GitHub Action.
-
-The command above will check test coverage automatically, and you could also test the coverage by command.
-
-```shell
-python -m coverage run && python -m  coverage report
-```
-
-It would not only run unit test but also show each file coverage which cover rate less than 100%, and `TOTAL`
-line show you total coverage of you code. If your CI failed with coverage you could go and find some reason by
-this command output.
-
-### Integrate Test
-
-Integrate Test can not run when you execute command `tox -e local-ci` because it needs external environment
-including [Docker](https://docs.docker.com/get-docker/) and specific image build by [maven](https://maven.apache.org/install.html).
-Here we would show you the step to run integrate test in directory `dolphinscheduler-python/pydolphinscheduler/tests/integration`.
-There are two ways to run integrate tests.
-
-#### Method 1: Launch Docker Container Locally
-
-```shell
-# Go to project root directory and build Docker image
-cd ../../
-
-# Build Docker image
-./mvnw -B clean install \
-    -Dmaven.test.skip \
-    -Dmaven.javadoc.skip \
-    -Dmaven.checkstyle.skip \
-    -Pdocker,release -Ddocker.tag=ci \
-    -pl dolphinscheduler-standalone-server -am
-
-# Go to pydolphinscheduler root directory and run integrate tests
-tox -e integrate-test
-```
-
-#### Method 2: Start Standalone Server in IntelliJ IDEA
-
-```shell
-# Start the standalone server in IDEA
-
-# Go to pydolphinscheduler root directory and run integrate tests
-tox -e local-integrate-test
-```
-
-## Add LICENSE When New Dependencies Adding
-
-When you add a new package in pydolphinscheduler, you should also add the package's LICENSE to directory
-`dolphinscheduler-dist/release-docs/licenses/python-api-licenses`, and also add a short description to
-`dolphinscheduler-dist/release-docs/LICENSE`.
-
-## Update `UPDATING.md` when public class, method or interface is be changed
-
-When you change public class, method or interface, you should change the [UPDATING.md](./UPDATING.md) to notice
-users who may use it in other way.
-
-## Reference
-
-[py4j]: https://www.py4j.org/index.html
-[pycharm]: https://www.jetbrains.com/pycharm
-[idea]: https://www.jetbrains.com/idea/
-[all-task]: https://dolphinscheduler.apache.org/en-us/docs/dev/user_doc/guide/task/shell.html
-[pytest]: https://docs.pytest.org/en/latest/
-[black]: https://black.readthedocs.io/en/stable/index.html
-[flake8]: https://flake8.pycqa.org/en/latest/index.html
-[black-editor]: https://black.readthedocs.io/en/stable/integrations/editors.html#pycharm-intellij-idea
-[coverage]: https://coverage.readthedocs.io/en/stable/
-[isort]: https://pycqa.github.io/isort/index.html
-[sphinx]: https://www.sphinx-doc.org/en/master
-
diff --git a/dolphinscheduler-python/pydolphinscheduler/LICENSE b/dolphinscheduler-python/pydolphinscheduler/LICENSE
deleted file mode 100644
index a7359fad35..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/LICENSE
+++ /dev/null
@@ -1,228 +0,0 @@
-                                 Apache License
-                           Version 2.0, January 2004
-                        http://www.apache.org/licenses/
-
-   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
-   1. Definitions.
-
-      "License" shall mean the terms and conditions for use, reproduction,
-      and distribution as defined by Sections 1 through 9 of this document.
-
-      "Licensor" shall mean the copyright owner or entity authorized by
-      the copyright owner that is granting the License.
-
-      "Legal Entity" shall mean the union of the acting entity and all
-      other entities that control, are controlled by, or are under common
-      control with that entity. For the purposes of this definition,
-      "control" means (i) the power, direct or indirect, to cause the
-      direction or management of such entity, whether by contract or
-      otherwise, or (ii) ownership of fifty percent (50%) or more of the
-      outstanding shares, or (iii) beneficial ownership of such entity.
-
-      "You" (or "Your") shall mean an individual or Legal Entity
-      exercising permissions granted by this License.
-
-      "Source" form shall mean the preferred form for making modifications,
-      including but not limited to software source code, documentation
-      source, and configuration files.
-
-      "Object" form shall mean any form resulting from mechanical
-      transformation or translation of a Source form, including but
-      not limited to compiled object code, generated documentation,
-      and conversions to other media types.
-
-      "Work" shall mean the work of authorship, whether in Source or
-      Object form, made available under the License, as indicated by a
-      copyright notice that is included in or attached to the work
-      (an example is provided in the Appendix below).
-
-      "Derivative Works" shall mean any work, whether in Source or Object
-      form, that is based on (or derived from) the Work and for which the
-      editorial revisions, annotations, elaborations, or other modifications
-      represent, as a whole, an original work of authorship. For the purposes
-      of this License, Derivative Works shall not include works that remain
-      separable from, or merely link (or bind by name) to the interfaces of,
-      the Work and Derivative Works thereof.
-
-      "Contribution" shall mean any work of authorship, including
-      the original version of the Work and any modifications or additions
-      to that Work or Derivative Works thereof, that is intentionally
-      submitted to Licensor for inclusion in the Work by the copyright owner
-      or by an individual or Legal Entity authorized to submit on behalf of
-      the copyright owner. For the purposes of this definition, "submitted"
-      means any form of electronic, verbal, or written communication sent
-      to the Licensor or its representatives, including but not limited to
-      communication on electronic mailing lists, source code control systems,
-      and issue tracking systems that are managed by, or on behalf of, the
-      Licensor for the purpose of discussing and improving the Work, but
-      excluding communication that is conspicuously marked or otherwise
-      designated in writing by the copyright owner as "Not a Contribution."
-
-      "Contributor" shall mean Licensor and any individual or Legal Entity
-      on behalf of whom a Contribution has been received by Licensor and
-      subsequently incorporated within the Work.
-
-   2. Grant of Copyright License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      copyright license to reproduce, prepare Derivative Works of,
-      publicly display, publicly perform, sublicense, and distribute the
-      Work and such Derivative Works in Source or Object form.
-
-   3. Grant of Patent License. Subject to the terms and conditions of
-      this License, each Contributor hereby grants to You a perpetual,
-      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
-      (except as stated in this section) patent license to make, have made,
-      use, offer to sell, sell, import, and otherwise transfer the Work,
-      where such license applies only to those patent claims licensable
-      by such Contributor that are necessarily infringed by their
-      Contribution(s) alone or by combination of their Contribution(s)
-      with the Work to which such Contribution(s) was submitted. If You
-      institute patent litigation against any entity (including a
-      cross-claim or counterclaim in a lawsuit) alleging that the Work
-      or a Contribution incorporated within the Work constitutes direct
-      or contributory patent infringement, then any patent licenses
-      granted to You under this License for that Work shall terminate
-      as of the date such litigation is filed.
-
-   4. Redistribution. You may reproduce and distribute copies of the
-      Work or Derivative Works thereof in any medium, with or without
-      modifications, and in Source or Object form, provided that You
-      meet the following conditions:
-
-      (a) You must give any other recipients of the Work or
-          Derivative Works a copy of this License; and
-
-      (b) You must cause any modified files to carry prominent notices
-          stating that You changed the files; and
-
-      (c) You must retain, in the Source form of any Derivative Works
-          that You distribute, all copyright, patent, trademark, and
-          attribution notices from the Source form of the Work,
-          excluding those notices that do not pertain to any part of
-          the Derivative Works; and
-
-      (d) If the Work includes a "NOTICE" text file as part of its
-          distribution, then any Derivative Works that You distribute must
-          include a readable copy of the attribution notices contained
-          within such NOTICE file, excluding those notices that do not
-          pertain to any part of the Derivative Works, in at least one
-          of the following places: within a NOTICE text file distributed
-          as part of the Derivative Works; within the Source form or
-          documentation, if provided along with the Derivative Works; or,
-          within a display generated by the Derivative Works, if and
-          wherever such third-party notices normally appear. The contents
-          of the NOTICE file are for informational purposes only and
-          do not modify the License. You may add Your own attribution
-          notices within Derivative Works that You distribute, alongside
-          or as an addendum to the NOTICE text from the Work, provided
-          that such additional attribution notices cannot be construed
-          as modifying the License.
-
-      You may add Your own copyright statement to Your modifications and
-      may provide additional or different license terms and conditions
-      for use, reproduction, or distribution of Your modifications, or
-      for any such Derivative Works as a whole, provided Your use,
-      reproduction, and distribution of the Work otherwise complies with
-      the conditions stated in this License.
-
-   5. Submission of Contributions. Unless You explicitly state otherwise,
-      any Contribution intentionally submitted for inclusion in the Work
-      by You to the Licensor shall be under the terms and conditions of
-      this License, without any additional terms or conditions.
-      Notwithstanding the above, nothing herein shall supersede or modify
-      the terms of any separate license agreement you may have executed
-      with Licensor regarding such Contributions.
-
-   6. Trademarks. This License does not grant permission to use the trade
-      names, trademarks, service marks, or product names of the Licensor,
-      except as required for reasonable and customary use in describing the
-      origin of the Work and reproducing the content of the NOTICE file.
-
-   7. Disclaimer of Warranty. Unless required by applicable law or
-      agreed to in writing, Licensor provides the Work (and each
-      Contributor provides its Contributions) on an "AS IS" BASIS,
-      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
-      implied, including, without limitation, any warranties or conditions
-      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
-      PARTICULAR PURPOSE. You are solely responsible for determining the
-      appropriateness of using or redistributing the Work and assume any
-      risks associated with Your exercise of permissions under this License.
-
-   8. Limitation of Liability. In no event and under no legal theory,
-      whether in tort (including negligence), contract, or otherwise,
-      unless required by applicable law (such as deliberate and grossly
-      negligent acts) or agreed to in writing, shall any Contributor be
-      liable to You for damages, including any direct, indirect, special,
-      incidental, or consequential damages of any character arising as a
-      result of this License or out of the use or inability to use the
-      Work (including but not limited to damages for loss of goodwill,
-      work stoppage, computer failure or malfunction, or any and all
-      other commercial damages or losses), even if such Contributor
-      has been advised of the possibility of such damages.
-
-   9. Accepting Warranty or Additional Liability. While redistributing
-      the Work or Derivative Works thereof, You may choose to offer,
-      and charge a fee for, acceptance of support, warranty, indemnity,
-      or other liability obligations and/or rights consistent with this
-      License. However, in accepting such obligations, You may act only
-      on Your own behalf and on Your sole responsibility, not on behalf
-      of any other Contributor, and only if You agree to indemnify,
-      defend, and hold each Contributor harmless for any liability
-      incurred by, or claims asserted against, such Contributor by reason
-      of your accepting any such warranty or additional liability.
-
-   END OF TERMS AND CONDITIONS
-
-   APPENDIX: How to apply the Apache License to your work.
-
-      To apply the Apache License to your work, attach the following
-      boilerplate notice, with the fields enclosed by brackets "{}"
-      replaced with your own identifying information. (Don't include
-      the brackets!)  The text should be enclosed in the appropriate
-      comment syntax for the file format. We also recommend that a
-      file or class name and description of purpose be included on the
-      same "printed page" as the copyright notice for easier
-      identification within third-party archives.
-
-   Copyright {yyyy} {name of copyright owner}
-
-   Licensed under the Apache License, Version 2.0 (the "License");
-   you may not use this file except in compliance with the License.
-   You may obtain a copy of the License at
-
-       http://www.apache.org/licenses/LICENSE-2.0
-
-   Unless required by applicable law or agreed to in writing, software
-   distributed under the License is distributed on an "AS IS" BASIS,
-   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-   See the License for the specific language governing permissions and
-   limitations under the License.
-
-============================================================================
-Apache DolphinScheduler Python API SUBCOMPONENTS:
-
-The Apache DolphinScheduler Python API project contains subcomponents
-with separate copyright notices and license terms. Your use of the source
-code for the these subcomponents is subject to the terms and conditions
-of the following licenses.
-
-========================================================================
-BSD licenses
-========================================================================
-
-The following components are provided under a BSD license. See project link for details.
-The text of each license is also included at licenses/LICENSE-[project].txt.
-
-    py4j v0.10 (https://github.com/py4j/py4j)
-    click v8.0 (https://github.com/pallets/click)
-
-========================================================================
-MIT licenses
-========================================================================
-
-The following components are provided under the MIT License. See project link for details.
-The text of each license is also included at licenses/LICENSE-[project].txt.
-
-    ruamel.yaml v0.17 (https://sourceforge.net/projects/ruamel-yaml/)
diff --git a/dolphinscheduler-python/pydolphinscheduler/NOTICE b/dolphinscheduler-python/pydolphinscheduler/NOTICE
deleted file mode 100644
index 61acdab5d8..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/NOTICE
+++ /dev/null
@@ -1,5 +0,0 @@
-Apache DolphinScheduler
-Copyright 2017-2022 The Apache Software Foundation
-
-This product includes software developed at
-The Apache Software Foundation (http://www.apache.org/).
diff --git a/dolphinscheduler-python/pydolphinscheduler/README.md b/dolphinscheduler-python/pydolphinscheduler/README.md
deleted file mode 100644
index 7fc73d6a29..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/README.md
+++ /dev/null
@@ -1,90 +0,0 @@
-<!--
-Licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
--->
-
-# pydolphinscheduler
-
-[![PyPi Version](https://img.shields.io/pypi/v/apache-dolphinscheduler.svg?style=flat-square&logo=PyPi)](https://pypi.org/project/apache-dolphinscheduler/)
-[![PyPi Python Versions](https://img.shields.io/pypi/pyversions/apache-dolphinscheduler.svg?style=flat-square&logo=python)](https://pypi.org/project/apache-dolphinscheduler/)
-[![PyPi License](https://img.shields.io/pypi/l/apache-dolphinscheduler.svg?style=flat-square)](https://pypi.org/project/apache-dolphinscheduler/)
-[![PyPi Status](https://img.shields.io/pypi/status/apache-dolphinscheduler.svg?style=flat-square)](https://pypi.org/project/apache-dolphinscheduler/)
-[![PyPi Downloads](https://img.shields.io/pypi/dm/apache-dolphinscheduler?style=flat-square)](https://pypi.org/project/apache-dolphinscheduler/)
-
-[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=flat-square)](https://github.com/psf/black)
-[![Imports: isort](https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat-square&labelColor=ef8336)](https://pycqa.github.io/isort)
-[![GitHub Build](https://github.com/apache/dolphinscheduler/actions/workflows/py-ci.yml/badge.svg?branch=dev)](https://github.com/apache/dolphinscheduler/actions?query=workflow%3A%22Python+API%22)
-
-**PyDolphinScheduler** is python API for Apache DolphinScheduler, which allow you definition
-your workflow by python code, aka workflow-as-codes.
-
-## Quick Start
-
-### Installation
-
-```shell
-# Install
-python -m pip install apache-dolphinscheduler
-
-# Verify installation is successful, it will show the version of apache-dolphinscheduler, here we use 0.1.0 as example
-pydolphinscheduler version
-# 0.1.0
-```
-
-> NOTE: package apache-dolphinscheduler not work on above Python version 3.10(including itself) in Window operating system
-> due to dependence [py4j](https://pypi.org/project/py4j/) not work on those environments.
-
-Here we show you how to install and run a simple example of pydolphinscheduler
-
-### Start Server And Run Example
-
-Before you run an example, you have to start backend server. You could follow
-[development setup](../../docs/docs/en/contribute/development-environment-setup.md)
-section "DolphinScheduler Standalone Quick Start" to set up developer environment. You have to start backend
-and frontend server in this step, which mean that you could view DolphinScheduler UI in your browser with URL
-http://localhost:12345/dolphinscheduler
-
-After backend server is being start, all requests from `pydolphinscheduler` would be sent to backend server.
-And for now we could run a simple example by:
-
-<!-- TODO Add examples directory to dist package later. -->
-
-```shell
-# Please make sure your terminal could 
-curl https://raw.githubusercontent.com/apache/dolphinscheduler/dev/dolphinscheduler-python/pydolphinscheduler/examples/tutorial.py -o ./tutorial.py
-python ./tutorial.py
-```
-
-> **_NOTICE:_** Since Apache DolphinScheduler's tenant is requests while running command, you might need to change
-> tenant value in `example/tutorial.py`. For now the value is `tenant_exists`, please change it to username exists
-> in you environment.
-
-After command execute, you could see a new project with single process definition named *tutorial* in the
-[UI-project list](https://dolphinscheduler.apache.org/en-us/docs/latest/user_doc/guide/project/project-list.html).
-
-## Develop
-
-Until now, we finish quick start by an example of pydolphinscheduler and run it. If you want to inspect or join
-pydolphinscheduler develop, you could take a look at [develop](./DEVELOP.md)
-
-## Release
-
-If you are interested in how to release **PyDolphinScheduler**, you could go and see at [release](./RELEASE.md)
-
-## What's more
-
-For more detail information, please go to see **PyDolphinScheduler** latest(unreleased) [document](https://dolphinscheduler.apache.org/python/dev/index.html)
diff --git a/dolphinscheduler-python/pydolphinscheduler/RELEASE.md b/dolphinscheduler-python/pydolphinscheduler/RELEASE.md
deleted file mode 100644
index e00ef05beb..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/RELEASE.md
+++ /dev/null
@@ -1,35 +0,0 @@
-<!--
-Licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
--->
-
-# Release
-
-**PyDolphinScheduler** office release is in [ASF Distribution Directory](https://downloads.apache.org/dolphinscheduler/),
-and it should be released together with [apache-dolphinscheduler](https://github.com/apache/dolphinscheduler).
-
-## To ASF Distribution Directory
-
-You could release to [ASF Distribution Directory](https://downloads.apache.org/dolphinscheduler/) according to
-[release guide](../../docs/docs/en/contribute/release/release-prepare.md) in DolphinScheduler
-website.
-
-## To PyPi
-
-[PyPI](https://pypi.org), Python Package Index, is a repository of software for the Python programming language.
-User could install Python package from it. Release to PyPi make user easier to install and try PyDolphinScheduler,
-There is an official way to package project from [PyPA](https://packaging.python.org/en/latest/tutorials/packaging-projects)
diff --git a/dolphinscheduler-python/pydolphinscheduler/UPDATING.md b/dolphinscheduler-python/pydolphinscheduler/UPDATING.md
deleted file mode 100644
index b298c3b1ad..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/UPDATING.md
+++ /dev/null
@@ -1,40 +0,0 @@
-<!--
-Licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
--->
-
-# UPDATING
-
-Updating is try to document non-backward compatible updates which notice users the detail changes about pydolphinscheduler.
-It started after version 2.0.5 released
-
-## dev
-
-* Remove parameter ``task_location`` in process definition and Java Gateway service ([#11681](https://github.com/apache/dolphinscheduler/pull/11681))
-* Remove the spark version of spark task ([#11860](https://github.com/apache/dolphinscheduler/pull/11860)).
-
-## 3.0.0
-
-* Integrate Python gateway server into Dolphinscheduler API server, and you could start Python gateway service by command
-  `./bin/dolphinscheduler-daemon.sh start api-server` instead of independent command
-  `./bin/dolphinscheduler-daemon.sh start python-gateway-server`.
-* Remove parameter `queue` from class `ProcessDefinition` to avoid confuse user when it change but not work
-* Change `yaml_parser.py` method `to_string` to magic method `__str__` make it more pythonic.
-* Use package ``ruamel.yaml`` replace ``pyyaml`` for write yaml file with comment.
-* Change variable about where to keep pydolphinscheduler configuration from ``PYDOLPHINSCHEDULER_HOME`` to
-  ``PYDS_HOME`` which is same as other environment variable name.
-
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/Makefile b/dolphinscheduler-python/pydolphinscheduler/docs/Makefile
deleted file mode 100644
index ff2c4ebb44..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/Makefile
+++ /dev/null
@@ -1,44 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Minimal makefile for Sphinx documentation
-#
-
-# You can set these variables from the command line, and also
-# from the environment for the first two.
-
-# Add opts `turn warnings into errors` strict sphinx-build behavior
-SPHINXOPTS    ?= -W
-SPHINXBUILD   ?= sphinx-build
-SPHINXMULTIVERSION   ?= sphinx-multiversion
-SOURCEDIR     = source
-BUILDDIR      = build
-
-# Put it first so that "make" without argument is like "make help".
-help:
-	@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
-
-.PHONY: help Makefile
-
-# Catch-all target: route all unknown targets to Sphinx using the new
-# "make mode" option.  $(O) is meant as a shortcut for $(SPHINXOPTS).
-%: Makefile
-	@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
-
-# Create multiple version of docs
-multiversion:
-	@$(SPHINXMULTIVERSION) "$(SOURCEDIR)" "$(BUILDDIR)/html"
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/make.bat b/dolphinscheduler-python/pydolphinscheduler/docs/make.bat
deleted file mode 100644
index feac4c92c0..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/make.bat
+++ /dev/null
@@ -1,54 +0,0 @@
-REM Licensed to the Apache Software Foundation (ASF) under one
-REM or more contributor license agreements.  See the NOTICE file
-REM distributed with this work for additional information
-REM regarding copyright ownership.  The ASF licenses this file
-REM to you under the Apache License, Version 2.0 (the
-REM "License"); you may not use this file except in compliance
-REM with the License.  You may obtain a copy of the License at
-REM 
-REM   http://www.apache.org/licenses/LICENSE-2.0
-REM 
-REM Unless required by applicable law or agreed to in writing,
-REM software distributed under the License is distributed on an
-REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-REM KIND, either express or implied.  See the License for the
-REM specific language governing permissions and limitations
-REM under the License.
-
-@ECHO OFF
-
-pushd %~dp0
-
-REM Command file for Sphinx documentation
-
-if "%SPHINXBUILD%" == "" (
-	set SPHINXBUILD=sphinx-build
-)
-set SOURCEDIR=source
-set BUILDDIR=build
-REM Add opts `turn warnings into errors` strict sphinx-build behavior
-set SPHINXOPTS=-W
-
-if "%1" == "" goto help
-
-%SPHINXBUILD% >NUL 2>NUL
-if errorlevel 9009 (
-	echo.
-	echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
-	echo.installed, then set the SPHINXBUILD environment variable to point
-	echo.to the full path of the 'sphinx-build' executable. Alternatively you
-	echo.may add the Sphinx directory to PATH.
-	echo.
-	echo.If you don't have Sphinx installed, grab it from
-	echo.https://www.sphinx-doc.org/
-	exit /b 1
-)
-
-%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
-goto end
-
-:help
-%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
-
-:end
-popd
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/_static/.gitkeep b/dolphinscheduler-python/pydolphinscheduler/docs/source/_static/.gitkeep
deleted file mode 100644
index e69de29bb2..0000000000
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/_templates/versioning.html b/dolphinscheduler-python/pydolphinscheduler/docs/source/_templates/versioning.html
deleted file mode 100644
index 47136c45cf..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/_templates/versioning.html
+++ /dev/null
@@ -1,27 +0,0 @@
-{#
- Licensed to the Apache Software Foundation (ASF) under one
- or more contributor license agreements.  See the NOTICE file
- distributed with this work for additional information
- regarding copyright ownership.  The ASF licenses this file
- to you under the Apache License, Version 2.0 (the
- "License"); you may not use this file except in compliance
- with the License.  You may obtain a copy of the License at
-
-   http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing,
- software distributed under the License is distributed on an
- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- KIND, either express or implied.  See the License for the
- specific language governing permissions and limitations
- under the License.
-#}
-
-{% if versions %}
-<h3>{{ _('Versions') }}</h3>
-<ul>
-  {%- for item in versions %}
-  <li><a href="{{ item.url }}">{{ item.name }}</a></li>
-  {%- endfor %}
-</ul>
-{% endif %}
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/_templates/versions.html b/dolphinscheduler-python/pydolphinscheduler/docs/source/_templates/versions.html
deleted file mode 100644
index 51b7271e9c..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/_templates/versions.html
+++ /dev/null
@@ -1,46 +0,0 @@
-{#
- Licensed to the Apache Software Foundation (ASF) under one
- or more contributor license agreements.  See the NOTICE file
- distributed with this work for additional information
- regarding copyright ownership.  The ASF licenses this file
- to you under the Apache License, Version 2.0 (the
- "License"); you may not use this file except in compliance
- with the License.  You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing,
- software distributed under the License is distributed on an
- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- KIND, either express or implied.  See the License for the
- specific language governing permissions and limitations
- under the License.
-#}
-
-{%- if current_version %}
-<div class="rst-versions" data-toggle="rst-versions" role="note" aria-label="versions">
-  <span class="rst-current-version" data-toggle="rst-current-version">
-    <span class="fa fa-book"> Other Versions</span>
-    v: {{ current_version.name }}
-    <span class="fa fa-caret-down"></span>
-  </span>
-  <div class="rst-other-versions">
-    {%- if versions.tags %}
-    <dl>
-      <dt>Tags</dt>
-      {%- for item in versions.tags %}
-      <dd><a href="{{ item.url }}">{{ item.name }}</a></dd>
-      {%- endfor %}
-    </dl>
-    {%- endif %}
-    {%- if versions.branches %}
-    <dl>
-      <dt>Branches</dt>
-      {%- for item in versions.branches %}
-      <dd><a href="{{ item.url }}">{{ item.name }}</a></dd>
-      {%- endfor %}
-    </dl>
-    {%- endif %}
-  </div>
-</div>
-{%- endif %}
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/api.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/api.rst
deleted file mode 100644
index b170b6f870..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/api.rst
+++ /dev/null
@@ -1,47 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-API
-===
-
-Core
-----
-
-.. automodule:: pydolphinscheduler.core
-  :inherited-members:
-
-Models
-------
-
-.. automodule:: pydolphinscheduler.models
-  :inherited-members:
-
-Tasks
------
-
-.. automodule:: pydolphinscheduler.tasks
-  :inherited-members:
-
-Constants
----------
-
-.. automodule:: pydolphinscheduler.constants
-
-Exceptions
-----------
-
-.. automodule:: pydolphinscheduler.exceptions
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/cli.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/cli.rst
deleted file mode 100644
index 60e8231abf..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/cli.rst
+++ /dev/null
@@ -1,36 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Command Line Interface
-======================
-
-*PyDolphinScheduler* have mechanism call CLI(command line interface) to help user control it in Shell.
-
-Prepare
--------
-
-You have to :ref:`install PyDolphinScheduler <start:installing pydolphinscheduler>` first before you using
-its CLI
-
-Usage
------
-
-Here is basic usage about the command line of *PyDolphinScheduler*
-
-.. click:: pydolphinscheduler.cli.commands:cli
-   :prog: pydolphinscheduler
-   :nested: full
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/concept.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/concept.rst
deleted file mode 100644
index 9a9527df1d..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/concept.rst
+++ /dev/null
@@ -1,151 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Concepts
-========
-
-In this section, you would know the core concepts of *PyDolphinScheduler*.
-
-Process Definition
-------------------
-
-Process definition describe the whole things except `tasks`_ and `tasks dependence`_, which including
-name, schedule interval, schedule start time and end time. You would know scheduler 
-
-Process definition could be initialized in normal assign statement or in context manger.
-
-.. code-block:: python
-
-   # Initialization with assign statement
-   pd = ProcessDefinition(name="my first process definition")
-
-   # Or context manger 
-   with ProcessDefinition(name="my first process definition") as pd:
-       pd.submit()
-
-Process definition is the main object communicate between *PyDolphinScheduler* and DolphinScheduler daemon.
-After process definition and task is be declared, you could use `submit` and `run` notify server your definition.
-
-If you just want to submit your definition and create workflow, without run it, you should use attribute `submit`.
-But if you want to run the workflow after you submit it, you could use attribute `run`.
-
-.. code-block:: python
-
-   # Just submit definition, without run it
-   pd.submit()
-   
-   # Both submit and run definition
-   pd.run()
-
-Schedule
-~~~~~~~~
-
-We use parameter `schedule` determine the schedule interval of workflow, *PyDolphinScheduler* support seven
-asterisks expression, and each of the meaning of position as below
-
-.. code-block:: text
-
-    * * * * * * *
-    ┬ ┬ ┬ ┬ ┬ ┬ ┬
-    │ │ │ │ │ │ │
-    │ │ │ │ │ │ └─── year
-    │ │ │ │ │ └───── day of week (0 - 7) (0 to 6 are Sunday to Saturday, or use names; 7 is Sunday, the same as 0)
-    │ │ │ │ └─────── month (1 - 12)
-    │ │ │ └───────── day of month (1 - 31)
-    │ │ └─────────── hour (0 - 23)
-    │ └───────────── min (0 - 59)
-    └─────────────── second (0 - 59)
-
-Here we add some example crontab:
-
-- `0 0 0 * * ? *`: Workflow execute every day at 00:00:00.
-- `10 2 * * * ? *`: Workflow execute hourly day at ten pass two.
-- `10,11 20 0 1,2 * ? *`: Workflow execute first and second day of month at 00:20:10 and 00:20:11.
-
-Tenant
-~~~~~~
-
-Tenant is the user who run task command in machine or in virtual machine. it could be assign by simple string.
-
-.. code-block:: python
-
-   # 
-   pd = ProcessDefinition(name="process definition tenant", tenant="tenant_exists")
-
-.. note::
-
-   Make should tenant exists in target machine, otherwise it will raise an error when you try to run command
-
-Tasks
------
-
-Task is the minimum unit running actual job, and it is nodes of DAG, aka directed acyclic graph. You could define
-what you want to in the task. It have some required parameter to make uniqueness and definition.
-
-Here we use :py:meth:`pydolphinscheduler.tasks.Shell` as example, parameter `name` and `command` is required and must be provider. Parameter
-`name` set name to the task, and parameter `command` declare the command you wish to run in this task.
-
-.. code-block:: python
-
-   # We named this task as "shell", and just run command `echo shell task`
-   shell_task = Shell(name="shell", command="echo shell task")
-
-If you want to see all type of tasks, you could see :doc:`tasks/index`.
-
-Tasks Dependence
-~~~~~~~~~~~~~~~~
-
-You could define many tasks in on single `Process Definition`_. If all those task is in parallel processing,
-then you could leave them alone without adding any additional information. But if there have some tasks should
-not be run unless pre task in workflow have be done, we should set task dependence to them. Set tasks dependence
-have two mainly way and both of them is easy. You could use bitwise operator `>>` and `<<`, or task attribute 
-`set_downstream` and `set_upstream` to do it.
-
-.. code-block:: python
-
-   # Set task1 as task2 upstream
-   task1 >> task2
-   # You could use attribute `set_downstream` too, is same as `task1 >> task2`
-   task1.set_downstream(task2)
-   
-   # Set task1 as task2 downstream
-   task1 << task2
-   # It is same as attribute `set_upstream`
-   task1.set_upstream(task2)
-   
-   # Beside, we could set dependence between task and sequence of tasks,
-   # we set `task1` is upstream to both `task2` and `task3`. It is useful
-   # for some tasks have same dependence.
-   task1 >> [task2, task3]
-
-Task With Process Definition
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-In most of data orchestration cases, you should assigned attribute `process_definition` to task instance to
-decide workflow of task. You could set `process_definition` in both normal assign or in context manger mode
-
-.. code-block:: python
-
-   # Normal assign, have to explicit declaration and pass `ProcessDefinition` instance to task
-   pd = ProcessDefinition(name="my first process definition")
-   shell_task = Shell(name="shell", command="echo shell task", process_definition=pd)
-
-   # Context manger, `ProcessDefinition` instance pd would implicit declaration to task
-   with ProcessDefinition(name="my first process definition") as pd:
-       shell_task = Shell(name="shell", command="echo shell task",
-
-With both `Process Definition`_, `Tasks`_  and `Tasks Dependence`_, we could build a workflow with multiple tasks.
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/conf.py b/dolphinscheduler-python/pydolphinscheduler/docs/source/conf.py
deleted file mode 100644
index 23fc117fb7..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/conf.py
+++ /dev/null
@@ -1,121 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Configuration file for the Sphinx documentation builder.
-#
-# This file only contains a selection of the most common options. For a full
-# list see the documentation:
-# https://www.sphinx-doc.org/en/master/usage/configuration.html
-
-# -- Path setup --------------------------------------------------------------
-
-# If extensions (or modules to document with autodoc) are in another directory,
-# add these directories to sys.path here. If the directory is relative to the
-# documentation root, use os.path.abspath to make it absolute, like shown here.
-
-import os
-import sys
-from pathlib import Path
-
-# For sphinx-multiversion, we need to build API docs of the corresponding package version, related issue:
-# https://github.com/Holzhaus/sphinx-multiversion/issues/42
-pkg_src_dir = (
-    Path(os.environ.get("SPHINX_MULTIVERSION_SOURCEDIR", default="."))
-    .joinpath("../../src")
-    .resolve()
-)
-sys.path.insert(0, str(pkg_src_dir))
-# Debug to uncomment this to see the source path
-# print("=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=")
-# print(pkg_src_dir)
-# [print(p) for p in sys.path]
-# print("=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=")
-
-
-# -- Project information -----------------------------------------------------
-
-project = "pydolphinscheduler"
-copyright = "2022, apache"
-author = "apache dolphinscheduler contributors"
-
-# The full version, including alpha/beta/rc tags
-release = "0.0.1"
-
-
-# -- General configuration ---------------------------------------------------
-
-# Add any Sphinx extension module names here, as strings. They can be
-# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
-# ones.
-extensions = [
-    # Measures durations of Sphinx processing
-    "sphinx.ext.duration",
-    # Semi-automatic make docstrings to document
-    "sphinx.ext.autodoc",
-    "sphinx.ext.viewcode",
-    "sphinx.ext.autosectionlabel",
-    "sphinx_rtd_theme",
-    # Documenting command line interface
-    "sphinx_click.ext",
-    # Add inline tabbed content
-    "sphinx_inline_tabs",
-    "sphinx_copybutton",
-    "sphinx_multiversion",
-]
-
-# Add any paths that contain templates here, relative to this directory.
-templates_path = ["_templates"]
-
-# sphinx_multiversion configuration
-html_sidebars = {
-    "**": [
-        "versioning.html",
-    ],
-}
-# Match all exists tag for pydolphinscheduler expect version 2.0.4(not release apache dolphinscheduler)
-smv_tag_whitelist = r"^(?!2.0.4)\d+\.\d+\.\d+$"
-smv_branch_whitelist = "dev"
-smv_remote_whitelist = r"^(origin|upstream)$"
-smv_released_pattern = "^refs/tags/.*$"
-smv_outputdir_format = "versions/{ref.name}"
-
-# List of patterns, relative to source directory, that match files and
-# directories to ignore when looking for source files.
-# This pattern also affects html_static_path and html_extra_path.
-exclude_patterns = []
-
-autodoc_default_options = {
-    "members": True,
-    "show-inheritance": True,
-    "private-members": True,
-    "undoc-members": True,
-    "member-order": "groupwise",
-}
-
-autosectionlabel_prefix_document = True
-
-# -- Options for HTML output -------------------------------------------------
-
-# The theme to use for HTML and HTML Help pages.  See the documentation for
-# a list of builtin themes.
-#
-html_theme = "sphinx_rtd_theme"
-
-# Add any paths that contain custom static files (such as style sheets) here,
-# relative to this directory. They are copied after the builtin static files,
-# so a file named "default.css" will overwrite the builtin "default.css".
-html_static_path = ["_static"]
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/config.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/config.rst
deleted file mode 100644
index 29a143d713..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/config.rst
+++ /dev/null
@@ -1,218 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Configuration
-=============
-
-pydolphinscheduler has a built-in module setting necessary configuration to start and run your workflow code.
-You could directly use them if you only want to run a quick start or for a simple job like POC. But if you
-want to deep use pydolphinscheduler and even use it in production. You should probably need to modify and
-change the built-in configuration.
-
-We have two ways to modify the configuration:
-
-- `Using Environment Variables`_: The more lightweight way to modify the configuration. it is useful in
-  containerization scenarios, like docker and k8s, or when you like to temporarily override configs in the
-  configuration file.
-- `Using Configuration File`_: The more general way to modify the configuration. It is useful when you want
-  to persist and manage configuration files in one single file.
-
-Using Environment Variables
----------------------------
-
-You could change the configuration by adding or modifying the operating system's environment variables. No
-matter what way you used, as long as you can successfully modify the environment variables. We use two common
-ways, `Bash <by bash>`_ and `Python OS Module <by python os module>`_, as examples:
-
-By Bash
-^^^^^^^
-
-Setting environment variables via `Bash` is the most straightforward and easiest way. We give some examples about
-how to change them by Bash.
-
-.. code-block:: bash
-
-   # Modify Java Gateway Address
-   export PYDS_JAVA_GATEWAY_ADDRESS="192.168.1.1"
-
-   # Modify Workflow Default User
-   export PYDS_WORKFLOW_USER="custom-user"
-
-After executing the commands above, both ``PYDS_JAVA_GATEWAY_ADDRESS`` and ``PYDS_WORKFLOW_USER`` will be changed.
-The next time you execute and submit your workflow, it will submit to host `192.168.1.1`, and with workflow's user
-named `custom-user`.
-
-By Python OS Module
-^^^^^^^^^^^^^^^^^^^
-
-pydolphinscheduler is a Python API for Apache DolphinScheduler, and you could modify or add system environment
-variables via Python ``os`` module. In this example, we change variables as the same value as we change in
-`Bash <by bash>`_. It will take effect the next time you run your workflow, and call workflow ``run`` or ``submit``
-method next to ``os.environ`` statement.
-
-.. code-block:: python
-
-   import os
-   # Modify Java Gateway Address
-   os.environ["PYDS_JAVA_GATEWAY_ADDRESS"] = "192.168.1.1"
-
-   # Modify Workflow Default User
-   os.environ["PYDS_WORKFLOW_USER"] = "custom-user"
-
-All Configurations in Environment Variables
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-All environment variables as below, and you could modify their value via `Bash <by bash>`_ or `Python OS Module <by python os module>`_
-
-+------------------+------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-| Variable Section | Variable Name                      | description                                                                                                        |
-+==================+====================================+====================================================================================================================+
-|                  | ``PYDS_JAVA_GATEWAY_ADDRESS``      | Default Java gateway address, will use its value when it is set.                                                   |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|   Java Gateway   | ``PYDS_JAVA_GATEWAY_PORT``         | Default Java gateway port, will use its value when it is set.                                                      |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_JAVA_GATEWAY_AUTO_CONVERT`` | Default boolean Java gateway auto convert, will use its value when it is set.                                      |
-+------------------+------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_USER_NAME``                 | Default user name, will use when user's ``name`` when does not specify.                                            |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_USER_PASSWORD``             | Default user password, will use when user's ``password`` when does not specify.                                    |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|   Default User   | ``PYDS_USER_EMAIL``                | Default user email, will use when user's ``email`` when does not specify.                                          |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_USER_PHONE``                | Default user phone, will use when user's ``phone`` when does not specify.                                          |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_USER_STATE``                | Default user state, will use when user's ``state`` when does not specify.                                          |
-+------------------+------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_WORKFLOW_PROJECT``          | Default workflow project name, will use its value when workflow does not specify the attribute ``project``.        |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_WORKFLOW_TENANT``           | Default workflow tenant, will use its value when workflow does not specify the attribute ``tenant``.               |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-| Default Workflow | ``PYDS_WORKFLOW_USER``             | Default workflow user, will use its value when workflow does not specify the attribute ``user``.                   |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_WORKFLOW_QUEUE``            | Default workflow queue, will use its value when workflow does not specify the attribute ``queue``.                 |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_WORKFLOW_WORKER_GROUP``     | Default workflow worker group, will use its value when workflow does not specify the attribute ``worker_group``.   |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_WORKFLOW_RELEASE_STATE``    | Default workflow release state, will use its value when workflow does not specify the attribute ``release_state``. |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_WORKFLOW_TIME_ZONE``        | Default workflow worker group, will use its value when workflow does not specify the attribute ``timezone``.       |
-+                  +------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-|                  | ``PYDS_WORKFLOW_WARNING_TYPE``     | Default workflow warning type, will use its value when workflow does not specify the attribute ``warning_type``.   |
-+------------------+------------------------------------+--------------------------------------------------------------------------------------------------------------------+
-
-.. note::
-
-   The scope of setting configuration via environment variable is in the workflow, and it will not change the
-   value of the configuration file. The :doc:`CLI <cli>` command ``config --get`` and ``config --set`` operate
-   the value of the configuration file, so the command ``config --get`` may return a different value from what
-   you set in the environment variable, and command ``config --get`` will never change your environment variable.
-
-Using Configuration File
-------------------------
-
-If you want to persist and manage configuration in a file instead of environment variables, or maybe you want
-want to save your configuration file to a version control system, like Git or SVN, and the way to change
-configuration by file is the best choice.
-
-Export Configuration File
-^^^^^^^^^^^^^^^^^^^^^^^^^
-
-pydolphinscheduler allows you to change the built-in configurations via CLI or editor you like. pydolphinscheduler
-integrated built-in configurations in its package, but you could also export it locally by CLI
-
-.. code-block:: bash
-
-   pydolphinscheduler config --init
-
-And it will create a new YAML file in the path `~/pydolphinscheduler/config.yaml` by default. If you want to export
-it to another path, you should set `PYDS_HOME` before you run command :code:`pydolphinscheduler config --init`.
-
-.. code-block:: bash
-
-    export PYDS_HOME=<CUSTOM_PATH>
-    pydolphinscheduler config --init
-
-After that, your configuration file will export into `<CUSTOM_PATH>/config.yaml` instead of the default path.
-
-Change Configuration
-^^^^^^^^^^^^^^^^^^^^
-
-In section `export configuration file`_ you export the configuration file locally, and as a local file, you could
-edit it with any editor you like. After you save your change in your editor, the latest configuration will work
-when you run your workflow code.
-
-You could also query or change the configuration via CLI :code:`config --get <config>` or :code:`config --get <config> <val>`.
-Both `--get` and `--set` could be called one or more times in single command, and you could only set the leaf
-node of the configuration but could get the parent configuration, there are simple examples below:
-
-.. code-block:: bash
-
-   # Get single configuration in the leaf node,
-   # The output look like below:
-   # java_gateway.address = 127.0.0.1
-   pydolphinscheduler config --get java_gateway.address
-
-   # Get multiple configuration in the leaf node,
-   # The output look like below:
-   # java_gateway.address = 127.0.0.1
-   # java_gateway.port = 25333
-   pydolphinscheduler config --get java_gateway.address --get java_gateway.port
-
-
-   # Get parent configuration which contain multiple leaf nodes,
-   # The output look like below:
-   # java_gateway = ordereddict([('address', '127.0.0.1'), ('port', 25333), ('auto_convert', True)])
-   pydolphinscheduler config --get java_gateway
-
-   # Set single configuration,
-   # The output look like below:
-   # Set configuration done.
-   pydolphinscheduler config --set java_gateway.address 192.168.1.1
-
-   # Set multiple configuration
-   # The output look like below:
-   # Set configuration done.
-   pydolphinscheduler config --set java_gateway.address 192.168.1.1 --set java_gateway.port 25334
-
-   # Set configuration not in leaf node will fail
-   # The output look like below:
-   # Raise error.
-   pydolphinscheduler config --set java_gateway 192.168.1.1,25334,True
-
-For more information about our CLI, you could see document :doc:`cli`.
-
-All Configurations in File
-^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Here are all our configurations for pydolphinscheduler.
-
-.. literalinclude:: ../../src/pydolphinscheduler/default_config.yaml
-   :language: yaml
-   :lines: 18-
-
-Priority
---------
-
-We have two ways to modify the configuration and there is a built-in config in pydolphinscheduler too. It is
-very important to understand the priority of the configuration when you use them. The overview of configuration
-priority is.
-
-``Environment Variables > Configurations File > Built-in Configurations``
-
-This means that your setting in environment variables or configurations file will overwrite the built-in one.
-And you could temporarily modify configurations by setting environment variables without modifying the global
-config in the configuration file.
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/howto/index.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/howto/index.rst
deleted file mode 100644
index a0b3c29c0c..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/howto/index.rst
+++ /dev/null
@@ -1,30 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-HOWTOs
-======
-
-pydolphinscheduler HOWTOs are documents that cover a single, specific topic, and attempt to cover it fairly
-completely. This collection is an effort to foster documentation that is more detailed than the :doc:`../concept`
-and :doc:`../tutorial`.
-
-Currently, the HOWTOs are:
-
-.. toctree::
-   :maxdepth: 2
-   
-   remote-submit
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/howto/remote-submit.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/howto/remote-submit.rst
deleted file mode 100644
index b7efdf4fc0..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/howto/remote-submit.rst
+++ /dev/null
@@ -1,51 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Submit Your Code from Different machine
-=======================================
-
-Generally, we use pydolphinscheduler as a client to DolphinScheduler, and consider we may change our workflow
-code frequently, the best practice is running :ref:`python gateway service <start:start python gateway service>`
-in your server machine and submit the workflow code from your development machine, like a laptop or PC. This behavior
-is supported by pydolphinscheduler out of box with one or two single command lines. 
-
-Export Configuration File
--------------------------
-
-.. code-block:: bash
-
-   pydolphinscheduler config --init
-
-your could find more detail in :ref:`configuration exporting <config:export configuration file>`
-
-Run API Server in Other Host
-----------------------------
-
-.. code-block:: bash
-
-   pydolphinscheduler config --set java_gateway.address <your-api-server-ip-or-hostname>
-
-your could find more detail in :ref:`configuration setting <config:change configuration>`
-
-Run API Server in Other Port
-----------------------------
-
-.. code-block:: bash
-
-   pydolphinscheduler config --set java_gateway.port <your-python-gateway-service-port>
-
-your could find more detail in :ref:`configuration setting <config:change configuration>`
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/index.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/index.rst
deleted file mode 100644
index 4dc0a949c9..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/index.rst
+++ /dev/null
@@ -1,46 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-PyDolphinScheduler
-==================
-
-**PyDolphinScheduler** is Python API for `Apache DolphinScheduler <https://dolphinscheduler.apache.org>`_,
-which allow you definition your workflow by Python code, aka workflow-as-codes.
-
-I could go and find how to :ref:`install <start:getting started>` the project. Or if you want to see simply example
-then go and see :doc:`tutorial` for more detail.
-
-
-.. toctree::
-   :maxdepth: 2
-
-   start
-   tutorial
-   concept
-   tasks/index
-   howto/index
-   cli
-   config
-   api
-   resources_plugin/index
-
-Indices and tables
-==================
-
-* :ref:`genindex`
-* :ref:`modindex`
-* :ref:`search`
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/develop.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/develop.rst
deleted file mode 100644
index e7d90ea03c..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/develop.rst
+++ /dev/null
@@ -1,46 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-How to develop
-==============
-
-When you want to create a new resource plugin, you need to add a new class in the module `resources_plugin`.
-
-The resource plugin class needs to inherit the abstract class `ResourcePlugin` and implement its abstract method `read_file` function.
-
-The parameter of the `__init__` function of `ResourcePlugin` is the prefix of STR type. You can override this function when necessary.
-
-The `read_file` function parameter of `ResourcePlugin` is the file suffix of STR type, and its return value is the file content, if it exists and is readable.
-
-
-Example
--------
-- Method `__init__`: Initiation method with `param`:`prefix`
-
-.. literalinclude:: ../../../src/pydolphinscheduler/resources_plugin/local.py
-    :start-after: [start init_method]
-    :end-before: [end init_method]
-
-- Method `read_file`: Get content from the given URI, The function parameter is the suffix of the file path.
-
-The file prefix has been initialized in init of the resource plugin.
-
-The prefix plus suffix is the absolute path of the file in this resource.
-
-.. literalinclude:: ../../../src/pydolphinscheduler/resources_plugin/local.py
-    :start-after: [start read_file_method]
-    :end-before: [end read_file_method]
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/github.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/github.rst
deleted file mode 100644
index b3023377de..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/github.rst
+++ /dev/null
@@ -1,35 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-GitHub
-======
-
-`GitHub` is a github resource plugin for pydolphinscheduler.
-
-When using a github resource plugin, you only need to add the `resource_plugin` parameter in the task subclass or workflow definition,
-such as `resource_plugin=GitHub(prefix="https://github.com/xxx", access_token="ghpxx")`.
-The token parameter is optional. You need to add it when your repository is a private repository.
-
-You can view this `document <https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token>`_
-when creating a token.
-
-For the specific use of resource plugins, you can see `How to use` in :doc:`resource-plugin`
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.resources_plugin.github
\ No newline at end of file
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/gitlab.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/gitlab.rst
deleted file mode 100644
index fdf43c9d2f..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/gitlab.rst
+++ /dev/null
@@ -1,46 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-GitLab
-======
-
-`GitLab` is a gitlab resource plugin for pydolphinscheduler.
-
-When using a gitlab resource plugin, you only need to add the `resource_plugin` parameter in the task subclass or workflow definition,
-such as `resource_plugin=GitLab(prefix="xxx")`, if it is a public repository.
-
-If it is a private or Internal repository, you can use three ways to obtain authentication.
-
-The first is `Personal Access Tokens`, using `resource_plugin=GitLab(prefix="xxx", private_token="xxx")`.
-
-The second method is to obtain authentication through `username` and `password`:
-
-using `resource_plugin=GitLab(prefix="xxx", username="username", password="pwd")`.
-
-The third method is to obtain authentication through `OAuth Token`:
-
-using `resource_plugin=GitLab(prefix="xxx", oauth_token="xx")`.
-
-You can view this `document <https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html#create-a-personal-access-token>`_
-when creating a `Personal Access Tokens`.
-
-For the specific use of resource plugins, you can see `How to use` in :doc:`resource-plugin`
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.resources_plugin.gitlab
\ No newline at end of file
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/index.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/index.rst
deleted file mode 100644
index c984f06048..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/index.rst
+++ /dev/null
@@ -1,32 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Resources_plugin
-================
-
-In this section
-
-.. toctree::
-   :maxdepth: 1
-
-   develop
-   resource-plugin
-   local
-   github
-   gitlab
-   oss
-   s3
\ No newline at end of file
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/local.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/local.rst
deleted file mode 100644
index 5da025a5c7..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/local.rst
+++ /dev/null
@@ -1,32 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Local
-=====
-
-`Local` is a local resource plugin for pydolphinscheduler.
-
-When using a local resource plugin, you only need to add the `resource_plugin` parameter in the task subclass or workflow definition,
-such as `resource_plugin=Local("/tmp")`.
-
-
-For the specific use of resource plugins, you can see `How to use` in :doc:`./resource-plugin`
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.resources_plugin.local
\ No newline at end of file
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/oss.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/oss.rst
deleted file mode 100644
index fbb6785d1d..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/oss.rst
+++ /dev/null
@@ -1,44 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-OSS
-===
-
-`OSS` is a Aliyun OSS resource plugin for pydolphinscheduler.
-
-When using a OSS resource plugin, you only need to add the `resource_plugin` parameter in the task subclass or workflow definition,
-such as `resource_plugin=OSS(prefix="xxx")`, if the file is publicly readable.
-
-When the file is private, using `resource_plugin=OSS(prefix="xxx", access_key_id="xxx", access_key_secret="xxx")`
-
-Notice
-The read permission of files in a bucket is inherited from the bucket by default. In other words, if the bucket is private,
-the files in it are also private.
-
-But the read permission of the files in the bucket can be changed, in other words, the files in the private bucket can also be read publicly.
-
-So whether the `AccessKey` is needed depends on whether the file is private or not.
-
-You can view this `document <https://www.alibabacloud.com/help/en/tablestore/latest/how-can-i-obtain-an-accesskey-pair>`_
-when creating a pair `AccessKey`.
-
-For the specific use of resource plugins, you can see `How to use` in :doc:`resource-plugin`
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.resources_plugin.OSS
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/resource-plugin.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/resource-plugin.rst
deleted file mode 100644
index 2a32526208..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/resource-plugin.rst
+++ /dev/null
@@ -1,75 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-ResourcePlugin
-==============
-
-`ResourcePlugin` is an abstract class of resource plug-in parameters of task subclass and workflow.
-All resource plugins need to inherit and override its abstract methods.
-
-Code
-----
-.. literalinclude:: ../../../src/pydolphinscheduler/core/resource_plugin.py
-   :start-after: [start resource_plugin_definition]
-   :end-before: [end resource_plugin_definition]
-
-Dive Into
----------
-It has the following key functions.
-
-- Method `__init__`: The `__init__` function has STR type parameter `prefix`, which means the prefix of the resource.
-
-You can rewrite this function if necessary.
-
-.. literalinclude:: ../../../src/pydolphinscheduler/core/resource_plugin.py
-    :start-after: [start init_method]
-    :end-before: [end init_method]
-
-- Method `read_file`: Get content from the given URI, The function parameter is the suffix of the file path.
-
-The file prefix has been initialized in init of the resource plug-in.
-
-The prefix plus suffix is the absolute path of the file in this resource.
-
-It is an abstract function. You must rewrite it
-
-.. literalinclude:: ../../../src/pydolphinscheduler/core/resource_plugin.py
-    :start-after: [start abstractmethod read_file]
-    :end-before: [end abstractmethod read_file]
-
-.. automodule:: pydolphinscheduler.core.resource_plugin
-
-How to use
-----------
-Resource plugin can be used in task subclasses and workflows. You can use the resource plugin by adding the `resource_plugin` parameter when they are initialized.
-For example, local resource plugin, add `resource_plugin = Local("/tmp")`.
-
-The resource plugin we currently support are `local`, `github`, `gitlab`, `OSS`, `S3`.
-
-Here is an example.
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/tutorial_resource_plugin.py
-   :start-after: [start workflow_declare]
-   :end-before: [end task_declare]
-
-When the resource_plugin parameter is defined in both the task subclass and the workflow, the resource_plugin defined in the task subclass is used first.
-
-If the task subclass does not define resource_plugin, but the resource_plugin is defined in the workflow, the resource_plugin in the workflow is used.
-
-Of course, if neither the task subclass nor the workflow specifies resource_plugin, the command at this time will be executed as a script,
-
-in other words, we are forward compatible.
\ No newline at end of file
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/s3.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/s3.rst
deleted file mode 100644
index f5bc1d37fe..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/resources_plugin/s3.rst
+++ /dev/null
@@ -1,36 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-S3
-==
-
-`S3` is a Amazon S3 resource plugin for pydolphinscheduler.
-
-When using a Amazon S3 resource plugin, you only need to add the `resource_plugin` parameter in the task subclass or workflow definition,
-such as `resource_plugin=S3(prefix="xxx")`, if the file is publicly readable.
-
-When the file is private, using `resource_plugin=S3(prefix="xxx", access_key_id="xxx", access_key_secret="xxx")`
-
-You can view this `document <https://docs.aws.amazon.com/general/latest/gr/aws-access-keys-best-practices.html>`_
-when creating a pair `AccessKey`.
-
-For the specific use of resource plugins, you can see `How to use` in :doc:`resource-plugin`
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.resources_plugin.S3
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/start.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/start.rst
deleted file mode 100644
index 270b5b855d..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/start.rst
+++ /dev/null
@@ -1,171 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Getting Started
-===============
-
-To get started with *PyDolphinScheduler* you must ensure python and pip
-installed on your machine, if you're already set up, you can skip straight
-to `Installing PyDolphinScheduler`_, otherwise please continue with
-`Installing Python`_.
-
-Installing Python
------------------
-
-How to install `python` and `pip` depends on what operating system
-you're using. The python wiki provides up to date
-`instructions for all platforms here`_. When you entering the website
-and choice your operating system, you would be offered the choice and
-select python version. *PyDolphinScheduler* recommend use version above
-Python 3.6 and we highly recommend you install *Stable Releases* instead
-of *Pre-releases*.
-
-After you have download and installed Python, you should open your terminal,
-typing and running :code:`python --version` to check whether the installation
-is correct or not. If all thing good, you could see the version in console
-without error(here is a example after Python 3.8.7 installed)
-
-.. code-block:: bash
-
-   python --version
-
-Will see detail of Python version, such as *Python 3.8.7*
-
-Installing PyDolphinScheduler
------------------------------
-
-After Python is already installed on your machine following section
-`installing Python`_, it easy to *PyDolphinScheduler* by pip.
-
-.. code-block:: bash
-
-   python -m pip install apache-dolphinscheduler
-
-The latest version of *PyDolphinScheduler* would be installed after you run above
-command in your terminal. You could go and `start Python Gateway Service`_ to finish
-the prepare, and then go to :doc:`tutorial` to make your hand dirty. But if you
-want to install the unreleased version of *PyDolphinScheduler*, you could go and see
-section `installing PyDolphinScheduler in dev branch`_ for more detail.
-
-.. note::
-
-   Currently, we released multiple pre-release package in PyPI, you can see all released package
-   including pre-release in `release history <https://pypi.org/project/apache-dolphinscheduler/#history>`_.
-   You can fix the the package version if you want to install pre-release package, for example if
-   you want to install version `3.0.0-beta-2` package, you can run command
-   :code:`python -m pip install apache-dolphinscheduler==3.0.0b2`.
-
-Installing PyDolphinScheduler In DEV Branch
--------------------------------------------
-
-Because the project is developing and some of the features still not release.
-If you want to try some thing unreleased you could install from the source code
-which we hold in GitHub
-
-.. code-block:: bash
-
-   # Clone Apache DolphinScheduler repository
-   git clone git@github.com:apache/dolphinscheduler.git
-   # Install PyDolphinScheduler in develop mode
-   cd dolphinscheduler-python/pydolphinscheduler && python -m pip install -e .
-
-After you installed *PyDolphinScheduler*, please remember `start Python Gateway Service`_
-which waiting for *PyDolphinScheduler*'s workflow definition require.
-
-Above command will clone whole dolphinscheduler source code to local, maybe you want to install latest pydolphinscheduler
-package directly and do not care about other code(including Python gateway service code), you can execute command
-
-.. code-block:: bash
-
-   # Must escape the '&' character by adding '\' 
-   pip install -e "git+https://github.com/apache/dolphinscheduler.git#egg=apache-dolphinscheduler&subdirectory=dolphinscheduler-python/pydolphinscheduler"
-
-Start Python Gateway Service
-----------------------------
-
-Since **PyDolphinScheduler** is Python API for `Apache DolphinScheduler`_, it
-could define workflow and tasks structure, but could not run it unless you
-`install Apache DolphinScheduler`_ and start its API server which including
-Python gateway service in it. We only and some key steps here and you could
-go `install Apache DolphinScheduler`_ for more detail
-
-.. code-block:: bash
-
-   # Start DolphinScheduler api-server which including python gateway service
-   ./bin/dolphinscheduler-daemon.sh start api-server
-
-To check whether the server is alive or not, you could run :code:`jps`. And
-the server is health if keyword `ApiApplicationServer` in the console.
-
-.. code-block:: bash
-
-   jps
-   # ....
-   # 201472 ApiApplicationServer
-   # ....
-
-.. note::
-
-   Please make sure you already enabled started Python gateway service along with `api-server`. The configuration is in
-   yaml config path `python-gateway.enabled : true` in api-server's configuration path in `api-server/conf/application.yaml`.
-   The default value is true and Python gateway service start when api server is been started.
-
-Run an Example
---------------
-
-Before run an example for pydolphinscheduler, you should get the example code from it source code. You could run
-single bash command to get it
-
-.. code-block:: bash
-
-   wget https://raw.githubusercontent.com/apache/dolphinscheduler/dev/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial.py
-
-or you could copy-paste the content from `tutorial source code`_. And then you could run the example in your
-terminal
-
-.. code-block:: bash
-
-   python tutorial.py
-
-If you want to submit your workflow to a remote API server, which means that your workflow script is different
-from the API server, you should first change pydolphinscheduler configuration and then submit the workflow script
-
-.. code-block:: bash
-
-   pydolphinscheduler config --init
-   pydolphinscheduler config --set java_gateway.address <YOUR-API-SERVER-IP-OR-HOSTNAME>
-   python tutorial.py
-
-.. note::
-
-   You could see more information in :doc:`config` about all the configurations pydolphinscheduler supported.
-
-After that, you could go and see your DolphinScheduler web UI to find out a new workflow created by pydolphinscheduler,
-and the path of web UI is `Project -> Workflow -> Workflow Definition`.
-
-
-What's More
------------
-
-If you do not familiar with *PyDolphinScheduler*, you could go to :doc:`tutorial` and see how it works. But
-if you already know the basic usage or concept of *PyDolphinScheduler*, you could go and play with all
-:doc:`tasks/index` *PyDolphinScheduler* supports, or see our :doc:`howto/index` about useful cases.
-
-.. _`instructions for all platforms here`: https://wiki.python.org/moin/BeginnersGuide/Download
-.. _`Apache DolphinScheduler`: https://dolphinscheduler.apache.org
-.. _`install Apache DolphinScheduler`: https://dolphinscheduler.apache.org/en-us/docs/latest/user_doc/guide/installation/standalone.html
-.. _`tutorial source code`: https://raw.githubusercontent.com/apache/dolphinscheduler/dev/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial.py
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/condition.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/condition.rst
deleted file mode 100644
index f6d7e6ad8f..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/condition.rst
+++ /dev/null
@@ -1,40 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Condition
-=========
-
-A condition task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_condition_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.condition
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Condition.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/datax.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/datax.rst
deleted file mode 100644
index cb67a2fa9e..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/datax.rst
+++ /dev/null
@@ -1,46 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Datax
-=====
-
-A DataX task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_datax_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.datax
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/DataX.yaml
-   :start-after: # under the License.
-   :language: yaml
-
-
-example_datax.json:
-
-.. literalinclude:: ../../../examples/yaml_define/example_datax.json
-   :language: json
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/dependent.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/dependent.rst
deleted file mode 100644
index d8e1599b2d..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/dependent.rst
+++ /dev/null
@@ -1,47 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Dependent
-=========
-
-A dependent task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_dependent_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.dependent
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Dependent.yaml
-   :start-after: # under the License.
-   :language: yaml
-
-Dependent_External.yaml:
-
-.. literalinclude:: ../../../examples/yaml_define/Dependent_External.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/dvc.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/dvc.rst
deleted file mode 100644
index 0127a982f3..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/dvc.rst
+++ /dev/null
@@ -1,41 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-DVC
-===
-
-A DVC task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_dvc_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.dvc
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Dvc.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/flink.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/flink.rst
deleted file mode 100644
index 76eb484718..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/flink.rst
+++ /dev/null
@@ -1,40 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Flink
-=====
-
-A flink task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_flink_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.flink
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Flink.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/func_wrap.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/func_wrap.rst
deleted file mode 100644
index a4a2972933..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/func_wrap.rst
+++ /dev/null
@@ -1,33 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Python Function Wrapper
-=======================
-
-A decorator covert Python function into pydolphinscheduler's task.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/tutorial_decorator.py
-   :start-after: [start tutorial]
-   :end-before: [end tutorial]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.func_wrap
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/http.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/http.rst
deleted file mode 100644
index 4e138c9989..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/http.rst
+++ /dev/null
@@ -1,29 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-HTTP
-====
-
-.. automodule:: pydolphinscheduler.tasks.http
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Http.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/index.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/index.rst
deleted file mode 100644
index 3f83f92675..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/index.rst
+++ /dev/null
@@ -1,48 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Tasks
-=====
-
-In this section 
-
-.. toctree::
-   :maxdepth: 1
-   
-   func_wrap
-   shell
-   sql
-   python
-   http
-
-   switch
-   condition
-   dependent
-
-   spark
-   flink
-   map_reduce
-   procedure
-
-   datax
-   sub_process
-
-   sagemaker
-   mlflow
-   openmldb
-   pytorch
-   dvc
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/map_reduce.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/map_reduce.rst
deleted file mode 100644
index 7356880b26..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/map_reduce.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Map Reduce
-==========
-
-
-A Map Reduce task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_map_reduce_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.map_reduce
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/MapReduce.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/mlflow.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/mlflow.rst
deleted file mode 100644
index b83903c26f..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/mlflow.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-MLflow
-=========
-
-
-A MLflow task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_mlflow_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.mlflow
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/mlflow.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/openmldb.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/openmldb.rst
deleted file mode 100644
index 125313dc21..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/openmldb.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-OpenMLDB
-=========
-
-
-A OpenMLDB task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_openmldb_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.openmldb
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/OpenMLDB.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/procedure.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/procedure.rst
deleted file mode 100644
index 2f28efc526..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/procedure.rst
+++ /dev/null
@@ -1,29 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Procedure
-=========
-
-.. automodule:: pydolphinscheduler.tasks.procedure
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Procedure.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/python.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/python.rst
deleted file mode 100644
index 1bf6210018..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/python.rst
+++ /dev/null
@@ -1,29 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Python
-======
-
-.. automodule:: pydolphinscheduler.tasks.python
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Python.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/pytorch.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/pytorch.rst
deleted file mode 100644
index 4c7a5521fb..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/pytorch.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Pytorch
-=======
-
-
-A Pytorch task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_pytorch_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.pytorch
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Pytorch.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sagemaker.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sagemaker.rst
deleted file mode 100644
index 36880d91d2..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sagemaker.rst
+++ /dev/null
@@ -1,46 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-SageMaker
-=========
-
-
-A SageMaker task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_sagemaker_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.sagemaker
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Sagemaker.yaml
-   :start-after: # under the License.
-   :language: yaml
-
-example_sagemaker_params.json:
-
-.. literalinclude:: ../../../examples/yaml_define/example_sagemaker_params.json
-   :language: json
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/shell.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/shell.rst
deleted file mode 100644
index 2dd106a3b8..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/shell.rst
+++ /dev/null
@@ -1,41 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Shell
-=====
-
-A shell task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/tutorial.py
-   :start-after: [start workflow_declare]
-   :end-before: [end task_relation_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.shell
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Shell.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/spark.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/spark.rst
deleted file mode 100644
index d5a51db91a..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/spark.rst
+++ /dev/null
@@ -1,41 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Spark
-=====
-
-A spark task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_spark_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.spark
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Spark.yaml
-   :start-after: # under the License.
-   :language: yaml
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sql.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sql.rst
deleted file mode 100644
index 52df042b74..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sql.rst
+++ /dev/null
@@ -1,35 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-SQL
-===
-
-.. automodule:: pydolphinscheduler.tasks.sql
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Sql.yaml
-   :start-after: # under the License.
-   :language: yaml
-
-example_sql.sql:
-
-.. literalinclude:: ../../../examples/yaml_define/example_sql.sql
-   :start-after: */
-   :language: sql
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sub_process.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sub_process.rst
deleted file mode 100644
index 894dd0fbad..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/sub_process.rst
+++ /dev/null
@@ -1,38 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Sub Process
-===========
-
-.. automodule:: pydolphinscheduler.tasks.sub_process
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/SubProcess.yaml
-   :start-after: # under the License.
-   :language: yaml
-
-
-
-example_subprocess.yaml:
-
-.. literalinclude:: ../../../examples/yaml_define/example_sub_workflow.yaml
-   :start-after: # under the License.
-   :language: yaml
-
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/switch.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/switch.rst
deleted file mode 100644
index 2fef589efb..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tasks/switch.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Switch
-======
-
-A switch task type's example and dive into information of **PyDolphinScheduler**.
-
-Example
--------
-
-.. literalinclude:: ../../../src/pydolphinscheduler/examples/task_switch_example.py
-   :start-after: [start workflow_declare]
-   :end-before: [end workflow_declare]
-
-Dive Into
----------
-
-.. automodule:: pydolphinscheduler.tasks.switch
-
-
-YAML file example
------------------
-
-.. literalinclude:: ../../../examples/yaml_define/Switch.yaml
-   :start-after: # under the License.
-   :language: yaml
-
diff --git a/dolphinscheduler-python/pydolphinscheduler/docs/source/tutorial.rst b/dolphinscheduler-python/pydolphinscheduler/docs/source/tutorial.rst
deleted file mode 100644
index 57d21b2d29..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/docs/source/tutorial.rst
+++ /dev/null
@@ -1,319 +0,0 @@
-.. Licensed to the Apache Software Foundation (ASF) under one
-   or more contributor license agreements.  See the NOTICE file
-   distributed with this work for additional information
-   regarding copyright ownership.  The ASF licenses this file
-   to you under the Apache License, Version 2.0 (the
-   "License"); you may not use this file except in compliance
-   with the License.  You may obtain a copy of the License at
-
-..   http://www.apache.org/licenses/LICENSE-2.0
-
-.. Unless required by applicable law or agreed to in writing,
-   software distributed under the License is distributed on an
-   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-   KIND, either express or implied.  See the License for the
-   specific language governing permissions and limitations
-   under the License.
-
-Tutorial
-========
-
-This tutorial shows you the basic concept of *PyDolphinScheduler* and tells all
-things you should know before you submit or run your first workflow. If you
-still have not installed *PyDolphinScheduler* and start DolphinScheduler, you
-could go and see :ref:`how to getting start PyDolphinScheduler <start:getting started>` firstly.
-
-Overview of Tutorial
---------------------
-
-Here have an overview of our tutorial, and it looks a little complex but does not
-worry about that because we explain this example below as detail as possible.
-
-There are two types of tutorials: traditional and task decorator.
-
-- **Traditional Way**: More general, support many :doc:`built-in task types <tasks/index>`, it is convenient
-  when you build your workflow at the beginning.
-- **Task Decorator**: A Python decorator allow you to wrap your function into pydolphinscheduler's task. Less
-  versatility to the traditional way because it only supported Python functions and without build-in tasks
-  supported. But it is helpful if your workflow is all built with Python or if you already have some Python
-  workflow code and want to migrate them to pydolphinscheduler.
-- **YAML File**: We can use pydolphinscheduler CLI to create process using YAML file: :code:`pydolphinscheduler yaml -f tutorial.yaml`. 
-  We can find more YAML file examples in `examples/yaml_define <https://github.com/apache/dolphinscheduler/tree/dev/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define>`_
-
-.. tab:: Tradition
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial.py
-      :dedent: 0
-      :start-after: [start tutorial]
-      :end-before: [end tutorial]
-
-.. tab:: Task Decorator
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial_decorator.py
-      :dedent: 0
-      :start-after: [start tutorial]
-      :end-before: [end tutorial]
-
-.. tab:: YAML File
-
-   .. literalinclude:: ../../examples/yaml_define/tutorial.yaml
-      :start-after: # under the License.
-      :language: yaml
-
-Import Necessary Module
------------------------
-
-First of all, we should import the necessary module which we would use later just like other Python packages.
-
-.. tab:: Tradition
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial.py
-      :dedent: 0
-      :start-after: [start package_import]
-      :end-before: [end package_import]
-
-   In tradition tutorial we import :class:`pydolphinscheduler.core.process_definition.ProcessDefinition` and
-   :class:`pydolphinscheduler.tasks.shell.Shell`.
-
-   If you want to use other task type you could click and :doc:`see all tasks we support <tasks/index>`
-
-.. tab:: Task Decorator
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial_decorator.py
-      :dedent: 0
-      :start-after: [start package_import]
-      :end-before: [end package_import]
-
-   In task decorator tutorial we import :class:`pydolphinscheduler.core.process_definition.ProcessDefinition` and
-   :func:`pydolphinscheduler.tasks.func_wrap.task`.
-
-Process Definition Declaration
-------------------------------
-
-We should instantiate :class:`pydolphinscheduler.core.process_definition.ProcessDefinition` object after we
-import them from `import necessary module`_. Here we declare basic arguments for process definition(aka, workflow).
-We define the name of :code:`ProcessDefinition`, using `Python context manager`_ and it **the only required argument**
-for `ProcessDefinition`. Besides, we also declare three arguments named :code:`schedule` and :code:`start_time`
-which setting workflow schedule interval and schedule start_time, and argument :code:`tenant` defines which tenant
-will be running this task in the DolphinScheduler worker. See :ref:`section tenant <concept:tenant>` in
-*PyDolphinScheduler* :doc:`concept` for more information.
-
-.. tab:: Tradition
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial.py
-      :dedent: 0
-      :start-after: [start workflow_declare]
-      :end-before: [end workflow_declare]
-
-.. tab:: Task Decorator
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial_decorator.py
-      :dedent: 0
-      :start-after: [start workflow_declare]
-      :end-before: [end workflow_declare]
-
-.. tab:: YAML File
-
-   .. literalinclude:: ../../examples/yaml_define/tutorial.yaml
-      :start-after: # under the License.
-      :end-before: # Define the tasks under the workflow
-      :language: yaml
-
-We could find more detail about :code:`ProcessDefinition` in :ref:`concept about process definition <concept:process definition>`
-if you are interested in it. For all arguments of object process definition, you could find in the
-:class:`pydolphinscheduler.core.process_definition` API documentation.
-
-Task Declaration
-----------------
-
-.. tab:: Tradition
-
-   We declare four tasks to show how to create tasks, and both of them are simple tasks of
-   :class:`pydolphinscheduler.tasks.shell` which runs `echo` command in the terminal. Besides the argument
-   `command` with :code:`echo` command, we also need to set the argument `name` for each task
-   *(not only shell task, `name` is required for each type of task)*.
-   
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial.py
-      :dedent: 0
-      :start-after: [start task_declare]
-      :end-before: [end task_declare]
-
-   Besides shell task, *PyDolphinScheduler* supports multiple tasks and you could find in :doc:`tasks/index`.
-
-.. tab:: Task Decorator
-
-   We declare four tasks to show how to create tasks, and both of them are created by the task decorator which
-   using :func:`pydolphinscheduler.tasks.func_wrap.task`. All we have to do is add a decorator named
-   :code:`@task` to existing Python function, and then use them inside :class:`pydolphinscheduler.core.process_definition`
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial_decorator.py
-      :dedent: 0
-      :start-after: [start task_declare]
-      :end-before: [end task_declare]
-
-   It makes our workflow more Pythonic, but be careful that when we use task decorator mode mean we only use
-   Python function as a task and could not use the :doc:`built-in tasks <tasks/index>` most of the cases.
-
-.. tab:: YAML File
-
-   .. literalinclude:: ../../examples/yaml_define/tutorial.yaml
-      :start-after: # Define the tasks under the workflow 
-      :language: yaml
-
-Setting Task Dependence
------------------------
-
-After we declare both process definition and task, we have four tasks that are independent and will be running
-in parallel. If you want to start one task until some task is finished, you have to set dependence on those
-tasks.
-
-Set task dependence is quite easy by task's attribute :code:`set_downstream` and :code:`set_upstream` or by
-bitwise operators :code:`>>` and :code:`<<`
-
-In this tutorial, task `task_parent` is the leading task of the whole workflow, then task `task_child_one` and
-task `task_child_two` are its downstream tasks. Task `task_union` will not run unless both task `task_child_one`
-and task `task_child_two` was done, because both two task is `task_union`'s upstream.
-
-.. tab:: Tradition
-   
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial.py
-      :dedent: 0
-      :start-after: [start task_relation_declare]
-      :end-before: [end task_relation_declare]
-
-.. tab:: Task Decorator
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial_decorator.py
-      :dedent: 0
-      :start-after: [start task_relation_declare]
-      :end-before: [end task_relation_declare]
-
-.. tab:: YAML File
-
-   We can use :code:`deps:[]` to set task dependence
-
-   .. literalinclude:: ../../examples/yaml_define/tutorial.yaml
-      :start-after: # Define the tasks under the workflow 
-      :language: yaml
-
-.. note::
-
-   We could set task dependence in batch mode if they have the same downstream or upstream by declaring those
-   tasks as task groups. In tutorial, We declare task `task_child_one` and `task_child_two` as task group named
-   `task_group`, then set `task_group` as downstream of task `task_parent`. You could see more detail in
-   :ref:`concept:Tasks Dependence` for more detail about how to set task dependence.
-
-Submit Or Run Workflow
-----------------------
-
-After that, we finish our workflow definition, with four tasks and task dependence, but all these things are
-local, we should let the DolphinScheduler daemon know how the definition of workflow. So the last thing we
-have to do is submit the workflow to the DolphinScheduler daemon.
-
-Fortunately, we have a convenient method to submit workflow via `ProcessDefinition` attribute :code:`run` which
-will create workflow definition as well as workflow schedule.
-
-.. tab:: Tradition
-   
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial.py
-      :dedent: 0
-      :start-after: [start submit_or_run]
-      :end-before: [end submit_or_run]
-
-.. tab:: Task Decorator
-
-   .. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial_decorator.py
-      :dedent: 0
-      :start-after: [start submit_or_run]
-      :end-before: [end submit_or_run]
-
-.. tab:: YAML File
-
-   pydolphinscheduler YAML CLI always submit workflow. We can run the workflow if we set :code:`run: true`
-
-   .. code-block:: yaml
-
-     # Define the workflow
-     workflow:
-       name: "tutorial"
-       run: true
-
-At last, we could execute this workflow code in your terminal like other Python scripts, running
-:code:`python tutorial.py` to trigger and execute it.
-
-.. note::
-
-   If you do not start your DolphinScheduler API server, you could find how to start it in
-   :ref:`start:start Python gateway service` for more detail. Besides attribute :code:`run`, we have attribute
-   :code:`submit` for object `ProcessDefinition` which just submits workflow to the daemon but does not set
-   the workflow schedule information. For more detail, you could see :ref:`concept:process definition`.
-
-DAG Graph After Tutorial Run
-----------------------------
-
-After we run the tutorial code, you could log in DolphinScheduler web UI, go and see the
-`DolphinScheduler project page`_. They is a new process definition be created by *PyDolphinScheduler* and it
-named "tutorial" or "tutorial_decorator". The task graph of workflow like below:
-
-.. literalinclude:: ../../src/pydolphinscheduler/examples/tutorial.py
-   :language: text
-   :lines: 24-28
-
-Create Process Using YAML File
-------------------------------
-
-We can use pydolphinscheduler CLI to create process using YAML file
-
-.. code-block:: bash
-
-   pydolphinscheduler yaml -f Shell.yaml
-
-We can use the following four special grammars to define workflows more flexibly.
-
-- :code:`$FILE{"file_name"}`: Read the file (:code:`file_name`) contents and replace them to that location.
-- :code:`$WORKFLOW{"other_workflow.yaml"}`: Refer to another process defined using YAML file (:code:`other_workflow.yaml`) and replace the process name in this location.
-- :code:`$ENV{env_name}`: Read the environment variable (:code:`env_name`) and replace it to that location.
-- :code:`${CONFIG.key_name}`: Read the configuration value of key (:code:`key_name`) and it them to that location.
-
-
-In addition, when loading the file path use :code:`$FILE{"file_name"}` or :code:`$WORKFLOW{"other_workflow.yaml"}`, pydolphinscheduler will search in the path of the YAMl file if the file does not exist.
-
-For exmaples, our file directory structure is as follows:
-
-.. code-block:: bash
-
-   .
-   └── yaml_define
-       ├── Condition.yaml
-       ├── DataX.yaml
-       ├── Dependent_External.yaml
-       ├── Dependent.yaml
-       ├── example_datax.json
-       ├── example_sql.sql
-       ├── example_subprocess.yaml
-       ├── Flink.yaml
-       ├── Http.yaml
-       ├── MapReduce.yaml
-       ├── MoreConfiguration.yaml
-       ├── Procedure.yaml
-       ├── Python.yaml
-       ├── Shell.yaml
-       ├── Spark.yaml
-       ├── Sql.yaml
-       ├── SubProcess.yaml
-       └── Switch.yaml
-
-After we run
-
-.. code-block:: bash
-
-   pydolphinscheduler yaml -file yaml_define/SubProcess.yaml
-
-
-the :code:`$WORKFLOW{"example_sub_workflow.yaml"}` will be set to :code:`$WORKFLOW{"yaml_define/example_sub_workflow.yaml"}`, because :code:`./example_subprocess.yaml` does not exist and :code:`yaml_define/example_sub_workflow.yaml` does.
-
-Furthermore, this feature supports recursion all the way down.
-
-
-.. _`DolphinScheduler project page`: https://dolphinscheduler.apache.org/en-us/docs/latest/user_doc/guide/project.html
-.. _`Python context manager`: https://docs.python.org/3/library/stdtypes.html#context-manager-types
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Condition.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Condition.yaml
deleted file mode 100644
index c65b8c7aeb..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Condition.yaml
+++ /dev/null
@@ -1,43 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Condition"
-
-# Define the tasks under the workflow
-tasks:
-  - { "task_type": "Shell", "name": "pre_task_1", "command": "echo pre_task_1" }
-  - { "task_type": "Shell", "name": "pre_task_2", "command": "echo pre_task_2" }
-  - { "task_type": "Shell", "name": "pre_task_3", "command": "echo pre_task_3" }
-  - { "task_type": "Shell", "name": "success_branch", "command": "echo success_branch" }
-  - { "task_type": "Shell", "name": "fail_branch", "command": "echo fail_branch" }
-
-  - name: condition
-    task_type: Condition
-    success_task: success_branch
-    failed_task: fail_branch
-    op: AND
-    groups:
-      - op: AND
-        groups:
-          - task: pre_task_1
-            flag: true
-          - task: pre_task_2
-            flag: true
-          - task: pre_task_3
-            flag: false
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/DataX.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/DataX.yaml
deleted file mode 100644
index 00ecd54685..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/DataX.yaml
+++ /dev/null
@@ -1,33 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "DataX"
-
-# Define the tasks under the workflow
-tasks:
-  - name: task
-    task_type: DataX
-    datasource_name: db
-    datatarget_name: db
-    sql: show tables;
-    target_table: table_test
-
-  - name: task_custon_config
-    task_type: CustomDataX
-    json: $FILE{"example_datax.json"}
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dependent.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dependent.yaml
deleted file mode 100644
index d69fac05da..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dependent.yaml
+++ /dev/null
@@ -1,76 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-workflow:
-  name: "Dependent"
-
-# Define the tasks under the workflow
-tasks:
-  - name: dependent
-    task_type: Dependent
-    denpendence:
-    op: and
-    groups:
-      - op: or
-        groups:
-          - project_name: pydolphin
-            process_definition_name: task_dependent_external
-            dependent_task_name: task_1
-
-          - project_name: pydolphin
-            process_definition_name: task_dependent_external
-            dependent_task_name: task_2
-
-      - op: and
-        groups:
-          - project_name: pydolphin
-            process_definition_name: task_dependent_external
-            dependent_task_name: task_1
-            dependent_date: LAST_WEDNESDAY 
-
-          - project_name: pydolphin
-            process_definition_name: task_dependent_external
-            dependent_task_name: task_2
-            dependent_date: last24Hours 
-
-  - name: dependent_var
-    task_type: Dependent
-    denpendence:
-    op: and
-    groups:
-      - op: or
-        # we can use ${CONFIG.WORKFLOW_PROJECT} to set the value to configuration.WORKFLOW_PROJECT
-        # we can use $WORKFLOW{"Dependent_External.yaml"} to create or update a workflow from dependent_external.yaml and set the value to that workflow name
-        groups:
-          - project_name: ${CONFIG.WORKFLOW_PROJECT} 
-            process_definition_name: $WORKFLOW{"Dependent_External.yaml"} 
-            dependent_task_name: task_1
-
-          - project_name: ${CONFIG.WORKFLOW_PROJECT} 
-            process_definition_name: $WORKFLOW{"Dependent_External.yaml"} 
-            dependent_task_name: task_2
-      - op: and
-        groups:
-          - project_name: ${CONFIG.WORKFLOW_PROJECT} 
-            process_definition_name: $WORKFLOW{"Dependent_External.yaml"} 
-            dependent_task_name: task_1
-            dependent_date: LAST_WEDNESDAY 
-
-          - project_name: ${CONFIG.WORKFLOW_PROJECT} 
-            process_definition_name: $WORKFLOW{"Dependent_External.yaml"} 
-            dependent_task_name: task_2
-            dependent_date: last24Hours 
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dependent_External.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dependent_External.yaml
deleted file mode 100644
index 577ff6a807..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dependent_External.yaml
+++ /dev/null
@@ -1,26 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "task_dependent_external"
-
-# Define the tasks under the workflow
-tasks:
-  - { "task_type": "Shell", "name": "task_1", "command": "echo task 1" }
-  - { "task_type": "Shell", "name": "task_2", "command": "echo task 2" }
-  - { "task_type": "Shell", "name": "task_3", "command": "echo task 3" }
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dvc.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dvc.yaml
deleted file mode 100644
index a6ec18c372..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Dvc.yaml
+++ /dev/null
@@ -1,46 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define variable `repository`
-repository: &repository "git@github.com:<YOUR-NAME-OR-ORG>/dvc-data-repository-example.git" 
-
-# Define the workflow
-workflow:
-  name: "DVC"
-  release_state: "offline"
-
-# Define the tasks under the process
-tasks:
-  - name: init_dvc 
-    task_type: DVCInit
-    repository: *repository
-    store_url: ~/dvc_data
-
-  - name: upload_data
-    task_type: DVCUpload
-    repository: *repository
-    data_path_in_dvc_repository: "iris"
-    data_path_in_worker: ~/source/iris
-    version: v1
-    message: upload iris data v1
-
-  - name: download_data
-    task_type: DVCDownload
-    repository: *repository
-    data_path_in_dvc_repository: "iris"
-    data_path_in_worker: ~/target/iris
-    version: v1
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Flink.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Flink.yaml
deleted file mode 100644
index 2449d435a3..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Flink.yaml
+++ /dev/null
@@ -1,29 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Flink"
-
-# Define the tasks under the workflow
-tasks:
-  - name: task
-    task_type: Flink
-    main_class: org.apache.flink.streaming.examples.wordcount.WordCount
-    main_package: test_java.jar
-    program_type: JAVA
-    deploy_mode: local
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Http.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Http.yaml
deleted file mode 100644
index 1483aeb3d8..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Http.yaml
+++ /dev/null
@@ -1,37 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Http"
-
-# Define the tasks under the workflow
-tasks:
-  - name: task
-    task_type: Http
-    url: "https://httpbin.org/get"
-    http_method: "GET"
-    http_params:
-      - { "prop": "a", "httpParametersType": "PARAMETER", "value": "1" }
-      - { "prop": "b", "httpParametersType": "PARAMETER", "value": "2" }
-      - {
-          "prop": "Content-Type",
-          "httpParametersType": "header",
-          "value": "test",
-        }
-    http_check_condition: "STATUS_CODE_CUSTOM"
-    condition: "404"
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/MapReduce.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/MapReduce.yaml
deleted file mode 100644
index e1a2b5709c..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/MapReduce.yaml
+++ /dev/null
@@ -1,29 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "MapReduce"
-
-# Define the tasks under the workflow
-tasks:
-  - name: task
-    task_type: MR
-    main_class: wordcount
-    main_package: test_java.jar
-    program_type: SCALA
-    main_args: /dolphinscheduler/tenant_exists/resources/file.txt /output/ds
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/MoreConfiguration.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/MoreConfiguration.yaml
deleted file mode 100644
index 258aa33433..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/MoreConfiguration.yaml
+++ /dev/null
@@ -1,40 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "MoreConfiguration"
-  param:
-    n: 1
-
-# Define the tasks under the workflow
-tasks:
-  - name: shell_0
-    task_type: Shell
-    description: "yaml define task"
-    flag: "YES"
-    command: |
-      echo "$ENV{HOME}"
-      echo "${n}"
-    task_priority: "HIGH"
-    delay_time: 20
-    fail_retry_times: 30
-    fail_retry_interval: 5
-    timeout_flag: "CLOSE"
-    timeout: 60
-    local_params:
-      - { "prop": "n", "direct": "IN", "type": "VARCHAR", "value": "${n}" }
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/OpenMLDB.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/OpenMLDB.yaml
deleted file mode 100644
index b455cb0768..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/OpenMLDB.yaml
+++ /dev/null
@@ -1,33 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "OpenMLDB"
-
-# Define the tasks under the workflow
-tasks:
-  - name: OpenMLDB
-    task_type: OpenMLDB
-    zookeeper: "127.0.0.1:2181"
-    zookeeper_path: "/openmldb"
-    execute_mode: "online"
-    sql: |
-      USE demo_db;
-      set @@job_timeout=200000;
-      LOAD DATA INFILE 'file:///tmp/train_sample.csv'
-      INTO TABLE talkingdata OPTIONS(mode='overwrite');
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Procedure.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Procedure.yaml
deleted file mode 100644
index 829a961c1a..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Procedure.yaml
+++ /dev/null
@@ -1,27 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Procedure"
-
-# Define the tasks under the workflow
-tasks:
-  - name: task
-    task_type: Procedure
-    datasource_name: db
-    method: show tables;
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Python.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Python.yaml
deleted file mode 100644
index 728b5c928e..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Python.yaml
+++ /dev/null
@@ -1,30 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Python"
-
-# Define the tasks under the workflow
-tasks:
-  - name: python
-    task_type: Python
-    definition: |
-      import os
-      print(os)
-      print("1")
-      print("2")
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Pytorch.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Pytorch.yaml
deleted file mode 100644
index 8706824245..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Pytorch.yaml
+++ /dev/null
@@ -1,53 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Pytorch"
-
-# Define the tasks under the workflow
-tasks:
-
-  # run project with existing environment
-  - name: task_existing_env
-    task_type: pytorch
-    script: main.py
-    script_params: --dry-run --no-cuda
-    project_path: https://github.com/pytorch/examples#mnist
-    python_command: /home/anaconda3/envs/pytorch/bin/python3
-
-
-  # run project with creating conda environment
-  - name: task_conda_env
-    task_type: pytorch
-    script: main.py
-    script_params: --dry-run --no-cuda
-    project_path: https://github.com/pytorch/examples#mnist
-    is_create_environment: True
-    python_env_tool: conda
-    requirements: requirements.txt
-    conda_python_version: 3.7
-
-  # run project with creating virtualenv environment
-  - name: task_virtualenv_env
-    task_type: pytorch
-    script: main.py
-    script_params: --dry-run --no-cuda
-    project_path: https://github.com/pytorch/examples#mnist
-    is_create_environment: True
-    python_env_tool: virtualenv
-    requirements: requirements.txt
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Sagemaker.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Sagemaker.yaml
deleted file mode 100644
index 9f77a3caa8..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Sagemaker.yaml
+++ /dev/null
@@ -1,28 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Sagemaker"
-  release_state: "offline"
-
-# Define the tasks under the process
-tasks:
-  - name: sagemaker
-    task_type: Sagemaker
-    sagemaker_request_json: $FILE{"example_sagemaker_params.json"}
-
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Shell.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Shell.yaml
deleted file mode 100644
index fdbe126327..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Shell.yaml
+++ /dev/null
@@ -1,40 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Shell"
-  release_state: "offline"
-  run: true
-
-# Define the tasks under the process
-tasks:
-  - name: task_parent
-    task_type: Shell
-    command: |
-      echo hello pydolphinscheduler 
-      echo run task parent
-
-  - name: task_child_one
-    task_type: Shell
-    deps: [task_parent]
-    command: echo "child one"
-
-  - name: task_child_two
-    task_type: Shell
-    deps: [task_parent]
-    command: echo "child two"
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Spark.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Spark.yaml
deleted file mode 100644
index e45514bbf1..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Spark.yaml
+++ /dev/null
@@ -1,29 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Spark"
-
-# Define the tasks under the workflow
-tasks:
-  - name: task
-    task_type: Spark
-    main_class: org.apache.spark.examples.SparkPi
-    main_package: test_java.jar
-    program_type: SCALA
-    deploy_mode: local
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Sql.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Sql.yaml
deleted file mode 100644
index c3c7e88ee1..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Sql.yaml
+++ /dev/null
@@ -1,45 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Sql"
-
-# Define the tasks under the workflow
-tasks:
-  - name: task_base
-    task_type: Sql
-    datasource_name: "db"
-    sql: show tables;
-
-  - name: task_multi_line
-    task_type: Sql
-    datasource_name: "db"
-    sql: |
-      show tables;
-      select id from version where id=1;
-
-  - name: task_file
-    task_type: Sql
-    datasource_name: "db"
-    sql: $FILE{"example_sql.sql"}
-
-  # Or you can define task "task_union" it with one line
-  - { "task_type": "Sql", "name": "task_base_one_line", "datasource_name": "db", "sql": "select id from version where id=1;"}
-
-  # Or you can define task "task_union" it with one line
-  - { "task_type": "Sql", "name": "task_file_one_line", "datasource_name": "db", "sql": '$FILE{"example_sql.sql"}'}
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/SubProcess.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/SubProcess.yaml
deleted file mode 100644
index 0ea7549db4..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/SubProcess.yaml
+++ /dev/null
@@ -1,27 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "SubWorkflow"
-
-tasks:
-  - name: example_workflow
-    task_type: SubProcess
-    process_definition_name: $WORKFLOW{"example_sub_workflow.yaml"}
-
-  - { "task_type": "Shell", "deps": [example_workflow], "name": "task_3", "command": "echo task 3" }
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Switch.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Switch.yaml
deleted file mode 100644
index 33ed68813e..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/Switch.yaml
+++ /dev/null
@@ -1,39 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "Switch"
-  param:
-    var: 1
-
-# Define the tasks under the workflow
-tasks:
-  - name: switch_child_1
-    task_type: Shell
-    command: echo switch_child_1
-
-  - name: switch_child_2
-    task_type: Shell
-    command: echo switch_child_2
-
-  - name: switch
-    task_type: Switch
-    condition:
-      - task: switch_child_1
-        condition: "${var} > 1"
-      - task: switch_child_2
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_datax.json b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_datax.json
deleted file mode 100644
index 3db8092cb6..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_datax.json
+++ /dev/null
@@ -1,62 +0,0 @@
-{
-  "job": {
-    "content": [
-      {
-        "reader": {
-          "name": "mysqlreader",
-          "parameter": {
-            "username": "usr",
-            "password": "pwd",
-            "column": [
-              "id",
-              "name",
-              "code",
-              "description"
-            ],
-            "splitPk": "id",
-            "connection": [
-              {
-                "table": [
-                  "source_table"
-                ],
-                "jdbcUrl": [
-                  "jdbc:mysql://127.0.0.1:3306/source_db"
-                ]
-              }
-            ]
-          }
-        },
-        "writer": {
-          "name": "mysqlwriter",
-          "parameter": {
-            "writeMode": "insert",
-            "username": "usr",
-            "password": "pwd",
-            "column": [
-              "id",
-              "name"
-            ],
-            "connection": [
-              {
-                "jdbcUrl": "jdbc:mysql://127.0.0.1:3306/target_db",
-                "table": [
-                  "target_table"
-                ]
-              }
-            ]
-          }
-        }
-      }
-    ],
-    "setting": {
-      "errorLimit": {
-        "percentage": 0,
-        "record": 0
-      },
-      "speed": {
-        "channel": 1,
-        "record": 1000
-      }
-    }
-  }
-}
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sagemaker_params.json b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sagemaker_params.json
deleted file mode 100644
index 9403320355..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sagemaker_params.json
+++ /dev/null
@@ -1,18 +0,0 @@
-{
-    "ParallelismConfiguration":{
-        "MaxParallelExecutionSteps":1
-    },
-    "PipelineExecutionDescription":"run pipeline using ds",
-    "PipelineExecutionDisplayName":"ds-sagemaker-pipeline",
-    "PipelineName":"DsSagemakerPipeline",
-    "PipelineParameters":[
-        {
-            "Name":"InputData",
-            "Value": "s3://sagemaker/dataset/dataset.csv"
-        },
-        {
-            "Name":"InferenceData",
-            "Value": "s3://sagemaker/dataset/inference.csv"
-        }
-    ]
-}
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sql.sql b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sql.sql
deleted file mode 100644
index 06b5c4c16c..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sql.sql
+++ /dev/null
@@ -1,22 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *    http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
-*/
-
-select id from version where id=1;
-select id from version where id=2;
-select id from version where id=3;
-select id from version where id=4;
-select id from version where id=5;
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sub_workflow.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sub_workflow.yaml
deleted file mode 100644
index af3a863da9..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/example_sub_workflow.yaml
+++ /dev/null
@@ -1,26 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "example_workflow_for_sub_workflow"
-
-# Define the tasks under the workflow
-tasks:
-  - { "task_type": "Shell", "name": "task_1", "command": "echo task 1" }
-  - { "task_type": "Shell", "deps": [task_1], "name": "task_2", "command": "echo task 2" }
-  - { "task_type": "Shell", "deps": [task_2], "name": "task_3", "command": "echo task 3" }
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/mlflow.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/mlflow.yaml
deleted file mode 100644
index 45e56726e1..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/mlflow.yaml
+++ /dev/null
@@ -1,69 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-
-# Define variable `mlflow_tracking_uri`
-mlflow_tracking_uri: &mlflow_tracking_uri "http://127.0.0.1:5000" 
-
-# Define the workflow
-workflow:
-  name: "MLflow"
-
-# Define the tasks under the workflow
-tasks:
-  - name: train_xgboost_native
-    task_type: MLFlowProjectsCustom 
-    repository: https://github.com/mlflow/mlflow#examples/xgboost/xgboost_native
-    mlflow_tracking_uri: *mlflow_tracking_uri
-    parameters: -P learning_rate=0.2 -P colsample_bytree=0.8 -P subsample=0.9
-    experiment_name: xgboost
-
-  - name: train_automl
-    task_type: MLFlowProjectsAutoML 
-    mlflow_tracking_uri: *mlflow_tracking_uri
-    parameters: time_budget=30;estimator_list=['lgbm']
-    experiment_name: automl_iris
-    model_name: iris_A
-    automl_tool: flaml
-    data_path: /data/examples/iris
-
-  - name: deploy_docker
-    task_type: MLflowModels 
-    deps: [train_automl]
-    model_uri: models:/iris_A/Production
-    mlflow_tracking_uri: *mlflow_tracking_uri
-    deploy_mode: DOCKER
-    port: 7002
-
-  - name: train_basic_algorithm
-    task_type: MLFlowProjectsBasicAlgorithm 
-    mlflow_tracking_uri: *mlflow_tracking_uri
-    parameters: n_estimators=200;learning_rate=0.2
-    experiment_name: basic_algorithm_iris
-    model_name: iris_B
-    algorithm: lightgbm
-    data_path: /data/examples/iris
-    search_params: max_depth=[5, 10];n_estimators=[100, 200]
-
-  - name: deploy_mlflow
-    deps: [train_basic_algorithm]
-    task_type: MLflowModels
-    model_uri: models:/iris_B/Production
-    mlflow_tracking_uri: *mlflow_tracking_uri
-    deploy_mode: MLFLOW
-    port: 7001
-
diff --git a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/tutorial.yaml b/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/tutorial.yaml
deleted file mode 100644
index 104a8c367b..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/examples/yaml_define/tutorial.yaml
+++ /dev/null
@@ -1,46 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Define the workflow
-workflow:
-  name: "tutorial"
-  schedule: "0 0 0 * * ? *"
-  start_time: "2021-01-01"
-  tenant: "tenant_exists"
-  release_state: "offline"
-  run: true
-
-# Define the tasks under the workflow
-tasks:
-  - name: task_parent
-    task_type: Shell
-    command: echo hello pydolphinscheduler
-
-  - name: task_child_one
-    task_type: Shell
-    deps: [task_parent]
-    command: echo "child one"
-
-  - name: task_child_two
-    task_type: Shell
-    deps: [task_parent]
-    command: echo "child two"
-
-  - name: task_union
-    task_type: Shell
-    deps: [task_child_one, task_child_two]
-    command: echo "union"
diff --git a/dolphinscheduler-python/pydolphinscheduler/pytest.ini b/dolphinscheduler-python/pydolphinscheduler/pytest.ini
deleted file mode 100644
index b1aa850346..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/pytest.ini
+++ /dev/null
@@ -1,21 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-[pytest]
-# add path here to skip pytest scan it
-norecursedirs =
-    tests/testing
-    # Integration test run seperated which do not calculate coverage, it will run in `tox -e integrate-test`
-    tests/integration
diff --git a/dolphinscheduler-python/pydolphinscheduler/setup.cfg b/dolphinscheduler-python/pydolphinscheduler/setup.cfg
deleted file mode 100644
index 13a83393a9..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/setup.cfg
+++ /dev/null
@@ -1,16 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
diff --git a/dolphinscheduler-python/pydolphinscheduler/setup.py b/dolphinscheduler-python/pydolphinscheduler/setup.py
deleted file mode 100644
index 66a1ffc86c..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/setup.py
+++ /dev/null
@@ -1,198 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""The script for setting up pydolphinscheduler."""
-import logging
-import os
-import sys
-from distutils.dir_util import remove_tree
-from os.path import dirname, join
-from typing import List
-
-from setuptools import Command, find_packages, setup
-
-if sys.version_info[0] < 3:
-    raise Exception(
-        "pydolphinscheduler does not support Python 2. Please upgrade to Python 3."
-    )
-
-logger = logging.getLogger(__name__)
-
-version = "dev"
-
-# Start package required
-prod = [
-    "boto3>=1.23.10",
-    "oss2>=2.16.0",
-    "python-gitlab>=2.10.1",
-    "click>=8.0.0",
-    "py4j~=0.10",
-    "ruamel.yaml",
-]
-
-build = [
-    "build",
-    "setuptools>=42",
-    "wheel",
-]
-
-doc = [
-    "sphinx>=4.3",
-    "sphinx_rtd_theme>=1.0",
-    "sphinx-click>=3.0",
-    "sphinx-inline-tabs",
-    "sphinx-copybutton>=0.4.0",
-    # Unreleased package have a feature we want(use correct version package for API ref), so we install from
-    # GitHub directly, see also:
-    # https://github.com/Holzhaus/sphinx-multiversion/issues/42#issuecomment-1210539786
-    "sphinx-multiversion @ git+https://github.com/Holzhaus/sphinx-multiversion#egg=sphinx-multiversion",
-]
-
-test = [
-    "pytest>=6.2",
-    "freezegun>=1.1",
-    "coverage>=6.1",
-    "pytest-cov>=3.0",
-    "docker>=5.0.3",
-]
-
-style = [
-    "flake8>=4.0",
-    "flake8-docstrings>=1.6",
-    "flake8-black>=0.2",
-    "isort>=5.10",
-    "autoflake>=1.4",
-]
-
-dev = style + test + doc + build
-
-all_dep = prod + dev
-# End package required
-
-
-def read(*names, **kwargs):
-    """Read file content from given file path."""
-    return open(
-        join(dirname(__file__), *names), encoding=kwargs.get("encoding", "utf8")
-    ).read()
-
-
-class CleanCommand(Command):
-    """Command to clean up python api before setup by running `python setup.py pre_clean`."""
-
-    description = "Clean up project root"
-    user_options: List[str] = []
-    clean_list = [
-        "build",
-        "htmlcov",
-        "dist",
-        ".pytest_cache",
-        ".coverage",
-    ]
-
-    def initialize_options(self) -> None:
-        """Set default values for options."""
-
-    def finalize_options(self) -> None:
-        """Set final values for options."""
-
-    def run(self) -> None:
-        """Run and remove temporary files."""
-        for cl in self.clean_list:
-            if not os.path.exists(cl):
-                logger.info("Path %s do not exists.", cl)
-            elif os.path.isdir(cl):
-                remove_tree(cl)
-            else:
-                os.remove(cl)
-        logger.info("Finish pre_clean process.")
-
-
-setup(
-    name="apache-dolphinscheduler",
-    version=version,
-    license="Apache License 2.0",
-    description="Apache DolphinScheduler Python API",
-    long_description=read("README.md"),
-    # Make sure pypi is expecting markdown
-    long_description_content_type="text/markdown",
-    author="Apache Software Foundation",
-    author_email="dev@dolphinscheduler.apache.org",
-    url="https://dolphinscheduler.apache.org/",
-    python_requires=">=3.6",
-    keywords=[
-        "dolphinscheduler",
-        "workflow",
-        "scheduler",
-        "taskflow",
-    ],
-    project_urls={
-        "Homepage": "https://dolphinscheduler.apache.org",
-        "Documentation": "https://dolphinscheduler.apache.org/python/dev/index.html",
-        "Source": "https://github.com/apache/dolphinscheduler/tree/dev/dolphinscheduler-python/"
-        "pydolphinscheduler",
-        "Issue Tracker": "https://github.com/apache/dolphinscheduler/issues?"
-        "q=is%3Aissue+is%3Aopen+label%3APython",
-        "Discussion": "https://github.com/apache/dolphinscheduler/discussions",
-        "Twitter": "https://twitter.com/dolphinschedule",
-    },
-    packages=find_packages(where="src"),
-    package_dir={"": "src"},
-    include_package_data=True,
-    package_data={
-        "pydolphinscheduler": ["default_config.yaml"],
-    },
-    platforms=["any"],
-    classifiers=[
-        # complete classifier list: http://pypi.python.org/pypi?%3Aaction=list_classifiers
-        "Development Status :: 4 - Beta",
-        "Environment :: Console",
-        "Intended Audience :: Developers",
-        "License :: OSI Approved :: Apache Software License",
-        "Operating System :: Unix",
-        "Operating System :: POSIX",
-        "Operating System :: Microsoft :: Windows",
-        "Programming Language :: Python",
-        "Programming Language :: Python :: 3",
-        "Programming Language :: Python :: 3.6",
-        "Programming Language :: Python :: 3.7",
-        "Programming Language :: Python :: 3.8",
-        "Programming Language :: Python :: 3.9",
-        "Programming Language :: Python :: 3.10",
-        "Programming Language :: Python :: 3.11",
-        "Programming Language :: Python :: Implementation :: CPython",
-        "Programming Language :: Python :: Implementation :: PyPy",
-        "Topic :: Software Development :: User Interfaces",
-    ],
-    install_requires=prod,
-    extras_require={
-        "all": all_dep,
-        "dev": dev,
-        "style": style,
-        "test": test,
-        "doc": doc,
-        "build": build,
-    },
-    cmdclass={
-        "pre_clean": CleanCommand,
-    },
-    entry_points={
-        "console_scripts": [
-            "pydolphinscheduler = pydolphinscheduler.cli.commands:cli",
-        ],
-    },
-)
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/__init__.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/__init__.py
deleted file mode 100644
index 2a7b55430c..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/__init__.py
+++ /dev/null
@@ -1,22 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Init root of pydolphinscheduler."""
-
-from pkg_resources import get_distribution
-
-__version__ = get_distribution("apache-dolphinscheduler").version
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/cli/__init__.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/cli/__init__.py
deleted file mode 100644
index 5f30c83241..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/cli/__init__.py
+++ /dev/null
@@ -1,18 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Commands line interface of pydolphinscheduler."""
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/cli/commands.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/cli/commands.py
deleted file mode 100644
index 8d923f1406..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/cli/commands.py
+++ /dev/null
@@ -1,106 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Commands line interface's command of pydolphinscheduler."""
-
-import click
-from click import echo
-
-import pydolphinscheduler
-from pydolphinscheduler.configuration import (
-    get_single_config,
-    init_config_file,
-    set_single_config,
-)
-from pydolphinscheduler.core.yaml_process_define import create_process_definition
-
-version_option_val = ["major", "minor", "micro"]
-
-
-@click.group()
-def cli():
-    """Apache DolphinScheduler Python API's command line interface."""
-
-
-@cli.command()
-@click.option(
-    "--part",
-    "-p",
-    required=False,
-    type=click.Choice(version_option_val, case_sensitive=False),
-    multiple=False,
-    help="The part of version your want to get.",
-)
-def version(part: str) -> None:
-    """Show current version of pydolphinscheduler."""
-    if part:
-        idx = version_option_val.index(part)
-        echo(f"{pydolphinscheduler.__version__.split('.')[idx]}")
-    else:
-        echo(f"{pydolphinscheduler.__version__}")
-
-
-@cli.command()
-@click.option(
-    "--init",
-    "-i",
-    is_flag=True,
-    help="Initialize and create configuration file to `PYDS_HOME`.",
-)
-@click.option(
-    "--set",
-    "-s",
-    "setter",
-    multiple=True,
-    type=click.Tuple([str, str]),
-    help="Set specific setting to config file."
-    "Use multiple ``--set <KEY> <VAL>`` options to set multiple configs",
-)
-@click.option(
-    "--get",
-    "-g",
-    "getter",
-    multiple=True,
-    type=str,
-    help="Get specific setting from config file."
-    "Use multiple ``--get <KEY>`` options to get multiple configs",
-)
-def config(getter, setter, init) -> None:
-    """Manage the configuration for pydolphinscheduler."""
-    if init:
-        init_config_file()
-    elif getter:
-        click.echo("The configuration query as below:\n")
-        configs_kv = [f"{key} = {get_single_config(key)}" for key in getter]
-        click.echo("\n".join(configs_kv))
-    elif setter:
-        for key, val in setter:
-            set_single_config(key, val)
-        click.echo("Set configuration done.")
-
-
-@cli.command()
-@click.option(
-    "--yaml_file",
-    "-f",
-    required=True,
-    help="YAML file path",
-    type=click.Path(exists=True),
-)
-def yaml(yaml_file) -> None:
-    """Create process definition using YAML file."""
-    create_process_definition(yaml_file)
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/configuration.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/configuration.py
deleted file mode 100644
index 860f9869f3..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/configuration.py
+++ /dev/null
@@ -1,193 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Configuration module for pydolphinscheduler."""
-import os
-from pathlib import Path
-from typing import Any
-
-from pydolphinscheduler.exceptions import PyDSConfException
-from pydolphinscheduler.utils import file
-from pydolphinscheduler.utils.yaml_parser import YamlParser
-
-BUILD_IN_CONFIG_PATH = Path(__file__).resolve().parent.joinpath("default_config.yaml")
-
-
-def config_path() -> Path:
-    """Get the path of pydolphinscheduler configuration file."""
-    pyds_home = os.environ.get("PYDS_HOME", "~/pydolphinscheduler")
-    config_file_path = Path(pyds_home).joinpath("config.yaml").expanduser()
-    return config_file_path
-
-
-def get_configs() -> YamlParser:
-    """Get all configuration settings from configuration file.
-
-    Will use custom configuration file first if it exists, otherwise default configuration file in
-    default path.
-    """
-    path = str(config_path()) if config_path().exists() else BUILD_IN_CONFIG_PATH
-    with open(path, mode="r") as f:
-        return YamlParser(f.read())
-
-
-def init_config_file() -> None:
-    """Initialize configuration file by default configs."""
-    if config_path().exists():
-        raise PyDSConfException(
-            "Initialize configuration false to avoid overwrite configure by accident, file already exists "
-            "in %s, if you wan to overwrite the exists configure please remove the exists file manually.",
-            str(config_path()),
-        )
-    file.write(content=str(get_configs()), to_path=str(config_path()))
-
-
-def get_single_config(key: str) -> Any:
-    """Get single config to configuration file.
-
-    Support get from nested keys by delimiter ``.``.
-
-    For example, yaml config as below:
-
-    .. code-block:: yaml
-
-        one:
-          two1:
-            three: value1
-          two2: value2
-
-    you could get ``value1`` and ``value2`` by nested path
-
-    .. code-block:: python
-
-        value1 = get_single_config("one.two1.three")
-        value2 = get_single_config("one.two2")
-
-    :param key: The config key want to get it value.
-    """
-    config = get_configs()
-    if key not in config:
-        raise PyDSConfException(
-            "Configuration path %s do not exists. Can not get configuration.", key
-        )
-    return config[key]
-
-
-def set_single_config(key: str, value: Any) -> None:
-    """Change single config to configuration file.
-
-    For example, yaml config as below:
-
-    .. code-block:: yaml
-
-        one:
-          two1:
-            three: value1
-          two2: value2
-
-    you could change ``value1`` to ``value3``, also change ``value2`` to ``value4`` by nested path assigned
-
-    .. code-block:: python
-
-        set_single_config["one.two1.three"] = "value3"
-        set_single_config["one.two2"] = "value4"
-
-    :param key: The config key want change.
-    :param value: The new value want to set.
-    """
-    config = get_configs()
-    if key not in config:
-        raise PyDSConfException(
-            "Configuration path %s do not exists. Can not set configuration.", key
-        )
-    config[key] = value
-    file.write(content=str(config), to_path=str(config_path()), overwrite=True)
-
-
-def get_int(val: Any) -> int:
-    """Covert value to int."""
-    return int(val)
-
-
-def get_bool(val: Any) -> bool:
-    """Covert value to boolean."""
-    if isinstance(val, str):
-        return val.lower() in {"true", "t"}
-    elif isinstance(val, int):
-        return val == 1
-    else:
-        return bool(val)
-
-
-# Start Common Configuration Settings
-
-# Add configs as module variables to avoid read configuration multiple times when
-#  Get common configuration setting
-#  Set or get multiple configs in single time
-configs: YamlParser = get_configs()
-
-# Java Gateway Settings
-JAVA_GATEWAY_ADDRESS = os.environ.get(
-    "PYDS_JAVA_GATEWAY_ADDRESS", configs.get("java_gateway.address")
-)
-JAVA_GATEWAY_PORT = get_int(
-    os.environ.get("PYDS_JAVA_GATEWAY_PORT", configs.get("java_gateway.port"))
-)
-JAVA_GATEWAY_AUTO_CONVERT = get_bool(
-    os.environ.get(
-        "PYDS_JAVA_GATEWAY_AUTO_CONVERT", configs.get("java_gateway.auto_convert")
-    )
-)
-
-# User Settings
-USER_NAME = os.environ.get("PYDS_USER_NAME", configs.get("default.user.name"))
-USER_PASSWORD = os.environ.get(
-    "PYDS_USER_PASSWORD", configs.get("default.user.password")
-)
-USER_EMAIL = os.environ.get("PYDS_USER_EMAIL", configs.get("default.user.email"))
-USER_PHONE = str(os.environ.get("PYDS_USER_PHONE", configs.get("default.user.phone")))
-USER_STATE = get_int(
-    os.environ.get("PYDS_USER_STATE", configs.get("default.user.state"))
-)
-
-# Workflow Settings
-WORKFLOW_PROJECT = os.environ.get(
-    "PYDS_WORKFLOW_PROJECT", configs.get("default.workflow.project")
-)
-WORKFLOW_TENANT = os.environ.get(
-    "PYDS_WORKFLOW_TENANT", configs.get("default.workflow.tenant")
-)
-WORKFLOW_USER = os.environ.get(
-    "PYDS_WORKFLOW_USER", configs.get("default.workflow.user")
-)
-WORKFLOW_QUEUE = os.environ.get(
-    "PYDS_WORKFLOW_QUEUE", configs.get("default.workflow.queue")
-)
-WORKFLOW_RELEASE_STATE = os.environ.get(
-    "PYDS_WORKFLOW_RELEASE_STATE", configs.get("default.workflow.release_state")
-)
-WORKFLOW_WORKER_GROUP = os.environ.get(
-    "PYDS_WORKFLOW_WORKER_GROUP", configs.get("default.workflow.worker_group")
-)
-WORKFLOW_TIME_ZONE = os.environ.get(
-    "PYDS_WORKFLOW_TIME_ZONE", configs.get("default.workflow.time_zone")
-)
-WORKFLOW_WARNING_TYPE = os.environ.get(
-    "PYDS_WORKFLOW_WARNING_TYPE", configs.get("default.workflow.warning_type")
-)
-
-# End Common Configuration Setting
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/constants.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/constants.py
deleted file mode 100644
index bedbbf2f5e..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/constants.py
+++ /dev/null
@@ -1,122 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Constants for pydolphinscheduler."""
-
-
-class TaskPriority(str):
-    """Constants for task priority."""
-
-    HIGHEST = "HIGHEST"
-    HIGH = "HIGH"
-    MEDIUM = "MEDIUM"
-    LOW = "LOW"
-    LOWEST = "LOWEST"
-
-
-class TaskFlag(str):
-    """Constants for task flag."""
-
-    YES = "YES"
-    NO = "NO"
-
-
-class TaskTimeoutFlag(str):
-    """Constants for task timeout flag."""
-
-    CLOSE = "CLOSE"
-
-
-class TaskType(str):
-    """Constants for task type, it will also show you which kind we support up to now."""
-
-    SHELL = "SHELL"
-    HTTP = "HTTP"
-    PYTHON = "PYTHON"
-    SQL = "SQL"
-    SUB_PROCESS = "SUB_PROCESS"
-    PROCEDURE = "PROCEDURE"
-    DATAX = "DATAX"
-    DEPENDENT = "DEPENDENT"
-    CONDITIONS = "CONDITIONS"
-    SWITCH = "SWITCH"
-    FLINK = "FLINK"
-    SPARK = "SPARK"
-    MR = "MR"
-    SAGEMAKER = "SAGEMAKER"
-    MLFLOW = "MLFLOW"
-    OPENMLDB = "OPENMLDB"
-    PYTORCH = "PYTORCH"
-    DVC = "DVC"
-
-
-class DefaultTaskCodeNum(str):
-    """Constants and default value for default task code number."""
-
-    DEFAULT = 1
-
-
-class JavaGatewayDefault(str):
-    """Constants and default value for java gateway."""
-
-    RESULT_MESSAGE_KEYWORD = "msg"
-    RESULT_MESSAGE_SUCCESS = "success"
-
-    RESULT_STATUS_KEYWORD = "status"
-    RESULT_STATUS_SUCCESS = "SUCCESS"
-
-    RESULT_DATA = "data"
-
-
-class Delimiter(str):
-    """Constants for delimiter."""
-
-    BAR = "-"
-    DASH = "/"
-    COLON = ":"
-    UNDERSCORE = "_"
-    DIRECTION = "->"
-
-
-class Time(str):
-    """Constants for date."""
-
-    FMT_STD_DATE = "%Y-%m-%d"
-    LEN_STD_DATE = 10
-
-    FMT_DASH_DATE = "%Y/%m/%d"
-
-    FMT_SHORT_DATE = "%Y%m%d"
-    LEN_SHORT_DATE = 8
-
-    FMT_STD_TIME = "%H:%M:%S"
-    FMT_NO_COLON_TIME = "%H%M%S"
-
-
-class ResourceKey(str):
-    """Constants for key of resource."""
-
-    ID = "id"
-
-
-class Symbol(str):
-    """Constants for symbol."""
-
-    SLASH = "/"
-    POINT = "."
-    COMMA = ","
-    UNDERLINE = "_"
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/__init__.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/__init__.py
deleted file mode 100644
index b997c3e9de..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/__init__.py
+++ /dev/null
@@ -1,30 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Init pydolphinscheduler.core package."""
-
-from pydolphinscheduler.core.database import Database
-from pydolphinscheduler.core.engine import Engine
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.core.task import Task
-
-__all__ = [
-    "Database",
-    "Engine",
-    "ProcessDefinition",
-    "Task",
-]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/database.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/database.py
deleted file mode 100644
index 4a93f22f3f..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/database.py
+++ /dev/null
@@ -1,62 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Module database."""
-
-from typing import Dict
-
-from py4j.protocol import Py4JJavaError
-
-from pydolphinscheduler.exceptions import PyDSParamException
-from pydolphinscheduler.java_gateway import JavaGate
-
-
-class Database(dict):
-    """database object, get information about database.
-
-    You provider database_name contain connection information, it decisions which
-    database type and database instance would run task.
-    """
-
-    def __init__(self, database_name: str, type_key, database_key, *args, **kwargs):
-        super().__init__(*args, **kwargs)
-        self._database = {}
-        self.database_name = database_name
-        self[type_key] = self.database_type
-        self[database_key] = self.database_id
-
-    @property
-    def database_type(self) -> str:
-        """Get database type from java gateway, a wrapper for :func:`get_database_info`."""
-        return self.get_database_info(self.database_name).get("type")
-
-    @property
-    def database_id(self) -> str:
-        """Get database id from java gateway, a wrapper for :func:`get_database_info`."""
-        return self.get_database_info(self.database_name).get("id")
-
-    def get_database_info(self, name) -> Dict:
-        """Get database info from java gateway, contains database id, type, name."""
-        if self._database:
-            return self._database
-        else:
-            try:
-                self._database = JavaGate().get_datasource_info(name)
-            # Handler database source do not exists error, for now we just terminate the process.
-            except Py4JJavaError as ex:
-                raise PyDSParamException(str(ex.java_exception))
-            return self._database
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/engine.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/engine.py
deleted file mode 100644
index 41021ed474..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/engine.py
+++ /dev/null
@@ -1,94 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Module engine."""
-
-from typing import Dict, Optional
-
-from py4j.protocol import Py4JJavaError
-
-from pydolphinscheduler.core.task import Task
-from pydolphinscheduler.exceptions import PyDSParamException
-from pydolphinscheduler.java_gateway import JavaGate
-
-
-class ProgramType(str):
-    """Type of program engine runs, for now it just contain `JAVA`, `SCALA` and `PYTHON`."""
-
-    JAVA = "JAVA"
-    SCALA = "SCALA"
-    PYTHON = "PYTHON"
-
-
-class Engine(Task):
-    """Task engine object, declare behavior for engine task to dolphinscheduler.
-
-    This is the parent class of spark, flink and mr tasks,
-    and is used to provide the programType, mainClass and mainJar task parameters for reuse.
-    """
-
-    def __init__(
-        self,
-        name: str,
-        task_type: str,
-        main_class: str,
-        main_package: str,
-        program_type: Optional[ProgramType] = ProgramType.SCALA,
-        *args,
-        **kwargs
-    ):
-        super().__init__(name, task_type, *args, **kwargs)
-        self.main_class = main_class
-        self.main_package = main_package
-        self.program_type = program_type
-        self._resource = {}
-
-    def get_resource_info(self, program_type, main_package):
-        """Get resource info from java gateway, contains resource id, name."""
-        if self._resource:
-            return self._resource
-        else:
-            try:
-                self._resource = JavaGate().get_resources_file_info(
-                    program_type, main_package
-                )
-            # Handler source do not exists error, for now we just terminate the process.
-            except Py4JJavaError as ex:
-                raise PyDSParamException(str(ex.java_exception))
-            return self._resource
-
-    def get_jar_id(self) -> int:
-        """Get jar id from java gateway, a wrapper for :func:`get_resource_info`."""
-        return self.get_resource_info(self.program_type, self.main_package).get("id")
-
-    @property
-    def task_params(self, camel_attr: bool = True, custom_attr: set = None) -> Dict:
-        """Override Task.task_params for engine children task.
-
-        children task have some specials attribute for task_params, and is odd if we
-        directly set as python property, so we Override Task.task_params here.
-        """
-        params = super().task_params
-        custom_params = {
-            "programType": self.program_type,
-            "mainClass": self.main_class,
-            "mainJar": {
-                "id": self.get_jar_id(),
-            },
-        }
-        params.update(custom_params)
-        return params
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/process_definition.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/process_definition.py
deleted file mode 100644
index 62de7ed1b4..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/process_definition.py
+++ /dev/null
@@ -1,424 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Module process definition, core class for workflow define."""
-
-import json
-from datetime import datetime
-from typing import Any, Dict, List, Optional, Set
-
-from pydolphinscheduler import configuration
-from pydolphinscheduler.constants import TaskType
-from pydolphinscheduler.core.resource import Resource
-from pydolphinscheduler.core.resource_plugin import ResourcePlugin
-from pydolphinscheduler.exceptions import PyDSParamException, PyDSTaskNoFoundException
-from pydolphinscheduler.java_gateway import JavaGate
-from pydolphinscheduler.models import Base, Project, Tenant, User
-from pydolphinscheduler.utils.date import MAX_DATETIME, conv_from_str, conv_to_schedule
-
-
-class ProcessDefinitionContext:
-    """Class process definition context, use when task get process definition from context expression."""
-
-    _context_managed_process_definition: Optional["ProcessDefinition"] = None
-
-    @classmethod
-    def set(cls, pd: "ProcessDefinition") -> None:
-        """Set attribute self._context_managed_process_definition."""
-        cls._context_managed_process_definition = pd
-
-    @classmethod
-    def get(cls) -> Optional["ProcessDefinition"]:
-        """Get attribute self._context_managed_process_definition."""
-        return cls._context_managed_process_definition
-
-    @classmethod
-    def delete(cls) -> None:
-        """Delete attribute self._context_managed_process_definition."""
-        cls._context_managed_process_definition = None
-
-
-class ProcessDefinition(Base):
-    """process definition object, will define process definition attribute, task, relation.
-
-    TODO: maybe we should rename this class, currently use DS object name.
-
-    :param user: The user for current process definition. Will create a new one if it do not exists. If your
-        parameter ``project`` already exists but project's create do not belongs to ``user``, will grant
-        ``project`` to ``user`` automatically.
-    :param project: The project for current process definition. You could see the workflow in this project
-        thought Web UI after it :func:`submit` or :func:`run`. It will create a new project belongs to
-        ``user`` if it does not exists. And when ``project`` exists but project's create do not belongs
-        to ``user``, will grant `project` to ``user`` automatically.
-    :param resource_list: Resource files required by the current process definition.You can create and modify
-        resource files from this field. When the process definition is submitted, these resource files are
-        also submitted along with it.
-    """
-
-    # key attribute for identify ProcessDefinition object
-    _KEY_ATTR = {
-        "name",
-        "project",
-        "tenant",
-        "release_state",
-        "param",
-    }
-
-    _DEFINE_ATTR = {
-        "name",
-        "description",
-        "_project",
-        "_tenant",
-        "worker_group",
-        "warning_type",
-        "warning_group_id",
-        "timeout",
-        "release_state",
-        "param",
-        "tasks",
-        "task_definition_json",
-        "task_relation_json",
-        "resource_list",
-    }
-
-    def __init__(
-        self,
-        name: str,
-        description: Optional[str] = None,
-        schedule: Optional[str] = None,
-        start_time: Optional[str] = None,
-        end_time: Optional[str] = None,
-        timezone: Optional[str] = configuration.WORKFLOW_TIME_ZONE,
-        user: Optional[str] = configuration.WORKFLOW_USER,
-        project: Optional[str] = configuration.WORKFLOW_PROJECT,
-        tenant: Optional[str] = configuration.WORKFLOW_TENANT,
-        worker_group: Optional[str] = configuration.WORKFLOW_WORKER_GROUP,
-        warning_type: Optional[str] = configuration.WORKFLOW_WARNING_TYPE,
-        warning_group_id: Optional[int] = 0,
-        timeout: Optional[int] = 0,
-        release_state: Optional[str] = configuration.WORKFLOW_RELEASE_STATE,
-        param: Optional[Dict] = None,
-        resource_plugin: Optional[ResourcePlugin] = None,
-        resource_list: Optional[List[Resource]] = None,
-    ):
-        super().__init__(name, description)
-        self.schedule = schedule
-        self._start_time = start_time
-        self._end_time = end_time
-        self.timezone = timezone
-        self._user = user
-        self._project = project
-        self._tenant = tenant
-        self.worker_group = worker_group
-        self.warning_type = warning_type
-        if warning_type.strip().upper() not in ("FAILURE", "SUCCESS", "ALL", "NONE"):
-            raise PyDSParamException(
-                "Parameter `warning_type` with unexpect value `%s`", warning_type
-            )
-        else:
-            self.warning_type = warning_type.strip().upper()
-        self.warning_group_id = warning_group_id
-        self.timeout = timeout
-        self._release_state = release_state
-        self.param = param
-        self.tasks: dict = {}
-        self.resource_plugin = resource_plugin
-        # TODO how to fix circle import
-        self._task_relations: set["TaskRelation"] = set()  # noqa: F821
-        self._process_definition_code = None
-        self.resource_list = resource_list or []
-
-    def __enter__(self) -> "ProcessDefinition":
-        ProcessDefinitionContext.set(self)
-        return self
-
-    def __exit__(self, exc_type, exc_val, exc_tb) -> None:
-        ProcessDefinitionContext.delete()
-
-    @property
-    def tenant(self) -> Tenant:
-        """Get attribute tenant."""
-        return Tenant(self._tenant)
-
-    @tenant.setter
-    def tenant(self, tenant: Tenant) -> None:
-        """Set attribute tenant."""
-        self._tenant = tenant.name
-
-    @property
-    def project(self) -> Project:
-        """Get attribute project."""
-        return Project(self._project)
-
-    @project.setter
-    def project(self, project: Project) -> None:
-        """Set attribute project."""
-        self._project = project.name
-
-    @property
-    def user(self) -> User:
-        """Get user object.
-
-        For now we just get from python models but not from java gateway models, so it may not correct.
-        """
-        return User(name=self._user, tenant=self._tenant)
-
-    @staticmethod
-    def _parse_datetime(val: Any) -> Any:
-        if val is None or isinstance(val, datetime):
-            return val
-        elif isinstance(val, str):
-            return conv_from_str(val)
-        else:
-            raise PyDSParamException("Do not support value type %s for now", type(val))
-
-    @property
-    def start_time(self) -> Any:
-        """Get attribute start_time."""
-        return self._parse_datetime(self._start_time)
-
-    @start_time.setter
-    def start_time(self, val) -> None:
-        """Set attribute start_time."""
-        self._start_time = val
-
-    @property
-    def end_time(self) -> Any:
-        """Get attribute end_time."""
-        return self._parse_datetime(self._end_time)
-
-    @end_time.setter
-    def end_time(self, val) -> None:
-        """Set attribute end_time."""
-        self._end_time = val
-
-    @property
-    def release_state(self) -> int:
-        """Get attribute release_state."""
-        rs_ref = {
-            "online": 1,
-            "offline": 0,
-        }
-        if self._release_state not in rs_ref:
-            raise PyDSParamException(
-                "Parameter release_state only support `online` or `offline` but get %",
-                self._release_state,
-            )
-        return rs_ref[self._release_state]
-
-    @release_state.setter
-    def release_state(self, val: str) -> None:
-        """Set attribute release_state."""
-        self._release_state = val.lower()
-
-    @property
-    def param_json(self) -> Optional[List[Dict]]:
-        """Return param json base on self.param."""
-        # Handle empty dict and None value
-        if not self.param:
-            return []
-        return [
-            {
-                "prop": k,
-                "direct": "IN",
-                "type": "VARCHAR",
-                "value": v,
-            }
-            for k, v in self.param.items()
-        ]
-
-    @property
-    def task_definition_json(self) -> List[Dict]:
-        """Return all tasks definition in list of dict."""
-        if not self.tasks:
-            return [self.tasks]
-        else:
-            return [task.get_define() for task in self.tasks.values()]
-
-    @property
-    def task_relation_json(self) -> List[Dict]:
-        """Return all relation between tasks pair in list of dict."""
-        if not self.tasks:
-            return [self.tasks]
-        else:
-            self._handle_root_relation()
-            return [tr.get_define() for tr in self._task_relations]
-
-    @property
-    def schedule_json(self) -> Optional[Dict]:
-        """Get schedule parameter json object. This is requests from java gateway interface."""
-        if not self.schedule:
-            return None
-        else:
-            start_time = conv_to_schedule(
-                self.start_time if self.start_time else datetime.now()
-            )
-            end_time = conv_to_schedule(
-                self.end_time if self.end_time else MAX_DATETIME
-            )
-            return {
-                "startTime": start_time,
-                "endTime": end_time,
-                "crontab": self.schedule,
-                "timezoneId": self.timezone,
-            }
-
-    @property
-    def task_list(self) -> List["Task"]:  # noqa: F821
-        """Return list of tasks objects."""
-        return list(self.tasks.values())
-
-    def _handle_root_relation(self):
-        """Handle root task property :class:`pydolphinscheduler.core.task.TaskRelation`.
-
-        Root task in DAG do not have dominant upstream node, but we have to add an exactly default
-        upstream task with task_code equal to `0`. This is requests from java gateway interface.
-        """
-        from pydolphinscheduler.core.task import TaskRelation
-
-        post_relation_code = set()
-        for relation in self._task_relations:
-            post_relation_code.add(relation.post_task_code)
-        for task in self.task_list:
-            if task.code not in post_relation_code:
-                root_relation = TaskRelation(pre_task_code=0, post_task_code=task.code)
-                self._task_relations.add(root_relation)
-
-    def add_task(self, task: "Task") -> None:  # noqa: F821
-        """Add a single task to process definition."""
-        self.tasks[task.code] = task
-        task._process_definition = self
-
-    def add_tasks(self, tasks: List["Task"]) -> None:  # noqa: F821
-        """Add task sequence to process definition, it a wrapper of :func:`add_task`."""
-        for task in tasks:
-            self.add_task(task)
-
-    def get_task(self, code: str) -> "Task":  # noqa: F821
-        """Get task object from process definition by given code."""
-        if code not in self.tasks:
-            raise PyDSTaskNoFoundException(
-                "Task with code %s can not found in process definition %",
-                (code, self.name),
-            )
-        return self.tasks[code]
-
-    # TODO which tying should return in this case
-    def get_tasks_by_name(self, name: str) -> Set["Task"]:  # noqa: F821
-        """Get tasks object by given name, if will return all tasks with this name."""
-        find = set()
-        for task in self.tasks.values():
-            if task.name == name:
-                find.add(task)
-        return find
-
-    def get_one_task_by_name(self, name: str) -> "Task":  # noqa: F821
-        """Get exact one task from process definition by given name.
-
-        Function always return one task even though this process definition have more than one task with
-        this name.
-        """
-        tasks = self.get_tasks_by_name(name)
-        if not tasks:
-            raise PyDSTaskNoFoundException(f"Can not find task with name {name}.")
-        return tasks.pop()
-
-    def run(self):
-        """Submit and Start ProcessDefinition instance.
-
-        Shortcut for function :func:`submit` and function :func:`start`. Only support manual start workflow
-        for now, and schedule run will coming soon.
-        :return:
-        """
-        self.submit()
-        self.start()
-
-    def _ensure_side_model_exists(self):
-        """Ensure process definition models model exists.
-
-        For now, models object including :class:`pydolphinscheduler.models.project.Project`,
-        :class:`pydolphinscheduler.models.tenant.Tenant`, :class:`pydolphinscheduler.models.user.User`.
-        If these model not exists, would create default value in
-        :class:`pydolphinscheduler.constants.ProcessDefinitionDefault`.
-        """
-        # TODO used metaclass for more pythonic
-        self.user.create_if_not_exists()
-        # Project model need User object exists
-        self.project.create_if_not_exists(self._user)
-
-    def _pre_submit_check(self):
-        """Check specific condition satisfy before.
-
-        This method should be called before process definition submit to java gateway
-        For now, we have below checker:
-        * `self.param` or at least one local param of task should be set if task `switch` in this workflow.
-        """
-        if (
-            any([task.task_type == TaskType.SWITCH for task in self.tasks.values()])
-            and self.param is None
-            and all([len(task.local_params) == 0 for task in self.tasks.values()])
-        ):
-            raise PyDSParamException(
-                "Parameter param or at least one local_param of task must "
-                "be provider if task Switch in process definition."
-            )
-
-    def submit(self) -> int:
-        """Submit ProcessDefinition instance to java gateway."""
-        self._ensure_side_model_exists()
-        self._pre_submit_check()
-
-        self._process_definition_code = JavaGate().create_or_update_process_definition(
-            self._user,
-            self._project,
-            self.name,
-            str(self.description) if self.description else "",
-            json.dumps(self.param_json),
-            self.warning_type,
-            self.warning_group_id,
-            self.timeout,
-            self.worker_group,
-            self._tenant,
-            self.release_state,
-            # TODO add serialization function
-            json.dumps(self.task_relation_json),
-            json.dumps(self.task_definition_json),
-            json.dumps(self.schedule_json) if self.schedule_json else None,
-            None,
-            None,
-        )
-        if len(self.resource_list) > 0:
-            for res in self.resource_list:
-                res.user_name = self._user
-                res.create_or_update_resource()
-        return self._process_definition_code
-
-    def start(self) -> None:
-        """Create and start ProcessDefinition instance.
-
-        which post to `start-process-instance` to java gateway
-        """
-        JavaGate().exec_process_instance(
-            self._user,
-            self._project,
-            self.name,
-            "",
-            self.worker_group,
-            self.warning_type,
-            self.warning_group_id,
-            24 * 3600,
-        )
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/resource.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/resource.py
deleted file mode 100644
index ea811915e2..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/resource.py
+++ /dev/null
@@ -1,73 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Module resource."""
-
-from typing import Optional
-
-from pydolphinscheduler.exceptions import PyDSParamException
-from pydolphinscheduler.java_gateway import JavaGate
-from pydolphinscheduler.models import Base
-
-
-class Resource(Base):
-    """resource object, will define the resources that you want to create or update.
-
-    :param name: The fullname of resource.Includes path and suffix.
-    :param content: The description of resource.
-    :param description: The description of resource.
-    :param user_name: The user name of resource.
-    """
-
-    _DEFINE_ATTR = {"name", "content", "description", "user_name"}
-
-    def __init__(
-        self,
-        name: str,
-        content: Optional[str] = None,
-        description: Optional[str] = None,
-        user_name: Optional[str] = None,
-    ):
-        super().__init__(name, description)
-        self.content = content
-        self.user_name = user_name
-        self._resource_code = None
-
-    def get_info_from_database(self):
-        """Get resource info from java gateway, contains resource id, name."""
-        if not self.user_name:
-            raise PyDSParamException(
-                "`user_name` is required when querying resources from python gate."
-            )
-        return JavaGate().query_resources_file_info(self.user_name, self.name)
-
-    def get_id_from_database(self):
-        """Get resource id from java gateway."""
-        return self.get_info_from_database().getId()
-
-    def create_or_update_resource(self):
-        """Create or update resource via java gateway."""
-        if not self.content or not self.user_name:
-            raise PyDSParamException(
-                "`user_name` and `content` are required when create or update resource from python gate."
-            )
-        JavaGate().create_or_update_resource(
-            self.user_name,
-            self.name,
-            self.content,
-            self.description,
-        )
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/resource_plugin.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/resource_plugin.py
deleted file mode 100644
index 8b500d165f..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/resource_plugin.py
+++ /dev/null
@@ -1,58 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""DolphinScheduler ResourcePlugin object."""
-
-from abc import ABCMeta, abstractmethod
-
-from pydolphinscheduler.exceptions import PyResPluginException
-
-
-# [start resource_plugin_definition]
-class ResourcePlugin(object, metaclass=ABCMeta):
-    """ResourcePlugin object, declare resource plugin for task and workflow to dolphinscheduler.
-
-    :param prefix: A string representing the prefix of ResourcePlugin.
-
-    """
-
-    # [start init_method]
-    def __init__(self, prefix: str, *args, **kwargs):
-        self.prefix = prefix
-
-    # [end init_method]
-
-    # [start abstractmethod read_file]
-    @abstractmethod
-    def read_file(self, suf: str):
-        """Get the content of the file.
-
-        The address of the file is the prefix of the resource plugin plus the parameter suf.
-        """
-
-    # [end abstractmethod read_file]
-
-    def get_index(self, s: str, x, n):
-        """Find the subscript of the nth occurrence of the X character in the string s."""
-        if n <= s.count(x):
-            all_index = [key for key, value in enumerate(s) if value == x]
-            return all_index[n - 1]
-        else:
-            raise PyResPluginException("Incomplete path.")
-
-
-# [end resource_plugin_definition]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/task.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/task.py
deleted file mode 100644
index 3fec31fd67..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/task.py
+++ /dev/null
@@ -1,384 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""DolphinScheduler Task and TaskRelation object."""
-import copy
-import types
-from logging import getLogger
-from typing import Dict, List, Optional, Sequence, Set, Tuple, Union
-
-from pydolphinscheduler import configuration
-from pydolphinscheduler.constants import (
-    Delimiter,
-    ResourceKey,
-    Symbol,
-    TaskFlag,
-    TaskPriority,
-    TaskTimeoutFlag,
-)
-from pydolphinscheduler.core.process_definition import (
-    ProcessDefinition,
-    ProcessDefinitionContext,
-)
-from pydolphinscheduler.core.resource import Resource
-from pydolphinscheduler.core.resource_plugin import ResourcePlugin
-from pydolphinscheduler.exceptions import PyDSParamException, PyResPluginException
-from pydolphinscheduler.java_gateway import JavaGate
-from pydolphinscheduler.models import Base
-
-logger = getLogger(__name__)
-
-
-class TaskRelation(Base):
-    """TaskRelation object, describe the relation of exactly two tasks."""
-
-    # Add attr `_KEY_ATTR` to overwrite :func:`__eq__`, it is make set
-    # `Task.process_definition._task_relations` work correctly.
-    _KEY_ATTR = {
-        "pre_task_code",
-        "post_task_code",
-    }
-
-    _DEFINE_ATTR = {
-        "pre_task_code",
-        "post_task_code",
-    }
-
-    _DEFAULT_ATTR = {
-        "name": "",
-        "preTaskVersion": 1,
-        "postTaskVersion": 1,
-        "conditionType": 0,
-        "conditionParams": {},
-    }
-
-    def __init__(
-        self,
-        pre_task_code: int,
-        post_task_code: int,
-        name: Optional[str] = None,
-    ):
-        super().__init__(name)
-        self.pre_task_code = pre_task_code
-        self.post_task_code = post_task_code
-
-    def __hash__(self):
-        return hash(f"{self.pre_task_code} {Delimiter.DIRECTION} {self.post_task_code}")
-
-
-class Task(Base):
-    """Task object, parent class for all exactly task type."""
-
-    _DEFINE_ATTR = {
-        "name",
-        "code",
-        "version",
-        "task_type",
-        "task_params",
-        "description",
-        "flag",
-        "task_priority",
-        "worker_group",
-        "environment_code",
-        "delay_time",
-        "fail_retry_times",
-        "fail_retry_interval",
-        "timeout_flag",
-        "timeout_notify_strategy",
-        "timeout",
-    }
-
-    # task default attribute will into `task_params` property
-    _task_default_attr = {
-        "local_params",
-        "resource_list",
-        "dependence",
-        "wait_start_timeout",
-        "condition_result",
-    }
-    # task attribute ignore from _task_default_attr and will not into `task_params` property
-    _task_ignore_attr: set = set()
-    # task custom attribute define in sub class and will append to `task_params` property
-    _task_custom_attr: set = set()
-
-    ext: set = None
-    ext_attr: Union[str, types.FunctionType] = None
-
-    DEFAULT_CONDITION_RESULT = {"successNode": [""], "failedNode": [""]}
-
-    def __init__(
-        self,
-        name: str,
-        task_type: str,
-        description: Optional[str] = None,
-        flag: Optional[str] = TaskFlag.YES,
-        task_priority: Optional[str] = TaskPriority.MEDIUM,
-        worker_group: Optional[str] = configuration.WORKFLOW_WORKER_GROUP,
-        environment_name: Optional[str] = None,
-        delay_time: Optional[int] = 0,
-        fail_retry_times: Optional[int] = 0,
-        fail_retry_interval: Optional[int] = 1,
-        timeout_flag: Optional[int] = TaskTimeoutFlag.CLOSE,
-        timeout_notify_strategy: Optional = None,
-        timeout: Optional[int] = 0,
-        process_definition: Optional[ProcessDefinition] = None,
-        local_params: Optional[List] = None,
-        resource_list: Optional[List] = None,
-        dependence: Optional[Dict] = None,
-        wait_start_timeout: Optional[Dict] = None,
-        condition_result: Optional[Dict] = None,
-        resource_plugin: Optional[ResourcePlugin] = None,
-    ):
-
-        super().__init__(name, description)
-        self.task_type = task_type
-        self.flag = flag
-        self.task_priority = task_priority
-        self.worker_group = worker_group
-        self._environment_name = environment_name
-        self.fail_retry_times = fail_retry_times
-        self.fail_retry_interval = fail_retry_interval
-        self.delay_time = delay_time
-        self.timeout_flag = timeout_flag
-        self.timeout_notify_strategy = timeout_notify_strategy
-        self.timeout = timeout
-        self._process_definition = None
-        self.process_definition: ProcessDefinition = (
-            process_definition or ProcessDefinitionContext.get()
-        )
-        self._upstream_task_codes: Set[int] = set()
-        self._downstream_task_codes: Set[int] = set()
-        self._task_relation: Set[TaskRelation] = set()
-        # move attribute code and version after _process_definition and process_definition declare
-        self.code, self.version = self.gen_code_and_version()
-        # Add task to process definition, maybe we could put into property process_definition latter
-
-        if (
-            self.process_definition is not None
-            and self.code not in self.process_definition.tasks
-        ):
-            self.process_definition.add_task(self)
-        else:
-            logger.warning(
-                "Task code %d already in process definition, prohibit re-add task.",
-                self.code,
-            )
-
-        # Attribute for task param
-        self.local_params = local_params or []
-        self._resource_list = resource_list or []
-        self.dependence = dependence or {}
-        self.wait_start_timeout = wait_start_timeout or {}
-        self._condition_result = condition_result or self.DEFAULT_CONDITION_RESULT
-        self.resource_plugin = resource_plugin
-        self.get_content()
-
-    @property
-    def process_definition(self) -> Optional[ProcessDefinition]:
-        """Get attribute process_definition."""
-        return self._process_definition
-
-    @process_definition.setter
-    def process_definition(self, process_definition: Optional[ProcessDefinition]):
-        """Set attribute process_definition."""
-        self._process_definition = process_definition
-
-    @property
-    def resource_list(self) -> List:
-        """Get task define attribute `resource_list`."""
-        resources = set()
-        for res in self._resource_list:
-            if type(res) == str:
-                resources.add(
-                    Resource(name=res, user_name=self.user_name).get_id_from_database()
-                )
-            elif type(res) == dict and res.get(ResourceKey.ID) is not None:
-                logger.warning(
-                    """`resource_list` should be defined using List[str] with resource paths,
-                       the use of ids to define resources will be remove in version 3.2.0.
-                    """
-                )
-                resources.add(res.get(ResourceKey.ID))
-        return [{ResourceKey.ID: r} for r in resources]
-
-    @property
-    def user_name(self) -> Optional[str]:
-        """Return user name of process definition."""
-        if self.process_definition:
-            return self.process_definition.user.name
-        else:
-            raise PyDSParamException("`user_name` cannot be empty.")
-
-    @property
-    def condition_result(self) -> Dict:
-        """Get attribute condition_result."""
-        return self._condition_result
-
-    @condition_result.setter
-    def condition_result(self, condition_result: Optional[Dict]):
-        """Set attribute condition_result."""
-        self._condition_result = condition_result
-
-    def _get_attr(self) -> Set[str]:
-        """Get final task task_params attribute.
-
-        Base on `_task_default_attr`, append attribute from `_task_custom_attr` and subtract attribute from
-        `_task_ignore_attr`.
-        """
-        attr = copy.deepcopy(self._task_default_attr)
-        attr -= self._task_ignore_attr
-        attr |= self._task_custom_attr
-        return attr
-
-    @property
-    def task_params(self) -> Optional[Dict]:
-        """Get task parameter object.
-
-        Will get result to combine _task_custom_attr and custom_attr.
-        """
-        custom_attr = self._get_attr()
-        return self.get_define_custom(custom_attr=custom_attr)
-
-    def get_plugin(self):
-        """Return the resource plug-in.
-
-        according to parameter resource_plugin and parameter
-        process_definition.resource_plugin.
-        """
-        if self.resource_plugin is None:
-            if self.process_definition.resource_plugin is not None:
-                return self.process_definition.resource_plugin
-            else:
-                raise PyResPluginException(
-                    "The execution command of this task is a file, but the resource plugin is empty"
-                )
-        else:
-            return self.resource_plugin
-
-    def get_content(self):
-        """Get the file content according to the resource plugin."""
-        if self.ext_attr is None and self.ext is None:
-            return
-        _ext_attr = getattr(self, self.ext_attr)
-        if _ext_attr is not None:
-            if isinstance(_ext_attr, str) and _ext_attr.endswith(tuple(self.ext)):
-                res = self.get_plugin()
-                content = res.read_file(_ext_attr)
-                setattr(self, self.ext_attr.lstrip(Symbol.UNDERLINE), content)
-            else:
-                if self.resource_plugin is not None or (
-                    self.process_definition is not None
-                    and self.process_definition.resource_plugin is not None
-                ):
-                    index = _ext_attr.rfind(Symbol.POINT)
-                    if index != -1:
-                        raise ValueError(
-                            "This task does not support files with suffix {}, only supports {}".format(
-                                _ext_attr[index:],
-                                Symbol.COMMA.join(str(suf) for suf in self.ext),
-                            )
-                        )
-                setattr(self, self.ext_attr.lstrip(Symbol.UNDERLINE), _ext_attr)
-
-    def __hash__(self):
-        return hash(self.code)
-
-    def __lshift__(self, other: Union["Task", Sequence["Task"]]):
-        """Implement Task << Task."""
-        self.set_upstream(other)
-        return other
-
-    def __rshift__(self, other: Union["Task", Sequence["Task"]]):
-        """Implement Task >> Task."""
-        self.set_downstream(other)
-        return other
-
-    def __rrshift__(self, other: Union["Task", Sequence["Task"]]):
-        """Call for Task >> [Task] because list don't have __rshift__ operators."""
-        self.__lshift__(other)
-        return self
-
-    def __rlshift__(self, other: Union["Task", Sequence["Task"]]):
-        """Call for Task << [Task] because list don't have __lshift__ operators."""
-        self.__rshift__(other)
-        return self
-
-    def _set_deps(
-        self, tasks: Union["Task", Sequence["Task"]], upstream: bool = True
-    ) -> None:
-        """
-        Set parameter tasks dependent to current task.
-
-        it is a wrapper for :func:`set_upstream` and :func:`set_downstream`.
-        """
-        if not isinstance(tasks, Sequence):
-            tasks = [tasks]
-
-        for task in tasks:
-            if upstream:
-                self._upstream_task_codes.add(task.code)
-                task._downstream_task_codes.add(self.code)
-
-                if self._process_definition:
-                    task_relation = TaskRelation(
-                        pre_task_code=task.code,
-                        post_task_code=self.code,
-                        name=f"{task.name} {Delimiter.DIRECTION} {self.name}",
-                    )
-                    self.process_definition._task_relations.add(task_relation)
-            else:
-                self._downstream_task_codes.add(task.code)
-                task._upstream_task_codes.add(self.code)
-
-                if self._process_definition:
-                    task_relation = TaskRelation(
-                        pre_task_code=self.code,
-                        post_task_code=task.code,
-                        name=f"{self.name} {Delimiter.DIRECTION} {task.name}",
-                    )
-                    self.process_definition._task_relations.add(task_relation)
-
-    def set_upstream(self, tasks: Union["Task", Sequence["Task"]]) -> None:
-        """Set parameter tasks as upstream to current task."""
-        self._set_deps(tasks, upstream=True)
-
-    def set_downstream(self, tasks: Union["Task", Sequence["Task"]]) -> None:
-        """Set parameter tasks as downstream to current task."""
-        self._set_deps(tasks, upstream=False)
-
-    # TODO code should better generate in bulk mode when :ref: processDefinition run submit or start
-    def gen_code_and_version(self) -> Tuple:
-        """
-        Generate task code and version from java gateway.
-
-        If task name do not exists in process definition before, if will generate new code and version id
-        equal to 0 by java gateway, otherwise if will return the exists code and version.
-        """
-        # TODO get code from specific project process definition and task name
-        result = JavaGate().get_code_and_version(
-            self.process_definition._project, self.process_definition.name, self.name
-        )
-        # result = gateway.entry_point.genTaskCodeList(DefaultTaskCodeNum.DEFAULT)
-        # gateway_result_checker(result)
-        return result.get("code"), result.get("version")
-
-    @property
-    def environment_code(self) -> str:
-        """Convert environment name to code."""
-        if self._environment_name is None:
-            return None
-        return JavaGate().query_environment_info(self._environment_name)
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/yaml_process_define.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/yaml_process_define.py
deleted file mode 100644
index 0944925a48..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/core/yaml_process_define.py
+++ /dev/null
@@ -1,466 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Parse YAML file to create process."""
-
-import logging
-import os
-import re
-from pathlib import Path
-from typing import Any, Dict
-
-from pydolphinscheduler import configuration, tasks
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.core.task import Task
-from pydolphinscheduler.exceptions import PyDSTaskNoFoundException
-from pydolphinscheduler.utils.yaml_parser import YamlParser
-
-logger = logging.getLogger(__file__)
-
-KEY_PROCESS = "workflow"
-KEY_TASK = "tasks"
-KEY_TASK_TYPE = "task_type"
-KEY_DEPS = "deps"
-KEY_OP = "op"
-
-TASK_SPECIAL_KEYS = [KEY_TASK_TYPE, KEY_DEPS]
-
-
-class ParseTool:
-    """Enhanced parsing tools."""
-
-    @staticmethod
-    def parse_string_param_if_file(string_param: str, **kwargs):
-        """Use $FILE{"data_path"} to load file from "data_path"."""
-        if string_param.startswith("$FILE"):
-            path = re.findall(r"\$FILE\{\"(.*?)\"\}", string_param)[0]
-            base_folder = kwargs.get("base_folder", ".")
-            path = ParseTool.get_possible_path(path, base_folder)
-            with open(path, "r") as read_file:
-                string_param = "".join(read_file)
-        return string_param
-
-    @staticmethod
-    def parse_string_param_if_env(string_param: str, **kwargs):
-        """Use $ENV{env_name} to load environment variable "env_name"."""
-        if "$ENV" in string_param:
-            key = re.findall(r"\$ENV\{(.*?)\}", string_param)[0]
-            env_value = os.environ.get(key, "$%s" % key)
-            string_param = string_param.replace("$ENV{%s}" % key, env_value)
-        return string_param
-
-    @staticmethod
-    def parse_string_param_if_config(string_param: str, **kwargs):
-        """Use ${CONFIG.var_name} to load variable "var_name" from configuration."""
-        if "${CONFIG" in string_param:
-            key = re.findall(r"\$\{CONFIG\.(.*?)\}", string_param)[0]
-            if hasattr(configuration, key):
-                string_param = getattr(configuration, key)
-            else:
-                string_param = configuration.get_single_config(key)
-
-        return string_param
-
-    @staticmethod
-    def get_possible_path(file_path, base_folder):
-        """Get file possible path.
-
-        Return new path if file_path is not exists, but base_folder + file_path exists
-        """
-        possible_path = file_path
-        if not Path(file_path).exists():
-            new_path = Path(base_folder).joinpath(file_path)
-            if new_path.exists():
-                possible_path = new_path
-                logger.info(f"{file_path} not exists, convert to {possible_path}")
-
-        return possible_path
-
-
-def get_task_cls(task_type) -> Task:
-    """Get the task class object by task_type (case compatible)."""
-    # only get task class from tasks.__all__
-    all_task_types = {type_.capitalize(): type_ for type_ in tasks.__all__}
-    task_type_cap = task_type.capitalize()
-    if task_type_cap not in all_task_types:
-        raise PyDSTaskNoFoundException("cant not find task %s" % task_type)
-
-    standard_name = all_task_types[task_type_cap]
-    return getattr(tasks, standard_name)
-
-
-class YamlProcess(YamlParser):
-    """Yaml parser for create process.
-
-    :param yaml_file: yaml file path.
-
-        examples1 ::
-
-            parser = YamlParser(yaml_file=...)
-            parser.create_process_definition()
-
-        examples2 ::
-
-            YamlParser(yaml_file=...).create_process_definition()
-
-    """
-
-    _parse_rules = [
-        ParseTool.parse_string_param_if_file,
-        ParseTool.parse_string_param_if_env,
-        ParseTool.parse_string_param_if_config,
-    ]
-
-    def __init__(self, yaml_file: str):
-        with open(yaml_file, "r") as f:
-            content = f.read()
-
-        self._base_folder = Path(yaml_file).parent
-        content = self.prepare_refer_process(content)
-        super().__init__(content)
-
-    def create_process_definition(self):
-        """Create process main function."""
-        # get process parameters with key "workflow"
-        process_params = self[KEY_PROCESS]
-
-        # pop "run" parameter, used at the end
-        is_run = process_params.pop("run", False)
-
-        # use YamlProcess._parse_rules to parse special value of yaml file
-        process_params = self.parse_params(process_params)
-
-        process_name = process_params["name"]
-        logger.info(f"Create Process: {process_name}")
-        with ProcessDefinition(**process_params) as pd:
-
-            # save dependencies between tasks
-            dependencies = {}
-
-            # save name and task mapping
-            name2task = {}
-
-            # get task datas with key "tasks"
-            for task_data in self[KEY_TASK]:
-                task = self.parse_task(task_data, name2task)
-
-                deps = task_data.get(KEY_DEPS, [])
-                if deps:
-                    dependencies[task.name] = deps
-                name2task[task.name] = task
-
-            # build dependencies between task
-            for downstream_task_name, deps in dependencies.items():
-                downstream_task = name2task[downstream_task_name]
-                for upstream_task_name in deps:
-                    upstream_task = name2task[upstream_task_name]
-                    upstream_task >> downstream_task
-
-            pd.submit()
-            # if set is_run, run the process after submit
-            if is_run:
-                logger.info(f"run workflow: {pd}")
-                pd.run()
-
-        return process_name
-
-    def parse_params(self, params: Any):
-        """Recursively resolves the parameter values.
-
-        The function operates params only when it encounters a string; other types continue recursively.
-        """
-        if isinstance(params, str):
-            for parse_rule in self._parse_rules:
-                params_ = params
-                params = parse_rule(params, base_folder=self._base_folder)
-                if params_ != params:
-                    logger.info(f"parse {params_} -> {params}")
-
-        elif isinstance(params, list):
-            for index in range(len(params)):
-                params[index] = self.parse_params(params[index])
-
-        elif isinstance(params, dict):
-            for key, value in params.items():
-                params[key] = self.parse_params(value)
-
-        return params
-
-    @classmethod
-    def parse(cls, yaml_file: str):
-        """Recursively resolves the parameter values.
-
-        The function operates params only when it encounters a string; other types continue recursively.
-        """
-        process_name = cls(yaml_file).create_process_definition()
-        return process_name
-
-    def prepare_refer_process(self, content):
-        """Allow YAML files to reference process derived from other YAML files."""
-        process_paths = re.findall(r"\$WORKFLOW\{\"(.*?)\"\}", content)
-        for process_path in process_paths:
-            logger.info(
-                f"find special token {process_path}, load process form {process_path}"
-            )
-            possible_path = ParseTool.get_possible_path(process_path, self._base_folder)
-            process_name = YamlProcess.parse(possible_path)
-            content = content.replace('$WORKFLOW{"%s"}' % process_path, process_name)
-
-        return content
-
-    def parse_task(self, task_data: dict, name2task: Dict[str, Task]):
-        """Parse various types of tasks.
-
-        :param task_data: dict.
-                {
-                    "task_type": "Shell",
-                    "params": {"name": "shell_task", "command":"ehco hellp"}
-                }
-
-        :param name2task: Dict[str, Task]), mapping of task_name and task
-
-
-        Some task type have special parse func:
-            if task type is Switch, use parse_switch;
-            if task type is Condition, use parse_condition;
-            if task type is Dependent, use parse_dependent;
-            other, we pass all task_params as input to task class, like "task_cls(**task_params)".
-        """
-        task_type = task_data["task_type"]
-        # get params without special key
-        task_params = {k: v for k, v in task_data.items() if k not in TASK_SPECIAL_KEYS}
-
-        task_cls = get_task_cls(task_type)
-
-        # use YamlProcess._parse_rules to parse special value of yaml file
-        task_params = self.parse_params(task_params)
-
-        if task_cls == tasks.Switch:
-            task = self.parse_switch(task_params, name2task)
-
-        elif task_cls == tasks.Condition:
-            task = self.parse_condition(task_params, name2task)
-
-        elif task_cls == tasks.Dependent:
-            task = self.parse_dependent(task_params, name2task)
-
-        else:
-            task = task_cls(**task_params)
-        logger.info(task_type, task)
-        return task
-
-    def parse_switch(self, task_params, name2task):
-        """Parse Switch Task.
-
-        This is an example Yaml fragment of task_params
-
-        name: switch
-        condition:
-          - ["${var} > 1", switch_child_1]
-          - switch_child_2
-        """
-        from pydolphinscheduler.tasks.switch import (
-            Branch,
-            Default,
-            Switch,
-            SwitchCondition,
-        )
-
-        condition_datas = task_params["condition"]
-        conditions = []
-        for condition_data in condition_datas:
-            assert "task" in condition_data, "task must be in %s" % condition_data
-            task_name = condition_data["task"]
-            condition_string = condition_data.get("condition", None)
-
-            # if condition_string is None, for example: {"task": "switch_child_2"}, set it to Default branch
-            if condition_string is None:
-                conditions.append(Default(task=name2task.get(task_name)))
-
-            # if condition_string is not None, for example:
-            # {"task": "switch_child_2", "condition": "${var} > 1"} set it to Branch
-            else:
-                conditions.append(
-                    Branch(condition_string, task=name2task.get(task_name))
-                )
-
-        switch = Switch(
-            name=task_params["name"], condition=SwitchCondition(*conditions)
-        )
-        return switch
-
-    def parse_condition(self, task_params, name2task):
-        """Parse Condition Task.
-
-        This is an example Yaml fragment of task_params
-
-        name: condition
-        success_task: success_branch
-        failed_task: fail_branch
-        OP: AND
-        groups:
-          -
-            OP: AND
-            groups:
-              - [pre_task_1, true]
-              - [pre_task_2, true]
-              - [pre_task_3, false]
-          -
-            OP: AND
-            groups:
-              - [pre_task_1, false]
-              - [pre_task_2, true]
-              - [pre_task_3, true]
-
-        """
-        from pydolphinscheduler.tasks.condition import (
-            FAILURE,
-            SUCCESS,
-            And,
-            Condition,
-            Or,
-        )
-
-        def get_op_cls(op):
-            cls = None
-            if op.lower() == "and":
-                cls = And
-            elif op.lower() == "or":
-                cls = Or
-            else:
-                raise Exception("OP must be in And or Or, but get: %s" % op)
-            return cls
-
-        second_cond_ops = []
-        for first_group in task_params["groups"]:
-            second_op = first_group["op"]
-            task_ops = []
-            for condition_data in first_group["groups"]:
-                assert "task" in condition_data, "task must be in %s" % condition_data
-                assert "flag" in condition_data, "flag must be in %s" % condition_data
-                task_name = condition_data["task"]
-                flag = condition_data["flag"]
-                task = name2task[task_name]
-
-                # for example: task = pre_task_1, flag = true
-                if flag:
-                    task_ops.append(SUCCESS(task))
-                else:
-                    task_ops.append(FAILURE(task))
-
-            second_cond_ops.append(get_op_cls(second_op)(*task_ops))
-
-        first_op = task_params["op"]
-        cond_operator = get_op_cls(first_op)(*second_cond_ops)
-
-        condition = Condition(
-            name=task_params["name"],
-            condition=cond_operator,
-            success_task=name2task[task_params["success_task"]],
-            failed_task=name2task[task_params["failed_task"]],
-        )
-        return condition
-
-    def parse_dependent(self, task_params, name2task):
-        """Parse Dependent Task.
-
-        This is an example Yaml fragment of task_params
-
-        name: dependent
-        denpendence:
-        OP: AND
-        groups:
-          -
-            OP: Or
-            groups:
-              - [pydolphin, task_dependent_external, task_1]
-              - [pydolphin, task_dependent_external, task_2]
-          -
-            OP: And
-            groups:
-              - [pydolphin, task_dependent_external, task_1, LAST_WEDNESDAY]
-              - [pydolphin, task_dependent_external, task_2, last24Hours]
-
-        """
-        from pydolphinscheduler.tasks.dependent import (
-            And,
-            Dependent,
-            DependentDate,
-            DependentItem,
-            Or,
-        )
-
-        def process_dependent_date(dependent_date):
-            """Parse dependent date (Compatible with key and value of DependentDate)."""
-            dependent_date_upper = dependent_date.upper()
-            if hasattr(DependentDate, dependent_date_upper):
-                dependent_date = getattr(DependentDate, dependent_date_upper)
-            return dependent_date
-
-        def get_op_cls(op):
-            cls = None
-            if op.lower() == "and":
-                cls = And
-            elif op.lower() == "or":
-                cls = Or
-            else:
-                raise Exception("OP must be in And or Or, but get: %s" % op)
-            return cls
-
-        def create_dependent_item(source_items):
-            """Parse dependent item.
-
-            project_name: pydolphin
-            process_definition_name: task_dependent_external
-            dependent_task_name: task_1
-            dependent_date: LAST_WEDNESDAY
-            """
-            project_name = source_items["project_name"]
-            process_definition_name = source_items["process_definition_name"]
-            dependent_task_name = source_items["dependent_task_name"]
-            dependent_date = source_items.get("dependent_date", DependentDate.TODAY)
-            dependent_item = DependentItem(
-                project_name=project_name,
-                process_definition_name=process_definition_name,
-                dependent_task_name=dependent_task_name,
-                dependent_date=process_dependent_date(dependent_date),
-            )
-
-            return dependent_item
-
-        second_dependences = []
-        for first_group in task_params["groups"]:
-            second_op = first_group[KEY_OP]
-            dependence_items = []
-            for source_items in first_group["groups"]:
-                dependence_items.append(create_dependent_item(source_items))
-
-            second_dependences.append(get_op_cls(second_op)(*dependence_items))
-
-        first_op = task_params[KEY_OP]
-        dependence = get_op_cls(first_op)(*second_dependences)
-
-        task = Dependent(
-            name=task_params["name"],
-            dependence=dependence,
-        )
-        return task
-
-
-def create_process_definition(yaml_file):
-    """CLI."""
-    YamlProcess.parse(yaml_file)
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/default_config.yaml b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/default_config.yaml
deleted file mode 100644
index 98d7b99fdc..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/default_config.yaml
+++ /dev/null
@@ -1,58 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Setting about Java gateway server
-java_gateway:
-  # The address of Python gateway server start. Set its value to `0.0.0.0` if your Python API run in different
-  # between Python gateway server. It could be be specific to other address like `127.0.0.1` or `localhost`
-  address: 127.0.0.1
-
-  # The port of Python gateway server start. Define which port you could connect to Python gateway server from
-  # Python API models.
-  port: 25333
-
-  # Whether automatically convert Python objects to Java Objects. Default value is ``True``. There is some
-  # performance lost when set to ``True`` but for now pydolphinscheduler do not handle the convert issue between
-  # java and Python, mark it as TODO item in the future.
-  auto_convert: true
-
-# Setting about dolphinscheduler default value, will use the value set below if property do not set, which
-# including ``user``, ``workflow`` 
-default:
-  # Default value for dolphinscheduler's user object
-  user:
-    name: userPythonGateway
-    password: userPythonGateway
-    email: userPythonGateway@dolphinscheduler.com
-    tenant: tenant_pydolphin
-    phone: 11111111111
-    state: 1
-  # Default value for dolphinscheduler's workflow object
-  workflow:
-    project: project-pydolphin
-    tenant: tenant_pydolphin
-    user: userPythonGateway
-    queue: queuePythonGateway
-    worker_group: default
-    # Release state of workflow, default value is ``online`` which mean setting workflow online when it submits
-    # to Java gateway, if you want to set workflow offline set its value to ``offline``
-    release_state: online
-    time_zone: Asia/Shanghai
-    # Warning type of the workflow, default value is ``NONE`` mean do not warn user in any cases of workflow state,
-    # change to ``FAILURE`` if you want to warn users when workflow failed. All available enum value are
-    # ``NONE``, ``SUCCESS``, ``FAILURE``, ``ALL`` 
-    warning_type: NONE
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/__init__.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/__init__.py
deleted file mode 100644
index 37b2e5b61c..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/__init__.py
+++ /dev/null
@@ -1,18 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Init examples package which provides users with pydolphinscheduler examples."""
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/bulk_create_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/bulk_create_example.py
deleted file mode 100644
index 72bdb02243..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/bulk_create_example.py
+++ /dev/null
@@ -1,55 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""
-This example show you how to create workflows in batch mode.
-
-After this example run, we will create 10 workflows named `workflow:<workflow_num>`, and with 3 tasks
-named `task:<task_num>-workflow:<workflow_num>` in each workflow. Task shape as below
-
-task:1-workflow:1 -> task:2-workflow:1 -> task:3-workflow:1
-
-Each workflow is linear since we set `IS_CHAIN=True`, you could change task to parallel by set it to `False`.
-"""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.shell import Shell
-
-NUM_WORKFLOWS = 10
-NUM_TASKS = 5
-# Make sure your tenant exists in your operator system
-TENANT = "exists_tenant"
-# Whether task should dependent on pre one or not
-# False will create workflow with independent task, while True task will dependent on pre-task and dependence
-# link like `pre_task -> current_task -> next_task`, default True
-IS_CHAIN = True
-
-for wf in range(0, NUM_WORKFLOWS):
-    workflow_name = f"workflow:{wf}"
-
-    with ProcessDefinition(name=workflow_name, tenant=TENANT) as pd:
-        for t in range(0, NUM_TASKS):
-            task_name = f"task:{t}-{workflow_name}"
-            command = f"echo This is task {task_name}"
-            task = Shell(name=task_name, command=command)
-
-            if IS_CHAIN and t > 0:
-                pre_task_name = f"task:{t-1}-{workflow_name}"
-                pd.get_one_task_by_name(pre_task_name) >> task
-
-        # We just submit workflow and task definition without set schedule time or run it manually
-        pd.submit()
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_condition_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_condition_example.py
deleted file mode 100644
index 2d73df4b40..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_condition_example.py
+++ /dev/null
@@ -1,59 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-r"""
-A example workflow for task condition.
-
-This example will create five task in single workflow, with four shell task and one condition task. Task
-condition have one upstream which we declare explicit with syntax `parent >> condition`, and three downstream
-automatically set dependence by condition task by passing parameter `condition`. The graph of this workflow
-like:
-pre_task_1 ->                     -> success_branch
-             \                  /
-pre_task_2 ->  -> conditions ->
-             /                  \
-pre_task_3 ->                     -> fail_branch
-.
-"""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.condition import FAILURE, SUCCESS, And, Condition
-from pydolphinscheduler.tasks.shell import Shell
-
-with ProcessDefinition(name="task_condition_example", tenant="tenant_exists") as pd:
-    pre_task_1 = Shell(name="pre_task_1", command="echo pre_task_1")
-    pre_task_2 = Shell(name="pre_task_2", command="echo pre_task_2")
-    pre_task_3 = Shell(name="pre_task_3", command="echo pre_task_3")
-    cond_operator = And(
-        And(
-            SUCCESS(pre_task_1, pre_task_2),
-            FAILURE(pre_task_3),
-        ),
-    )
-
-    success_branch = Shell(name="success_branch", command="echo success_branch")
-    fail_branch = Shell(name="fail_branch", command="echo fail_branch")
-
-    condition = Condition(
-        name="condition",
-        condition=cond_operator,
-        success_task=success_branch,
-        failed_task=fail_branch,
-    )
-    pd.submit()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_datax_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_datax_example.py
deleted file mode 100644
index 94bd449cf7..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_datax_example.py
+++ /dev/null
@@ -1,95 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""
-A example workflow for task datax.
-
-This example will create a workflow named `task_datax`.
-`task_datax` is true workflow define and run task task_datax.
-You can create data sources `first_mysql` and `first_mysql` through UI.
-It creates a task to synchronize datax from the source database to the target database.
-"""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.datax import CustomDataX, DataX
-
-# datax json template
-JSON_TEMPLATE = {
-    "job": {
-        "content": [
-            {
-                "reader": {
-                    "name": "mysqlreader",
-                    "parameter": {
-                        "username": "usr",
-                        "password": "pwd",
-                        "column": ["id", "name", "code", "description"],
-                        "splitPk": "id",
-                        "connection": [
-                            {
-                                "table": ["source_table"],
-                                "jdbcUrl": ["jdbc:mysql://127.0.0.1:3306/source_db"],
-                            }
-                        ],
-                    },
-                },
-                "writer": {
-                    "name": "mysqlwriter",
-                    "parameter": {
-                        "writeMode": "insert",
-                        "username": "usr",
-                        "password": "pwd",
-                        "column": ["id", "name"],
-                        "connection": [
-                            {
-                                "jdbcUrl": "jdbc:mysql://127.0.0.1:3306/target_db",
-                                "table": ["target_table"],
-                            }
-                        ],
-                    },
-                },
-            }
-        ],
-        "setting": {
-            "errorLimit": {"percentage": 0, "record": 0},
-            "speed": {"channel": 1, "record": 1000},
-        },
-    }
-}
-
-with ProcessDefinition(
-    name="task_datax_example",
-    tenant="tenant_exists",
-) as pd:
-    # This task synchronizes the data in `t_ds_project`
-    # of `first_mysql` database to `target_project` of `second_mysql` database.
-    # You have to make sure data source named `first_mysql` and `second_mysql` exists
-    # in your environment.
-    task1 = DataX(
-        name="task_datax",
-        datasource_name="first_mysql",
-        datatarget_name="second_mysql",
-        sql="select id, name, code, description from source_table",
-        target_table="target_table",
-    )
-
-    # You can custom json_template of datax to sync data. This task create a new
-    # datax job same as task1, transfer record from `first_mysql` to `second_mysql`
-    task2 = CustomDataX(name="task_custom_datax", json=str(JSON_TEMPLATE))
-    pd.run()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_dependent_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_dependent_example.py
deleted file mode 100644
index db53bcc9f3..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_dependent_example.py
+++ /dev/null
@@ -1,74 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-r"""
-A example workflow for task dependent.
-
-This example will create two workflows named `task_dependent` and `task_dependent_external`.
-`task_dependent` is true workflow define and run task dependent, while `task_dependent_external`
-define outside workflow and task from dependent.
-
-After this script submit, we would get workflow as below:
-
-task_dependent_external:
-
-task_1
-task_2
-task_3
-
-task_dependent:
-
-task_dependent(this task dependent on task_dependent_external.task_1 and task_dependent_external.task_2).
-"""
-from pydolphinscheduler import configuration
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.dependent import And, Dependent, DependentItem, Or
-from pydolphinscheduler.tasks.shell import Shell
-
-with ProcessDefinition(
-    name="task_dependent_external",
-    tenant="tenant_exists",
-) as pd:
-    task_1 = Shell(name="task_1", command="echo task 1")
-    task_2 = Shell(name="task_2", command="echo task 2")
-    task_3 = Shell(name="task_3", command="echo task 3")
-    pd.submit()
-
-with ProcessDefinition(
-    name="task_dependent_example",
-    tenant="tenant_exists",
-) as pd:
-    task = Dependent(
-        name="task_dependent",
-        dependence=And(
-            Or(
-                DependentItem(
-                    project_name=configuration.WORKFLOW_PROJECT,
-                    process_definition_name="task_dependent_external",
-                    dependent_task_name="task_1",
-                ),
-                DependentItem(
-                    project_name=configuration.WORKFLOW_PROJECT,
-                    process_definition_name="task_dependent_external",
-                    dependent_task_name="task_2",
-                ),
-            )
-        ),
-    )
-    pd.submit()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_dvc_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_dvc_example.py
deleted file mode 100644
index 2b93cd14b7..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_dvc_example.py
+++ /dev/null
@@ -1,52 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""A example workflow for task dvc."""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks import DVCDownload, DVCInit, DVCUpload
-
-repository = "git@github.com:<YOUR-NAME-OR-ORG>/dvc-data-repository-example.git"
-
-with ProcessDefinition(
-    name="task_dvc_example",
-    tenant="tenant_exists",
-) as pd:
-    init_task = DVCInit(name="init_dvc", repository=repository, store_url="~/dvc_data")
-    upload_task = DVCUpload(
-        name="upload_data",
-        repository=repository,
-        data_path_in_dvc_repository="iris",
-        data_path_in_worker="~/source/iris",
-        version="v1",
-        message="upload iris data v1",
-    )
-
-    download_task = DVCDownload(
-        name="download_data",
-        repository=repository,
-        data_path_in_dvc_repository="iris",
-        data_path_in_worker="~/target/iris",
-        version="v1",
-    )
-
-    init_task >> upload_task >> download_task
-
-    pd.run()
-
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_flink_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_flink_example.py
deleted file mode 100644
index 1e8a040c65..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_flink_example.py
+++ /dev/null
@@ -1,33 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""A example workflow for task flink."""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.flink import DeployMode, Flink, ProgramType
-
-with ProcessDefinition(name="task_flink_example", tenant="tenant_exists") as pd:
-    task = Flink(
-        name="task_flink",
-        main_class="org.apache.flink.streaming.examples.wordcount.WordCount",
-        main_package="WordCount.jar",
-        program_type=ProgramType.JAVA,
-        deploy_mode=DeployMode.LOCAL,
-    )
-    pd.run()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_map_reduce_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_map_reduce_example.py
deleted file mode 100644
index 39b204f82a..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_map_reduce_example.py
+++ /dev/null
@@ -1,34 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""A example workflow for task mr."""
-
-from pydolphinscheduler.core.engine import ProgramType
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.map_reduce import MR
-
-with ProcessDefinition(name="task_map_reduce_example", tenant="tenant_exists") as pd:
-    task = MR(
-        name="task_mr",
-        main_class="wordcount",
-        main_package="hadoop-mapreduce-examples-3.3.1.jar",
-        program_type=ProgramType.JAVA,
-        main_args="/dolphinscheduler/tenant_exists/resources/file.txt /output/ds",
-    )
-    pd.run()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_mlflow_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_mlflow_example.py
deleted file mode 100644
index c2734bcf81..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_mlflow_example.py
+++ /dev/null
@@ -1,93 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""A example workflow for task mlflow."""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.mlflow import (
-    MLflowDeployType,
-    MLflowModels,
-    MLFlowProjectsAutoML,
-    MLFlowProjectsBasicAlgorithm,
-    MLFlowProjectsCustom,
-)
-
-mlflow_tracking_uri = "http://127.0.0.1:5000"
-
-with ProcessDefinition(
-    name="task_mlflow_example",
-    tenant="tenant_exists",
-) as pd:
-
-    # run custom mlflow project to train model
-    train_custom = MLFlowProjectsCustom(
-        name="train_xgboost_native",
-        repository="https://github.com/mlflow/mlflow#examples/xgboost/xgboost_native",
-        mlflow_tracking_uri=mlflow_tracking_uri,
-        parameters="-P learning_rate=0.2 -P colsample_bytree=0.8 -P subsample=0.9",
-        experiment_name="xgboost",
-    )
-
-    # run automl to train model
-    train_automl = MLFlowProjectsAutoML(
-        name="train_automl",
-        mlflow_tracking_uri=mlflow_tracking_uri,
-        parameters="time_budget=30;estimator_list=['lgbm']",
-        experiment_name="automl_iris",
-        model_name="iris_A",
-        automl_tool="flaml",
-        data_path="/data/examples/iris",
-    )
-
-    # Using DOCKER to deploy model from train_automl
-    deploy_docker = MLflowModels(
-        name="deploy_docker",
-        model_uri="models:/iris_A/Production",
-        mlflow_tracking_uri=mlflow_tracking_uri,
-        deploy_mode=MLflowDeployType.DOCKER,
-        port=7002,
-    )
-
-    train_automl >> deploy_docker
-
-    # run lightgbm to train model
-    train_basic_algorithm = MLFlowProjectsBasicAlgorithm(
-        name="train_basic_algorithm",
-        mlflow_tracking_uri=mlflow_tracking_uri,
-        parameters="n_estimators=200;learning_rate=0.2",
-        experiment_name="basic_algorithm_iris",
-        model_name="iris_B",
-        algorithm="lightgbm",
-        data_path="/data/examples/iris",
-        search_params="max_depth=[5, 10];n_estimators=[100, 200]",
-    )
-
-    # Using MLFLOW to deploy model from training lightgbm project
-    deploy_mlflow = MLflowModels(
-        name="deploy_mlflow",
-        model_uri="models:/iris_B/Production",
-        mlflow_tracking_uri=mlflow_tracking_uri,
-        deploy_mode=MLflowDeployType.MLFLOW,
-        port=7001,
-    )
-
-    train_basic_algorithm >> deploy_mlflow
-
-    pd.submit()
-
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_openmldb_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_openmldb_example.py
deleted file mode 100644
index 5b90091ecf..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_openmldb_example.py
+++ /dev/null
@@ -1,43 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""A example workflow for task openmldb."""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.openmldb import OpenMLDB
-
-sql = """USE demo_db;
-set @@job_timeout=200000;
-LOAD DATA INFILE 'file:///tmp/train_sample.csv'
-INTO TABLE talkingdata OPTIONS(mode='overwrite');
-"""
-
-with ProcessDefinition(
-    name="task_openmldb_example",
-    tenant="tenant_exists",
-) as pd:
-    task_openmldb = OpenMLDB(
-        name="task_openmldb",
-        zookeeper="127.0.0.1:2181",
-        zookeeper_path="/openmldb",
-        execute_mode="offline",
-        sql=sql,
-    )
-
-    pd.run()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_pytorch_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_pytorch_example.py
deleted file mode 100644
index 6559c9ac65..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_pytorch_example.py
+++ /dev/null
@@ -1,62 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""A example workflow for task pytorch."""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.pytorch import Pytorch
-
-with ProcessDefinition(
-    name="task_pytorch_example",
-    tenant="tenant_exists",
-) as pd:
-
-    # run project with existing environment
-    task_existing_env = Pytorch(
-        name="task_existing_env",
-        script="main.py",
-        script_params="--dry-run --no-cuda",
-        project_path="https://github.com/pytorch/examples#mnist",
-        python_command="/home/anaconda3/envs/pytorch/bin/python3",
-    )
-
-    # run project with creating conda environment
-    task_conda_env = Pytorch(
-        name="task_conda_env",
-        script="main.py",
-        script_params="--dry-run --no-cuda",
-        project_path="https://github.com/pytorch/examples#mnist",
-        is_create_environment=True,
-        python_env_tool="conda",
-        requirements="requirements.txt",
-        conda_python_version="3.7",
-    )
-
-    # run project with creating virtualenv environment
-    task_virtualenv_env = Pytorch(
-        name="task_virtualenv_env",
-        script="main.py",
-        script_params="--dry-run --no-cuda",
-        project_path="https://github.com/pytorch/examples#mnist",
-        is_create_environment=True,
-        python_env_tool="virtualenv",
-        requirements="requirements.txt",
-    )
-
-    pd.submit()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_sagemaker_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_sagemaker_example.py
deleted file mode 100644
index b056f61a63..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_sagemaker_example.py
+++ /dev/null
@@ -1,46 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""A example workflow for task sagemaker."""
-import json
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.sagemaker import SageMaker
-
-sagemaker_request_data = {
-    "ParallelismConfiguration": {"MaxParallelExecutionSteps": 1},
-    "PipelineExecutionDescription": "test Pipeline",
-    "PipelineExecutionDisplayName": "AbalonePipeline",
-    "PipelineName": "AbalonePipeline",
-    "PipelineParameters": [
-        {"Name": "ProcessingInstanceType", "Value": "ml.m4.xlarge"},
-        {"Name": "ProcessingInstanceCount", "Value": "2"},
-    ],
-}
-
-with ProcessDefinition(
-    name="task_sagemaker_example",
-    tenant="tenant_exists",
-) as pd:
-    task_sagemaker = SageMaker(
-        name="task_sagemaker",
-        sagemaker_request_json=json.dumps(sagemaker_request_data, indent=2),
-    )
-
-    pd.run()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_spark_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_spark_example.py
deleted file mode 100644
index 594d95f55a..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_spark_example.py
+++ /dev/null
@@ -1,33 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-"""A example workflow for task spark."""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.spark import DeployMode, ProgramType, Spark
-
-with ProcessDefinition(name="task_spark_example", tenant="tenant_exists") as pd:
-    task = Spark(
-        name="task_spark",
-        main_class="org.apache.spark.examples.SparkPi",
-        main_package="spark-examples_2.12-3.2.0.jar",
-        program_type=ProgramType.JAVA,
-        deploy_mode=DeployMode.LOCAL,
-    )
-    pd.run()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_switch_example.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_switch_example.py
deleted file mode 100644
index 7966af320e..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/task_switch_example.py
+++ /dev/null
@@ -1,51 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# [start workflow_declare]
-r"""
-A example workflow for task switch.
-
-This example will create four task in single workflow, with three shell task and one switch task. Task switch
-have one upstream which we declare explicit with syntax `parent >> switch`, and two downstream automatically
-set dependence by switch task by passing parameter `condition`. The graph of this workflow like:
-                      --> switch_child_1
-                    /
-parent -> switch ->
-                    \
-                      --> switch_child_2
-.
-"""
-
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-from pydolphinscheduler.tasks.shell import Shell
-from pydolphinscheduler.tasks.switch import Branch, Default, Switch, SwitchCondition
-
-with ProcessDefinition(
-    name="task_switch_example", tenant="tenant_exists", param={"var": "1"}
-) as pd:
-    parent = Shell(name="parent", command="echo parent")
-    switch_child_1 = Shell(name="switch_child_1", command="echo switch_child_1")
-    switch_child_2 = Shell(name="switch_child_2", command="echo switch_child_2")
-    switch_condition = SwitchCondition(
-        Branch(condition="${var} > 1", task=switch_child_1),
-        Default(task=switch_child_2),
-    )
-
-    switch = Switch(name="switch", condition=switch_condition)
-    parent >> switch
-    pd.submit()
-# [end workflow_declare]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial.py
deleted file mode 100644
index 0478e68519..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial.py
+++ /dev/null
@@ -1,68 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-r"""
-A tutorial example take you to experience pydolphinscheduler.
-
-After tutorial.py file submit to Apache DolphinScheduler server a DAG would be create,
-and workflow DAG graph as below:
-
-                  --> task_child_one
-                /                    \
-task_parent -->                        -->  task_union
-                \                    /
-                  --> task_child_two
-
-it will instantiate and run all the task it have.
-"""
-
-# [start tutorial]
-# [start package_import]
-# Import ProcessDefinition object to define your workflow attributes
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-
-# Import task Shell object cause we would create some shell tasks later
-from pydolphinscheduler.tasks.shell import Shell
-
-# [end package_import]
-
-# [start workflow_declare]
-with ProcessDefinition(
-    name="tutorial",
-    schedule="0 0 0 * * ? *",
-    start_time="2021-01-01",
-    tenant="tenant_exists",
-) as pd:
-    # [end workflow_declare]
-    # [start task_declare]
-    task_parent = Shell(name="task_parent", command="echo hello pydolphinscheduler")
-    task_child_one = Shell(name="task_child_one", command="echo 'child one'")
-    task_child_two = Shell(name="task_child_two", command="echo 'child two'")
-    task_union = Shell(name="task_union", command="echo union")
-    # [end task_declare]
-
-    # [start task_relation_declare]
-    task_group = [task_child_one, task_child_two]
-    task_parent.set_downstream(task_group)
-
-    task_union << task_group
-    # [end task_relation_declare]
-
-    # [start submit_or_run]
-    pd.run()
-    # [end submit_or_run]
-# [end tutorial]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial_decorator.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial_decorator.py
deleted file mode 100644
index 986c1bbb6e..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial_decorator.py
+++ /dev/null
@@ -1,91 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-r"""
-A tutorial example take you to experience pydolphinscheduler.
-
-After tutorial.py file submit to Apache DolphinScheduler server a DAG would be create,
-and workflow DAG graph as below:
-
-                  --> task_child_one
-                /                    \
-task_parent -->                        -->  task_union
-                \                    /
-                  --> task_child_two
-
-it will instantiate and run all the task it have.
-"""
-
-# [start tutorial]
-# [start package_import]
-# Import ProcessDefinition object to define your workflow attributes
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-
-# Import task Shell object cause we would create some shell tasks later
-from pydolphinscheduler.tasks.func_wrap import task
-
-# [end package_import]
-
-
-# [start task_declare]
-@task
-def task_parent():
-    """First task in this workflow."""
-    print("echo hello pydolphinscheduler")
-
-
-@task
-def task_child_one():
-    """Child task will be run parallel after task ``task_parent`` finished."""
-    print("echo 'child one'")
-
-
-@task
-def task_child_two():
-    """Child task will be run parallel after task ``task_parent`` finished."""
-    print("echo 'child two'")
-
-
-@task
-def task_union():
-    """Last task in this workflow."""
-    print("echo union")
-
-
-# [end task_declare]
-
-
-# [start workflow_declare]
-with ProcessDefinition(
-    name="tutorial_decorator",
-    schedule="0 0 0 * * ? *",
-    start_time="2021-01-01",
-    tenant="tenant_exists",
-) as pd:
-    # [end workflow_declare]
-
-    # [start task_relation_declare]
-    task_group = [task_child_one(), task_child_two()]
-    task_parent().set_downstream(task_group)
-
-    task_union() << task_group
-    # [end task_relation_declare]
-
-    # [start submit_or_run]
-    pd.run()
-    # [end submit_or_run]
-# [end tutorial]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial_resource_plugin.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial_resource_plugin.py
deleted file mode 100644
index 5b02022ee9..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/examples/tutorial_resource_plugin.py
+++ /dev/null
@@ -1,64 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-r"""
-A tutorial example take you to experience pydolphinscheduler resource plugin.
-
-Resource plug-ins can be defined in workflows and tasks
-
-it will instantiate and run all the task it have.
-"""
-import os
-from pathlib import Path
-
-# [start tutorial_resource_plugin]
-# [start package_import]
-# Import ProcessDefinition object to define your workflow attributes
-from pydolphinscheduler.core.process_definition import ProcessDefinition
-
-# Import task Shell object cause we would create some shell tasks later
-from pydolphinscheduler.resources_plugin.local import Local
-from pydolphinscheduler.tasks.shell import Shell
-
-# [end package_import]
-
-# [start workflow_declare]
-with ProcessDefinition(
-    name="tutorial_resource_plugin",
-    schedule="0 0 0 * * ? *",
-    start_time="2021-01-01",
-    tenant="tenant_exists",
-    resource_plugin=Local("/tmp"),
-) as process_definition:
-    # [end workflow_declare]
-    # [start task_declare]
-    file = "resource.sh"
-    path = Path("/tmp").joinpath(file)
-    with open(str(path), "w") as f:
-        f.write("echo tutorial resource plugin")
-    task_parent = Shell(
-        name="local-resource-example",
-        command=file,
-    )
-    print(task_parent.task_params)
-    os.remove(path)
-    # [end task_declare]
-
-    # [start submit_or_run]
-    process_definition.run()
-    # [end submit_or_run]
-# [end tutorial_resource_plugin]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/exceptions.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/exceptions.py
deleted file mode 100644
index 5b0d1bb61f..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/exceptions.py
+++ /dev/null
@@ -1,46 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Exceptions for pydolphinscheduler."""
-
-
-class PyDSBaseException(Exception):
-    """Base exception for pydolphinscheduler."""
-
-
-class PyDSParamException(PyDSBaseException):
-    """Exception for pydolphinscheduler parameter verify error."""
-
-
-class PyDSTaskNoFoundException(PyDSBaseException):
-    """Exception for pydolphinscheduler workflow task no found error."""
-
-
-class PyDSJavaGatewayException(PyDSBaseException):
-    """Exception for pydolphinscheduler Java gateway error."""
-
-
-class PyDSProcessDefinitionNotAssignException(PyDSBaseException):
-    """Exception for pydolphinscheduler process definition not assign error."""
-
-
-class PyDSConfException(PyDSBaseException):
-    """Exception for pydolphinscheduler configuration error."""
-
-
-class PyResPluginException(PyDSBaseException):
-    """Exception for pydolphinscheduler resource plugin error."""
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/java_gateway.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/java_gateway.py
deleted file mode 100644
index 54bb0a38b2..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/java_gateway.py
+++ /dev/null
@@ -1,308 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Module java gateway, contain gateway behavior."""
-
-import contextlib
-from logging import getLogger
-from typing import Any, Optional
-
-from py4j.java_collections import JavaMap
-from py4j.java_gateway import GatewayParameters, JavaGateway
-from py4j.protocol import Py4JError
-
-from pydolphinscheduler import __version__, configuration
-from pydolphinscheduler.constants import JavaGatewayDefault
-from pydolphinscheduler.exceptions import PyDSJavaGatewayException
-
-logger = getLogger(__name__)
-
-
-def launch_gateway(
-    address: Optional[str] = None,
-    port: Optional[int] = None,
-    auto_convert: Optional[bool] = True,
-) -> JavaGateway:
-    """Launch java gateway to pydolphinscheduler.
-
-    TODO Note that automatic conversion makes calling Java methods slightly less efficient because
-    in the worst case, Py4J needs to go through all registered converters for all parameters.
-    This is why automatic conversion is disabled by default.
-    """
-    gateway_parameters = GatewayParameters(
-        address=address or configuration.JAVA_GATEWAY_ADDRESS,
-        port=port or configuration.JAVA_GATEWAY_PORT,
-        auto_convert=auto_convert or configuration.JAVA_GATEWAY_AUTO_CONVERT,
-    )
-    gateway = JavaGateway(gateway_parameters=gateway_parameters)
-    return gateway
-
-
-def gateway_result_checker(
-    result: JavaMap,
-    msg_check: Optional[str] = JavaGatewayDefault.RESULT_MESSAGE_SUCCESS,
-) -> Any:
-    """Check weather java gateway result success or not."""
-    if (
-        result[JavaGatewayDefault.RESULT_STATUS_KEYWORD].toString()
-        != JavaGatewayDefault.RESULT_STATUS_SUCCESS
-    ):
-        raise PyDSJavaGatewayException("Failed when try to got result for java gateway")
-    if (
-        msg_check is not None
-        and result[JavaGatewayDefault.RESULT_MESSAGE_KEYWORD] != msg_check
-    ):
-        raise PyDSJavaGatewayException("Get result state not success.")
-    return result
-
-
-class JavaGate:
-    """Launch java gateway to pydolphin scheduler."""
-
-    def __init__(
-        self,
-        address: Optional[str] = None,
-        port: Optional[int] = None,
-        auto_convert: Optional[bool] = True,
-    ):
-        self.java_gateway = launch_gateway(address, port, auto_convert)
-        gateway_version = "unknown"
-        with contextlib.suppress(Py4JError):
-            # 1. Java gateway version is too old: doesn't have method 'getGatewayVersion()'
-            # 2. Error connecting to Java gateway
-            gateway_version = self.get_gateway_version()
-        if gateway_version != __version__:
-            logger.warning(
-                f"Using unmatched version of pydolphinscheduler (version {__version__}) "
-                f"and Java gateway (version {gateway_version}) may cause errors. "
-                "We strongly recommend you to find the matched version "
-                "(check: https://pypi.org/project/apache-dolphinscheduler)"
-            )
-
-    def get_gateway_version(self):
-        """Get the java gateway version, expected to be equal with pydolphinscheduler."""
-        return self.java_gateway.entry_point.getGatewayVersion()
-
-    def get_datasource_info(self, name: str):
-        """Get datasource info through java gateway."""
-        return self.java_gateway.entry_point.getDatasourceInfo(name)
-
-    def get_resources_file_info(self, program_type: str, main_package: str):
-        """Get resources file info through java gateway."""
-        return self.java_gateway.entry_point.getResourcesFileInfo(
-            program_type, main_package
-        )
-
-    def create_or_update_resource(
-        self, user_name: str, name: str, content: str, description: Optional[str] = None
-    ):
-        """Create or update resource through java gateway."""
-        return self.java_gateway.entry_point.createOrUpdateResource(
-            user_name, name, description, content
-        )
-
-    def query_resources_file_info(self, user_name: str, name: str):
-        """Get resources file info through java gateway."""
-        return self.java_gateway.entry_point.queryResourcesFileInfo(user_name, name)
-
-    def query_environment_info(self, name: str):
-        """Get environment info through java gateway."""
-        return self.java_gateway.entry_point.getEnvironmentInfo(name)
-
-    def get_code_and_version(
-        self, project_name: str, process_definition_name: str, task_name: str
-    ):
-        """Get code and version through java gateway."""
-        return self.java_gateway.entry_point.getCodeAndVersion(
-            project_name, process_definition_name, task_name
-        )
-
-    def create_or_grant_project(
-        self, user: str, name: str, description: Optional[str] = None
-    ):
-        """Create or grant project through java gateway."""
-        return self.java_gateway.entry_point.createOrGrantProject(
-            user, name, description
-        )
-
-    def query_project_by_name(self, user: str, name: str):
-        """Query project through java gateway."""
-        return self.java_gateway.entry_point.queryProjectByName(user, name)
-
-    def update_project(
-        self, user: str, project_code: int, project_name: str, description: str
-    ):
-        """Update project through java gateway."""
-        return self.java_gateway.entry_point.updateProject(
-            user, project_code, project_name, description
-        )
-
-    def delete_project(self, user: str, code: int):
-        """Delete project through java gateway."""
-        return self.java_gateway.entry_point.deleteProject(user, code)
-
-    def create_tenant(
-        self, tenant_name: str, queue_name: str, description: Optional[str] = None
-    ):
-        """Create tenant through java gateway."""
-        return self.java_gateway.entry_point.createTenant(
-            tenant_name, description, queue_name
-        )
-
-    def query_tenant(self, tenant_code: str):
-        """Query tenant through java gateway."""
-        return self.java_gateway.entry_point.queryTenantByCode(tenant_code)
-
-    def grant_tenant_to_user(self, user_name: str, tenant_code: str):
-        """Grant tenant to user through java gateway."""
-        return self.java_gateway.entry_point.grantTenantToUser(user_name, tenant_code)
-
-    def update_tenant(
-        self,
-        user: str,
-        tenant_id: int,
-        code: str,
-        queue_id: int,
-        description: Optional[str] = None,
-    ):
-        """Update tenant through java gateway."""
-        return self.java_gateway.entry_point.updateTenant(
-            user, tenant_id, code, queue_id, description
-        )
-
-    def delete_tenant(self, user: str, tenant_id: int):
-        """Delete tenant through java gateway."""
-        return self.java_gateway.entry_point.deleteTenantById(user, tenant_id)
-
-    def create_user(
-        self,
-        name: str,
-        password: str,
-        email: str,
-        phone: str,
-        tenant: str,
-        queue: str,
-        status: int,
-    ):
-        """Create user through java gateway."""
-        return self.java_gateway.entry_point.createUser(
-            name, password, email, phone, tenant, queue, status
-        )
-
-    def query_user(self, user_id: int):
-        """Query user through java gateway."""
-        return self.java_gateway.queryUser(user_id)
-
-    def update_user(
-        self,
-        name: str,
-        password: str,
-        email: str,
-        phone: str,
-        tenant: str,
-        queue: str,
-        status: int,
-    ):
-        """Update user through java gateway."""
-        return self.java_gateway.entry_point.updateUser(
-            name, password, email, phone, tenant, queue, status
-        )
-
-    def delete_user(self, name: str, user_id: int):
-        """Delete user through java gateway."""
-        return self.java_gateway.entry_point.deleteUser(name, user_id)
-
-    def get_dependent_info(
-        self,
-        project_name: str,
-        process_definition_name: str,
-        task_name: Optional[str] = None,
-    ):
-        """Get dependent info through java gateway."""
-        return self.java_gateway.entry_point.getDependentInfo(
-            project_name, process_definition_name, task_name
-        )
-
-    def get_process_definition_info(
-        self, user_name: str, project_name: str, process_definition_name: str
-    ):
-        """Get process definition info through java gateway."""
-        return self.java_gateway.entry_point.getProcessDefinitionInfo(
-            user_name, project_name, process_definition_name
-        )
-
-    def create_or_update_process_definition(
-        self,
-        user_name: str,
-        project_name: str,
-        name: str,
-        description: str,
-        global_params: str,
-        warning_type: str,
-        warning_group_id: int,
-        timeout: int,
-        worker_group: str,
-        tenant_code: str,
-        release_state: int,
-        task_relation_json: str,
-        task_definition_json: str,
-        schedule: Optional[str] = None,
-        other_params_json: Optional[str] = None,
-        execution_type: Optional[str] = None,
-    ):
-        """Create or update process definition through java gateway."""
-        return self.java_gateway.entry_point.createOrUpdateProcessDefinition(
-            user_name,
-            project_name,
-            name,
-            description,
-            global_params,
-            schedule,
-            warning_type,
-            warning_group_id,
-            timeout,
-            worker_group,
-            tenant_code,
-            release_state,
-            task_relation_json,
-            task_definition_json,
-            other_params_json,
-            execution_type,
-        )
-
-    def exec_process_instance(
-        self,
-        user_name: str,
-        project_name: str,
-        process_definition_name: str,
-        cron_time: str,
-        worker_group: str,
-        warning_type: str,
-        warning_group_id: int,
-        timeout: int,
-    ):
-        """Exec process instance through java gateway."""
-        return self.java_gateway.entry_point.execProcessInstance(
-            user_name,
-            project_name,
-            process_definition_name,
-            cron_time,
-            worker_group,
-            warning_type,
-            warning_group_id,
-            timeout,
-        )
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/__init__.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/__init__.py
deleted file mode 100644
index b289954caa..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/__init__.py
+++ /dev/null
@@ -1,36 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Init Models package, keeping object related to DolphinScheduler covert from Java Gateway Service."""
-
-from pydolphinscheduler.models.base import Base
-from pydolphinscheduler.models.base_side import BaseSide
-from pydolphinscheduler.models.project import Project
-from pydolphinscheduler.models.queue import Queue
-from pydolphinscheduler.models.tenant import Tenant
-from pydolphinscheduler.models.user import User
-from pydolphinscheduler.models.worker_group import WorkerGroup
-
-__all__ = [
-    "Base",
-    "BaseSide",
-    "Project",
-    "Tenant",
-    "User",
-    "Queue",
-    "WorkerGroup",
-]
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/base.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/base.py
deleted file mode 100644
index 2647714af0..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/base.py
+++ /dev/null
@@ -1,74 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""DolphinScheduler Base object."""
-
-from typing import Dict, Optional
-
-# from pydolphinscheduler.models.user import User
-from pydolphinscheduler.utils.string import attr2camel
-
-
-class Base:
-    """DolphinScheduler Base object."""
-
-    # Object key attribute, to test whether object equals and so on.
-    _KEY_ATTR: set = {"name", "description"}
-
-    # Object defines attribute, use when needs to communicate with Java gateway server.
-    _DEFINE_ATTR: set = set()
-
-    # Object default attribute, will add those attribute to `_DEFINE_ATTR` when init assign missing.
-    _DEFAULT_ATTR: Dict = {}
-
-    def __init__(self, name: str, description: Optional[str] = None):
-        self.name = name
-        self.description = description
-
-    def __repr__(self) -> str:
-        return f'<{type(self).__name__}: name="{self.name}">'
-
-    def __eq__(self, other):
-        return type(self) == type(other) and all(
-            getattr(self, a, None) == getattr(other, a, None) for a in self._KEY_ATTR
-        )
-
-    def get_define_custom(
-        self, camel_attr: bool = True, custom_attr: set = None
-    ) -> Dict:
-        """Get object definition attribute by given attr set."""
-        content = {}
-        for attr in custom_attr:
-            val = getattr(self, attr, None)
-            if camel_attr:
-                content[attr2camel(attr)] = val
-            else:
-                content[attr] = val
-        return content
-
-    def get_define(self, camel_attr: bool = True) -> Dict:
-        """Get object definition attribute communicate to Java gateway server.
-
-        use attribute `self._DEFINE_ATTR` to determine which attributes should including when
-        object tries to communicate with Java gateway server.
-        """
-        content = self.get_define_custom(camel_attr, self._DEFINE_ATTR)
-        update_default = {
-            k: self._DEFAULT_ATTR.get(k) for k in self._DEFAULT_ATTR if k not in content
-        }
-        content.update(update_default)
-        return content
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/base_side.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/base_side.py
deleted file mode 100644
index 99b4007a85..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/base_side.py
+++ /dev/null
@@ -1,48 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""Module for models object."""
-
-from typing import Optional
-
-from pydolphinscheduler import configuration
-from pydolphinscheduler.models import Base
-
-
-class BaseSide(Base):
-    """Base class for models object, it declare base behavior for them."""
-
-    def __init__(self, name: str, description: Optional[str] = None):
-        super().__init__(name, description)
-
-    @classmethod
-    def create_if_not_exists(
-        cls,
-        # TODO comment for avoiding cycle import
-        # user: Optional[User] = ProcessDefinitionDefault.USER
-        user=configuration.WORKFLOW_USER,
-    ):
-        """Create Base if not exists."""
-        raise NotImplementedError
-
-    def delete_all(self):
-        """Delete all method."""
-        if not self:
-            return
-        list_pro = [key for key in self.__dict__.keys()]
-        for key in list_pro:
-            self.__delattr__(key)
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/project.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/project.py
deleted file mode 100644
index 678332ba3b..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/project.py
+++ /dev/null
@@ -1,72 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-"""DolphinScheduler Project object."""
-
-from typing import Optional
-
-from pydolphinscheduler import configuration
-from pydolphinscheduler.java_gateway import JavaGate
-from pydolphinscheduler.models import BaseSide
-
-
-class Project(BaseSide):
-    """DolphinScheduler Project object."""
-
-    def __init__(
-        self,
-        name: str = configuration.WORKFLOW_PROJECT,
-        description: Optional[str] = None,
-        code: Optional[int] = None,
-    ):
-        super().__init__(name, description)
-        self.code = code
-
-    def create_if_not_exists(self, user=configuration.USER_NAME) -> None:
-        """Create Project if not exists."""
-        JavaGate().create_or_grant_project(user, self.name, self.description)
-        # TODO recover result checker
-        # gateway_result_checker(result, None)
-
-    @classmethod
-    def get_project_by_name(cls, user=configuration.USER_NAME, name=None) -> "Project":
-        """Get Project by name."""
-        project = JavaGate().query_project_by_name(user, name)
-        if project is None:
-            return cls()
-        return cls(
-            name=project.getName(),
-            description=project.getDescription(),
-            code=project.getCode(),
-        )
-
-    def update(
-        self,
-        user=configuration.USER_NAME,
-        project_code=None,
-        project_name=None,
-        description=None,
-    ) -> None:
-        """Update Project."""
-        JavaGate().update_project(user, project_code, project_name, description)
-        self.name = project_name
-        self.description = description
-
-    def delete(self, user=configuration.USER_NAME) -> None:
-        """Delete Project."""
-        JavaGate().delete_project(user, self.code)
-        self.delete_all()
diff --git a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/queue.py b/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/queue.py
deleted file mode 100644
index e6da2594c8..0000000000
--- a/dolphinscheduler-python/pydolphinscheduler/src/pydolphinscheduler/models/queue.py
+++ /dev/null
@@ -1,34 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
... 12448 lines suppressed ...