You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by po...@apache.org on 2021/04/05 21:47:59 UTC

[airflow] branch v2-0-test updated (663985d -> 5f5a914)

This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


    from 663985d  Fix bug in airflow.stats timing that broke dogstatsd mode (#15132)
     new 530abe5  The PYTHON_MAJOR_MINOR build arg has been deprecated (#15054)
     new d7c45f3  The --force-pull-images is restored in breeze (#15063)
     new 17b89f3  Parallelize build of documentation. (#15062)
     new 7a5c26c  Add timeout to test jobs to prevent hanging docker containers (#15078)
     new f840d16  Better handling of docker command (#15080)
     new 6e17675  Mark the test_scheduler_task_start_date as quarantined (#15086)
     new 30e5584  Fixes failing docs upload on master (#15148)
     new 321237a  Increase timeout for building the docs (#15157)
     new 3c24a31  Merges prepare/test provider packages into two jobs (#15152)
     new 657e707  Finish quarantine for test_should_force_kill_process (#15081)
     new 65c3ecf  Adds Blinker dependency which is missing after recent changes (#15182)
     new 3e9633e  Bump K8S versions to latest supported ones. (#15156)
     new 25caba7  Fixes problem when Pull Request is `weird` - has null head_repo (#15189)
     new 3d17216  Removes unused CI feature of printing output on error (#15190)
     new e87cd1f  Merges quarantined tests into single job (#15153)
     new 5f5a914  Updates 3.6 limits for latest versions of a few libraries (#15209)

The 16 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/actions/cancel-workflow-runs               |   2 +-
 .github/workflows/ci.yml                           |  89 +++--
 .gitignore                                         |   5 +
 BREEZE.rst                                         |   8 +-
 IMAGES.rst                                         |  15 +-
 README.md                                          |   2 +-
 airflow/models/baseoperator.py                     |   2 +-
 .../providers/google/cloud/operators/dataflow.py   |   6 +-
 breeze                                             |   2 +-
 breeze-complete                                    |   6 +-
 .../enter_breeze_provider_package_tests.sh         |   2 +-
 docs/apache-airflow/installation.rst               |   2 +-
 docs/build_docs.py                                 | 417 ++++++++++++++++++---
 docs/exts/docs_build/code_utils.py                 |  32 ++
 docs/exts/docs_build/docs_builder.py               | 317 ++++++++++++----
 docs/exts/docs_build/errors.py                     |  39 +-
 docs/exts/docs_build/github_action_utils.py        |   1 +
 docs/exts/docs_build/spelling_checks.py            |  47 ++-
 provider_packages/README.rst                       |  53 +++
 scripts/ci/docs/ci_docs.sh                         |  15 +-
 scripts/ci/kubernetes/kind-cluster-conf.yaml       |  15 +-
 scripts/ci/libraries/_all_libs.sh                  |   2 +
 scripts/ci/libraries/_build_images.sh              |  45 ++-
 scripts/ci/libraries/_docker_engine_resources.sh   |  12 +-
 scripts/ci/libraries/_initialization.sh            |   9 +-
 scripts/ci/libraries/_kind.sh                      |   2 +-
 scripts/ci/libraries/_parallel.sh                  |  35 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |  28 +-
 scripts/ci/libraries/_runs.sh                      |  10 +-
 scripts/ci/libraries/_start_end.sh                 |   2 +-
 scripts/ci/libraries/_testing.sh                   | 116 ++++++
 scripts/ci/libraries/_verbosity.sh                 |   6 +-
 scripts/ci/libraries/_verify_image.sh              |   8 +-
 .../ci_install_and_test_provider_packages.sh       |   2 +-
 scripts/ci/static_checks/bats_tests.sh             |   2 +-
 scripts/ci/static_checks/check_license.sh          |   2 +-
 scripts/ci/static_checks/flake8.sh                 |   4 +-
 .../ci/static_checks/in_container_bats_tests.sh    |   4 +-
 scripts/ci/static_checks/lint_dockerfile.sh        |   4 +-
 scripts/ci/static_checks/mypy.sh                   |   2 +-
 scripts/ci/static_checks/pylint.sh                 |   4 +-
 scripts/ci/static_checks/refresh_pylint_todo.sh    |   2 +-
 scripts/ci/testing/ci_run_airflow_testing.sh       | 143 +------
 scripts/ci/testing/ci_run_quarantined_tests.sh     |  87 +++++
 .../ci_run_single_airflow_test_in_docker.sh        |   6 +-
 scripts/ci/tools/ci_clear_tmp.sh                   |   2 +-
 scripts/ci/tools/ci_fix_ownership.sh               |   2 +-
 scripts/ci/tools/ci_free_space_on_ci.sh            |   2 +-
 scripts/in_container/_in_container_utils.sh        |  39 +-
 .../in_container/run_anything.sh                   |   2 +
 scripts/in_container/run_fix_ownership.sh          |   4 +-
 setup.cfg                                          |  10 +-
 setup.py                                           |  11 +-
 tests/jobs/test_scheduler_job.py                   |   1 +
 tests/utils/test_process_utils.py                  |  12 +-
 55 files changed, 1200 insertions(+), 499 deletions(-)
 create mode 100644 provider_packages/README.rst
 create mode 100644 scripts/ci/libraries/_testing.sh
 create mode 100755 scripts/ci/testing/ci_run_quarantined_tests.sh
 copy airflow/api_connexion/__init__.py => scripts/in_container/run_anything.sh (96%)
 mode change 100644 => 100755

[airflow] 09/16: Merges prepare/test provider packages into two jobs (#15152)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3c24a314128935d7b2b4ec821ee2c8efc8fbd016
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sat Apr 3 12:48:54 2021 +0200

    Merges prepare/test provider packages into two jobs (#15152)
    
    The 'wheel' package installation tests all options
    comprehensively - including preparing documentation
    and installing on Airflow 2.0.
    
    The 'sdist' package installation takes longer (because
    the packages are converted to wheels on-the-fly by
    pip), so only basic installation is tested (the rest
    is the same as in case of wheel packages)
    
    (cherry picked from commit 2e8aa0d1094a661674456dfefda2e1c8e8b134d4)
---
 .github/workflows/ci.yml | 64 +++++++++++++++++++++++++-----------------------
 1 file changed, 34 insertions(+), 30 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index d96eeb5..ddc985b 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -502,23 +502,18 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
           github.event_name == 'push'
         run: aws s3 sync --delete ./docs/_build s3://apache-airflow-docs
 
-  prepare-provider-packages:
+  prepare-test-provider-packages-wheel:
     timeout-minutes: 40
-    name: "Provider packages ${{ matrix.package-format }}"
+    name: "Build and test provider packages wheel"
     runs-on: ${{ fromJson(needs.build-info.outputs.runsOn) }}
     needs: [build-info, ci-images]
     env:
       RUNS_ON: ${{ fromJson(needs.build-info.outputs.runsOn) }}
-      INSTALL_AIRFLOW_VERSION: "${{ matrix.package-format }}"
       AIRFLOW_EXTRAS: "all"
       PYTHON_MAJOR_MINOR_VERSION: ${{needs.build-info.outputs.defaultPythonVersion}}
       VERSION_SUFFIX_FOR_PYPI: "dev"
       VERSION_SUFFIX_FOR_SVN: "dev"
-      PACKAGE_FORMAT: ${{ matrix.package-format }}
       GITHUB_REGISTRY: ${{ needs.ci-images.outputs.githubRegistry }}
-    strategy:
-      matrix:
-        package-format: ['wheel', 'sdist']
     if: needs.build-info.outputs.image-build == 'true' && needs.build-info.outputs.default-branch == 'master'
     steps:
       - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
@@ -536,43 +531,44 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
         run: ./scripts/ci/images/ci_prepare_ci_image_on_ci.sh
       - name: "Prepare provider documentation"
         run: ./scripts/ci/provider_packages/ci_prepare_provider_documentation.sh
-      - name: "Prepare provider packages: ${{ matrix.package-format }}"
+      - name: "Prepare provider packages: wheel"
         run: ./scripts/ci/provider_packages/ci_prepare_provider_packages.sh
-      - name: "Prepare airflow packages: ${{ matrix.package-format }}"
+        env:
+          PACKAGE_FORMAT: "wheel"
+      - name: "Prepare airflow package: wheel"
         run: ./scripts/ci/build_airflow/ci_build_airflow_package.sh
-      - name: "Install and test provider packages and airflow via ${{ matrix.package-format }} files"
+        env:
+          PACKAGE_FORMAT: "wheel"
+      - name: "Install and test provider packages and airflow via wheel files"
         run: ./scripts/ci/provider_packages/ci_install_and_test_provider_packages.sh
-      - name: "Upload package artifacts"
-        uses: actions/upload-artifact@v2
-        if: always()
-        with:
-          name: airflow-provider-packages
-          path: "./dist/apache-*"
-          retention-days: 7
+        env:
+          INSTALL_AIRFLOW_VERSION: "wheel"
+          PACKAGE_FORMAT: "wheel"
+      - name: "Install and test provider packages and airflow on Airflow 2.0 files"
+        run: ./scripts/ci/provider_packages/ci_install_and_test_provider_packages.sh
+        env:
+          INSTALL_AIRFLOW_VERSION: "2.0.0"
+          PACKAGE_FORMAT: "wheel"
 
-  test-provider-packages-released-airflow:
-    timeout-minutes: 30
-    name: "Test Provider packages with 2.0.0 version ${{ matrix.package-format }}"
+  prepare-test-provider-packages-sdist:
+    timeout-minutes: 40
+    name: "Build and test provider packages sdist"
     runs-on: ${{ fromJson(needs.build-info.outputs.runsOn) }}
     needs: [build-info, ci-images]
     env:
       RUNS_ON: ${{ fromJson(needs.build-info.outputs.runsOn) }}
-      INSTALL_AIRFLOW_VERSION: "2.0.0"
       AIRFLOW_EXTRAS: "all"
       PYTHON_MAJOR_MINOR_VERSION: ${{needs.build-info.outputs.defaultPythonVersion}}
       VERSION_SUFFIX_FOR_PYPI: "dev"
       VERSION_SUFFIX_FOR_SVN: "dev"
-      PACKAGE_FORMAT: ${{ matrix.package-format }}
       GITHUB_REGISTRY: ${{ needs.ci-images.outputs.githubRegistry }}
-    strategy:
-      matrix:
-        package-format: ['wheel', 'sdist']
     if: needs.build-info.outputs.image-build == 'true' && needs.build-info.outputs.default-branch == 'master'
     steps:
       - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
         uses: actions/checkout@v2
         with:
           persist-credentials: false
+        if: needs.build-info.outputs.default-branch == 'master'
       - name: "Setup python"
         uses: actions/setup-python@v2
         with:
@@ -581,12 +577,19 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
         run: ./scripts/ci/tools/ci_free_space_on_ci.sh
       - name: "Prepare CI image ${{env.PYTHON_MAJOR_MINOR_VERSION}}:${{ env.GITHUB_REGISTRY_PULL_IMAGE_TAG }}"
         run: ./scripts/ci/images/ci_prepare_ci_image_on_ci.sh
-      - name: "Prepare provider documentation"
-        run: ./scripts/ci/provider_packages/ci_prepare_provider_documentation.sh
-      - name: "Prepare provider packages: ${{ matrix.package-format }}"
+      - name: "Prepare provider packages: sdist"
         run: ./scripts/ci/provider_packages/ci_prepare_provider_packages.sh
-      - name: "Install and test provider packages and airflow via ${{ matrix.package-format }} files"
+        env:
+          PACKAGE_FORMAT: "sdist"
+      - name: "Prepare airflow package: sdist"
+        run: ./scripts/ci/build_airflow/ci_build_airflow_package.sh
+        env:
+          PACKAGE_FORMAT: "sdist"
+      - name: "Install and test provider packages and airflow via sdist files"
         run: ./scripts/ci/provider_packages/ci_install_and_test_provider_packages.sh
+        env:
+          INSTALL_AIRFLOW_VERSION: "sdist"
+          PACKAGE_FORMAT: "sdist"
 
   tests-helm:
     timeout-minutes: 20
@@ -1240,7 +1243,8 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
       - tests-mysql
       - tests-kubernetes
       - constraints-push
-      - prepare-provider-packages
+      - prepare-test-provider-packages-wheel
+      - prepare-test-provider-packages-sdist
     if: github.event_name == 'schedule' &&  github.repository == 'apache/airflow'
     env:
       RUNS_ON: ${{ fromJson(needs.build-info.outputs.runsOn) }}

[airflow] 01/16: The PYTHON_MAJOR_MINOR build arg has been deprecated (#15054)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 530abe5bbb254f0ec9a2331bc212bc570edc5ef5
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Mar 28 20:58:52 2021 +0200

    The PYTHON_MAJOR_MINOR build arg has been deprecated (#15054)
    
    The python version is now auto-detected and the former
    build-arg is deprecated.
    
    Time to remove it.
    
    (cherry picked from commit c0ceb10326ddfdd23d69dddff83a9a525fa89453)
---
 IMAGES.rst                            | 15 ++++++---------
 scripts/ci/libraries/_build_images.sh |  3 ---
 2 files changed, 6 insertions(+), 12 deletions(-)

diff --git a/IMAGES.rst b/IMAGES.rst
index 3e00f41..40299b6 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -450,7 +450,6 @@ additional apt dev and runtime dependencies.
 
   docker build . -f Dockerfile.ci \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
-    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALLATION_METHOD="apache-airflow" \
     --build-arg AIRFLOW_VERSION="2.0.0" \
     --build-arg AIRFLOW_VERSION_SPECIFICATION="==2.0.0" \
@@ -484,7 +483,6 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
 
   docker build . -f Dockerfile.ci \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
-    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALLATION_METHOD="apache-airflow" \
     --build-arg AIRFLOW_VERSION="2.0.0" \
     --build-arg AIRFLOW_VERSION_SPECIFICATION="==2.0.0" \
@@ -684,8 +682,7 @@ This builds the CI image in version 3.7 with default extras ("all").
 
 .. code-block:: bash
 
-  docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
-    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7
+  docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster"
 
 
 This builds the CI image in version 3.6 with "gcp" extra only.
@@ -693,7 +690,7 @@ This builds the CI image in version 3.6 with "gcp" extra only.
 .. code-block:: bash
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
-    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg AIRFLOW_EXTRAS=gcp
+    --build-arg AIRFLOW_EXTRAS=gcp
 
 
 This builds the CI image in version 3.6 with "apache-beam" extra added.
@@ -701,28 +698,28 @@ This builds the CI image in version 3.6 with "apache-beam" extra added.
 .. code-block:: bash
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
-    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_AIRFLOW_EXTRAS="apache-beam"
+    --build-arg ADDITIONAL_AIRFLOW_EXTRAS="apache-beam"
 
 This builds the CI image in version 3.6 with "mssql" additional package added.
 
 .. code-block:: bash
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
-    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_PYTHON_DEPS="mssql"
+    --build-arg ADDITIONAL_PYTHON_DEPS="mssql"
 
 This builds the CI image in version 3.6 with "gcc" and "g++" additional apt dev dependencies added.
 
 .. code-block::
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
-    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_DEV_APT_DEPS="gcc g++"
+    --build-arg ADDITIONAL_DEV_APT_DEPS="gcc g++"
 
 This builds the CI image in version 3.6 with "jdbc" extra and "default-jre-headless" additional apt runtime dependencies added.
 
 .. code-block::
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
-    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg AIRFLOW_EXTRAS=jdbc --build-arg ADDITIONAL_RUNTIME_DEPS="default-jre-headless"
+    --build-arg AIRFLOW_EXTRAS=jdbc --build-arg ADDITIONAL_RUNTIME_DEPS="default-jre-headless"
 
 Production images
 -----------------
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index 55801e2..7d7d180 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -748,7 +748,6 @@ Docker building ${AIRFLOW_CI_IMAGE}.
     docker build \
         "${EXTRA_DOCKER_CI_BUILD_FLAGS[@]}" \
         --build-arg PYTHON_BASE_IMAGE="${AIRFLOW_PYTHON_BASE_IMAGE}" \
-        --build-arg PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION}" \
         --build-arg AIRFLOW_VERSION="${AIRFLOW_VERSION}" \
         --build-arg AIRFLOW_BRANCH="${BRANCH_NAME}" \
         --build-arg AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS}" \
@@ -902,7 +901,6 @@ function build_images::build_prod_images() {
     docker build \
         "${EXTRA_DOCKER_PROD_BUILD_FLAGS[@]}" \
         --build-arg PYTHON_BASE_IMAGE="${AIRFLOW_PYTHON_BASE_IMAGE}" \
-        --build-arg PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION}" \
         --build-arg INSTALL_MYSQL_CLIENT="${INSTALL_MYSQL_CLIENT}" \
         --build-arg AIRFLOW_VERSION="${AIRFLOW_VERSION}" \
         --build-arg AIRFLOW_BRANCH="${AIRFLOW_BRANCH_FOR_PYPI_PRELOADING}" \
@@ -939,7 +937,6 @@ function build_images::build_prod_images() {
     docker build \
         "${EXTRA_DOCKER_PROD_BUILD_FLAGS[@]}" \
         --build-arg PYTHON_BASE_IMAGE="${AIRFLOW_PYTHON_BASE_IMAGE}" \
-        --build-arg PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION}" \
         --build-arg INSTALL_MYSQL_CLIENT="${INSTALL_MYSQL_CLIENT}" \
         --build-arg ADDITIONAL_AIRFLOW_EXTRAS="${ADDITIONAL_AIRFLOW_EXTRAS}" \
         --build-arg ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS}" \

[airflow] 16/16: Updates 3.6 limits for latest versions of a few libraries (#15209)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 5f5a914db9352453b2ca11dc415a7837a53be327
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Mon Apr 5 20:25:11 2021 +0200

    Updates 3.6 limits for latest versions of a few libraries (#15209)
    
    This PR sets Pythong 3.6 specific limits for some of the packages
    that recently dropped support for Python 3.6 binary packages
    released via PyPI. Even if those packages did not drop the
    Python 3.6 support entirely, it gets more and more difficult to
    get those packages installed (both locally and in the Docker image)
    because the require the packages to be compiled and they often
    require a number of external dependencies to do so.
    
    This makes it difficult to automatically upgrade dependencies,
    because such upgrade fails for Python 3.6 images if we attempt
    to do so.
    
    This PR limits several of those dependencies (dask/pandas/numpy)
    to not use the lates major releases for those packages but limits
    them to the latest released versions.
    
    Also comment/clarification was added to recently (#15114) added limit
    for `pandas-gbq`. This limit has been added because of broken
    import for bigquery provider, but the comment about it was missing
    so the comment is added now.
    
    (cherry picked from commit e49722859b81cfcdd7e4bb8e8aba4efb049a8590)
---
 airflow/models/baseoperator.py                       |  2 +-
 airflow/providers/google/cloud/operators/dataflow.py |  6 +++---
 setup.cfg                                            |  7 ++++++-
 setup.py                                             | 11 +++++++++--
 4 files changed, 19 insertions(+), 7 deletions(-)

diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
index 06094a1..8bda785 100644
--- a/airflow/models/baseoperator.py
+++ b/airflow/models/baseoperator.py
@@ -1493,7 +1493,7 @@ def cross_downstream(
 class BaseOperatorLink(metaclass=ABCMeta):
     """Abstract base class that defines how we get an operator link."""
 
-    operators: ClassVar[List[Type[BaseOperator]]] = []
+    operators: ClassVar[List[Type[BaseOperator]]] = []  # pylint: disable=invalid-name
     """
     This property will be used by Airflow Plugins to find the Operators to which you want
     to assign this Operator Link
diff --git a/airflow/providers/google/cloud/operators/dataflow.py b/airflow/providers/google/cloud/operators/dataflow.py
index 92ae77e..513fea3 100644
--- a/airflow/providers/google/cloud/operators/dataflow.py
+++ b/airflow/providers/google/cloud/operators/dataflow.py
@@ -43,9 +43,9 @@ class CheckJobRunning(Enum):
     WaitForRun - wait for job to finish and then continue with new job
     """
 
-    IgnoreJob = 1
-    FinishIfRunning = 2
-    WaitForRun = 3
+    IgnoreJob = 1  # pylint: disable=invalid-name
+    FinishIfRunning = 2  # pylint: disable=invalid-name
+    WaitForRun = 3  # pylint: disable=invalid-name
 
 
 class DataflowConfiguration:
diff --git a/setup.cfg b/setup.cfg
index 9c051af..1dcdef9 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -110,7 +110,12 @@ install_requires =
     markdown>=2.5.2, <4.0
     markupsafe>=1.1.1, <2.0
     marshmallow-oneofschema>=2.0.1
-    pandas>=0.17.1, <2.0
+    # Numpy stopped releasing 3.6 binaries for 1.20.* series.
+    numpy<1.20;python_version<"3.7"
+    numpy;python_version>="3.7"
+    # Pandas stopped releasing 3.6 binaries for 1.2.* series.
+    pandas>=0.17.1, <1.2;python_version<"3.7"
+    pandas>=0.17.1, <2.0;python_version>="3.7"
     pendulum~=2.0
     pep562~=1.0;python_version<"3.7"
     psutil>=4.2.0, <6.0.0
diff --git a/setup.py b/setup.py
index 0f421bc..51bd9be 100644
--- a/setup.py
+++ b/setup.py
@@ -237,7 +237,12 @@ cgroups = [
 cloudant = [
     'cloudant>=2.0',
 ]
-dask = ['cloudpickle>=1.4.1, <1.5.0', 'distributed>=2.11.1, <2.20']
+dask = [
+    'cloudpickle>=1.4.1, <1.5.0',
+    'dask<2021.3.1;python_version>"3.7"',  # dask stopped supporting python 3.6 in 2021.3.1 version
+    'dask>=2.9.0;python_version>="3.7"',
+    'distributed>=2.11.1, <2.20',
+]
 databricks = [
     'requests>=2.20.0, <3',
 ]
@@ -313,7 +318,9 @@ google = [
     'google-cloud-workflows>=0.1.0,<2.0.0',
     'grpcio-gcp>=0.2.2',
     'json-merge-patch~=0.2',
-    'pandas-gbq',
+    # pandas-gbq 0.15.0 release broke google provider's bigquery import
+    # _check_google_client_version (airflow/providers/google/cloud/hooks/bigquery.py:49)
+    'pandas-gbq<0.15.0',
     'plyvel',
 ]
 grpc = [

[airflow] 11/16: Adds Blinker dependency which is missing after recent changes (#15182)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 65c3ecf1f13411091e0bbfa669f69c89fe796964
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Apr 4 01:57:56 2021 +0200

    Adds Blinker dependency which is missing after recent changes (#15182)
    
    This PR fixes a problem introduced by #14144
    
    This is a very weird and unforeseen issue. The change introduced a
    new import from flask `before_render_template` and this caused
    flask to require `blinker` dependency, even if it was not
    specified before as 'required' by flask. We have not seen it
    before, because changes to this part of the code do not trigger
    K8S tests, however subsequent PRs started to fail because
    the setup.py did not have `blinker` as dependency.
    
    However in CI image `blinker` was installed because it is
    needed by sentry. So the problem was only detectable in the
    production image.
    
    This is an ultimate proof that our test harness is really good in
    catchig this kind of errors.
    
    The root cause for it is described in
    https://stackoverflow.com/questions/38491075/flask-testing-signals-not-supported-error
    
    Flask support for signals is optional and it does not blinker as
    dependency, but importing some parts of flask triggers the need
    for signals.
    
    (cherry picked from commit 437850bd16ea71421613ce9ab361bafec90b7ece)
---
 setup.cfg | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/setup.cfg b/setup.cfg
index ed533ca..9c051af 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -82,7 +82,8 @@ install_requires =
     alembic>=1.2, <2.0
     argcomplete~=1.10
     attrs>=20.0, <21.0
-    cached_property~=1.5
+    blinker
+    cached_property~=1.5;python_version<="3.7"
     # cattrs >= 1.1.0 dropped support for Python 3.6
     cattrs>=1.0, <1.1.0;python_version<="3.6"
     cattrs~=1.1;python_version>"3.6"

[airflow] 04/16: Add timeout to test jobs to prevent hanging docker containers (#15078)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7a5c26cdde2dcd58378ff133ac7411e5a84aef55
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Wed Mar 31 01:01:47 2021 +0200

    Add timeout to test jobs to prevent hanging docker containers (#15078)
    
    Some of the test jobs are hanging - either becasue of some
    weird race conditions in docker or because the test hangs (happens
    for quarantined tests). This change add maximum timeout we let
    the test suite execute to 25 minutes.
    
    (cherry picked from commit a4aee3f1d0c27f7c6010e784611ee943009c7498)
---
 scripts/ci/testing/ci_run_airflow_testing.sh | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/scripts/ci/testing/ci_run_airflow_testing.sh b/scripts/ci/testing/ci_run_airflow_testing.sh
index 8286874..0867e3c 100755
--- a/scripts/ci/testing/ci_run_airflow_testing.sh
+++ b/scripts/ci/testing/ci_run_airflow_testing.sh
@@ -160,9 +160,10 @@ function run_test_types_in_parallel() {
         mkdir -p "${PARALLEL_MONITORED_DIR}/${SEMAPHORE_NAME}/${TEST_TYPE}"
         export JOB_LOG="${PARALLEL_MONITORED_DIR}/${SEMAPHORE_NAME}/${TEST_TYPE}/stdout"
         export PARALLEL_JOB_STATUS="${PARALLEL_MONITORED_DIR}/${SEMAPHORE_NAME}/${TEST_TYPE}/status"
+        # Each test job will get SIGTERM followed by SIGTERM 200ms later and SIGKILL 200ms later after 25 mins
         # shellcheck disable=SC2086
         parallel --ungroup --bg --semaphore --semaphorename "${SEMAPHORE_NAME}" \
-            --jobs "${MAX_PARALLEL_TEST_JOBS}" \
+            --jobs "${MAX_PARALLEL_TEST_JOBS}" --timeout 1500 \
             "$( dirname "${BASH_SOURCE[0]}" )/ci_run_single_airflow_test_in_docker.sh" "${@}" >${JOB_LOG} 2>&1
     done
     parallel --semaphore --semaphorename "${SEMAPHORE_NAME}" --wait

[airflow] 03/16: Parallelize build of documentation. (#15062)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 17b89f3514e65c9671c722d40278ca8a0e8f2220
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Wed Mar 31 00:24:09 2021 +0200

    Parallelize build of documentation. (#15062)
    
    This is far more complex than it should be because of
    autoapi problems with parallel execution. Unfortunately autoapi
    does not cope well when several autoapis are run in parallel on
    the same code - even if they are run in separate processes and
    for different packages. Autoapi uses common _doctree and _api
    directories generated in the source code and they override
    each other if two or more of them run in parallel.
    
    The solution in this PR is mostly applicable for CI environment.
    In this case we have docker images that have been already built
    using current sources so we can safely run separate docker
    containers without mapping the sources and run generation
    of documentation separtely and independently in each container.
    
    This seems to work really well, speeding up docs generation
    2x in public GitHub runners and 8x in self-hosted runners.
    
    Public runners:
    
    * 27m -> 15m
    
    Self-hosted runners:
    
    * 27m -> < 8m
    
    (cherry picked from commit 741a54502f0fcb3ad57c17d18edd9a6745b4b78b)
---
 .github/workflows/ci.yml                           |  14 +
 .gitignore                                         |   5 +
 docs/build_docs.py                                 | 417 ++++++++++++++++++---
 docs/exts/docs_build/code_utils.py                 |  32 ++
 docs/exts/docs_build/docs_builder.py               | 317 ++++++++++++----
 docs/exts/docs_build/errors.py                     |  39 +-
 docs/exts/docs_build/github_action_utils.py        |   1 +
 docs/exts/docs_build/spelling_checks.py            |  47 ++-
 provider_packages/README.rst                       |  53 +++
 scripts/ci/docs/ci_docs.sh                         |  15 +-
 scripts/in_container/_in_container_utils.sh        |   9 +-
 .../{run_fix_ownership.sh => run_anything.sh}      |   5 +-
 scripts/in_container/run_fix_ownership.sh          |   4 +-
 13 files changed, 777 insertions(+), 181 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index a70632c..49fb2e7 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -463,12 +463,26 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
     env:
       RUNS_ON: ${{ fromJson(needs.build-info.outputs.runsOn) }}
       GITHUB_REGISTRY: ${{ needs.ci-images.outputs.githubRegistry }}
+      PYTHON_MAJOR_MINOR_VERSION: ${{needs.build-info.outputs.defaultPythonVersion}}
     steps:
       - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
         uses: actions/checkout@v2
         with:
           persist-credentials: false
           submodules: recursive
+      - name: "Setup python"
+        uses: actions/setup-python@v2
+        with:
+          python-version: ${{needs.build-info.outputs.defaultPythonVersion}}
+      - uses: actions/cache@v2
+        id: cache-venv-docs
+        with:
+          path: ./.docs-venv/
+          key: ${{ runner.os }}-docs-venv-${{ hashFiles('setup.py', 'setup.cfg') }}
+          restore-keys: |
+            ${{ runner.os }}-docs-venv-
+      - name: "Free space"
+        run: ./scripts/ci/tools/ci_free_space_on_ci.sh
       - name: "Prepare CI image ${{env.PYTHON_MAJOR_MINOR_VERSION}}:${{ env.GITHUB_REGISTRY_PULL_IMAGE_TAG }}"
         run: ./scripts/ci/images/ci_prepare_ci_image_on_ci.sh
       - name: "Build docs"
diff --git a/.gitignore b/.gitignore
index 67dfd80..5da0b18 100644
--- a/.gitignore
+++ b/.gitignore
@@ -215,3 +215,8 @@ Chart.lock
 pip-wheel-metadata
 
 .pypirc
+/.docs-venv
+
+# Dev files
+/dev/packages.txt
+/dev/Dockerfile.pmc
diff --git a/docs/build_docs.py b/docs/build_docs.py
index 4e4786f..59e1681 100755
--- a/docs/build_docs.py
+++ b/docs/build_docs.py
@@ -16,14 +16,25 @@
 # specific language governing permissions and limitations
 # under the License.
 import argparse
+import multiprocessing
 import os
+import platform
 import sys
 from collections import defaultdict
-from typing import Dict, List, Optional, Tuple
+from subprocess import run
+from typing import Dict, List, NamedTuple, Optional, Tuple
 
+from rich.console import Console
 from tabulate import tabulate
 
 from docs.exts.docs_build import dev_index_generator, lint_checks  # pylint: disable=no-name-in-module
+from docs.exts.docs_build.code_utils import (
+    CONSOLE_WIDTH,
+    DOCKER_PROJECT_DIR,
+    ROOT_PROJECT_DIR,
+    TEXT_RED,
+    TEXT_RESET,
+)
 from docs.exts.docs_build.docs_builder import (  # pylint: disable=no-name-in-module
     DOCS_DIR,
     AirflowDocsBuilder,
@@ -60,32 +71,32 @@ ERRORS_ELIGIBLE_TO_REBUILD = [
 ]
 
 ON_GITHUB_ACTIONS = os.environ.get('GITHUB_ACTIONS', 'false') == "true"
-TEXT_BLUE = '\033[94m'
-TEXT_RESET = '\033[0m'
+
+console = Console(force_terminal=True, color_system="standard", width=CONSOLE_WIDTH)
 
 
 def _promote_new_flags():
-    print(TEXT_BLUE)
-    print("Tired of waiting for documentation to be built?")
-    print()
+    console.print()
+    console.print("[yellow]Still tired of waiting for documentation to be built?[/]")
+    console.print()
     if ON_GITHUB_ACTIONS:
-        print("You can quickly build documentation locally with just one command.")
-        print("    ./breeze build-docs")
-        print()
-        print("Still too slow?")
-        print()
-    print("You can only build one documentation package:")
-    print("    ./breeze build-docs -- --package-filter <PACKAGE-NAME>")
-    print()
-    print("This usually takes from 20 seconds to 2 minutes.")
-    print()
-    print("You can also use other extra flags to iterate faster:")
-    print("   --docs-only       - Only build documentation")
-    print("   --spellcheck-only - Only perform spellchecking")
-    print()
-    print("For more info:")
-    print("   ./breeze build-docs --help")
-    print(TEXT_RESET)
+        console.print("You can quickly build documentation locally with just one command.")
+        console.print("    [blue]./breeze build-docs[/]")
+        console.print()
+        console.print("[yellow]Still too slow?[/]")
+        console.print()
+    console.print("You can only build one documentation package:")
+    console.print("    [blue]./breeze build-docs -- --package-filter <PACKAGE-NAME>[/]")
+    console.print()
+    console.print("This usually takes from [yellow]20 seconds[/] to [yellow]2 minutes[/].")
+    console.print()
+    console.print("You can also use other extra flags to iterate faster:")
+    console.print("   [blue]--docs-only       - Only build documentation[/]")
+    console.print("   [blue]--spellcheck-only - Only perform spellchecking[/]")
+    console.print()
+    console.print("For more info:")
+    console.print("   [blue]./breeze build-docs --help[/]")
+    console.print()
 
 
 def _get_parser():
@@ -116,6 +127,34 @@ def _get_parser():
         help='Builds documentation for official release i.e. all links point to stable version',
     )
     parser.add_argument(
+        "-j",
+        "--jobs",
+        dest='jobs',
+        type=int,
+        default=1,
+        help=(
+            """
+    Number of parallel processes that will be spawned to build the docs.
+
+    This is usually used in CI system only. Though you can also use it to run complete check
+    of the documntation locally if you have powerful local machine.
+    Default is 1 - which means that doc check runs sequentially, This is the default behaviour
+    because autoapi extension we use is not capable of running parallel builds at the same time using
+    the same source files.
+
+    In parallel builds we are using dockerised version of image built from local sources but the image
+    has to be prepared locally (similarly as it is in CI) before you run the docs build. Any changes you
+    have done locally after building the image, will not be checked.
+
+    Typically you run parallel build in this way if you want to quickly run complete check for all docs:
+
+         ./breeze build-image --python 3.6
+         ./docs/build-docs.py -j 0
+
+"""
+        ),
+    )
+    parser.add_argument(
         "-v",
         "--verbose",
         dest='verbose',
@@ -129,31 +168,291 @@ def _get_parser():
     return parser
 
 
+class BuildSpecification(NamedTuple):
+    """Specification of single build."""
+
+    package_name: str
+    for_production: bool
+    verbose: bool
+    dockerized: bool
+
+
+class BuildDocsResult(NamedTuple):
+    """Result of building documentation."""
+
+    package_name: str
+    log_file_name: str
+    errors: List[DocBuildError]
+
+
+class SpellCheckResult(NamedTuple):
+    """Result of spellcheck."""
+
+    package_name: str
+    log_file_name: str
+    errors: List[SpellingError]
+
+
+def perform_docs_build_for_single_package(build_specification: BuildSpecification) -> BuildDocsResult:
+    """Performs single package docs build."""
+    builder = AirflowDocsBuilder(
+        package_name=build_specification.package_name, for_production=build_specification.for_production
+    )
+    console.print(f"[blue]{build_specification.package_name:60}:[/] Building documentation")
+    result = BuildDocsResult(
+        package_name=build_specification.package_name,
+        errors=builder.build_sphinx_docs(
+            dockerized=build_specification.dockerized,
+            verbose=build_specification.verbose,
+        ),
+        log_file_name=builder.log_build_filename,
+    )
+    return result
+
+
+def perform_spell_check_for_single_package(build_specification: BuildSpecification) -> SpellCheckResult:
+    """Performs single package spell check."""
+    builder = AirflowDocsBuilder(
+        package_name=build_specification.package_name, for_production=build_specification.for_production
+    )
+    console.print(f"[blue]{build_specification.package_name:60}:[/] Checking spelling started")
+    result = SpellCheckResult(
+        package_name=build_specification.package_name,
+        errors=builder.check_spelling(
+            dockerized=build_specification.dockerized,
+            verbose=build_specification.verbose,
+        ),
+        log_file_name=builder.log_spelling_filename,
+    )
+    console.print(f"[blue]{build_specification.package_name:60}:[/] Checking spelling completed")
+    return result
+
+
 def build_docs_for_packages(
-    current_packages: List[str], docs_only: bool, spellcheck_only: bool, for_production: bool, verbose: bool
+    current_packages: List[str],
+    docs_only: bool,
+    spellcheck_only: bool,
+    for_production: bool,
+    jobs: int,
+    verbose: bool,
 ) -> Tuple[Dict[str, List[DocBuildError]], Dict[str, List[SpellingError]]]:
-    """Builds documentation for single package and returns errors"""
+    """Builds documentation for all packages and combines errors."""
     all_build_errors: Dict[str, List[DocBuildError]] = defaultdict(list)
     all_spelling_errors: Dict[str, List[SpellingError]] = defaultdict(list)
-    for package_no, package_name in enumerate(current_packages, start=1):
-        print("#" * 20, f"[{package_no}/{len(current_packages)}] {package_name}", "#" * 20)
-        builder = AirflowDocsBuilder(package_name=package_name, for_production=for_production)
-        builder.clean_files()
-        if not docs_only:
-            with with_group(f"Check spelling: {package_name}"):
-                spelling_errors = builder.check_spelling(verbose=verbose)
-            if spelling_errors:
-                all_spelling_errors[package_name].extend(spelling_errors)
-
-        if not spellcheck_only:
-            with with_group(f"Building docs: {package_name}"):
-                docs_errors = builder.build_sphinx_docs(verbose=verbose)
-            if docs_errors:
-                all_build_errors[package_name].extend(docs_errors)
-
+    with with_group("Cleaning documentation files"):
+        for package_name in current_packages:
+            console.print(f"[blue]{package_name:60}:[/] Cleaning files")
+            builder = AirflowDocsBuilder(package_name=package_name, for_production=for_production)
+            builder.clean_files()
+    if jobs > 1:
+        if os.getenv('CI', '') == '':
+            console.print("[yellow] PARALLEL DOCKERIZED EXECUTION REQUIRES IMAGE TO BE BUILD BEFORE !!!![/]")
+            console.print("[yellow] Make sure that you've build the image before runnning docs build.[/]")
+            console.print("[yellow] otherwise local changes you've done will not be used during the check[/]")
+            console.print()
+        run_in_parallel(
+            all_build_errors,
+            all_spelling_errors,
+            current_packages,
+            docs_only,
+            for_production,
+            jobs,
+            spellcheck_only,
+            verbose,
+        )
+    else:
+        run_sequentially(
+            all_build_errors,
+            all_spelling_errors,
+            current_packages,
+            docs_only,
+            for_production,
+            spellcheck_only,
+            verbose,
+        )
     return all_build_errors, all_spelling_errors
 
 
+def run_sequentially(
+    all_build_errors,
+    all_spelling_errors,
+    current_packages,
+    docs_only,
+    for_production,
+    spellcheck_only,
+    verbose,
+):
+    """Run both - spellcheck and docs build sequentially without multiprocessing"""
+    if not spellcheck_only:
+        for package_name in current_packages:
+            build_result = perform_docs_build_for_single_package(
+                build_specification=BuildSpecification(
+                    package_name=package_name,
+                    for_production=for_production,
+                    dockerized=False,
+                    verbose=verbose,
+                )
+            )
+            if build_result.errors:
+                all_build_errors[package_name].extend(build_result.errors)
+                print_build_output(build_result)
+    if not docs_only:
+        for package_name in current_packages:
+            spellcheck_result = perform_spell_check_for_single_package(
+                build_specification=BuildSpecification(
+                    package_name=package_name,
+                    for_production=for_production,
+                    dockerized=False,
+                    verbose=verbose,
+                )
+            )
+            if spellcheck_result.errors:
+                all_spelling_errors[package_name].extend(spellcheck_result.errors)
+                print_spelling_output(spellcheck_result)
+
+
+def run_in_parallel(
+    all_build_errors,
+    all_spelling_errors,
+    current_packages,
+    docs_only,
+    for_production,
+    jobs,
+    spellcheck_only,
+    verbose,
+):
+    """Run both - spellcheck and docs build sequentially without multiprocessing"""
+    pool = multiprocessing.Pool(processes=jobs)
+    # until we fix autoapi, we need to run parallel builds as dockerized images
+    dockerized = True
+    if not spellcheck_only:
+        run_docs_build_in_parallel(
+            all_build_errors=all_build_errors,
+            for_production=for_production,
+            current_packages=current_packages,
+            verbose=verbose,
+            dockerized=dockerized,
+            pool=pool,
+        )
+    if not docs_only:
+        run_spell_check_in_parallel(
+            all_spelling_errors=all_spelling_errors,
+            for_production=for_production,
+            current_packages=current_packages,
+            verbose=verbose,
+            dockerized=dockerized,
+            pool=pool,
+        )
+    fix_ownership()
+
+
+def fix_ownership():
+    """Fixes ownership for all files created with root user,"""
+    console.print("Fixing ownership for generated files")
+    python_version = os.getenv('PYTHON_MAJOR_MINOR_VERSION', "3.6")
+    fix_cmd = [
+        "docker",
+        "run",
+        "--entrypoint",
+        "/bin/bash",
+        "--rm",
+        "-e",
+        f"HOST_OS={platform.system()}",
+        "-e" f"HOST_USER_ID={os.getuid()}",
+        "-e",
+        f"HOST_GROUP_ID={os.getgid()}",
+        "-v",
+        f"{ROOT_PROJECT_DIR}:{DOCKER_PROJECT_DIR}",
+        f"apache/airflow:master-python{python_version}-ci",
+        "-c",
+        "/opt/airflow/scripts/in_container/run_fix_ownership.sh",
+    ]
+    run(fix_cmd, check=True)
+
+
+def print_build_output(result: BuildDocsResult):
+    """Prints output of docs build job."""
+    with with_group(f"{TEXT_RED}Output for documentation build {result.package_name}{TEXT_RESET}"):
+        console.print()
+        console.print(f"[blue]{result.package_name:60}: " + "#" * 80)
+        with open(result.log_file_name) as output:
+            for line in output.read().splitlines():
+                console.print(f"{result.package_name:60} {line}")
+        console.print(f"[blue]{result.package_name:60}: " + "#" * 80)
+
+
+def run_docs_build_in_parallel(
+    all_build_errors: Dict[str, List[DocBuildError]],
+    for_production: bool,
+    current_packages: List[str],
+    verbose: bool,
+    dockerized: bool,
+    pool,
+):
+    """Runs documentation building in parallel."""
+    doc_build_specifications: List[BuildSpecification] = []
+    with with_group("Scheduling documentation to build"):
+        for package_name in current_packages:
+            console.print(f"[blue]{package_name:60}:[/] Scheduling documentation to build")
+            doc_build_specifications.append(
+                BuildSpecification(
+                    package_name=package_name,
+                    for_production=for_production,
+                    verbose=verbose,
+                    dockerized=dockerized,
+                )
+            )
+    with with_group("Running docs building"):
+        console.print()
+        result_list = pool.map(perform_docs_build_for_single_package, doc_build_specifications)
+    for result in result_list:
+        if result.errors:
+            all_build_errors[result.package_name].extend(result.errors)
+            print_build_output(result)
+
+
+def print_spelling_output(result: SpellCheckResult):
+    """Prints output of spell check job."""
+    with with_group(f"{TEXT_RED}Output for spelling check: {result.package_name}{TEXT_RESET}"):
+        console.print()
+        console.print(f"[blue]{result.package_name:60}: " + "#" * 80)
+        with open(result.log_file_name) as output:
+            for line in output.read().splitlines():
+                console.print(f"{result.package_name:60} {line}")
+        console.print(f"[blue]{result.package_name:60}: " + "#" * 80)
+        console.print()
+
+
+def run_spell_check_in_parallel(
+    all_spelling_errors: Dict[str, List[SpellingError]],
+    for_production: bool,
+    current_packages: List[str],
+    verbose: bool,
+    dockerized: bool,
+    pool,
+):
+    """Runs spell check in parallel."""
+    spell_check_specifications: List[BuildSpecification] = []
+    with with_group("Scheduling spell checking of documentation"):
+        for package_name in current_packages:
+            console.print(f"[blue]{package_name:60}:[/] Scheduling spellchecking")
+            spell_check_specifications.append(
+                BuildSpecification(
+                    package_name=package_name,
+                    for_production=for_production,
+                    verbose=verbose,
+                    dockerized=dockerized,
+                )
+            )
+    with with_group("Running spell checking of documentation"):
+        console.print()
+        result_list = pool.map(perform_spell_check_for_single_package, spell_check_specifications)
+    for result in result_list:
+        if result.errors:
+            all_spelling_errors[result.package_name].extend(result.errors)
+            print_spelling_output(result)
+
+
 def display_packages_summary(
     build_errors: Dict[str, List[DocBuildError]], spelling_errors: Dict[str, List[SpellingError]]
 ):
@@ -161,15 +460,15 @@ def display_packages_summary(
     packages_names = {*build_errors.keys(), *spelling_errors.keys()}
     tabular_data = [
         {
-            "Package name": package_name,
+            "Package name": f"[blue]{package_name}[/]",
             "Count of doc build errors": len(build_errors.get(package_name, [])),
             "Count of spelling errors": len(spelling_errors.get(package_name, [])),
         }
         for package_name in sorted(packages_names, key=lambda k: k or '')
     ]
-    print("#" * 20, "Packages errors summary", "#" * 20)
-    print(tabulate(tabular_data=tabular_data, headers="keys"))
-    print("#" * 50)
+    console.print("#" * 20, " Packages errors summary ", "#" * 20)
+    console.print(tabulate(tabular_data=tabular_data, headers="keys"))
+    console.print("#" * 50)
 
 
 def print_build_errors_and_exit(
@@ -180,15 +479,17 @@ def print_build_errors_and_exit(
     if build_errors or spelling_errors:
         if build_errors:
             display_errors_summary(build_errors)
-            print()
+            console.print()
         if spelling_errors:
             display_spelling_error_summary(spelling_errors)
-            print()
-        print("The documentation has errors.")
+            console.print()
+        console.print("The documentation has errors.")
         display_packages_summary(build_errors, spelling_errors)
-        print()
-        print(CHANNEL_INVITATION)
+        console.print()
+        console.print(CHANNEL_INVITATION)
         sys.exit(1)
+    else:
+        console.print("[green]Documentation build is successful[/]")
 
 
 def main():
@@ -201,15 +502,12 @@ def main():
     package_filters = args.package_filter
     for_production = args.for_production
 
-    if not package_filters:
-        _promote_new_flags()
-
     with with_group("Available packages"):
         for pkg in sorted(available_packages):
-            print(f" - {pkg}")
+            console.print(f" - {pkg}")
 
     if package_filters:
-        print("Current package filters: ", package_filters)
+        console.print("Current package filters: ", package_filters)
     current_packages = process_package_filters(available_packages, package_filters)
 
     with with_group("Fetching inventories"):
@@ -218,9 +516,12 @@ def main():
         priority_packages = fetch_inventories()
     current_packages = sorted(current_packages, key=lambda d: -1 if d in priority_packages else 1)
 
-    with with_group(f"Documentation will be built for {len(current_packages)} package(s)"):
+    jobs = args.jobs if args.jobs != 0 else os.cpu_count()
+    with with_group(
+        f"Documentation will be built for {len(current_packages)} package(s) with {jobs} parallel jobs"
+    ):
         for pkg_no, pkg in enumerate(current_packages, start=1):
-            print(f"{pkg_no}. {pkg}")
+            console.print(f"{pkg_no}. {pkg}")
 
     all_build_errors: Dict[Optional[str], List[DocBuildError]] = {}
     all_spelling_errors: Dict[Optional[str], List[SpellingError]] = {}
@@ -229,6 +530,7 @@ def main():
         docs_only=docs_only,
         spellcheck_only=spellcheck_only,
         for_production=for_production,
+        jobs=jobs,
         verbose=args.verbose,
     )
     if package_build_errors:
@@ -252,6 +554,7 @@ def main():
             docs_only=docs_only,
             spellcheck_only=spellcheck_only,
             for_production=for_production,
+            jobs=jobs,
             verbose=args.verbose,
         )
         if package_build_errors:
diff --git a/docs/exts/docs_build/code_utils.py b/docs/exts/docs_build/code_utils.py
index e77d0b6..07fe8d0 100644
--- a/docs/exts/docs_build/code_utils.py
+++ b/docs/exts/docs_build/code_utils.py
@@ -17,6 +17,38 @@
 import os
 from contextlib import suppress
 
+from docs.exts.provider_yaml_utils import load_package_data
+
+ROOT_PROJECT_DIR = os.path.abspath(
+    os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir, os.pardir, os.pardir)
+)
+DOCS_DIR = os.path.join(ROOT_PROJECT_DIR, "docs")
+AIRFLOW_DIR = os.path.join(ROOT_PROJECT_DIR, "airflow")
+
+DOCKER_PROJECT_DIR = "/opt/airflow"
+DOCKER_DOCS_DIR = os.path.join(DOCKER_PROJECT_DIR, "docs")
+DOCKER_AIRFLOW_DIR = os.path.join(DOCKER_PROJECT_DIR, "/airflow")
+ALL_PROVIDER_YAMLS = load_package_data()
+AIRFLOW_SITE_DIR = os.environ.get('AIRFLOW_SITE_DIRECTORY')
+PROCESS_TIMEOUT = 4 * 60
+
+TEXT_RED = '\033[31m'
+TEXT_RESET = '\033[0m'
+
+CONSOLE_WIDTH = 180
+
+
+def remap_from_docker(file_name: str, dockerized: bool):
+    """
+    Remaps filename from Docker to Host.
+    :param file_name: name of file
+    :param dockerized: whether builds were running in docker environment.
+    :return:
+    """
+    if dockerized and file_name.startswith(DOCKER_PROJECT_DIR):
+        return file_name.replace(DOCKER_PROJECT_DIR, ROOT_PROJECT_DIR)
+    return file_name
+
 
 def prepare_code_snippet(file_path: str, line_no: int, context_lines_count: int = 5) -> str:
     """
diff --git a/docs/exts/docs_build/docs_builder.py b/docs/exts/docs_build/docs_builder.py
index 71e4acb..0669c75 100644
--- a/docs/exts/docs_build/docs_builder.py
+++ b/docs/exts/docs_build/docs_builder.py
@@ -20,24 +20,27 @@ import shlex
 import shutil
 from glob import glob
 from subprocess import run
-from tempfile import NamedTemporaryFile, TemporaryDirectory
 from typing import List
 
-# pylint: disable=no-name-in-module
-from docs.exts.docs_build.code_utils import pretty_format_path
+from rich.console import Console
+
+from docs.exts.docs_build.code_utils import (
+    AIRFLOW_SITE_DIR,
+    ALL_PROVIDER_YAMLS,
+    CONSOLE_WIDTH,
+    DOCKER_DOCS_DIR,
+    DOCS_DIR,
+    PROCESS_TIMEOUT,
+    pretty_format_path,
+)
 from docs.exts.docs_build.errors import DocBuildError, parse_sphinx_warnings
+
+# pylint: disable=no-name-in-module
 from docs.exts.docs_build.spelling_checks import SpellingError, parse_spelling_warnings
-from docs.exts.provider_yaml_utils import load_package_data
 
 # pylint: enable=no-name-in-module
 
-ROOT_PROJECT_DIR = os.path.abspath(
-    os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir, os.pardir, os.pardir)
-)
-DOCS_DIR = os.path.join(ROOT_PROJECT_DIR, "docs")
-ALL_PROVIDER_YAMLS = load_package_data()
-AIRFLOW_SITE_DIR = os.environ.get('AIRFLOW_SITE_DIRECTORY')
-PROCESS_TIMEOUT = 4 * 60
+console = Console(force_terminal=True, color_system="standard", width=CONSOLE_WIDTH)
 
 
 class AirflowDocsBuilder:
@@ -52,6 +55,18 @@ class AirflowDocsBuilder:
         return f"{DOCS_DIR}/_doctrees/docs/{self.package_name}"
 
     @property
+    def _docker_doctree_dir(self) -> str:
+        return f"{DOCKER_DOCS_DIR}/_doctrees/docs/{self.package_name}"
+
+    @property
+    def _inventory_cache_dir(self) -> str:
+        return f"{DOCS_DIR}/_inventory_cache"
+
+    @property
+    def _docker_inventory_cache_dir(self) -> str:
+        return f"{DOCKER_DOCS_DIR}/_inventory_cache"
+
+    @property
     def is_versioned(self):
         """Is current documentation package versioned?"""
         # Disable versioning. This documentation does not apply to any released product and we can update
@@ -67,6 +82,54 @@ class AirflowDocsBuilder:
             return f"{DOCS_DIR}/_build/docs/{self.package_name}"
 
     @property
+    def log_spelling_filename(self) -> str:
+        """Log from spelling job."""
+        return os.path.join(self._build_dir, f"output-spelling-{self.package_name}.log")
+
+    @property
+    def docker_log_spelling_filename(self) -> str:
+        """Log from spelling job in docker."""
+        return os.path.join(self._docker_build_dir, f"output-spelling-{self.package_name}.log")
+
+    @property
+    def log_spelling_output_dir(self) -> str:
+        """Results from spelling job."""
+        return os.path.join(self._build_dir, f"output-spelling-results-{self.package_name}")
+
+    @property
+    def docker_log_spelling_output_dir(self) -> str:
+        """Results from spelling job in docker."""
+        return os.path.join(self._docker_build_dir, f"output-spelling-results-{self.package_name}")
+
+    @property
+    def log_build_filename(self) -> str:
+        """Log from build job."""
+        return os.path.join(self._build_dir, f"output-build-{self.package_name}.log")
+
+    @property
+    def docker_log_build_filename(self) -> str:
+        """Log from build job in docker."""
+        return os.path.join(self._docker_build_dir, f"output-build-{self.package_name}.log")
+
+    @property
+    def log_build_warning_filename(self) -> str:
+        """Warnings from build job."""
+        return os.path.join(self._build_dir, f"warning-build-{self.package_name}.log")
+
+    @property
+    def docker_log_warning_filename(self) -> str:
+        """Warnings from build job in docker."""
+        return os.path.join(self._docker_build_dir, f"warning-build-{self.package_name}.log")
+
+    @property
+    def _docker_build_dir(self) -> str:
+        if self.is_versioned:
+            version = "stable" if self.for_production else "latest"
+            return f"{DOCKER_DOCS_DIR}/_build/docs/{self.package_name}/{version}"
+        else:
+            return f"{DOCKER_DOCS_DIR}/_build/docs/{self.package_name}"
+
+    @property
     def _current_version(self):
         if not self.is_versioned:
             raise Exception("This documentation package is not versioned")
@@ -90,6 +153,10 @@ class AirflowDocsBuilder:
     def _src_dir(self) -> str:
         return f"{DOCS_DIR}/{self.package_name}"
 
+    @property
+    def _docker_src_dir(self) -> str:
+        return f"{DOCKER_DOCS_DIR}/{self.package_name}"
+
     def clean_files(self) -> None:
         """Cleanup all artifacts generated by previous builds."""
         api_dir = os.path.join(self._src_dir, "_api")
@@ -99,11 +166,42 @@ class AirflowDocsBuilder:
         os.makedirs(api_dir, exist_ok=True)
         os.makedirs(self._build_dir, exist_ok=True)
 
-    def check_spelling(self, verbose):
-        """Checks spelling."""
+    def check_spelling(self, verbose: bool, dockerized: bool) -> List[SpellingError]:
+        """
+        Checks spelling
+
+        :param verbose: whether to show output while running
+        :param dockerized: whether to run dockerized build (required for paralllel processing on CI)
+        :return: list of errors
+        """
         spelling_errors = []
-        with TemporaryDirectory() as tmp_dir, NamedTemporaryFile() as output:
+        os.makedirs(self._build_dir, exist_ok=True)
+        shutil.rmtree(self.log_spelling_output_dir, ignore_errors=True)
+        os.makedirs(self.log_spelling_output_dir, exist_ok=True)
+        if dockerized:
+            python_version = os.getenv('PYTHON_MAJOR_MINOR_VERSION', "3.6")
             build_cmd = [
+                "docker",
+                "run",
+                "--rm",
+                "-e",
+                "AIRFLOW_FOR_PRODUCTION",
+                "-e",
+                "AIRFLOW_PACKAGE_NAME",
+                "-v",
+                f"{self._build_dir}:{self._docker_build_dir}",
+                "-v",
+                f"{self._inventory_cache_dir}:{self._docker_inventory_cache_dir}",
+                "-w",
+                DOCKER_DOCS_DIR,
+                f"apache/airflow:master-python{python_version}-ci",
+                "/opt/airflow/scripts/in_container/run_anything.sh",
+            ]
+        else:
+            build_cmd = []
+
+        build_cmd.extend(
+            [
                 "sphinx-build",
                 "-W",  # turn warnings into errors
                 "--color",  # do emit colored output
@@ -111,19 +209,26 @@ class AirflowDocsBuilder:
                 "-b",  # builder to use
                 "spelling",
                 "-c",
-                DOCS_DIR,
+                DOCS_DIR if not dockerized else DOCKER_DOCS_DIR,
                 "-d",  # path for the cached environment and doctree files
-                self._doctree_dir,
-                self._src_dir,  # path to documentation source files
-                tmp_dir,
+                self._doctree_dir if not dockerized else self._docker_doctree_dir,
+                self._src_dir
+                if not dockerized
+                else self._docker_src_dir,  # path to documentation source files
+                self.log_spelling_output_dir if not dockerized else self.docker_log_spelling_output_dir,
             ]
-            print("Executing cmd: ", " ".join([shlex.quote(c) for c in build_cmd]))
-            if not verbose:
-                print("The output is hidden until an error occurs.")
-            env = os.environ.copy()
-            env['AIRFLOW_PACKAGE_NAME'] = self.package_name
-            if self.for_production:
-                env['AIRFLOW_FOR_PRODUCTION'] = 'true'
+        )
+        env = os.environ.copy()
+        env['AIRFLOW_PACKAGE_NAME'] = self.package_name
+        if self.for_production:
+            env['AIRFLOW_FOR_PRODUCTION'] = 'true'
+        if verbose:
+            console.print(
+                f"[blue]{self.package_name:60}:[/] Executing cmd: ",
+                " ".join([shlex.quote(c) for c in build_cmd]),
+            )
+            console.print(f"[blue]{self.package_name:60}:[/] The output is hidden until an error occurs.")
+        with open(self.log_spelling_filename, "wt") as output:
             completed_proc = run(  # pylint: disable=subprocess-run-check
                 build_cmd,
                 cwd=self._src_dir,
@@ -132,58 +237,101 @@ class AirflowDocsBuilder:
                 stderr=output if not verbose else None,
                 timeout=PROCESS_TIMEOUT,
             )
-            if completed_proc.returncode != 0:
-                output.seek(0)
-                print(output.read().decode())
-
-                spelling_errors.append(
-                    SpellingError(
-                        file_path=None,
-                        line_no=None,
-                        spelling=None,
-                        suggestion=None,
-                        context_line=None,
-                        message=(
-                            f"Sphinx spellcheck returned non-zero exit status: {completed_proc.returncode}."
-                        ),
-                    )
+        if completed_proc.returncode != 0:
+            spelling_errors.append(
+                SpellingError(
+                    file_path=None,
+                    line_no=None,
+                    spelling=None,
+                    suggestion=None,
+                    context_line=None,
+                    message=(
+                        f"Sphinx spellcheck returned non-zero exit status: " f"{completed_proc.returncode}."
+                    ),
+                )
+            )
+            warning_text = ""
+            for filepath in glob(f"{self.log_spelling_output_dir}/**/*.spelling", recursive=True):
+                with open(filepath) as spelling_file:
+                    warning_text += spelling_file.read()
+            spelling_errors.extend(parse_spelling_warnings(warning_text, self._src_dir, dockerized))
+            console.print(f"[blue]{self.package_name:60}:[/] [red]Finished spell-checking with errors[/]")
+        else:
+            if spelling_errors:
+                console.print(
+                    f"[blue]{self.package_name:60}:[/] [yellow]Finished spell-checking " f"with warnings[/]"
+                )
+            else:
+                console.print(
+                    f"[blue]{self.package_name:60}:[/] [green]Finished spell-checking " f"successfully[/]"
                 )
-                warning_text = ""
-                for filepath in glob(f"{tmp_dir}/**/*.spelling", recursive=True):
-                    with open(filepath) as speeling_file:
-                        warning_text += speeling_file.read()
-
-                spelling_errors.extend(parse_spelling_warnings(warning_text, self._src_dir))
         return spelling_errors
 
-    def build_sphinx_docs(self, verbose) -> List[DocBuildError]:
-        """Build Sphinx documentation"""
+    def build_sphinx_docs(self, verbose: bool, dockerized: bool) -> List[DocBuildError]:
+        """
+        Build Sphinx documentation.
+
+        :param verbose: whether to show output while running
+        :param dockerized: whether to run dockerized build (required for paralllel processing on CI)
+        :return: list of errors
+        """
         build_errors = []
-        with NamedTemporaryFile() as tmp_file, NamedTemporaryFile() as output:
+        os.makedirs(self._build_dir, exist_ok=True)
+        if dockerized:
+            python_version = os.getenv('PYTHON_MAJOR_MINOR_VERSION', "3.6")
             build_cmd = [
+                "docker",
+                "run",
+                "--rm",
+                "-e",
+                "AIRFLOW_FOR_PRODUCTION",
+                "-e",
+                "AIRFLOW_PACKAGE_NAME",
+                "-v",
+                f"{self._build_dir}:{self._docker_build_dir}",
+                "-v",
+                f"{self._inventory_cache_dir}:{self._docker_inventory_cache_dir}",
+                "-w",
+                DOCKER_DOCS_DIR,
+                f"apache/airflow:master-python{python_version}-ci",
+                "/opt/airflow/scripts/in_container/run_anything.sh",
+            ]
+        else:
+            build_cmd = []
+        build_cmd.extend(
+            [
                 "sphinx-build",
                 "-T",  # show full traceback on exception
                 "--color",  # do emit colored output
                 "-b",  # builder to use
                 "html",
                 "-d",  # path for the cached environment and doctree files
-                self._doctree_dir,
+                self._doctree_dir if not dockerized else self._docker_doctree_dir,
                 "-c",
-                DOCS_DIR,
+                DOCS_DIR if not dockerized else DOCKER_DOCS_DIR,
                 "-w",  # write warnings (and errors) to given file
-                tmp_file.name,
-                self._src_dir,  # path to documentation source files
-                self._build_dir,  # path to output directory
+                self.log_build_warning_filename if not dockerized else self.docker_log_warning_filename,
+                self._src_dir
+                if not dockerized
+                else self._docker_src_dir,  # path to documentation source files
+                self._build_dir if not dockerized else self._docker_build_dir,  # path to output directory
             ]
-            print("Executing cmd: ", " ".join([shlex.quote(c) for c in build_cmd]))
-            if not verbose:
-                print("The output is hidden until an error occurs.")
-
-            env = os.environ.copy()
-            env['AIRFLOW_PACKAGE_NAME'] = self.package_name
-            if self.for_production:
-                env['AIRFLOW_FOR_PRODUCTION'] = 'true'
-
+        )
+        env = os.environ.copy()
+        env['AIRFLOW_PACKAGE_NAME'] = self.package_name
+        if self.for_production:
+            env['AIRFLOW_FOR_PRODUCTION'] = 'true'
+        if verbose:
+            console.print(
+                f"[blue]{self.package_name:60}:[/] Executing cmd: ",
+                " ".join([shlex.quote(c) for c in build_cmd]),
+            )
+        else:
+            console.print(
+                f"[blue]{self.package_name:60}:[/] Running sphinx. "
+                f"The output is hidden until an error occurs."
+            )
+        with open(self.log_build_filename, "wt") as output:
             completed_proc = run(  # pylint: disable=subprocess-run-check
                 build_cmd,
                 cwd=self._src_dir,
@@ -192,35 +340,48 @@ class AirflowDocsBuilder:
                 stderr=output if not verbose else None,
                 timeout=PROCESS_TIMEOUT,
             )
-            if completed_proc.returncode != 0:
-                output.seek(0)
-                print(output.read().decode())
-                build_errors.append(
-                    DocBuildError(
-                        file_path=None,
-                        line_no=None,
-                        message=f"Sphinx returned non-zero exit status: {completed_proc.returncode}.",
-                    )
+        if completed_proc.returncode != 0:
+            build_errors.append(
+                DocBuildError(
+                    file_path=None,
+                    line_no=None,
+                    message=f"Sphinx returned non-zero exit status: {completed_proc.returncode}.",
                 )
-            tmp_file.seek(0)
-            warning_text = tmp_file.read().decode()
+            )
+        if os.path.isfile(self.log_build_warning_filename):
+            with open(self.log_build_warning_filename) as warning_file:
+                warning_text = warning_file.read()
             # Remove 7-bit C1 ANSI escape sequences
             warning_text = re.sub(r"\x1B[@-_][0-?]*[ -/]*[@-~]", "", warning_text)
-            build_errors.extend(parse_sphinx_warnings(warning_text, self._src_dir))
+            build_errors.extend(parse_sphinx_warnings(warning_text, self._src_dir, dockerized))
+        if build_errors:
+            console.print(f"[blue]{self.package_name:60}:[/] [red]Finished docs building with errors[/]")
+        else:
+            console.print(f"[blue]{self.package_name:60}:[/] [green]Finished docs building successfully[/]")
         return build_errors
 
     def publish(self):
         """Copy documentation packages files to airflow-site repository."""
-        print(f"Publishing docs for {self.package_name}")
+        console.print(f"Publishing docs for {self.package_name}")
         output_dir = os.path.join(AIRFLOW_SITE_DIR, self._publish_dir)
         pretty_source = pretty_format_path(self._build_dir, os.getcwd())
         pretty_target = pretty_format_path(output_dir, AIRFLOW_SITE_DIR)
-        print(f"Copy directory: {pretty_source} => {pretty_target}")
+        console.print(f"Copy directory: {pretty_source} => {pretty_target}")
+        if os.path.exists(output_dir):
+            if self.is_versioned:
+                console.print(
+                    f"Skipping previously existing {output_dir}! "
+                    f"Delete it manually if you want to regenerate it!"
+                )
+                console.print()
+                return
+            else:
+                shutil.rmtree(output_dir)
         shutil.copytree(self._build_dir, output_dir)
         if self.is_versioned:
             with open(os.path.join(output_dir, "..", "stable.txt"), "w") as stable_file:
                 stable_file.write(self._current_version)
-        print()
+        console.print()
 
 
 def get_available_providers_packages():
diff --git a/docs/exts/docs_build/errors.py b/docs/exts/docs_build/errors.py
index 21106ce..954262d 100644
--- a/docs/exts/docs_build/errors.py
+++ b/docs/exts/docs_build/errors.py
@@ -18,11 +18,16 @@ import os
 from functools import total_ordering
 from typing import Dict, List, NamedTuple, Optional
 
+from rich.console import Console
+
 from airflow.utils.code_utils import prepare_code_snippet
+from docs.exts.docs_build.code_utils import CONSOLE_WIDTH, remap_from_docker
 
 CURRENT_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__)))
 DOCS_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir, os.pardir))
 
+console = Console(force_terminal=True, color_system="standard", width=CONSOLE_WIDTH)
+
 
 @total_ordering
 class DocBuildError(NamedTuple):
@@ -52,28 +57,32 @@ class DocBuildError(NamedTuple):
 
 def display_errors_summary(build_errors: Dict[str, List[DocBuildError]]) -> None:
     """Displays summary of errors"""
-    print("#" * 20, "Docs build errors summary", "#" * 20)
-
+    console.print()
+    console.print("[red]" + "#" * 30 + " Start docs build errors summary " + "#" * 30 + "[/]")
+    console.print()
     for package_name, errors in build_errors.items():
         if package_name:
-            print("=" * 20, package_name, "=" * 20)
+            console.print("=" * 30 + f" [blue]{package_name}[/] " + "=" * 30)
         else:
-            print("=" * 20, "General", "=" * 20)
+            console.print("=" * 30, " [blue]General[/] ", "=" * 30)
         for warning_no, error in enumerate(sorted(errors), 1):
-            print("-" * 20, f"Error {warning_no:3}", "-" * 20)
-            print(error.message)
-            print()
+            console.print("-" * 30, f"[red]Error {warning_no:3}[/]", "-" * 20)
+            console.print(error.message)
+            console.print()
             if error.file_path and error.file_path != "<unknown>" and error.line_no:
-                print(f"File path: {os.path.relpath(error.file_path, start=DOCS_DIR)} ({error.line_no})")
-                print()
-                print(prepare_code_snippet(error.file_path, error.line_no))
+                console.print(
+                    f"File path: {os.path.relpath(error.file_path, start=DOCS_DIR)} ({error.line_no})"
+                )
+                console.print()
+                console.print(prepare_code_snippet(error.file_path, error.line_no))
             elif error.file_path:
-                print(f"File path: {error.file_path}")
-
-    print("#" * 50)
+                console.print(f"File path: {error.file_path}")
+    console.print()
+    console.print("[red]" + "#" * 30 + " End docs build errors summary " + "#" * 30 + "[/]")
+    console.print()
 
 
-def parse_sphinx_warnings(warning_text: str, docs_dir: str) -> List[DocBuildError]:
+def parse_sphinx_warnings(warning_text: str, docs_dir: str, dockerized: bool) -> List[DocBuildError]:
     """
     Parses warnings from Sphinx.
 
@@ -89,7 +98,7 @@ def parse_sphinx_warnings(warning_text: str, docs_dir: str) -> List[DocBuildErro
             try:
                 sphinx_build_errors.append(
                     DocBuildError(
-                        file_path=os.path.join(docs_dir, warning_parts[0]),
+                        file_path=remap_from_docker(os.path.join(docs_dir, warning_parts[0]), dockerized),
                         line_no=int(warning_parts[1]),
                         message=warning_parts[2],
                     )
diff --git a/docs/exts/docs_build/github_action_utils.py b/docs/exts/docs_build/github_action_utils.py
index 4b21b03..f0fc483 100644
--- a/docs/exts/docs_build/github_action_utils.py
+++ b/docs/exts/docs_build/github_action_utils.py
@@ -33,6 +33,7 @@ def with_group(title):
         yield
         return
     print(f"::group::{title}")
+    print()
     yield
     print("\033[0m")
     print("::endgroup::")
diff --git a/docs/exts/docs_build/spelling_checks.py b/docs/exts/docs_build/spelling_checks.py
index c2b7ca9..2be9cca 100644
--- a/docs/exts/docs_build/spelling_checks.py
+++ b/docs/exts/docs_build/spelling_checks.py
@@ -20,11 +20,16 @@ import re
 from functools import total_ordering
 from typing import Dict, List, NamedTuple, Optional
 
+from rich.console import Console
+
 from airflow.utils.code_utils import prepare_code_snippet
+from docs.exts.docs_build.code_utils import CONSOLE_WIDTH, remap_from_docker
 
 CURRENT_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__)))
 DOCS_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir, os.pardir))
 
+console = Console(force_terminal=True, color_system="standard", width=CONSOLE_WIDTH)
+
 
 @total_ordering
 class SpellingError(NamedTuple):
@@ -75,7 +80,7 @@ class SpellingError(NamedTuple):
         return left < right
 
 
-def parse_spelling_warnings(warning_text: str, docs_dir) -> List[SpellingError]:
+def parse_spelling_warnings(warning_text: str, docs_dir: str, dockerized: bool) -> List[SpellingError]:
     """
     Parses warnings from Sphinx.
 
@@ -94,7 +99,7 @@ def parse_spelling_warnings(warning_text: str, docs_dir) -> List[SpellingError]:
             try:
                 sphinx_spelling_errors.append(
                     SpellingError(
-                        file_path=os.path.join(docs_dir, warning_parts[0]),
+                        file_path=remap_from_docker(os.path.join(docs_dir, warning_parts[0]), dockerized),
                         line_no=int(warning_parts[1]) if warning_parts[1] not in ('None', '') else None,
                         spelling=warning_parts[2],
                         suggestion=warning_parts[3] if warning_parts[3] else None,
@@ -130,43 +135,47 @@ def parse_spelling_warnings(warning_text: str, docs_dir) -> List[SpellingError]:
 
 def display_spelling_error_summary(spelling_errors: Dict[str, List[SpellingError]]) -> None:
     """Displays summary of Spelling errors"""
-    print("#" * 20, "Spelling errors summary", "#" * 20)
+    console.print()
+    console.print("[red]" + "#" * 30 + " Start spelling errors summary " + "#" * 30 + "[/]")
+    console.print()
 
     for package_name, errors in sorted(spelling_errors.items()):
         if package_name:
-            print("=" * 20, package_name, "=" * 20)
+            console.print("=" * 30, f" [blue]{package_name}[/] ", "=" * 30)
         else:
-            print("=" * 20, "General", "=" * 20)
+            console.print("=" * 30, " [blue]General[/] ", "=" * 30)
 
         for warning_no, error in enumerate(sorted(errors), 1):
-            print("-" * 20, f"Error {warning_no:3}", "-" * 20)
+            console.print("-" * 30, f"Error {warning_no:3}", "-" * 30)
 
             _display_error(error)
 
-    print("=" * 50)
-    print()
+    console.print("=" * 100)
+    console.print()
     msg = """
 If the spelling is correct, add the spelling to docs/spelling_wordlist.txt
 or use the spelling directive.
 Check https://sphinxcontrib-spelling.readthedocs.io/en/latest/customize.html#private-dictionaries
 for more details.
     """
-    print(msg)
-    print()
-    print("#" * 50)
+    console.print(msg)
+    console.print()
+    console.print
+    console.print("[red]" + "#" * 30 + " End docs build errors summary " + "#" * 30 + "[/]")
+    console.print
 
 
 def _display_error(error: SpellingError):
-    print(error.message)
-    print()
+    console.print(error.message)
+    console.print()
     if error.file_path:
-        print(f"File path: {os.path.relpath(error.file_path, start=DOCS_DIR)}")
+        console.print(f"File path: {os.path.relpath(error.file_path, start=DOCS_DIR)}")
         if error.spelling:
-            print(f"Incorrect Spelling: '{error.spelling}'")
+            console.print(f"Incorrect Spelling: '{error.spelling}'")
         if error.suggestion:
-            print(f"Suggested Spelling: '{error.suggestion}'")
+            console.print(f"Suggested Spelling: '{error.suggestion}'")
         if error.context_line:
-            print(f"Line with Error: '{error.context_line}'")
+            console.print(f"Line with Error: '{error.context_line}'")
         if error.line_no:
-            print(f"Line Number: {error.line_no}")
-            print(prepare_code_snippet(error.file_path, error.line_no))
+            console.print(f"Line Number: {error.line_no}")
+            console.print(prepare_code_snippet(error.file_path, error.line_no))
diff --git a/provider_packages/README.rst b/provider_packages/README.rst
new file mode 100644
index 0000000..9761c5e
--- /dev/null
+++ b/provider_packages/README.rst
@@ -0,0 +1,53 @@
+
+.. Licensed to the Apache Software Foundation (ASF) under one
+   or more contributor license agreements.  See the NOTICE file
+   distributed with this work for additional information
+   regarding copyright ownership.  The ASF licenses this file
+   to you under the Apache License, Version 2.0 (the
+   "License"); you may not use this file except in compliance
+   with the License.  You may obtain a copy of the License at
+
+..   http://www.apache.org/licenses/LICENSE-2.0
+
+.. Unless required by applicable law or agreed to in writing,
+   software distributed under the License is distributed on an
+   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+   KIND, either express or implied.  See the License for the
+   specific language governing permissions and limitations
+   under the License.
+
+
+Package ``apache-airflow-providers-ssh``
+
+Release: ``1.0.0dev``
+
+
+`Secure Shell (SSH) <https://tools.ietf.org/html/rfc4251>`__
+
+
+Provider package
+================
+
+This is a provider package for ``ssh`` provider. All classes for this provider package
+are in ``airflow.providers.ssh`` python package.
+
+You can find package information and changelog for the provider
+in the `documentation <https://airflow.apache.org/docs/apache-airflow-providers-ssh/1.0.0/>`_.
+
+
+Installation
+============
+
+You can install this package on top of an existing airflow 2.* installation via
+``pip install apache-airflow-providers-ssh``
+
+PIP requirements
+================
+
+=============  ==================
+PIP package    Version required
+=============  ==================
+``paramiko``   ``>=2.6.0``
+``pysftp``     ``>=0.2.9``
+``sshtunnel``  ``>=0.1.4,<0.2``
+=============  ==================
diff --git a/scripts/ci/docs/ci_docs.sh b/scripts/ci/docs/ci_docs.sh
index be0d2ed..003a8c2 100755
--- a/scripts/ci/docs/ci_docs.sh
+++ b/scripts/ci/docs/ci_docs.sh
@@ -22,4 +22,17 @@ build_images::prepare_ci_build
 
 build_images::rebuild_ci_image_if_needed_with_group
 
-runs::run_docs "${@}"
+start_end::group_start "Preparing venv for doc building"
+
+python3 -m venv .docs-venv
+source .docs-venv/bin/activate
+export PYTHONPATH=${AIRFLOW_SOURCES}
+
+pip install --upgrade pip==20.2.4
+
+pip install .[doc] --upgrade --constraint \
+    "https://raw.githubusercontent.com/apache/airflow/constraints-${DEFAULT_BRANCH}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt"
+
+start_end::group_end
+
+"${AIRFLOW_SOURCES}/docs/build_docs.py" -j 0 "${@}"
diff --git a/scripts/in_container/_in_container_utils.sh b/scripts/in_container/_in_container_utils.sh
index f0006e1..7d80f00 100644
--- a/scripts/in_container/_in_container_utils.sh
+++ b/scripts/in_container/_in_container_utils.sh
@@ -137,17 +137,16 @@ function in_container_cleanup_pycache() {
 function in_container_fix_ownership() {
     if [[ ${HOST_OS:=} == "Linux" ]]; then
         DIRECTORIES_TO_FIX=(
-            "/tmp"
             "/files"
             "/root/.aws"
             "/root/.azure"
             "/root/.config/gcloud"
             "/root/.docker"
-            "${AIRFLOW_SOURCES}"
+            "/opt/airflow/logs"
+            "/opt/airflow/docs"
         )
-        sudo find "${DIRECTORIES_TO_FIX[@]}" -print0 -user root 2>/dev/null |
-            sudo xargs --null chown "${HOST_USER_ID}.${HOST_GROUP_ID}" --no-dereference ||
-            true >/dev/null 2>&1
+        find "${DIRECTORIES_TO_FIX[@]}" -print0 -user root 2>/dev/null |
+            xargs --null chown "${HOST_USER_ID}.${HOST_GROUP_ID}" --no-dereference || true >/dev/null 2>&1
     fi
 }
 
diff --git a/scripts/in_container/run_fix_ownership.sh b/scripts/in_container/run_anything.sh
similarity index 83%
copy from scripts/in_container/run_fix_ownership.sh
copy to scripts/in_container/run_anything.sh
index eaaee77..233cb47 100755
--- a/scripts/in_container/run_fix_ownership.sh
+++ b/scripts/in_container/run_anything.sh
@@ -15,7 +15,4 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-# shellcheck source=scripts/in_container/_in_container_script_init.sh
-. "$( dirname "${BASH_SOURCE[0]}" )/_in_container_script_init.sh"
-
-in_container_fix_ownership
+"${@}"
diff --git a/scripts/in_container/run_fix_ownership.sh b/scripts/in_container/run_fix_ownership.sh
index eaaee77..d9e98ff 100755
--- a/scripts/in_container/run_fix_ownership.sh
+++ b/scripts/in_container/run_fix_ownership.sh
@@ -15,7 +15,7 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-# shellcheck source=scripts/in_container/_in_container_script_init.sh
-. "$( dirname "${BASH_SOURCE[0]}" )/_in_container_script_init.sh"
+# shellcheck source=scripts/in_container/_in_container_utils.sh
+. "$( dirname "${BASH_SOURCE[0]}" )/_in_container_utils.sh"
 
 in_container_fix_ownership

[airflow] 14/16: Removes unused CI feature of printing output on error (#15190)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3d172162837f9327135e4e051e303c827d70a708
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Apr 4 22:20:11 2021 +0200

    Removes unused CI feature of printing output on error (#15190)
    
    Fixes: #13924
    (cherry picked from commit 7c17bf0d1e828b454a6b2c7245ded275b313c792)
---
 scripts/in_container/_in_container_utils.sh | 30 +++--------------------------
 1 file changed, 3 insertions(+), 27 deletions(-)

diff --git a/scripts/in_container/_in_container_utils.sh b/scripts/in_container/_in_container_utils.sh
index 7d80f00..ad3083e 100644
--- a/scripts/in_container/_in_container_utils.sh
+++ b/scripts/in_container/_in_container_utils.sh
@@ -54,16 +54,6 @@ function assert_in_container() {
 }
 
 function in_container_script_start() {
-    OUTPUT_PRINTED_ONLY_ON_ERROR=$(mktemp)
-    export OUTPUT_PRINTED_ONLY_ON_ERROR
-    readonly OUTPUT_PRINTED_ONLY_ON_ERROR
-
-    if [[ ${VERBOSE=} == "true" && ${GITHUB_ACTIONS=} != "true" ]]; then
-        echo
-        echo "Output is redirected to ${OUTPUT_PRINTED_ONLY_ON_ERROR} and will be printed on error only"
-        echo
-    fi
-
     if [[ ${VERBOSE_COMMANDS:="false"} == "true" ]]; then
         set -x
     fi
@@ -74,23 +64,9 @@ function in_container_script_end() {
     EXIT_CODE=$?
     if [[ ${EXIT_CODE} != 0 ]]; then
         if [[ "${PRINT_INFO_FROM_SCRIPTS="true"}" == "true" ]]; then
-            if [[ -f "${OUTPUT_PRINTED_ONLY_ON_ERROR}" ]]; then
-                echo "###########################################################################################"
-                echo
-                echo "${COLOR_BLUE} EXIT CODE: ${EXIT_CODE} in container (See above for error message). Below is the output of the last action! ${COLOR_RESET}"
-                echo
-                echo "${COLOR_BLUE}***  BEGINNING OF THE LAST COMMAND OUTPUT *** ${COLOR_RESET}"
-                cat "${OUTPUT_PRINTED_ONLY_ON_ERROR}"
-                echo "${COLOR_BLUE}***  END OF THE LAST COMMAND OUTPUT ***  ${COLOR_RESET}"
-                echo
-                echo "${COLOR_BLUE} EXIT CODE: ${EXIT_CODE} in container. The actual error might be above the output!  ${COLOR_RESET}"
-                echo
-                echo "###########################################################################################"
-            else
-                echo "########################################################################################################################"
-                echo "${COLOR_BLUE} [IN CONTAINER]   EXITING ${0} WITH EXIT CODE ${EXIT_CODE}  ${COLOR_RESET}"
-                echo "########################################################################################################################"
-            fi
+            echo "########################################################################################################################"
+            echo "${COLOR_BLUE} [IN CONTAINER]   EXITING ${0} WITH EXIT CODE ${EXIT_CODE}  ${COLOR_RESET}"
+            echo "########################################################################################################################"
         fi
     fi
 

[airflow] 13/16: Fixes problem when Pull Request is `weird` - has null head_repo (#15189)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 25caba7e31baf6f6ae066a6dbe766b3afd08d5cc
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Apr 4 20:30:02 2021 +0200

    Fixes problem when Pull Request is `weird` - has null head_repo (#15189)
    
    Fixes: #15188
    (cherry picked from commit 041a09f3ee6bc447c3457b108bd5431a2fd70ad9)
---
 .github/actions/cancel-workflow-runs | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/actions/cancel-workflow-runs b/.github/actions/cancel-workflow-runs
index 953e057..8248bc1 160000
--- a/.github/actions/cancel-workflow-runs
+++ b/.github/actions/cancel-workflow-runs
@@ -1 +1 @@
-Subproject commit 953e057dc81d3458935a18d1184c386b0f6b5738
+Subproject commit 8248bc1feff049e98c0e6a96889b147199c38203

[airflow] 12/16: Bump K8S versions to latest supported ones. (#15156)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3e9633e720b3eefa774087a4b8c9bacbbb22615a
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Apr 4 15:08:18 2021 +0200

    Bump K8S versions to latest supported ones. (#15156)
    
    K8S has a one-year support policy. This PR updates the
    K8S versions we use to test to the latest available in three
    supported versions of K8S as of now: 1.20, 1.19. 18.
    
    The 1.16 and 1.17 versions are not supported any more as of today.
    
    https://en.wikipedia.org/wiki/Kubernetes
    
    This change also bumps kind to latest version (we use kind for
    K8S testing) and fixes configuration to match this version.
    
    (cherry picked from commit 36ab9dd7c4188278068c9b8c280d874760f02c5b)
---
 BREEZE.rst                                   |  8 ++++----
 README.md                                    |  2 +-
 breeze-complete                              |  4 ++--
 docs/apache-airflow/installation.rst         |  2 +-
 scripts/ci/kubernetes/kind-cluster-conf.yaml | 15 ++++-----------
 scripts/ci/libraries/_initialization.sh      |  4 ++--
 6 files changed, 14 insertions(+), 21 deletions(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index 293cb37..2a8a74a 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -2485,17 +2485,17 @@ This is the current syntax for  `./breeze <./breeze>`_:
           Kubernetes version - only used in case one of kind-cluster commands is used.
           One of:
 
-                 v1.18.6 v1.17.5 v1.16.9
+                 v1.20.2 v1.19.7 v1.18.15
 
-          Default: v1.18.6
+          Default: v1.20.2
 
   --kind-version KIND_VERSION
           Kind version - only used in case one of kind-cluster commands is used.
           One of:
 
-                 v0.8.0
+                 v0.10.0
 
-          Default: v0.8.0
+          Default: v0.10.0
 
   --helm-version HELM_VERSION
           Helm version - only used in case one of kind-cluster commands is used.
diff --git a/README.md b/README.md
index 7385ed2..0270131 100644
--- a/README.md
+++ b/README.md
@@ -84,7 +84,7 @@ Apache Airflow is tested with:
 | PostgreSQL   | 9.6, 10, 11, 12, 13       | 9.6, 10, 11, 12, 13      | 9.6, 10, 11, 12, 13        |
 | MySQL        | 5.7, 8                    | 5.7, 8                   | 5.6, 5.7                   |
 | SQLite       | 3.15.0+                   | 3.15.0+                  | 3.15.0+                    |
-| Kubernetes   | 1.16.9, 1.17.5, 1.18.6    | 1.16.9, 1.17.5, 1.18.6   | 1.16.9, 1.17.5, 1.18.6     |
+| Kubernetes   | 1.20, 1.19, 1.18          | 1.20, 1.19, 1.18         | 1.18, 1.17, 1.16           |
 
 **Note:** MySQL 5.x versions are unable to or have limitations with
 running multiple schedulers -- please see the "Scheduler" docs. MariaDB is not tested/recommended.
diff --git a/breeze-complete b/breeze-complete
index a75b267..83dfe9f 100644
--- a/breeze-complete
+++ b/breeze-complete
@@ -30,9 +30,9 @@ _breeze_allowed_generate_constraints_modes="source-providers pypi-providers no-p
 # registrys is good here even if it is not correct english. We are adding s automatically to all variables
 _breeze_allowed_github_registrys="docker.pkg.github.com ghcr.io"
 _breeze_allowed_kubernetes_modes="image"
-_breeze_allowed_kubernetes_versions="v1.18.6 v1.17.5 v1.16.9"
+_breeze_allowed_kubernetes_versions="v1.20.2 v1.19.7 v1.18.15"
 _breeze_allowed_helm_versions="v3.2.4"
-_breeze_allowed_kind_versions="v0.8.0"
+_breeze_allowed_kind_versions="v0.10.0"
 _breeze_allowed_mysql_versions="5.7 8"
 _breeze_allowed_postgres_versions="9.6 10 11 12 13"
 _breeze_allowed_kind_operations="start stop restart status deploy test shell k9s"
diff --git a/docs/apache-airflow/installation.rst b/docs/apache-airflow/installation.rst
index 0184216..a348334 100644
--- a/docs/apache-airflow/installation.rst
+++ b/docs/apache-airflow/installation.rst
@@ -42,7 +42,7 @@ Airflow is tested with:
   * MySQL: 5.7, 8
   * SQLite: 3.15.0+
 
-* Kubernetes: 1.16.9, 1.17.5, 1.18.6
+* Kubernetes: 1.18.15 1.19.7 1.20.2
 
 **Note:** MySQL 5.x versions are unable to or have limitations with
 running multiple schedulers -- please see: :doc:`/scheduler`. MariaDB is not tested/recommended.
diff --git a/scripts/ci/kubernetes/kind-cluster-conf.yaml b/scripts/ci/kubernetes/kind-cluster-conf.yaml
index df60820..f03c1b7 100644
--- a/scripts/ci/kubernetes/kind-cluster-conf.yaml
+++ b/scripts/ci/kubernetes/kind-cluster-conf.yaml
@@ -16,9 +16,10 @@
 # under the License.
 ---
 kind: Cluster
-apiVersion: kind.sigs.k8s.io/v1alpha3
+apiVersion: kind.x-k8s.io/v1alpha4
 networking:
-  apiServerAddress: 0.0.0.0
+  ipFamily: ipv4
+  apiServerAddress: "127.0.0.1"
   apiServerPort: 19090
 nodes:
   - role: control-plane
@@ -26,13 +27,5 @@ nodes:
     extraPortMappings:
       - containerPort: 30007
         hostPort: 8080
-        listenAddress: "0.0.0.0"
+        listenAddress: "127.0.0.1"
         protocol: TCP
-kubeadmConfigPatchesJson6902:
-  - group: kubeadm.k8s.io
-    version: v1beta2
-    kind: ClusterConfiguration
-    patch: |
-      - op: add
-        path: /apiServer/certSANs/-
-        value: docker
diff --git a/scripts/ci/libraries/_initialization.sh b/scripts/ci/libraries/_initialization.sh
index cb42693..f924962 100644
--- a/scripts/ci/libraries/_initialization.sh
+++ b/scripts/ci/libraries/_initialization.sh
@@ -476,13 +476,13 @@ function initialization::initialize_provider_package_building() {
 # Determine versions of kubernetes cluster and tools used
 function initialization::initialize_kubernetes_variables() {
     # Currently supported versions of Kubernetes
-    CURRENT_KUBERNETES_VERSIONS+=("v1.18.6" "v1.17.5" "v1.16.9")
+    CURRENT_KUBERNETES_VERSIONS+=("v1.20.2" "v1.19.7" "v1.18.15")
     export CURRENT_KUBERNETES_VERSIONS
     # Currently supported modes of Kubernetes
     CURRENT_KUBERNETES_MODES+=("image")
     export CURRENT_KUBERNETES_MODES
     # Currently supported versions of Kind
-    CURRENT_KIND_VERSIONS+=("v0.8.0")
+    CURRENT_KIND_VERSIONS+=("v0.10.0")
     export CURRENT_KIND_VERSIONS
     # Currently supported versions of Helm
     CURRENT_HELM_VERSIONS+=("v3.2.4")

[airflow] 15/16: Merges quarantined tests into single job (#15153)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e87cd1f259f591aeaa0521ef79b9d6249739d411
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Mon Apr 5 19:58:10 2021 +0200

    Merges quarantined tests into single job (#15153)
    
    (cherry picked from commit 1087226f756b3ff9ea48398e53f9074b0ed4c1cc)
---
 .github/workflows/ci.yml                           |   9 +-
 scripts/ci/libraries/_all_libs.sh                  |   2 +
 scripts/ci/libraries/_initialization.sh            |   3 +-
 scripts/ci/libraries/_parallel.sh                  |  35 +++++-
 scripts/ci/libraries/_testing.sh                   | 116 +++++++++++++++++
 scripts/ci/testing/ci_run_airflow_testing.sh       | 140 +++------------------
 scripts/ci/testing/ci_run_quarantined_tests.sh     |  87 +++++++++++++
 .../ci_run_single_airflow_test_in_docker.sh        |   6 +-
 8 files changed, 259 insertions(+), 139 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index ddc985b..dc98f5c 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -817,15 +817,8 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
     runs-on: ${{ fromJson(needs.build-info.outputs.runsOn) }}
     continue-on-error: true
     needs: [build-info, ci-images]
-    strategy:
-      matrix:
-        include:
-          - backend: mysql
-          - backend: postgres
-          - backend: sqlite
     env:
       RUNS_ON: ${{ fromJson(needs.build-info.outputs.runsOn) }}
-      BACKEND: ${{ matrix.backend }}
       PYTHON_MAJOR_MINOR_VERSION: ${{ needs.build-info.outputs.defaultPythonVersion }}
       MYSQL_VERSION: ${{needs.build-info.outputs.defaultMySQLVersion}}
       POSTGRES_VERSION: ${{needs.build-info.outputs.defaultPostgresVersion}}
@@ -860,7 +853,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
       - name: "Prepare CI image ${{env.PYTHON_MAJOR_MINOR_VERSION}}:${{ env.GITHUB_REGISTRY_PULL_IMAGE_TAG }}"
         run: ./scripts/ci/images/ci_prepare_ci_image_on_ci.sh
       - name: "Tests: Quarantined"
-        run: ./scripts/ci/testing/ci_run_airflow_testing.sh
+        run: ./scripts/ci/testing/ci_run_quarantined_tests.sh
       - name: "Upload Quarantine test results"
         uses: actions/upload-artifact@v2
         if: always()
diff --git a/scripts/ci/libraries/_all_libs.sh b/scripts/ci/libraries/_all_libs.sh
index 09a147d..04e25e8 100755
--- a/scripts/ci/libraries/_all_libs.sh
+++ b/scripts/ci/libraries/_all_libs.sh
@@ -60,6 +60,8 @@ readonly SCRIPTS_CI_DIR
 . "${LIBRARIES_DIR}"/_spinner.sh
 # shellcheck source=scripts/ci/libraries/_start_end.sh
 . "${LIBRARIES_DIR}"/_start_end.sh
+# shellcheck source=scripts/ci/libraries/_testing.sh
+. "${LIBRARIES_DIR}"/_testing.sh
 # shellcheck source=scripts/ci/libraries/_verbosity.sh
 . "${LIBRARIES_DIR}"/_verbosity.sh
 # shellcheck source=scripts/ci/libraries/_verify_image.sh
diff --git a/scripts/ci/libraries/_initialization.sh b/scripts/ci/libraries/_initialization.sh
index f924962..f82cb55 100644
--- a/scripts/ci/libraries/_initialization.sh
+++ b/scripts/ci/libraries/_initialization.sh
@@ -710,7 +710,7 @@ Initialization variables:
 
 Test variables:
 
-    TEST_TYPE: '${TEST_TYPE}'
+    TEST_TYPE: '${TEST_TYPE=}'
 
 EOF
     if [[ "${CI}" == "true" ]]; then
@@ -776,7 +776,6 @@ function initialization::make_constants_read_only() {
     readonly HELM_VERSION
     readonly KUBECTL_VERSION
 
-    readonly BACKEND
     readonly POSTGRES_VERSION
     readonly MYSQL_VERSION
 
diff --git a/scripts/ci/libraries/_parallel.sh b/scripts/ci/libraries/_parallel.sh
index dfe1c4d..739bae1 100644
--- a/scripts/ci/libraries/_parallel.sh
+++ b/scripts/ci/libraries/_parallel.sh
@@ -73,7 +73,7 @@ function parallel::monitor_loop() {
         do
             parallel_process=$(basename "${directory}")
 
-            echo "${COLOR_BLUE}### The last lines for ${parallel_process} process ###${COLOR_RESET}"
+            echo "${COLOR_BLUE}### The last lines for ${parallel_process} process: ${directory}/stdout ###${COLOR_RESET}"
             echo
             tail -2 "${directory}/stdout" || true
             echo
@@ -160,3 +160,36 @@ function parallel::print_job_summary_and_return_status_code() {
     done
     return "${return_code}"
 }
+
+function parallel::kill_all_running_docker_containers() {
+    echo
+    echo "${COLOR_BLUE}Kill all running docker containers${COLOR_RESET}"
+    echo
+    # shellcheck disable=SC2046
+    docker kill $(docker ps -q) || true
+}
+
+function parallel::system_prune_docker() {
+    echo
+    echo "${COLOR_BLUE}System-prune docker${COLOR_RESET}"
+    echo
+    docker_v system prune --force --volumes
+    echo
+}
+
+# Cleans up runner before test execution.
+#  * Kills all running docker containers
+#  * System prune to clean all the temporary/unnamed images and left-over volumes
+#  * Print information about available space and memory
+#  * Kills stale semaphore locks
+function parallel::cleanup_runner() {
+    start_end::group_start "Cleanup runner"
+    parallel::kill_all_running_docker_containers
+    parallel::system_prune_docker
+    docker_engine_resources::get_available_memory_in_docker
+    docker_engine_resources::get_available_cpus_in_docker
+    docker_engine_resources::get_available_disk_space_in_docker
+    docker_engine_resources::print_overall_stats
+    parallel::kill_stale_semaphore_locks
+    start_end::group_end
+}
diff --git a/scripts/ci/libraries/_testing.sh b/scripts/ci/libraries/_testing.sh
new file mode 100644
index 0000000..28d1fc6
--- /dev/null
+++ b/scripts/ci/libraries/_testing.sh
@@ -0,0 +1,116 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+export MEMORY_REQUIRED_FOR_INTEGRATION_TEST_PARALLEL_RUN=33000
+
+function testing::skip_tests_if_requested(){
+    if [[ -f ${BUILD_CACHE_DIR}/.skip_tests ]]; then
+        echo
+        echo "Skipping running tests !!!!!"
+        echo
+        exit
+    fi
+}
+
+function testing::get_docker_compose_local() {
+    DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/files.yml")
+    if [[ ${MOUNT_SELECTED_LOCAL_SOURCES} == "true" ]]; then
+        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/local.yml")
+    fi
+    if [[ ${MOUNT_ALL_LOCAL_SOURCES} == "true" ]]; then
+        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/local-all-sources.yml")
+    fi
+
+    if [[ ${GITHUB_ACTIONS} == "true" ]]; then
+        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/ga.yml")
+    fi
+
+    if [[ ${FORWARD_CREDENTIALS} == "true" ]]; then
+        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/forward-credentials.yml")
+    fi
+
+    if [[ -n ${INSTALL_AIRFLOW_VERSION=} || -n ${INSTALL_AIRFLOW_REFERENCE} ]]; then
+        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/remove-sources.yml")
+    fi
+    readonly DOCKER_COMPOSE_LOCAL
+}
+
+function testing::get_maximum_parallel_test_jobs() {
+    docker_engine_resources::get_available_cpus_in_docker
+    if [[ ${RUNS_ON} != *"self-hosted"* ]]; then
+        echo
+        echo "${COLOR_YELLOW}This is a Github Public runner - for now we are forcing max parallel Quarantined tests jobs to 1 for those${COLOR_RESET}"
+        echo
+        export MAX_PARALLEL_QUARANTINED_TEST_JOBS="1"
+    else
+        if [[ ${MAX_PARALLEL_QUARANTINED_TEST_JOBS=} != "" ]]; then
+            echo
+            echo "${COLOR_YELLOW}Maximum parallel Quarantined test jobs forced via MAX_PARALLEL_QUARANTINED_TEST_JOBS = ${MAX_PARALLEL_QUARANTINED_TEST_JOBS}${COLOR_RESET}"
+            echo
+        else
+            MAX_PARALLEL_QUARANTINED_TEST_JOBS=${CPUS_AVAILABLE_FOR_DOCKER}
+            echo
+            echo "${COLOR_YELLOW}Maximum parallel Quarantined test jobs set to number of CPUs available for Docker = ${MAX_PARALLEL_QUARANTINED_TEST_JOBS}${COLOR_RESET}"
+            echo
+        fi
+
+    fi
+
+    if [[ ${MAX_PARALLEL_TEST_JOBS=} != "" ]]; then
+        echo
+        echo "${COLOR_YELLOW}Maximum parallel test jobs forced via MAX_PARALLEL_TEST_JOBS = ${MAX_PARALLEL_TEST_JOBS}${COLOR_RESET}"
+        echo
+    else
+        MAX_PARALLEL_TEST_JOBS=${CPUS_AVAILABLE_FOR_DOCKER}
+        echo
+        echo "${COLOR_YELLOW}Maximum parallel test jobs set to number of CPUs available for Docker = ${MAX_PARALLEL_TEST_JOBS}${COLOR_RESET}"
+        echo
+    fi
+    export MAX_PARALLEL_TEST_JOBS
+}
+
+function testing::get_test_types_to_run() {
+    if [[ -n "${FORCE_TEST_TYPE=}" ]]; then
+        # Handle case where test type is forced from outside
+        export TEST_TYPES="${FORCE_TEST_TYPE}"
+    fi
+
+    if [[ -z "${TEST_TYPES=}" ]]; then
+        TEST_TYPES="Core Providers API CLI Integration Other WWW"
+        echo
+        echo "Test types not specified. Adding all: ${TEST_TYPES}"
+        echo
+    fi
+
+    if [[ -z "${FORCE_TEST_TYPE=}" ]]; then
+        # Add Postgres/MySQL special test types in case we are running several test types
+        if [[ ${BACKEND} == "postgres" && ${TEST_TYPES} != "Quarantined" ]]; then
+            TEST_TYPES="${TEST_TYPES} Postgres"
+            echo
+            echo "Added Postgres. Tests to run: ${TEST_TYPES}"
+            echo
+        fi
+        if [[ ${BACKEND} == "mysql" && ${TEST_TYPES} != "Quarantined" ]]; then
+            TEST_TYPES="${TEST_TYPES} MySQL"
+            echo
+            echo "Added MySQL. Tests to run: ${TEST_TYPES}"
+            echo
+        fi
+    fi
+    readonly TEST_TYPES
+}
diff --git a/scripts/ci/testing/ci_run_airflow_testing.sh b/scripts/ci/testing/ci_run_airflow_testing.sh
index af147ad..fa8c044 100755
--- a/scripts/ci/testing/ci_run_airflow_testing.sh
+++ b/scripts/ci/testing/ci_run_airflow_testing.sh
@@ -23,128 +23,13 @@ export RUN_TESTS
 SKIPPED_FAILED_JOB="Quarantined"
 export SKIPPED_FAILED_JOB
 
-# shellcheck source=scripts/ci/libraries/_script_init.sh
-. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
-
-if [[ -f ${BUILD_CACHE_DIR}/.skip_tests ]]; then
-    echo
-    echo "Skipping running tests !!!!!"
-    echo
-    exit
-fi
-
-# In case we see too many failures on regular PRs from our users using GitHub Public runners
-# We can uncomment this and come back to sequential test-type execution
-#if [[ ${RUNS_ON} != *"self-hosted"* ]]; then
-#    echo
-#    echo "${COLOR_YELLOW}This is a Github Public runner - for now we are forcing max parallel jobs to 1 for those${COLOR_RESET}"
-#    echo "${COLOR_YELLOW}Until we fix memory usage to allow up to 2 parallel runs on those runners${COLOR_RESET}"
-#    echo
-#    # Forces testing in parallel in case the script is run on self-hosted runners
-#    export MAX_PARALLEL_TEST_JOBS="1"
-#fi
-
 SEMAPHORE_NAME="tests"
+export SEMAPHORE_NAME
 
-function prepare_tests_to_run() {
-    DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/files.yml")
-    if [[ ${MOUNT_SELECTED_LOCAL_SOURCES} == "true" ]]; then
-        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/local.yml")
-    fi
-    if [[ ${MOUNT_ALL_LOCAL_SOURCES} == "true" ]]; then
-        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/local-all-sources.yml")
-    fi
-
-    if [[ ${GITHUB_ACTIONS} == "true" ]]; then
-        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/ga.yml")
-    fi
-
-    if [[ ${FORWARD_CREDENTIALS} == "true" ]]; then
-        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/forward-credentials.yml")
-    fi
-
-    if [[ -n ${INSTALL_AIRFLOW_VERSION=} || -n ${INSTALL_AIRFLOW_REFERENCE} ]]; then
-        DOCKER_COMPOSE_LOCAL+=("-f" "${SCRIPTS_CI_DIR}/docker-compose/remove-sources.yml")
-    fi
-    readonly DOCKER_COMPOSE_LOCAL
-
-    if [[ -n "${FORCE_TEST_TYPE=}" ]]; then
-        # Handle case where test type is forced from outside
-        export TEST_TYPES="${FORCE_TEST_TYPE}"
-    fi
-
-    if [[ -z "${TEST_TYPES=}" ]]; then
-        TEST_TYPES="Core Providers API CLI Integration Other WWW"
-        echo
-        echo "Test types not specified. Adding all: ${TEST_TYPES}"
-        echo
-    fi
-
-    if [[ -z "${FORCE_TEST_TYPE=}" ]]; then
-        # Add Postgres/MySQL special test types in case we are running several test types
-        if [[ ${BACKEND} == "postgres" && ${TEST_TYPES} != "Quarantined" ]]; then
-            TEST_TYPES="${TEST_TYPES} Postgres"
-            echo
-            echo "Added Postgres. Tests to run: ${TEST_TYPES}"
-            echo
-        fi
-        if [[ ${BACKEND} == "mysql" && ${TEST_TYPES} != "Quarantined" ]]; then
-            TEST_TYPES="${TEST_TYPES} MySQL"
-            echo
-            echo "Added MySQL. Tests to run: ${TEST_TYPES}"
-            echo
-        fi
-    fi
-    readonly TEST_TYPES
-}
-
-function kill_all_running_docker_containers() {
-    echo
-    echo "${COLOR_BLUE}Kill all running docker containers${COLOR_RESET}"
-    echo
-    # shellcheck disable=SC2046
-    docker kill $(docker ps -q) || true
-}
+# shellcheck source=scripts/ci/libraries/_script_init.sh
+. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
 
-function system_prune_docker() {
-    echo
-    echo "${COLOR_BLUE}System-prune docker${COLOR_RESET}"
-    echo
-    docker_v system prune --force --volumes
-    echo
-}
 
-function get_maximum_parallel_test_jobs() {
-    if [[ ${MAX_PARALLEL_TEST_JOBS=} != "" ]]; then
-        echo
-        echo "${COLOR_YELLOW}Maximum parallel test jobs forced vi MAX_PARALLEL_TEST_JOBS = ${MAX_PARALLEL_TEST_JOBS}${COLOR_RESET}"
-        echo
-    else
-        MAX_PARALLEL_TEST_JOBS=${CPUS_AVAILABLE_FOR_DOCKER}
-        echo
-        echo "${COLOR_YELLOW}Maximum parallel test jobs set to number of CPUs available for Docker = ${MAX_PARALLEL_TEST_JOBS}${COLOR_RESET}"
-        echo
-    fi
-    export MAX_PARALLEL_TEST_JOBS
-}
-
-# Cleans up runner before test execution.
-#  * Kills all running docker containers
-#  * System prune to clean all the temporary/unnamed images and left-over volumes
-#  * Print information about available space and memory
-#  * Kills stale semaphore locks
-function cleanup_runner() {
-    start_end::group_start "Cleanup runner"
-    kill_all_running_docker_containers
-    system_prune_docker
-    docker_engine_resources::get_available_memory_in_docker
-    docker_engine_resources::get_available_cpus_in_docker
-    docker_engine_resources::get_available_disk_space_in_docker
-    docker_engine_resources::print_overall_stats
-    get_maximum_parallel_test_jobs
-    parallel::kill_stale_semaphore_locks
-    start_end::group_end
-}
 
 # Starts test types in parallel
 # test_types_to_run - list of test types (it's not an array, it is space-separate list)
@@ -171,9 +56,6 @@ function run_test_types_in_parallel() {
     start_end::group_end
 }
 
-
-export MEMORY_REQUIRED_FOR_INTEGRATION_TEST_PARALLEL_RUN=33000
-
 # Runs all test types in parallel depending on the number of CPUs available
 # We monitors their progress, display the progress  and summarize the result when finished.
 #
@@ -188,7 +70,7 @@ export MEMORY_REQUIRED_FOR_INTEGRATION_TEST_PARALLEL_RUN=33000
 #   * MEMORY_AVAILABLE_FOR_DOCKER - memory that is available in docker (set by cleanup_runners)
 #
 function run_all_test_types_in_parallel() {
-    cleanup_runner
+    parallel::cleanup_runner
 
     start_end::group_start "Determine how to run the tests"
     echo
@@ -196,6 +78,7 @@ function run_all_test_types_in_parallel() {
     echo
 
     local run_integration_tests_separately="false"
+    # shellcheck disable=SC2153
     local test_types_to_run=${TEST_TYPES}
 
     if [[ ${test_types_to_run} == *"Integration"* ]]; then
@@ -222,7 +105,7 @@ function run_all_test_types_in_parallel() {
 
     run_test_types_in_parallel "${@}"
     if [[ ${run_integration_tests_separately} == "true" ]]; then
-        cleanup_runner
+        parallel::cleanup_runner
         test_types_to_run="Integration"
         run_test_types_in_parallel "${@}"
     fi
@@ -231,12 +114,19 @@ function run_all_test_types_in_parallel() {
     parallel::print_job_summary_and_return_status_code
 }
 
+
+testing::skip_tests_if_requested
+
 build_images::prepare_ci_build
 
 build_images::rebuild_ci_image_if_needed_with_group
 
-prepare_tests_to_run
-
 parallel::make_sure_gnu_parallel_is_installed
 
+testing::get_maximum_parallel_test_jobs
+
+testing::get_test_types_to_run
+
+testing::get_docker_compose_local
+
 run_all_test_types_in_parallel "${@}"
diff --git a/scripts/ci/testing/ci_run_quarantined_tests.sh b/scripts/ci/testing/ci_run_quarantined_tests.sh
new file mode 100755
index 0000000..0c1108e
--- /dev/null
+++ b/scripts/ci/testing/ci_run_quarantined_tests.sh
@@ -0,0 +1,87 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# Enable automated tests execution
+RUN_TESTS="true"
+export RUN_TESTS
+
+SKIPPED_FAILED_JOB="Quarantined"
+export SKIPPED_FAILED_JOB
+
+SEMAPHORE_NAME="tests"
+export SEMAPHORE_NAME
+
+# shellcheck source=scripts/ci/libraries/_script_init.sh
+. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
+
+BACKEND_TEST_TYPES=(mysql postgres sqlite)
+
+# Starts test types in parallel
+# test_types_to_run - list of test types (it's not an array, it is space-separate list)
+# ${@} - additional arguments to pass to test execution
+function run_quarantined_backend_tests_in_parallel() {
+    start_end::group_start "Determining how to run the tests"
+    echo
+    echo "${COLOR_YELLOW}Running maximum ${MAX_PARALLEL_QUARANTINED_TEST_JOBS} test types in parallel${COLOR_RESET}"
+    echo
+    start_end::group_end
+    start_end::group_start "Monitoring Quarantined tests : ${BACKEND_TEST_TYPES[*]}"
+    parallel::initialize_monitoring
+    parallel::monitor_progress
+    mkdir -p "${PARALLEL_MONITORED_DIR}/${SEMAPHORE_NAME}"
+    TEST_TYPE="Quarantined"
+    export TEST_TYPE
+    for BACKEND in "${BACKEND_TEST_TYPES[@]}"
+    do
+        export BACKEND
+        mkdir -p "${PARALLEL_MONITORED_DIR}/${SEMAPHORE_NAME}/${BACKEND}"
+        mkdir -p "${PARALLEL_MONITORED_DIR}/${SEMAPHORE_NAME}/${BACKEND}"
+        export JOB_LOG="${PARALLEL_MONITORED_DIR}/${SEMAPHORE_NAME}/${BACKEND}/stdout"
+        export PARALLEL_JOB_STATUS="${PARALLEL_MONITORED_DIR}/${SEMAPHORE_NAME}/${BACKEND}/status"
+        # Each test job will get SIGTERM followed by SIGTERM 200ms later and SIGKILL 200ms later after 25 mins
+        # shellcheck disable=SC2086
+        parallel --ungroup --bg --semaphore --semaphorename "${SEMAPHORE_NAME}" \
+            --jobs "${MAX_PARALLEL_QUARANTINED_TEST_JOBS}" --timeout 1500 \
+            "$( dirname "${BASH_SOURCE[0]}" )/ci_run_single_airflow_test_in_docker.sh" "${@}" >${JOB_LOG} 2>&1
+    done
+    parallel --semaphore --semaphorename "${SEMAPHORE_NAME}" --wait
+    parallel::kill_monitor
+    start_end::group_end
+}
+
+testing::skip_tests_if_requested
+
+build_images::prepare_ci_build
+
+build_images::rebuild_ci_image_if_needed_with_group
+
+parallel::make_sure_gnu_parallel_is_installed
+
+testing::get_maximum_parallel_test_jobs
+
+testing::get_docker_compose_local
+
+run_quarantined_backend_tests_in_parallel "${@}"
+
+set +e
+
+parallel::print_job_summary_and_return_status_code
+
+echo "Those are quarantined tests so failure of those does not fail the whole build!"
+echo "Please look above for the output of failed tests to fix them!"
+echo
diff --git a/scripts/ci/testing/ci_run_single_airflow_test_in_docker.sh b/scripts/ci/testing/ci_run_single_airflow_test_in_docker.sh
index 76b710e..0bf415f 100755
--- a/scripts/ci/testing/ci_run_single_airflow_test_in_docker.sh
+++ b/scripts/ci/testing/ci_run_single_airflow_test_in_docker.sh
@@ -90,7 +90,7 @@ function run_airflow_testing_in_docker() {
         echo "Making sure docker-compose is down and remnants removed"
         echo
         docker-compose --log-level INFO -f "${SCRIPTS_CI_DIR}/docker-compose/base.yml" \
-            --project-name "airflow-${TEST_TYPE}" \
+            --project-name "airflow-${TEST_TYPE}-${BACKEND}" \
             down --remove-orphans \
             --volumes --timeout 10
         docker-compose --log-level INFO \
@@ -98,11 +98,11 @@ function run_airflow_testing_in_docker() {
           -f "${SCRIPTS_CI_DIR}/docker-compose/backend-${BACKEND}.yml" \
           "${INTEGRATIONS[@]}" \
           "${DOCKER_COMPOSE_LOCAL[@]}" \
-          --project-name "airflow-${TEST_TYPE}" \
+          --project-name "airflow-${TEST_TYPE}-${BACKEND}" \
              run airflow "${@}"
         exit_code=$?
         docker-compose --log-level INFO -f "${SCRIPTS_CI_DIR}/docker-compose/base.yml" \
-            --project-name "airflow-${TEST_TYPE}" \
+            --project-name "airflow-${TEST_TYPE}-${BACKEND}" \
             down --remove-orphans \
             --volumes --timeout 10
         if [[ ${exit_code} == "254" && ${try_num} != "5" ]]; then

[airflow] 05/16: Better handling of docker command (#15080)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f840d1674d3cc3634764fe2ee911332d9872df3a
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Fri Apr 2 07:31:00 2021 +0200

    Better handling of docker command (#15080)
    
    Not all docker commands are replaced with functions now.
    
    Earlier wer replaced all docker commands with a function to be able
    to capture docker commands used and display it with -v for breeze.
    This has proven to be harmful as
    this is an unexpected behaviour for a docker command.
    
    This change introduces docker_v command which outputs the command
    when needed.
    
    (cherry picked from commit 535e1a8e692ba28ad8ce9474a66b941af1df4875)
---
 breeze                                             |  2 +-
 .../enter_breeze_provider_package_tests.sh         |  2 +-
 scripts/ci/libraries/_build_images.sh              | 42 +++++++++++-----------
 scripts/ci/libraries/_docker_engine_resources.sh   | 12 +++----
 scripts/ci/libraries/_initialization.sh            |  2 ++
 scripts/ci/libraries/_kind.sh                      |  2 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   | 28 +++++++--------
 scripts/ci/libraries/_runs.sh                      | 10 +++---
 scripts/ci/libraries/_start_end.sh                 |  2 +-
 scripts/ci/libraries/_verbosity.sh                 |  6 ++--
 scripts/ci/libraries/_verify_image.sh              |  8 ++---
 .../ci_install_and_test_provider_packages.sh       |  2 +-
 scripts/ci/static_checks/bats_tests.sh             |  2 +-
 scripts/ci/static_checks/check_license.sh          |  2 +-
 scripts/ci/static_checks/flake8.sh                 |  4 +--
 .../ci/static_checks/in_container_bats_tests.sh    |  4 +--
 scripts/ci/static_checks/lint_dockerfile.sh        |  4 +--
 scripts/ci/static_checks/mypy.sh                   |  2 +-
 scripts/ci/static_checks/pylint.sh                 |  4 +--
 scripts/ci/static_checks/refresh_pylint_todo.sh    |  2 +-
 scripts/ci/testing/ci_run_airflow_testing.sh       |  2 +-
 scripts/ci/tools/ci_clear_tmp.sh                   |  2 +-
 scripts/ci/tools/ci_fix_ownership.sh               |  2 +-
 scripts/ci/tools/ci_free_space_on_ci.sh            |  2 +-
 24 files changed, 75 insertions(+), 75 deletions(-)

diff --git a/breeze b/breeze
index 4df4937..c85a5ac 100755
--- a/breeze
+++ b/breeze
@@ -609,7 +609,7 @@ if [[ \${VERBOSE} == "true" ]]; then
   echo
   echo "Executing script:"
   echo
-  echo "${file} \${@}"
+  echo "${COLOR_CYAN}${file} \${@}${COLOR_RESET}"
   echo
   set -x
 fi
diff --git a/dev/provider_packages/enter_breeze_provider_package_tests.sh b/dev/provider_packages/enter_breeze_provider_package_tests.sh
index 0e9467f..b67b33f 100755
--- a/dev/provider_packages/enter_breeze_provider_package_tests.sh
+++ b/dev/provider_packages/enter_breeze_provider_package_tests.sh
@@ -21,7 +21,7 @@ export MOUNT_SELECTED_LOCAL_SOURCES="false"
 . "$(dirname "${BASH_SOURCE[0]}")/../../scripts/ci/libraries/_script_init.sh"
 
 function enter_breeze_with_mapped_sources() {
-    docker run -it "${EXTRA_DOCKER_FLAGS[@]}" \
+    docker_v run -it "${EXTRA_DOCKER_FLAGS[@]}" \
         -v "${AIRFLOW_SOURCES}/setup.py:/airflow_sources/setup.py:cached" \
         -v "${AIRFLOW_SOURCES}/setup.cfg:/airflow_sources/setup.cfg:cached" \
         -v "${AIRFLOW_SOURCES}/airflow/__init__.py:/airflow_sources/airflow/__init__.py:cached" \
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index 7d7d180..771a06d 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -89,7 +89,7 @@ function build_images::add_build_args_for_remote_install() {
 # Retrieves version of airflow stored in the production image (used to display the actual
 # Version we use if it was build from PyPI or GitHub
 function build_images::get_airflow_version_from_production_image() {
-    VERBOSE="false" docker run --entrypoint /bin/bash "${AIRFLOW_PROD_IMAGE}" -c 'echo "${AIRFLOW_VERSION}"'
+    docker run --entrypoint /bin/bash "${AIRFLOW_PROD_IMAGE}" -c 'echo "${AIRFLOW_VERSION}"'
 }
 
 # Removes the "Forced answer" (yes/no/quit) given previously, unless you specifically want to remember it.
@@ -252,7 +252,7 @@ function build_images::confirm_non-empty-docker-context-files() {
 # We cannot use docker registry APIs as they are available only with authorisation
 # But this image can be pulled without authentication
 function build_images::build_ci_image_manifest() {
-    docker build \
+    docker_v build \
         --tag="${AIRFLOW_CI_LOCAL_MANIFEST_IMAGE}" \
         -f- . <<EOF
 FROM scratch
@@ -270,8 +270,8 @@ function build_images::get_local_build_cache_hash() {
 
     set +e
     # Remove the container just in case
-    docker rm --force "local-airflow-ci-container" 2>/dev/null >/dev/null
-    if ! docker inspect "${AIRFLOW_CI_IMAGE}" 2>/dev/null >/dev/null; then
+    docker_v rm --force "local-airflow-ci-container" 2>/dev/null >/dev/null
+    if ! docker_v inspect "${AIRFLOW_CI_IMAGE}" 2>/dev/null >/dev/null; then
         verbosity::print_info
         verbosity::print_info "Local airflow CI image not available"
         verbosity::print_info
@@ -281,8 +281,8 @@ function build_images::get_local_build_cache_hash() {
         return
 
     fi
-    docker create --name "local-airflow-ci-container" "${AIRFLOW_CI_IMAGE}" 2>/dev/null
-    docker cp "local-airflow-ci-container:/build-cache-hash" \
+    docker_v create --name "local-airflow-ci-container" "${AIRFLOW_CI_IMAGE}" 2>/dev/null
+    docker_v cp "local-airflow-ci-container:/build-cache-hash" \
         "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}" 2>/dev/null ||
         touch "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}"
     set -e
@@ -305,7 +305,7 @@ function build_images::get_local_build_cache_hash() {
 function build_images::get_remote_image_build_cache_hash() {
     set +e
     # Pull remote manifest image
-    if ! docker pull "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}" 2>/dev/null >/dev/null; then
+    if ! docker_v pull "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}" 2>/dev/null >/dev/null; then
         verbosity::print_info
         verbosity::print_info "Remote docker registry unreachable"
         verbosity::print_info
@@ -317,11 +317,11 @@ function build_images::get_remote_image_build_cache_hash() {
     set -e
     rm -f "${REMOTE_IMAGE_CONTAINER_ID_FILE}"
     # Create container dump out of the manifest image without actually running it
-    docker create --cidfile "${REMOTE_IMAGE_CONTAINER_ID_FILE}" "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}"
+    docker_v create --cidfile "${REMOTE_IMAGE_CONTAINER_ID_FILE}" "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}"
     # Extract manifest and store it in local file
-    docker cp "$(cat "${REMOTE_IMAGE_CONTAINER_ID_FILE}"):/build-cache-hash" \
+    docker_v cp "$(cat "${REMOTE_IMAGE_CONTAINER_ID_FILE}"):/build-cache-hash" \
         "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}"
-    docker rm --force "$(cat "${REMOTE_IMAGE_CONTAINER_ID_FILE}")"
+    docker_v rm --force "$(cat "${REMOTE_IMAGE_CONTAINER_ID_FILE}")"
     rm -f "${REMOTE_IMAGE_CONTAINER_ID_FILE}"
     verbosity::print_info
     verbosity::print_info "Remote build cache hash: '$(cat "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}")'"
@@ -490,7 +490,7 @@ function build_image::configure_docker_registry() {
             verbosity::print_info
         fi
         if [[ -n "${token}" ]]; then
-            echo "${token}" | docker login \
+            echo "${token}" | docker_v login \
                 --username "${GITHUB_USERNAME:-apache}" \
                 --password-stdin \
                 "${GITHUB_REGISTRY}"
@@ -745,7 +745,7 @@ Docker building ${AIRFLOW_CI_IMAGE}.
     if [[ -n "${RUNTIME_APT_COMMAND}" ]]; then
         additional_runtime_args+=("--build-arg" "RUNTIME_APT_COMMAND=\"${RUNTIME_APT_COMMAND}\"")
     fi
-    docker build \
+    docker_v build \
         "${EXTRA_DOCKER_CI_BUILD_FLAGS[@]}" \
         --build-arg PYTHON_BASE_IMAGE="${AIRFLOW_PYTHON_BASE_IMAGE}" \
         --build-arg AIRFLOW_VERSION="${AIRFLOW_VERSION}" \
@@ -781,11 +781,11 @@ Docker building ${AIRFLOW_CI_IMAGE}.
     set -u
     if [[ -n "${DEFAULT_CI_IMAGE=}" ]]; then
         echo "Tagging additionally image ${AIRFLOW_CI_IMAGE} with ${DEFAULT_CI_IMAGE}"
-        docker tag "${AIRFLOW_CI_IMAGE}" "${DEFAULT_CI_IMAGE}"
+        docker_v tag "${AIRFLOW_CI_IMAGE}" "${DEFAULT_CI_IMAGE}"
     fi
     if [[ -n "${IMAGE_TAG=}" ]]; then
         echo "Tagging additionally image ${AIRFLOW_CI_IMAGE} with ${IMAGE_TAG}"
-        docker tag "${AIRFLOW_CI_IMAGE}" "${IMAGE_TAG}"
+        docker_v tag "${AIRFLOW_CI_IMAGE}" "${IMAGE_TAG}"
     fi
     if [[ -n ${SPIN_PID=} ]]; then
         kill -HUP "${SPIN_PID}" || true
@@ -898,7 +898,7 @@ function build_images::build_prod_images() {
     if [[ -n "${DEV_APT_COMMAND}" ]]; then
         additional_dev_args+=("--build-arg" "DEV_APT_COMMAND=\"${DEV_APT_COMMAND}\"")
     fi
-    docker build \
+    docker_v build \
         "${EXTRA_DOCKER_PROD_BUILD_FLAGS[@]}" \
         --build-arg PYTHON_BASE_IMAGE="${AIRFLOW_PYTHON_BASE_IMAGE}" \
         --build-arg INSTALL_MYSQL_CLIENT="${INSTALL_MYSQL_CLIENT}" \
@@ -934,7 +934,7 @@ function build_images::build_prod_images() {
     if [[ -n "${RUNTIME_APT_COMMAND}" ]]; then
         additional_runtime_args+=("--build-arg" "RUNTIME_APT_COMMAND=\"${RUNTIME_APT_COMMAND}\"")
     fi
-    docker build \
+    docker_v build \
         "${EXTRA_DOCKER_PROD_BUILD_FLAGS[@]}" \
         --build-arg PYTHON_BASE_IMAGE="${AIRFLOW_PYTHON_BASE_IMAGE}" \
         --build-arg INSTALL_MYSQL_CLIENT="${INSTALL_MYSQL_CLIENT}" \
@@ -970,11 +970,11 @@ function build_images::build_prod_images() {
     set -u
     if [[ -n "${DEFAULT_PROD_IMAGE:=}" ]]; then
         echo "Tagging additionally image ${AIRFLOW_PROD_IMAGE} with ${DEFAULT_PROD_IMAGE}"
-        docker tag "${AIRFLOW_PROD_IMAGE}" "${DEFAULT_PROD_IMAGE}"
+        docker_v tag "${AIRFLOW_PROD_IMAGE}" "${DEFAULT_PROD_IMAGE}"
     fi
     if [[ -n "${IMAGE_TAG=}" ]]; then
         echo "Tagging additionally image ${AIRFLOW_PROD_IMAGE} with ${IMAGE_TAG}"
-        docker tag "${AIRFLOW_PROD_IMAGE}" "${IMAGE_TAG}"
+        docker_v tag "${AIRFLOW_PROD_IMAGE}" "${IMAGE_TAG}"
     fi
 }
 
@@ -1000,7 +1000,7 @@ function build_images::wait_for_image_tag() {
     while true; do
         set +e
         echo "${COLOR_BLUE}Docker pull ${IMAGE_TO_WAIT_FOR} ${COLOR_RESET}" >"${OUTPUT_LOG}"
-        docker pull "${IMAGE_TO_WAIT_FOR}" >>"${OUTPUT_LOG}" 2>&1
+        docker_v pull "${IMAGE_TO_WAIT_FOR}" >>"${OUTPUT_LOG}" 2>&1
         set -e
         local image_hash
         echo "${COLOR_BLUE} Docker images -q ${IMAGE_TO_WAIT_FOR}${COLOR_RESET}" >>"${OUTPUT_LOG}"
@@ -1020,12 +1020,12 @@ function build_images::wait_for_image_tag() {
             echo
             echo "Tagging ${IMAGE_TO_WAIT_FOR} as ${IMAGE_NAME}."
             echo
-            docker tag "${IMAGE_TO_WAIT_FOR}" "${IMAGE_NAME}"
+            docker_v tag "${IMAGE_TO_WAIT_FOR}" "${IMAGE_NAME}"
             for TARGET_TAG in "${@}"; do
                 echo
                 echo "Tagging ${IMAGE_TO_WAIT_FOR} as ${TARGET_TAG}."
                 echo
-                docker tag "${IMAGE_TO_WAIT_FOR}" "${TARGET_TAG}"
+                docker_v tag "${IMAGE_TO_WAIT_FOR}" "${TARGET_TAG}"
             done
             break
         fi
diff --git a/scripts/ci/libraries/_docker_engine_resources.sh b/scripts/ci/libraries/_docker_engine_resources.sh
index b5283b3..f5ed3e6 100644
--- a/scripts/ci/libraries/_docker_engine_resources.sh
+++ b/scripts/ci/libraries/_docker_engine_resources.sh
@@ -22,27 +22,25 @@ function docker_engine_resources::print_overall_stats() {
     echo "Overall resource statistics"
     echo
     docker stats --all --no-stream --no-trunc
-    docker run --rm --entrypoint /bin/bash "${AIRFLOW_CI_IMAGE}" -c "free -h"
-    df --human || true
+    docker run --rm --entrypoint /bin/bash "debian:buster-slim" -c "cat /proc/meminfo"
+    df -h || true
 }
 
 
 function docker_engine_resources::get_available_memory_in_docker() {
-    MEMORY_AVAILABLE_FOR_DOCKER=$(docker run --rm --entrypoint /bin/bash debian:buster-slim -c \
-        'echo $(($(getconf _PHYS_PAGES) * $(getconf PAGE_SIZE) / (1024 * 1024)))')
+    MEMORY_AVAILABLE_FOR_DOCKER=$(docker run --rm  --entrypoint /bin/bash "debian:buster-slim" -c 'echo $(($(getconf _PHYS_PAGES) * $(getconf PAGE_SIZE) / (1024 * 1024)))')
     echo "${COLOR_BLUE}Memory available for Docker${COLOR_RESET}: $(numfmt --to iec $((MEMORY_AVAILABLE_FOR_DOCKER * 1024 * 1024)))"
     export MEMORY_AVAILABLE_FOR_DOCKER
 }
 
 function docker_engine_resources::get_available_cpus_in_docker() {
-    CPUS_AVAILABLE_FOR_DOCKER=$(docker run --rm --entrypoint /bin/bash debian:buster-slim -c \
-        'grep -cE "cpu[0-9]+" </proc/stat')
+    CPUS_AVAILABLE_FOR_DOCKER=$(docker run --rm "debian:buster-slim" grep -cE 'cpu[0-9]+' /proc/stat)
     echo "${COLOR_BLUE}CPUS available for Docker${COLOR_RESET}: ${CPUS_AVAILABLE_FOR_DOCKER}"
     export CPUS_AVAILABLE_FOR_DOCKER
 }
 
 function docker_engine_resources::get_available_disk_space_in_docker() {
-    DISK_SPACE_AVAILABLE_FOR_DOCKER=$(docker run --rm --entrypoint /bin/bash debian:buster-slim -c \
+    DISK_SPACE_AVAILABLE_FOR_DOCKER=$(docker run --rm --entrypoint /bin/bash "debian:buster-slim" -c \
         'df  / | tail -1 | awk '\''{print $4}'\')
     echo "${COLOR_BLUE}Disk space available for Docker${COLOR_RESET}: $(numfmt --to iec $((DISK_SPACE_AVAILABLE_FOR_DOCKER * 1024)))"
     export DISK_SPACE_AVAILABLE_FOR_DOCKER
diff --git a/scripts/ci/libraries/_initialization.sh b/scripts/ci/libraries/_initialization.sh
index 93dabea..cb42693 100644
--- a/scripts/ci/libraries/_initialization.sh
+++ b/scripts/ci/libraries/_initialization.sh
@@ -566,11 +566,13 @@ function initialization::set_output_color_variables() {
     COLOR_RED=$'\e[31m'
     COLOR_RESET=$'\e[0m'
     COLOR_YELLOW=$'\e[33m'
+    COLOR_CYAN=$'\e[36m'
     export COLOR_BLUE
     export COLOR_GREEN
     export COLOR_RED
     export COLOR_RESET
     export COLOR_YELLOW
+    export COLOR_CYAN
 }
 
 # Common environment that is initialized by both Breeze and CI scripts
diff --git a/scripts/ci/libraries/_kind.sh b/scripts/ci/libraries/_kind.sh
index 4fbfee1..a8deaac 100644
--- a/scripts/ci/libraries/_kind.sh
+++ b/scripts/ci/libraries/_kind.sh
@@ -252,7 +252,7 @@ function kind::check_cluster_ready_for_airflow() {
 function kind::build_image_for_kubernetes_tests() {
     start_end::group_start "Build image for kubernetes tests ${AIRFLOW_PROD_IMAGE_KUBERNETES}"
     cd "${AIRFLOW_SOURCES}" || exit 1
-    docker build --tag "${AIRFLOW_PROD_IMAGE_KUBERNETES}" . -f - <<EOF
+    docker_v build --tag "${AIRFLOW_PROD_IMAGE_KUBERNETES}" . -f - <<EOF
 FROM ${AIRFLOW_PROD_IMAGE}
 
 COPY airflow/example_dags/ \${AIRFLOW_HOME}/dags/
diff --git a/scripts/ci/libraries/_push_pull_remove_images.sh b/scripts/ci/libraries/_push_pull_remove_images.sh
index 3624a9a..b0723e4 100644
--- a/scripts/ci/libraries/_push_pull_remove_images.sh
+++ b/scripts/ci/libraries/_push_pull_remove_images.sh
@@ -25,7 +25,7 @@ function push_pull_remove_images::push_image_with_retries() {
         set +e
         echo
         echo "Trying to push the image ${1}. Number of try: ${try_num}"
-        docker push "${1}"
+        docker_v push "${1}"
         local res=$?
         set -e
         if [[ ${res} != "0" ]]; then
@@ -61,7 +61,7 @@ function push_pull_remove_images::pull_image_if_not_present_or_forced() {
         echo
         echo "Pulling the image ${IMAGE_TO_PULL}"
         echo
-        docker pull "${IMAGE_TO_PULL}"
+        docker_v pull "${IMAGE_TO_PULL}"
         EXIT_VALUE="$?"
         if [[ ${EXIT_VALUE} != "0" && ${FAIL_ON_GITHUB_DOCKER_PULL_ERROR} == "true" ]]; then
             echo
@@ -97,7 +97,7 @@ function push_pull_remove_images::pull_image_github_dockerhub() {
     set +e
     if push_pull_remove_images::pull_image_if_not_present_or_forced "${GITHUB_IMAGE}"; then
         # Tag the image to be the DockerHub one
-        docker tag "${GITHUB_IMAGE}" "${DOCKERHUB_IMAGE}"
+        docker_v tag "${GITHUB_IMAGE}" "${DOCKERHUB_IMAGE}"
     else
         push_pull_remove_images::pull_image_if_not_present_or_forced "${DOCKERHUB_IMAGE}"
     fi
@@ -109,9 +109,9 @@ function push_pull_remove_images::rebuild_python_base_image() {
    echo
    echo "Rebuilding ${AIRFLOW_PYTHON_BASE_IMAGE} from latest ${PYTHON_BASE_IMAGE}"
    echo
-   docker pull "${PYTHON_BASE_IMAGE}"
+   docker_v pull "${PYTHON_BASE_IMAGE}"
    echo "FROM ${PYTHON_BASE_IMAGE}" | \
-        docker build \
+        docker_v build \
             --label "org.opencontainers.image.source=https://github.com/${GITHUB_REPOSITORY}" \
             -t "${AIRFLOW_PYTHON_BASE_IMAGE}" -
 }
@@ -144,7 +144,7 @@ function push_pull_remove_images::pull_base_python_image() {
         push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_PYTHON_BASE_IMAGE}" \
             "${GITHUB_REGISTRY_PYTHON_BASE_IMAGE}${PYTHON_TAG_SUFFIX}"
     else
-        docker pull "${AIRFLOW_PYTHON_BASE_IMAGE}"
+        docker_v pull "${AIRFLOW_PYTHON_BASE_IMAGE}"
     fi
 }
 
@@ -194,7 +194,7 @@ function push_pull_remove_images::pull_prod_images_if_needed() {
 function push_pull_remove_images::push_ci_images_to_dockerhub() {
     push_pull_remove_images::push_image_with_retries "${AIRFLOW_PYTHON_BASE_IMAGE}"
     push_pull_remove_images::push_image_with_retries "${AIRFLOW_CI_IMAGE}"
-    docker tag "${AIRFLOW_CI_LOCAL_MANIFEST_IMAGE}" "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}"
+    docker_v tag "${AIRFLOW_CI_LOCAL_MANIFEST_IMAGE}" "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}"
     push_pull_remove_images::push_image_with_retries "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}"
     if [[ -n ${DEFAULT_CI_IMAGE=} ]]; then
         # Only push default image to DockerHub registry if it is defined
@@ -214,7 +214,7 @@ function push_pull_remove_images::push_python_image_to_github() {
     if [[ ${GITHUB_REGISTRY_PUSH_IMAGE_TAG} != "latest" ]]; then
         PYTHON_TAG_SUFFIX="-${GITHUB_REGISTRY_PUSH_IMAGE_TAG}"
     fi
-    docker tag "${AIRFLOW_PYTHON_BASE_IMAGE}" \
+    docker_v tag "${AIRFLOW_PYTHON_BASE_IMAGE}" \
         "${GITHUB_REGISTRY_PYTHON_BASE_IMAGE}${PYTHON_TAG_SUFFIX}"
     push_pull_remove_images::push_image_with_retries \
         "${GITHUB_REGISTRY_PYTHON_BASE_IMAGE}${PYTHON_TAG_SUFFIX}"
@@ -224,12 +224,12 @@ function push_pull_remove_images::push_python_image_to_github() {
 function push_pull_remove_images::push_ci_images_to_github() {
     push_pull_remove_images::push_python_image_to_github
     AIRFLOW_CI_TAGGED_IMAGE="${GITHUB_REGISTRY_AIRFLOW_CI_IMAGE}:${GITHUB_REGISTRY_PUSH_IMAGE_TAG}"
-    docker tag "${AIRFLOW_CI_IMAGE}" "${AIRFLOW_CI_TAGGED_IMAGE}"
+    docker_v tag "${AIRFLOW_CI_IMAGE}" "${AIRFLOW_CI_TAGGED_IMAGE}"
     push_pull_remove_images::push_image_with_retries "${AIRFLOW_CI_TAGGED_IMAGE}"
     if [[ -n ${GITHUB_SHA=} ]]; then
         # Also push image to GitHub registry with commit SHA
         AIRFLOW_CI_SHA_IMAGE="${GITHUB_REGISTRY_AIRFLOW_CI_IMAGE}:${COMMIT_SHA}"
-        docker tag "${AIRFLOW_CI_IMAGE}" "${AIRFLOW_CI_SHA_IMAGE}"
+        docker_v tag "${AIRFLOW_CI_IMAGE}" "${AIRFLOW_CI_SHA_IMAGE}"
         push_pull_remove_images::push_image_with_retries "${AIRFLOW_CI_SHA_IMAGE}"
     fi
 }
@@ -265,17 +265,17 @@ function push_pull_remove_images::push_prod_images_to_dockerhub () {
 function push_pull_remove_images::push_prod_images_to_github () {
     push_pull_remove_images::push_python_image_to_github
     AIRFLOW_PROD_TAGGED_IMAGE="${GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE}:${GITHUB_REGISTRY_PUSH_IMAGE_TAG}"
-    docker tag "${AIRFLOW_PROD_IMAGE}" "${AIRFLOW_PROD_TAGGED_IMAGE}"
+    docker_v tag "${AIRFLOW_PROD_IMAGE}" "${AIRFLOW_PROD_TAGGED_IMAGE}"
     push_pull_remove_images::push_image_with_retries "${GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE}:${GITHUB_REGISTRY_PUSH_IMAGE_TAG}"
     if [[ -n ${COMMIT_SHA=} ]]; then
         # Also push image to GitHub registry with commit SHA
         AIRFLOW_PROD_SHA_IMAGE="${GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE}:${COMMIT_SHA}"
-        docker tag "${AIRFLOW_PROD_IMAGE}" "${AIRFLOW_PROD_SHA_IMAGE}"
+        docker_v tag "${AIRFLOW_PROD_IMAGE}" "${AIRFLOW_PROD_SHA_IMAGE}"
         push_pull_remove_images::push_image_with_retries "${AIRFLOW_PROD_SHA_IMAGE}"
     fi
     # Also push prod build image
     AIRFLOW_PROD_BUILD_TAGGED_IMAGE="${GITHUB_REGISTRY_AIRFLOW_PROD_BUILD_IMAGE}:${GITHUB_REGISTRY_PUSH_IMAGE_TAG}"
-    docker tag "${AIRFLOW_PROD_BUILD_IMAGE}" "${AIRFLOW_PROD_BUILD_TAGGED_IMAGE}"
+    docker_v tag "${AIRFLOW_PROD_BUILD_IMAGE}" "${AIRFLOW_PROD_BUILD_TAGGED_IMAGE}"
     push_pull_remove_images::push_image_with_retries "${AIRFLOW_PROD_BUILD_TAGGED_IMAGE}"
 }
 
@@ -327,7 +327,7 @@ function push_pull_remove_images::check_for_image_in_github_container_registry()
 
     local image_to_wait_for="ghcr.io/${GITHUB_REPOSITORY}-${image_name_in_github_registry}:${image_tag_in_github_registry}"
     echo "GitHub Container Registry: checking for ${image_to_wait_for} via docker manifest inspect!"
-    docker manifest inspect "${image_to_wait_for}"
+    docker_v manifest inspect "${image_to_wait_for}"
     local res=$?
     if [[ ${res} == "0" ]]; then
         echo  "Image: ${image_to_wait_for} found in Container Registry: ${COLOR_GREEN}OK.${COLOR_RESET}"
diff --git a/scripts/ci/libraries/_runs.sh b/scripts/ci/libraries/_runs.sh
index bd80108..45c4c2b 100644
--- a/scripts/ci/libraries/_runs.sh
+++ b/scripts/ci/libraries/_runs.sh
@@ -19,7 +19,7 @@
 # Docker command to build documentation
 function runs::run_docs() {
     start_end::group_start "Run build docs"
-    docker run "${EXTRA_DOCKER_FLAGS[@]}" -t \
+    docker_v run "${EXTRA_DOCKER_FLAGS[@]}" -t \
         -e "GITHUB_ACTIONS=${GITHUB_ACTIONS="false"}" \
         --entrypoint "/usr/local/bin/dumb-init"  \
         "${AIRFLOW_CI_IMAGE}" \
@@ -30,7 +30,7 @@ function runs::run_docs() {
 # Docker command to generate constraint files.
 function runs::run_generate_constraints() {
     start_end::group_start "Run generate constraints"
-    docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+    docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         --entrypoint "/usr/local/bin/dumb-init"  \
         "${AIRFLOW_CI_IMAGE}" \
         "--" "/opt/airflow/scripts/in_container/run_generate_constraints.sh"
@@ -40,7 +40,7 @@ function runs::run_generate_constraints() {
 # Docker command to prepare airflow packages
 function runs::run_prepare_airflow_packages() {
     start_end::group_start "Run prepare airflow packages"
-    docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+    docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         --entrypoint "/usr/local/bin/dumb-init"  \
         -t \
         -v "${AIRFLOW_SOURCES}:/opt/airflow" \
@@ -53,7 +53,7 @@ function runs::run_prepare_airflow_packages() {
 # Docker command to prepare provider packages
 function runs::run_prepare_provider_packages() {
     # No group here - groups are added internally
-    docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+    docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         --entrypoint "/usr/local/bin/dumb-init"  \
         -t \
         -v "${AIRFLOW_SOURCES}:/opt/airflow" \
@@ -64,7 +64,7 @@ function runs::run_prepare_provider_packages() {
 # Docker command to generate release notes for provider packages
 function runs::run_prepare_provider_documentation() {
     # No group here - groups are added internally
-    docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+    docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         --entrypoint "/usr/local/bin/dumb-init"  \
         -t \
         -v "${AIRFLOW_SOURCES}:/opt/airflow" \
diff --git a/scripts/ci/libraries/_start_end.sh b/scripts/ci/libraries/_start_end.sh
index b7fa74f..d2d6a61 100644
--- a/scripts/ci/libraries/_start_end.sh
+++ b/scripts/ci/libraries/_start_end.sh
@@ -76,7 +76,7 @@ function start_end::dump_container_logs() {
     echo "${COLOR_BLUE}###########################################################################################${COLOR_RESET}"
     echo "                   Dumping logs from ${container} container"
     echo "${COLOR_BLUE}###########################################################################################${COLOR_RESET}"
-    docker logs "${container}" > "${dump_file}"
+    docker_v logs "${container}" > "${dump_file}"
     echo "                   Container ${container} logs dumped to ${dump_file}"
     echo "${COLOR_BLUE}###########################################################################################${COLOR_RESET}"
     start_end::group_end
diff --git a/scripts/ci/libraries/_verbosity.sh b/scripts/ci/libraries/_verbosity.sh
index 26a077d..dc3ca5a 100644
--- a/scripts/ci/libraries/_verbosity.sh
+++ b/scripts/ci/libraries/_verbosity.sh
@@ -39,10 +39,10 @@ function verbosity::restore_exit_on_error_status() {
 # In case "VERBOSE" is set to "true" (--verbose flag in Breeze) all docker commands run will be
 # printed before execution. In case of DRY_RUN_DOCKER flag set to "true"
 # show the command to execute instead of executing them
-function docker {
+function docker_v {
     if [[ ${DRY_RUN_DOCKER} != "false" ]]; then
         echo
-        echo "${COLOR_YELLOW}docker" "${@}" "${COLOR_RESET}"
+        echo "${COLOR_CYAN}docker" "${@}" "${COLOR_RESET}"
         echo
         return
     fi
@@ -52,7 +52,7 @@ function docker {
         ${VERBOSE_COMMANDS:=} != "true" && \
         # And when generally printing info is disabled
         ${PRINT_INFO_FROM_SCRIPTS} == "true" ]]; then
-        >&2 echo "docker" "${@}"
+        >&2 echo "${COLOR_CYAN}docker ${*} ${COLOR_RESET}"
     fi
     if [[ ${PRINT_INFO_FROM_SCRIPTS} == "false" ]]; then
         ${DOCKER_BINARY_PATH} "${@}" >>"${OUTPUT_LOG}" 2>&1
diff --git a/scripts/ci/libraries/_verify_image.sh b/scripts/ci/libraries/_verify_image.sh
index f9d2bb6..b0060ac 100644
--- a/scripts/ci/libraries/_verify_image.sh
+++ b/scripts/ci/libraries/_verify_image.sh
@@ -16,7 +16,7 @@
 # specific language governing permissions and limitations
 # under the License.
 function verify_image::run_command_in_image() {
-    docker run --rm \
+    docker_v run --rm \
             -e COLUMNS=180 \
             --entrypoint /bin/bash "${DOCKER_IMAGE}" \
             -c "${@}"
@@ -83,7 +83,7 @@ function verify_image::verify_prod_image_has_airflow_and_providers() {
 function verify_image::verify_ci_image_dependencies() {
     start_end::group_start "Checking if Airflow dependencies are non-conflicting in ${DOCKER_IMAGE} image."
     set +e
-    docker run --rm --entrypoint /bin/bash "${DOCKER_IMAGE}" -c 'pip check'
+    docker_v run --rm --entrypoint /bin/bash "${DOCKER_IMAGE}" -c 'pip check'
     local res=$?
     if [[ ${res} != "0" ]]; then
         echo  "${COLOR_RED}ERROR: ^^^ Some dependencies are conflicting. See instructions below on how to deal with it.  ${COLOR_RESET}"
@@ -212,7 +212,7 @@ function verify_image::verify_prod_image_as_root() {
     echo "Checking airflow as root"
     local output
     local res
-    output=$(docker run --rm --user 0 "${DOCKER_IMAGE}" "airflow" "info" 2>&1)
+    output=$(docker_v run --rm --user 0 "${DOCKER_IMAGE}" "airflow" "info" 2>&1)
     res=$?
     if [[ ${res} == "0" ]]; then
         echo "${COLOR_GREEN}OK${COLOR_RESET}"
@@ -229,7 +229,7 @@ function verify_image::verify_prod_image_as_root() {
     tmp_dir="$(mktemp -d)"
     touch "${tmp_dir}/__init__.py"
     echo 'print("Awesome")' >> "${tmp_dir}/awesome.py"
-    output=$(docker run \
+    output=$(docker_v run \
         --rm \
         -e "PYTHONPATH=${tmp_dir}" \
         -v "${tmp_dir}:${tmp_dir}" \
diff --git a/scripts/ci/provider_packages/ci_install_and_test_provider_packages.sh b/scripts/ci/provider_packages/ci_install_and_test_provider_packages.sh
index 5b29c7e..51575eb 100755
--- a/scripts/ci/provider_packages/ci_install_and_test_provider_packages.sh
+++ b/scripts/ci/provider_packages/ci_install_and_test_provider_packages.sh
@@ -29,7 +29,7 @@ fi
 
 function run_test_package_import_all_classes() {
     # Groups are added internally
-    docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+    docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         --entrypoint "/usr/local/bin/dumb-init"  \
         -t \
         -v "${AIRFLOW_SOURCES}/setup.py:/airflow_sources/setup.py:cached" \
diff --git a/scripts/ci/static_checks/bats_tests.sh b/scripts/ci/static_checks/bats_tests.sh
index e54e5b1..eaf9171 100755
--- a/scripts/ci/static_checks/bats_tests.sh
+++ b/scripts/ci/static_checks/bats_tests.sh
@@ -53,7 +53,7 @@ function run_bats_tests() {
     # deduplicate
     FS=" " read -r -a bats_arguments <<< "$(tr ' ' '\n' <<< "${bats_arguments[@]}" | sort -u | tr '\n' ' ' )"
     if [[ ${#@} == "0" ]]; then
-        # Run all tests
+        # Run al tests
         docker run --workdir /airflow -v "$(pwd):/airflow" --rm \
             apache/airflow:bats-2020.09.05-1.2.1 --tap /airflow/tests/bats/
     elif [[ ${#bats_arguments} == "0" ]]; then
diff --git a/scripts/ci/static_checks/check_license.sh b/scripts/ci/static_checks/check_license.sh
index 8698bc9..d3a8be7 100755
--- a/scripts/ci/static_checks/check_license.sh
+++ b/scripts/ci/static_checks/check_license.sh
@@ -31,7 +31,7 @@ function run_check_license() {
 
     echo "Running license checks. This can take a while."
     # We mount ALL airflow files for the licence check. We want to check them all!
-    if ! docker run -v "${AIRFLOW_SOURCES}:/opt/airflow" -t \
+    if ! docker_v run -v "${AIRFLOW_SOURCES}:/opt/airflow" -t \
             --user "$(id -ur):$(id -gr)" \
             --rm --env-file "${AIRFLOW_SOURCES}/scripts/ci/docker-compose/_docker.env" \
             apache/airflow:apache-rat-2020.07.10-0.13 \
diff --git a/scripts/ci/static_checks/flake8.sh b/scripts/ci/static_checks/flake8.sh
index 322ab9e..1c5440c 100755
--- a/scripts/ci/static_checks/flake8.sh
+++ b/scripts/ci/static_checks/flake8.sh
@@ -20,12 +20,12 @@
 
 function run_flake8() {
     if [[ "${#@}" == "0" ]]; then
-        docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+        docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
             --entrypoint "/usr/local/bin/dumb-init"  \
             "${AIRFLOW_CI_IMAGE}" \
             "--" "/opt/airflow/scripts/in_container/run_flake8.sh"
     else
-        docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+        docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
             --entrypoint "/usr/local/bin/dumb-init"  \
             "${AIRFLOW_CI_IMAGE}" \
             "--" "/opt/airflow/scripts/in_container/run_flake8.sh" "${@}"
diff --git a/scripts/ci/static_checks/in_container_bats_tests.sh b/scripts/ci/static_checks/in_container_bats_tests.sh
index a7c0121..fa4eacd 100644
--- a/scripts/ci/static_checks/in_container_bats_tests.sh
+++ b/scripts/ci/static_checks/in_container_bats_tests.sh
@@ -20,13 +20,13 @@
 
 function run_in_container_bats_tests() {
     if [[ "${#@}" == "0" ]]; then
-        docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+        docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         --entrypoint "/opt/bats/bin/bats"  \
         "-v" "$(pwd):/airflow" \
         "${AIRFLOW_CI_IMAGE}" \
         --tap  "tests/bats/in_container/"
     else
-        docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+        docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         --entrypoint "/opt/bats/bin/bats"  \
         "-v" "$(pwd):/airflow" \
         "${AIRFLOW_CI_IMAGE}" \
diff --git a/scripts/ci/static_checks/lint_dockerfile.sh b/scripts/ci/static_checks/lint_dockerfile.sh
index 491662d..38327f7 100755
--- a/scripts/ci/static_checks/lint_dockerfile.sh
+++ b/scripts/ci/static_checks/lint_dockerfile.sh
@@ -25,7 +25,7 @@ function run_docker_lint() {
         echo "Running docker lint for all Dockerfiles"
         echo
         # shellcheck disable=SC2046
-        docker run \
+        docker_v run \
             -v "$(pwd):/root" \
             -w "/root" \
             --rm \
@@ -37,7 +37,7 @@ function run_docker_lint() {
         echo
         echo "Running docker lint for $*"
         echo
-        docker run \
+        docker_v run \
             -v "$(pwd):/root" \
             -w "/root" \
             --rm \
diff --git a/scripts/ci/static_checks/mypy.sh b/scripts/ci/static_checks/mypy.sh
index a7257a9..7ebbd63 100755
--- a/scripts/ci/static_checks/mypy.sh
+++ b/scripts/ci/static_checks/mypy.sh
@@ -26,7 +26,7 @@ function run_mypy() {
       files=("$@")
     fi
 
-    docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+    docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         --entrypoint "/usr/local/bin/dumb-init"  \
         "-v" "${AIRFLOW_SOURCES}/.mypy_cache:/opt/airflow/.mypy_cache" \
         "${AIRFLOW_CI_IMAGE}" \
diff --git a/scripts/ci/static_checks/pylint.sh b/scripts/ci/static_checks/pylint.sh
index edadd94..c69498e 100755
--- a/scripts/ci/static_checks/pylint.sh
+++ b/scripts/ci/static_checks/pylint.sh
@@ -20,12 +20,12 @@
 
 function run_pylint() {
     if [[ "${#@}" == "0" ]]; then
-       docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+       docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
             --entrypoint "/usr/local/bin/dumb-init"  \
             "${AIRFLOW_CI_IMAGE}" \
             "--" "/opt/airflow/scripts/in_container/run_pylint.sh"
     else
-        docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+        docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
             --entrypoint "/usr/local/bin/dumb-init" \
             "${AIRFLOW_CI_IMAGE}" \
             "--" "/opt/airflow/scripts/in_container/run_pylint.sh" "${@}"
diff --git a/scripts/ci/static_checks/refresh_pylint_todo.sh b/scripts/ci/static_checks/refresh_pylint_todo.sh
index 05dce88..52474b7 100755
--- a/scripts/ci/static_checks/refresh_pylint_todo.sh
+++ b/scripts/ci/static_checks/refresh_pylint_todo.sh
@@ -21,7 +21,7 @@ export FORCE_ANSWER_TO_QUESTIONS="quit"
 . "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
 
 function refresh_pylint_todo() {
-    docker run "${EXTRA_DOCKER_FLAGS[@]}" \
+    docker_v run "${EXTRA_DOCKER_FLAGS[@]}" \
         "${AIRFLOW_CI_IMAGE}" \
         "/opt/airflow/scripts/in_container/refresh_pylint_todo.sh"
 }
diff --git a/scripts/ci/testing/ci_run_airflow_testing.sh b/scripts/ci/testing/ci_run_airflow_testing.sh
index 0867e3c..af147ad 100755
--- a/scripts/ci/testing/ci_run_airflow_testing.sh
+++ b/scripts/ci/testing/ci_run_airflow_testing.sh
@@ -110,7 +110,7 @@ function system_prune_docker() {
     echo
     echo "${COLOR_BLUE}System-prune docker${COLOR_RESET}"
     echo
-    docker system prune --force --volumes
+    docker_v system prune --force --volumes
     echo
 }
 
diff --git a/scripts/ci/tools/ci_clear_tmp.sh b/scripts/ci/tools/ci_clear_tmp.sh
index d367967..bef3fa5 100755
--- a/scripts/ci/tools/ci_clear_tmp.sh
+++ b/scripts/ci/tools/ci_clear_tmp.sh
@@ -27,7 +27,7 @@ sanity_checks::sanitize_mounted_files
 
 read -r -a EXTRA_DOCKER_FLAGS <<<"$(local_mounts::convert_local_mounts_to_docker_params)"
 
-docker run --entrypoint /bin/bash "${EXTRA_DOCKER_FLAGS[@]}" \
+docker_v run --entrypoint /bin/bash "${EXTRA_DOCKER_FLAGS[@]}" \
     --rm \
     --env-file "${AIRFLOW_SOURCES}/scripts/ci/docker-compose/_docker.env" \
     "${AIRFLOW_CI_IMAGE}" \
diff --git a/scripts/ci/tools/ci_fix_ownership.sh b/scripts/ci/tools/ci_fix_ownership.sh
index 2d57d65..56463d2 100755
--- a/scripts/ci/tools/ci_fix_ownership.sh
+++ b/scripts/ci/tools/ci_fix_ownership.sh
@@ -33,7 +33,7 @@ sanity_checks::sanitize_mounted_files
 
 read -r -a EXTRA_DOCKER_FLAGS <<<"$(local_mounts::convert_local_mounts_to_docker_params)"
 
-docker run --entrypoint /bin/bash "${EXTRA_DOCKER_FLAGS[@]}" \
+docker_v run --entrypoint /bin/bash "${EXTRA_DOCKER_FLAGS[@]}" \
     --rm \
     --env-file "${AIRFLOW_SOURCES}/scripts/ci/docker-compose/_docker.env" \
     "${AIRFLOW_CI_IMAGE}" \
diff --git a/scripts/ci/tools/ci_free_space_on_ci.sh b/scripts/ci/tools/ci_free_space_on_ci.sh
index 4747848..a337545 100755
--- a/scripts/ci/tools/ci_free_space_on_ci.sh
+++ b/scripts/ci/tools/ci_free_space_on_ci.sh
@@ -26,7 +26,7 @@ echo "${COLOR_BLUE}Cleaning apt${COLOR_RESET}"
 sudo apt clean
 
 echo "${COLOR_BLUE}Pruning docker${COLOR_RESET}"
-docker system prune --all --force --volumes
+docker_v system prune --all --force --volumes
 
 echo "${COLOR_BLUE}Free disk space  ${COLOR_RESET}"
 df -h

[airflow] 06/16: Mark the test_scheduler_task_start_date as quarantined (#15086)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6e17675013071fe6eb3701464dee12a4b62560ba
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Fri Apr 2 07:31:15 2021 +0200

    Mark the test_scheduler_task_start_date as quarantined (#15086)
    
    Details captured in #15085
    
    (cherry picked from commit 4d1b2e985492894e6064235688cf8f381b1e8858)
---
 tests/jobs/test_scheduler_job.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/tests/jobs/test_scheduler_job.py b/tests/jobs/test_scheduler_job.py
index 7a9e273..a5fd794 100644
--- a/tests/jobs/test_scheduler_job.py
+++ b/tests/jobs/test_scheduler_job.py
@@ -2490,6 +2490,7 @@ class TestSchedulerJob(unittest.TestCase):
             session.commit()
             assert [] == self.null_exec.sorted_tasks
 
+    @pytest.mark.quarantined
     def test_scheduler_task_start_date(self):
         """
         Test that the scheduler respects task start dates that are different from DAG start dates

[airflow] 07/16: Fixes failing docs upload on master (#15148)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 30e5584d1d03d5a828a5f1e841b8c85817d4a059
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Fri Apr 2 19:25:58 2021 +0200

    Fixes failing docs upload on master (#15148)
    
    (cherry picked from commit 83d702c345f8f4ce16d32268f4f83ee508fea676)
---
 .github/workflows/ci.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 49fb2e7..d96eeb5 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -500,7 +500,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}"
         if: >
           github.ref == 'refs/heads/master' && github.repository == 'apache/airflow' &&
           github.event_name == 'push'
-        run: aws s3 sync --delete ./files/documentation s3://apache-airflow-docs
+        run: aws s3 sync --delete ./docs/_build s3://apache-airflow-docs
 
   prepare-provider-packages:
     timeout-minutes: 40

[airflow] 10/16: Finish quarantine for test_should_force_kill_process (#15081)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 657e707aac3862af3a452ce39e56ff46cfa3d14c
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sat Apr 3 12:49:40 2021 +0200

    Finish quarantine for test_should_force_kill_process (#15081)
    
    Changing the test to check actual PID of the process to kill
    
    (cherry picked from commit de22fc7fae05a4521870869f1035f5e4859e877f)
---
 tests/utils/test_process_utils.py | 12 +++++-------
 1 file changed, 5 insertions(+), 7 deletions(-)

diff --git a/tests/utils/test_process_utils.py b/tests/utils/test_process_utils.py
index 2c14ae4..21d6cdd 100644
--- a/tests/utils/test_process_utils.py
+++ b/tests/utils/test_process_utils.py
@@ -136,23 +136,21 @@ class TestKillChildProcessesByPids(unittest.TestCase):
         num_process = subprocess.check_output(["ps", "-ax", "-o", "pid="]).decode().count("\n")
         assert before_num_process == num_process
 
-    @pytest.mark.quarantined
     def test_should_force_kill_process(self):
-        before_num_process = subprocess.check_output(["ps", "-ax", "-o", "pid="]).decode().count("\n")
 
         process = multiprocessing.Process(target=my_sleep_subprocess_with_signals, args=())
         process.start()
         sleep(0)
 
-        num_process = subprocess.check_output(["ps", "-ax", "-o", "pid="]).decode().count("\n")
-        assert before_num_process + 1 == num_process
+        all_processes = subprocess.check_output(["ps", "-ax", "-o", "pid="]).decode().splitlines()
+        assert str(process.pid) in map(lambda x: x.strip(), all_processes)
 
         with self.assertLogs(process_utils.log) as cm:
             process_utils.kill_child_processes_by_pids([process.pid], timeout=0)
         assert any("Killing child PID" in line for line in cm.output)
-
-        num_process = subprocess.check_output(["ps", "-ax", "-o", "pid="]).decode().count("\n")
-        assert before_num_process == num_process
+        sleep(0)
+        all_processes = subprocess.check_output(["ps", "-ax", "-o", "pid="]).decode().splitlines()
+        assert str(process.pid) not in map(lambda x: x.strip(), all_processes)
 
 
 class TestPatchEnviron(unittest.TestCase):

[airflow] 02/16: The --force-pull-images is restored in breeze (#15063)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d7c45f359207488c72ac0567a959d3d77a9e9c1c
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Tue Mar 30 01:02:48 2021 +0200

    The --force-pull-images is restored in breeze (#15063)
    
    It's been accidentally removed during rebase.
    
    (cherry picked from commit 6415489390c5ec3679f8d6684c88c1dd74414951)
---
 breeze-complete | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/breeze-complete b/breeze-complete
index a3c1e69..a75b267 100644
--- a/breeze-complete
+++ b/breeze-complete
@@ -166,7 +166,7 @@ help python: backend: integration:
 kubernetes-mode: kubernetes-version: helm-version: kind-version:
 skip-mounting-local-sources mount-all-local-sources install-airflow-version: install-airflow-reference: db-reset
 verbose assume-yes assume-no assume-quit forward-credentials init-script:
-force-build-images force-pull-base-python-image production-image extras: force-clean-images skip-rebuild-check
+force-build-images force-pull-images force-pull-base-python-image production-image extras: force-clean-images skip-rebuild-check
 build-cache-local build-cache-pulled build-cache-disabled disable-pip-cache
 dockerhub-user: dockerhub-repo: use-github-registry github-registry: github-repository: github-image-id: generate-constraints-mode:
 postgres-version: mysql-version:

[airflow] 08/16: Increase timeout for building the docs (#15157)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 321237a74b2494d05804ce1ff1f3a315ccc65b7a
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sat Apr 3 10:27:58 2021 +0200

    Increase timeout for building the docs (#15157)
    
    Sometimes when docs are building in parallel, it takes longer
    than 4 minutes to build a big package and the job fails with
    timeout.
    
    This change increases the individual package build timeout to
    be longer (8 minutes instead of 4)
    
    (cherry picked from commit 95ae24a953fb5d47452e492cea94768a2c8c3ec5)
---
 docs/exts/docs_build/code_utils.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/exts/docs_build/code_utils.py b/docs/exts/docs_build/code_utils.py
index 07fe8d0..5c88797 100644
--- a/docs/exts/docs_build/code_utils.py
+++ b/docs/exts/docs_build/code_utils.py
@@ -30,7 +30,7 @@ DOCKER_DOCS_DIR = os.path.join(DOCKER_PROJECT_DIR, "docs")
 DOCKER_AIRFLOW_DIR = os.path.join(DOCKER_PROJECT_DIR, "/airflow")
 ALL_PROVIDER_YAMLS = load_package_data()
 AIRFLOW_SITE_DIR = os.environ.get('AIRFLOW_SITE_DIRECTORY')
-PROCESS_TIMEOUT = 4 * 60
+PROCESS_TIMEOUT = 8 * 60  # 400 seconds
 
 TEXT_RED = '\033[31m'
 TEXT_RESET = '\033[0m'