You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by po...@apache.org on 2021/01/18 02:59:47 UTC

[airflow] branch master updated: Gets rid of all the docker cli tools in Breeze. (#13731)

This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
     new 58b36b8  Gets rid of all the docker cli tools in Breeze. (#13731)
58b36b8 is described below

commit 58b36b861c309db0b82a3962f8f9a89e8678a15e
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Mon Jan 18 03:59:34 2021 +0100

    Gets rid of all the docker cli tools in Breeze. (#13731)
    
    * Gets rid of all the docker cli tools in Breeze.
    
    The docker cli tools caused more trouble than benefits.
    Replaced the last two (azure and aws) with installation to
    "/files" directory so that they survive breeze restart.
    
    Also added --reinstall to installation command for all tools
    so that it is easier to reinstall them.
    
    * Update .pre-commit-config.yaml
---
 BREEZE.rst                                         |  44 +------
 CI.rst                                             |   3 -
 CONTRIBUTORS_QUICK_START.rst                       |  51 ++++----
 breeze                                             |   2 -
 scripts/ci/docker-compose/_docker.env              |   1 -
 scripts/ci/docker-compose/base.yml                 |   1 -
 scripts/ci/docker-compose/local.yml                |   1 -
 scripts/ci/libraries/_initialization.sh            |   5 -
 scripts/ci/libraries/_local_mounts.sh              |   1 -
 .../bin/{install_java.sh => install_aws.sh}        |  42 +++---
 .../bin/{install_kubectl.sh => install_az.sh}      |  50 +++++--
 scripts/in_container/bin/install_gcloud.sh         |  17 ++-
 scripts/in_container/bin/install_imgcat.sh         |  11 +-
 scripts/in_container/bin/install_java.sh           |  15 ++-
 scripts/in_container/bin/install_kubectl.sh        |  13 +-
 scripts/in_container/bin/install_terraform.sh      |  10 +-
 scripts/in_container/entrypoint_ci.sh              |   6 -
 scripts/in_container/run_cli_tool.sh               | 144 ---------------------
 18 files changed, 140 insertions(+), 277 deletions(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index 3ecf770..561b719 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -446,6 +446,8 @@ is also available in ``$PATH``, so just type ``install_<TAB>`` to get a list of
 
 Currently available scripts:
 
+* ``install_aws.sh`` - installs `the AWS CLI <https://aws.amazon.com/cli/>`__ including
+* ``install_az.sh`` - installs `the Azure CLI <https://github.com/Azure/azure-cli>`__ including
 * ``install_gcloud.sh`` - installs `the Google Cloud SDK <https://cloud.google.com/sdk>`__ including
   ``gcloud``, ``gsutil``.
 * ``install_imgcat.sh`` - installs `imgcat - Inline Images Protocol <https://iterm2.com/documentation-images.html>`__
@@ -454,48 +456,6 @@ Currently available scripts:
 * ``install_kubectl.sh`` - installs `the Kubernetes command-line tool, kubectl <https://kubernetes.io/docs/reference/kubectl/kubectl/>`__
 * ``install_terraform.sh`` - installs `the Terraform <https://www.terraform.io/docs/index.html>`__
 
-**Docker wrappers**
-
-For tools that do not provide a simple method of installation in the custom directory, we provided simple
-wrappers for docker. Those CLIs are not installed when you build or pull the image - they will be downloaded
-as docker images the first time you attempt to use them. It is downloaded and executed in your host's
-docker engine so once it is downloaded, it will stay until you remove the downloaded images from your host
-container.
-
-For each of those CLI credentials are taken (automatically) from the credentials you have defined in
-your ``${HOME}`` directory on host.
-
-Those tools also have host Airflow source directory mounted in ``/opt/airflow`` path
-so you can directly transfer files to/from your airflow host sources.
-
-Those are currently installed CLIs (they are available as aliases to the docker commands):
-
-+-----------------------+----------+-------------------------------------------------+-------------------+
-| Cloud Provider        | CLI tool | Docker image                                    | Configuration dir |
-+=======================+==========+=================================================+===================+
-| Amazon Web Services   | aws      | amazon/aws-cli:latest                           | .aws              |
-+-----------------------+----------+-------------------------------------------------+-------------------+
-| Microsoft Azure       | az       | mcr.microsoft.com/azure-cli:latest              | .azure            |
-+-----------------------+----------+-------------------------------------------------+-------------------+
-
-For each of the CLIs we have also an accompanying ``*-update`` alias (for example ``aws-update``) which
-will pull the latest image for the tool. Note that all Google Cloud tools are served by one
-image and they are updated together.
-
-Also - in case you run several different Breeze containers in parallel (from different directories,
-with different versions) - they docker images for CLI Cloud Providers tools are shared so if you update it
-for one Breeze container, they will also get updated for all the other containers.
-
-.. raw:: html
-
-    <div align="center">
-      <a href="https://youtu.be/4MCTXq-oF68?t=1072">
-        <img src="images/breeze/overlayed_breeze_cloud_tools.png" width="640"
-             alt="Airflow Breeze - Cloud tools">
-      </a>
-    </div>
-
-
 Launching Breeze integrations
 -----------------------------
 
diff --git a/CI.rst b/CI.rst
index 3a534a9..2290f8b 100644
--- a/CI.rst
+++ b/CI.rst
@@ -190,9 +190,6 @@ You can use those variables when you try to reproduce the build locally.
 +-----------------------------------------+-------------+-------------+------------+-------------------------------------------------+
 | ``HOST_HOME``                           |             |             |            | Home directory on the host.                     |
 +-----------------------------------------+-------------+-------------+------------+-------------------------------------------------+
-| ``HOST_AIRFLOW_SOURCES``                |             |             |            | Directory where airflow sources are located     |
-|                                         |             |             |            | on the host.                                    |
-+-----------------------------------------+-------------+-------------+------------+-------------------------------------------------+
 |                                                           Image variables                                                          |
 +-----------------------------------------+-------------+-------------+------------+-------------------------------------------------+
 | ``INSTALL_AIRFLOW_VERSION``             |             |             |            | Installs Airflow version from PyPI when         |
diff --git a/CONTRIBUTORS_QUICK_START.rst b/CONTRIBUTORS_QUICK_START.rst
index 0fd914a..28f6256 100644
--- a/CONTRIBUTORS_QUICK_START.rst
+++ b/CONTRIBUTORS_QUICK_START.rst
@@ -594,19 +594,18 @@ All Tests are inside ./tests directory.
 .. code-block:: bash
 
    root@df8927308887:/opt/airflow# ./scripts/in_container/
-    _in_container_script_init.sh            run_generate_constraints.sh
-    check_environment.sh                    run_init_script.sh
-    entrypoint_ci.sh                        run_mypy.sh
-    entrypoint_exec.sh                      run_prepare_provider_packages.sh
-    prod/                                   run_prepare_provider_readme.sh
-    refresh_pylint_todo.sh                  run_pylint.sh
-    run_ci_tests.sh                         run_system_tests.sh
-    run_clear_tmp.sh                        run_test_package_import_all_classes.sh
-    run_cli_tool.sh                         run_test_package_install.sh
-    run_docs_build.sh                       run_tmux.sh
-    run_extract_tests.sh                    run_tmux_welcome.sh
-    run_fix_ownership.sh                    stop_tmux_airflow.sh
-    run_flake8.sh                           update_quarantined_test_status.py
+      bin/                                        run_flake8.sh*
+      check_environment.sh*                       run_generate_constraints.sh*
+      entrypoint_ci.sh*                           run_init_script.sh*
+      entrypoint_exec.sh*                         run_install_and_test_provider_packages.sh*
+      _in_container_script_init.sh*               run_mypy.sh*
+      prod/                                       run_prepare_provider_packages.sh*
+      refresh_pylint_todo.sh*                     run_prepare_provider_readme.sh*
+      run_ci_tests.sh*                            run_pylint.sh*
+      run_clear_tmp.sh*                           run_system_tests.sh*
+      run_docs_build.sh*                          run_tmux_welcome.sh*
+      run_extract_tests.sh*                       stop_tmux_airflow.sh*
+      run_fix_ownership.sh*                       update_quarantined_test_status.py*
 
    root@df8927308887:/opt/airflow# ./scripts/in_container/run_docs_build.sh
 
@@ -810,19 +809,19 @@ To avoid burden on CI infrastructure and to save time, Pre-commit hooks can be r
 .. code-block:: bash
 
    root@df8927308887:/opt/airflow# ./scripts/in_container/
-    _in_container_script_init.sh            run_generate_constraints.sh
-    check_environment.sh                    run_init_script.sh
-    entrypoint_ci.sh                        run_mypy.sh
-    entrypoint_exec.sh                      run_prepare_provider_packages.sh
-    prod/                                   run_prepare_provider_readme.sh
-    refresh_pylint_todo.sh                  run_pylint.sh
-    run_ci_tests.sh                         run_system_tests.sh
-    run_clear_tmp.sh                        run_test_package_import_all_classes.sh
-    run_cli_tool.sh                         run_test_package_install.sh
-    run_docs_build.sh                       run_tmux.sh
-    run_extract_tests.sh                    run_tmux_welcome.sh
-    run_fix_ownership.sh                    stop_tmux_airflow.sh
-    run_flake8.sh                           update_quarantined_test_status.py
+      bin/                                        run_flake8.sh*
+      check_environment.sh*                       run_generate_constraints.sh*
+      entrypoint_ci.sh*                           run_init_script.sh*
+      entrypoint_exec.sh*                         run_install_and_test_provider_packages.sh*
+      _in_container_script_init.sh*               run_mypy.sh*
+      prod/                                       run_prepare_provider_packages.sh*
+      refresh_pylint_todo.sh*                     run_prepare_provider_readme.sh*
+      run_ci_tests.sh*                            run_pylint.sh*
+      run_clear_tmp.sh*                           run_system_tests.sh*
+      run_docs_build.sh*                          run_tmux_welcome.sh*
+      run_extract_tests.sh*                       stop_tmux_airflow.sh*
+      run_fix_ownership.sh*                       update_quarantined_test_status.py*
+
 
    root@df8927308887:/opt/airflow# ./scripts/in_container/run_docs_build.sh
 
diff --git a/breeze b/breeze
index 042067b..268fc7d 100755
--- a/breeze
+++ b/breeze
@@ -548,7 +548,6 @@ EOF
 #   PYTHON_MAJOR_MINOR_VERSION
 #   DOCKERHUB_USER
 #   DOCKERHUB_REPO
-#   HOST_AIRFLOW_SOURCES
 #   BACKEND
 #   AIRFLOW_VERSION
 #   INSTALL_AIRFLOW_VERSION
@@ -595,7 +594,6 @@ export DOCKERHUB_USER=${DOCKERHUB_USER}
 export DOCKERHUB_REPO=${DOCKERHUB_REPO}
 export HOST_USER_ID=${HOST_USER_ID}
 export HOST_GROUP_ID=${HOST_GROUP_ID}
-export HOST_AIRFLOW_SOURCES="${AIRFLOW_SOURCES}"
 export COMPOSE_FILE="${compose_file}"
 export PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION}"
 export BACKEND="${BACKEND}"
diff --git a/scripts/ci/docker-compose/_docker.env b/scripts/ci/docker-compose/_docker.env
index 29fc93f..69f4547 100644
--- a/scripts/ci/docker-compose/_docker.env
+++ b/scripts/ci/docker-compose/_docker.env
@@ -37,7 +37,6 @@ HOST_USER_ID
 HOST_GROUP_ID
 HOST_OS
 HOST_HOME
-HOST_AIRFLOW_SOURCES
 INIT_SCRIPT_FILE
 INSTALL_AIRFLOW_VERSION
 INSTALL_PROVIDERS_FROM_SOURCES
diff --git a/scripts/ci/docker-compose/base.yml b/scripts/ci/docker-compose/base.yml
index 8c9d9a9..eab6425 100644
--- a/scripts/ci/docker-compose/base.yml
+++ b/scripts/ci/docker-compose/base.yml
@@ -24,7 +24,6 @@ services:
       - ADDITIONAL_PATH=~/.local/bin
       - CELERY_BROKER_URLS=amqp://guest:guest@rabbitmq:5672,redis://redis:6379/0
       - KUBECONFIG=/files/.kube/config
-      - HOST_AIRFLOW_SOURCES=${AIRFLOW_SOURCES}
       - HOST_HOME=${HOME}
     env_file:
       - _docker.env
diff --git a/scripts/ci/docker-compose/local.yml b/scripts/ci/docker-compose/local.yml
index cf39f88..641f93f 100644
--- a/scripts/ci/docker-compose/local.yml
+++ b/scripts/ci/docker-compose/local.yml
@@ -56,7 +56,6 @@ services:
       - ../../../tests:/opt/airflow/tests:cached
       - ../../../kubernetes_tests:/opt/airflow/kubernetes_tests:cached
       - ../../../chart:/opt/airflow/chart:cached
-      - ../../../tmp:/tmp:cached
       - ../../../metastore_browser:/opt/airflow/metastore_browser:cached
       # END automatically generated volumes from LOCAL_MOUNTS in _local_mounts.sh
     ports:
diff --git a/scripts/ci/libraries/_initialization.sh b/scripts/ci/libraries/_initialization.sh
index b63b02b..bdf7ed5 100644
--- a/scripts/ci/libraries/_initialization.sh
+++ b/scripts/ci/libraries/_initialization.sh
@@ -288,9 +288,6 @@ function initialization::initialize_host_variables() {
     # Home directory of the host user
     export HOST_HOME="${HOME}"
 
-    # Sources of Airflow on the host.
-    export HOST_AIRFLOW_SOURCES="${HOST_AIRFLOW_SOURCES:=${AIRFLOW_SOURCES}}"
-
     # In case of MacOS we need to use gstat - gnu version of the stats
     export STAT_BIN=stat
     if [[ "${OSTYPE}" == "darwin"* ]]; then
@@ -600,7 +597,6 @@ Host variables:
     HOST_GROUP_ID=${HOST_GROUP_ID}
     HOST_OS=${HOST_OS}
     HOST_HOME=${HOST_HOME}
-    HOST_AIRFLOW_SOURCES=${HOST_AIRFLOW_SOURCES}
 
 Version suffix variables:
 
@@ -729,7 +725,6 @@ function initialization::make_constants_read_only() {
 
     readonly HOST_USER_ID
     readonly HOST_GROUP_ID
-    readonly HOST_AIRFLOW_SOURCES
     readonly HOST_HOME
     readonly HOST_OS
 
diff --git a/scripts/ci/libraries/_local_mounts.sh b/scripts/ci/libraries/_local_mounts.sh
index 7f1a7a8..4925140 100644
--- a/scripts/ci/libraries/_local_mounts.sh
+++ b/scripts/ci/libraries/_local_mounts.sh
@@ -52,7 +52,6 @@ function local_mounts::generate_local_mounts_list {
         "$prefix"tests:/opt/airflow/tests:cached
         "$prefix"kubernetes_tests:/opt/airflow/kubernetes_tests:cached
         "$prefix"chart:/opt/airflow/chart:cached
-        "$prefix"tmp:/tmp:cached
         "$prefix"metastore_browser:/opt/airflow/metastore_browser:cached
     )
 }
diff --git a/scripts/in_container/bin/install_java.sh b/scripts/in_container/bin/install_aws.sh
similarity index 61%
copy from scripts/in_container/bin/install_java.sh
copy to scripts/in_container/bin/install_aws.sh
index 8bdf48b..e1e8ec3 100755
--- a/scripts/in_container/bin/install_java.sh
+++ b/scripts/in_container/bin/install_aws.sh
@@ -19,16 +19,26 @@
 
 set -euo pipefail
 
-if command -v java; then
-    echo 'The "java" command found. Installation not needed.'
+INSTALL_DIR="/files/opt/aws"
+BIN_PATH="/files/bin/aws"
+
+if [[ $# != "0" && ${1} == "--reinstall" ]]; then
+    rm -rf "${INSTALL_DIR}"
+    rm -f "${BIN_PATH}"
+fi
+
+hash -r
+
+if command -v aws; then
+    echo 'The "aws" command found. Installation not needed. Run with --reinstall to reinstall'
+    echo "Run with --reinstall to reinstall."
     exit 1
 fi
 
-DOWNLOAD_URL='https://download.java.net/openjdk/jdk8u41/ri/openjdk-8u41-b04-linux-x64-14_jan_2020.tar.gz'
-INSTALL_DIR="/files/opt/java"
+DOWNLOAD_URL="https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip"
 
 if [[ -e ${INSTALL_DIR} ]]; then
-    echo "The install directory (${INSTALL_DIR}) already exists. This may mean java is already installed."
+    echo "The install directory (${INSTALL_DIR}) already exists. This may mean AWS CLI is already installed."
     echo "Please delete this directory to start the installation."
     exit 1
 fi
@@ -38,23 +48,19 @@ TMP_DIR="$(mktemp -d)"
 trap "rm -rf ${TMP_DIR}" EXIT
 
 mkdir -p "${INSTALL_DIR}"
-
 echo "Downloading from ${DOWNLOAD_URL}"
-curl -# --fail "${DOWNLOAD_URL}" --output "${TMP_DIR}/openjdk.tar.gz"
-
+curl -# --fail "${DOWNLOAD_URL}" --output "${TMP_DIR}/awscliv2.zip"
 echo "Extracting archive"
-tar xzf "${TMP_DIR}/openjdk.tar.gz" -C "${INSTALL_DIR}" --strip-components=1
-
-echo 'Symlinking executables files to /files/bin'
-mkdir -p "/files/bin/"
-while IPS='' read -r line; do
-    BIN_NAME="$(basename "${line}")"
-    ln -s "${line}" "/files/bin/${BIN_NAME}"
-done < <(find "${INSTALL_DIR}/bin/" -type f)
+pushd "${TMP_DIR}" && unzip "${TMP_DIR}/awscliv2.zip" && cd aws && \
+    "./install" \
+    --update \
+    --install-dir ${INSTALL_DIR} \
+    --bin-dir "/files/bin/" && \
+    popd
 
 # Sanity check
-if ! command -v java > /dev/null; then
-    echo 'Installation failed. The command "java" was not found.'
+if ! command -v aws > /dev/null; then
+    echo 'Installation failed. The command "aws" was not found.'
     exit 1
 fi
 
diff --git a/scripts/in_container/bin/install_kubectl.sh b/scripts/in_container/bin/install_az.sh
similarity index 51%
copy from scripts/in_container/bin/install_kubectl.sh
copy to scripts/in_container/bin/install_az.sh
index ae7916d..51ef31a 100755
--- a/scripts/in_container/bin/install_kubectl.sh
+++ b/scripts/in_container/bin/install_az.sh
@@ -19,29 +19,51 @@
 
 set -euo pipefail
 
-if command -v kubectl; then
-    echo 'The "kubectl" command found. Installation not needed.'
+INSTALL_DIR="/files/opt/az"
+BIN_PATH="/files/bin/az"
+
+if [[ $# != "0" && ${1} == "--reinstall" ]]; then
+    rm -rf "${INSTALL_DIR}"
+    rm -f "${BIN_PATH}"
+fi
+
+hash -r
+
+if command -v az; then
+    echo 'The "az" command found. Installation not needed. Run with --reinstall to reinstall'
     exit 1
 fi
 
-KUBECTL_VERSION="$(curl -s https://storage.googleapis.com/kubernetes-release/release/stable.txt)"
-DOWNLOAD_URL="https://storage.googleapis.com/kubernetes-release/release/${KUBECTL_VERSION}/bin/linux/amd64/kubectl"
-BIN_PATH="/files/bin/kubectl"
 
-if [[ -e ${BIN_PATH} ]]; then
-    echo "The binary file (${BIN_PATH}) already exists. This may mean kubectl is already installed."
-    echo "Please delete this file to start the installation."
+if [[ -e ${INSTALL_DIR} ]]; then
+    echo "The install directory (${INSTALL_DIR}) already exists. This may mean az CLI is already installed."
+    echo "Run with --reinstall to reinstall."
     exit 1
 fi
 
-mkdir -p "/files/bin/"
-echo "Downloading from ${DOWNLOAD_URL}"
-curl -# --fail "${DOWNLOAD_URL}" --output "${BIN_PATH}"
-chmod +x "${BIN_PATH}"
+virtualenv /files/opt/az
+
+# ignore the source
+# shellcheck source=/dev/null
+source /files/opt/az/bin/activate
+
+pip install azure-cli
+
+cat >/files/opt/az/az <<EOF
+#!/usr/bin/env bash
+
+source /files/opt/az/bin/activate
+
+az "\${@}"
+EOF
+
+chmod a+x /files/opt/az/az
+
+ln -s /files/opt/az/az "${BIN_PATH}"
 
 # Sanity check
-if ! command -v kubectl > /dev/null; then
-    echo 'Installation failed. The command "kubectl" was not found.'
+if ! command -v az > /dev/null; then
+    echo 'Installation failed. The command "az" was not found.'
     exit 1
 fi
 
diff --git a/scripts/in_container/bin/install_gcloud.sh b/scripts/in_container/bin/install_gcloud.sh
index 755369e..96d5017 100755
--- a/scripts/in_container/bin/install_gcloud.sh
+++ b/scripts/in_container/bin/install_gcloud.sh
@@ -19,18 +19,27 @@
 
 set -euo pipefail
 
+INSTALL_DIR="/files/opt/google-cloud-sdk"
+BIN_PATH="/files/bin/gcloud"
+
+if [[ $# != "0" && ${1} == "--reinstall" ]]; then
+    rm -rf "${INSTALL_DIR}"
+    rm -f "${BIN_PATH}"
+fi
+
+hash -r
+
 if command -v gcloud; then
-    echo 'The "gcloud" command found. Installation not needed.'
+    echo 'The "gcloud" command found. Installation not needed.  Run with --reinstall to reinstall'
     exit 1
 fi
 
 CLOUD_SDK_VERSION=322.0.0
 DOWNLOAD_URL="https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-${CLOUD_SDK_VERSION}-linux-x86_64.tar.gz"
-INSTALL_DIR="/files/opt/google-cloud-sdk"
 
 if [[ -e ${INSTALL_DIR} ]]; then
     echo "The install directory (${INSTALL_DIR}) already exists. This may mean Cloud SDK is already installed."
-    echo "Please delete this directory to start the installation."
+    echo "Run with --reinstall to reinstall."
     exit 1
 fi
 
@@ -55,7 +64,7 @@ echo 'Symlinking executables files to /files/bin'
 mkdir -p "/files/bin/"
 while IPS='' read -r line; do
     BIN_NAME="$(basename "${line}")"
-    ln -s "${line}" "/files/bin/${BIN_NAME}"
+    ln -sf "${line}" "/files/bin/${BIN_NAME}"
 done < <(find "${INSTALL_DIR}/bin/" -type f)
 
 # Sanity check
diff --git a/scripts/in_container/bin/install_imgcat.sh b/scripts/in_container/bin/install_imgcat.sh
index ac0542d..6609055 100755
--- a/scripts/in_container/bin/install_imgcat.sh
+++ b/scripts/in_container/bin/install_imgcat.sh
@@ -19,8 +19,15 @@
 
 set -euo pipefail
 
+BIN_PATH="/files/bin/imgcat"
+
+if [[ $# != "0" && ${1} == "--reinstall" ]]; then
+    rm -f "${BIN_PATH}"
+fi
+
 if command -v imgcat; then
     echo 'The "imgcat" command found. Installation not needed.'
+    echo "Run with --reinstall to reinstall."
     exit 1
 fi
 
@@ -28,8 +35,8 @@ DOWNLOAD_URL="https://iterm2.com/utilities/imgcat"
 
 mkdir -p "/files/bin/"
 echo "Downloading from ${DOWNLOAD_URL}"
-curl -# --fail "${DOWNLOAD_URL}" --output "/files/bin/imgcat"
-chmod +x "/files/bin/imgcat"
+curl -# --fail "${DOWNLOAD_URL}" --output "${BIN_PATH}"
+chmod +x "${BIN_PATH}"
 
 # Sanity check
 if ! command -v imgcat > /dev/null; then
diff --git a/scripts/in_container/bin/install_java.sh b/scripts/in_container/bin/install_java.sh
index 8bdf48b..49c9040 100755
--- a/scripts/in_container/bin/install_java.sh
+++ b/scripts/in_container/bin/install_java.sh
@@ -19,17 +19,26 @@
 
 set -euo pipefail
 
+INSTALL_DIR="/files/opt/java"
+BIN_PATH="/files/bin/java"
+
+if [[ $# != "0" && ${1} == "--reinstall" ]]; then
+    rm -rf "${INSTALL_DIR}"
+    rm -f "${BIN_PATH}"
+fi
+
+hash -r
+
 if command -v java; then
-    echo 'The "java" command found. Installation not needed.'
+    echo 'The "java" command found. Installation not needed. Run with --reinstall to reinstall'
     exit 1
 fi
 
 DOWNLOAD_URL='https://download.java.net/openjdk/jdk8u41/ri/openjdk-8u41-b04-linux-x64-14_jan_2020.tar.gz'
-INSTALL_DIR="/files/opt/java"
 
 if [[ -e ${INSTALL_DIR} ]]; then
     echo "The install directory (${INSTALL_DIR}) already exists. This may mean java is already installed."
-    echo "Please delete this directory to start the installation."
+    echo "Run with --reinstall to reinstall."
     exit 1
 fi
 
diff --git a/scripts/in_container/bin/install_kubectl.sh b/scripts/in_container/bin/install_kubectl.sh
index ae7916d..b2604ae 100755
--- a/scripts/in_container/bin/install_kubectl.sh
+++ b/scripts/in_container/bin/install_kubectl.sh
@@ -19,18 +19,25 @@
 
 set -euo pipefail
 
+BIN_PATH="/files/bin/kubectl"
+
+if [[ $# != "0" && ${1} == "--reinstall" ]]; then
+    rm -f "${BIN_PATH}"
+fi
+
+hash -r
+
 if command -v kubectl; then
-    echo 'The "kubectl" command found. Installation not needed.'
+    echo 'The "kubectl" command found. Installation not needed. Run with --reinstall to reinstall'
     exit 1
 fi
 
 KUBECTL_VERSION="$(curl -s https://storage.googleapis.com/kubernetes-release/release/stable.txt)"
 DOWNLOAD_URL="https://storage.googleapis.com/kubernetes-release/release/${KUBECTL_VERSION}/bin/linux/amd64/kubectl"
-BIN_PATH="/files/bin/kubectl"
 
 if [[ -e ${BIN_PATH} ]]; then
     echo "The binary file (${BIN_PATH}) already exists. This may mean kubectl is already installed."
-    echo "Please delete this file to start the installation."
+    echo "Run with --reinstall to reinstall."
     exit 1
 fi
 
diff --git a/scripts/in_container/bin/install_terraform.sh b/scripts/in_container/bin/install_terraform.sh
index 6d86bdf..12edbba 100755
--- a/scripts/in_container/bin/install_terraform.sh
+++ b/scripts/in_container/bin/install_terraform.sh
@@ -19,8 +19,16 @@
 
 set -euo pipefail
 
+BIN_PATH="/files/bin/terraform"
+
+if [[ $# != "0" && ${1} == "--reinstall" ]]; then
+    rm -f "${BIN_PATH}"
+fi
+
+hash -r
+
 if command -v terraform; then
-    echo 'The "terraform" command found. Installation not needed.'
+    echo 'The "terraform" command found. Installation not needed. Run with --reinstall to reinstall'
     exit 1
 fi
 
diff --git a/scripts/in_container/entrypoint_ci.sh b/scripts/in_container/entrypoint_ci.sh
index f3be856..1c740df 100755
--- a/scripts/in_container/entrypoint_ci.sh
+++ b/scripts/in_container/entrypoint_ci.sh
@@ -58,12 +58,6 @@ RUN_TESTS=${RUN_TESTS:="false"}
 CI=${CI:="false"}
 INSTALL_AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION:=""}"
 
-if [[ ${GITHUB_ACTIONS:="false"} == "false" ]]; then
-    # Create links for useful CLI tools
-    # shellcheck source=scripts/in_container/run_cli_tool.sh
-    source <(bash scripts/in_container/run_cli_tool.sh)
-fi
-
 if [[ ${AIRFLOW_VERSION} == *1.10* || ${INSTALL_AIRFLOW_VERSION} == *1.10* ]]; then
     export RUN_AIRFLOW_1_10="true"
 else
diff --git a/scripts/in_container/run_cli_tool.sh b/scripts/in_container/run_cli_tool.sh
deleted file mode 100755
index 1d8e2b7..0000000
--- a/scripts/in_container/run_cli_tool.sh
+++ /dev/null
@@ -1,144 +0,0 @@
-#!/usr/bin/env bash
-
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-set -euo pipefail
-
-if [ -z "${AIRFLOW_CI_IMAGE=}" ]; then
-    echo
-    echo  "${COLOR_RED}ERROR: Missing environment variable AIRFLOW_CI_IMAGE  ${COLOR_RESET}"
-    echo
-    exit 1
-fi
-if [ -z "${HOST_AIRFLOW_SOURCES=}" ]; then
-    echo
-    echo  "${COLOR_RED}ERROR: Missing environment variable HOST_AIRFLOW_SOURCES  ${COLOR_RESET}"
-    echo
-    exit 1
-fi
-if [ -z "${HOST_USER_ID=}" ]; then
-    echo
-    echo  "${COLOR_RED}ERROR: Missing environment variable HOST_USER_ID  ${COLOR_RESET}"
-    echo
-    exit 1
-fi
-if [ -z "${HOST_GROUP_ID=}" ]; then
-    echo
-    echo  "${COLOR_RED}ERROR: Missing environment variable HOST_GROUP_ID   ${COLOR_RESET}"
-    echo
-    exit 1
-fi
-
-SCRIPT_NAME="$( basename "${BASH_SOURCE[0]}")"
-# Drop "-update" suffix, if exists
-TOOL_NAME="$(echo "${SCRIPT_NAME}" | cut -d "-" -f 1)"
-
-SUPPORTED_TOOL_NAMES=("aws" "az")
-
-if [ ! -L "${BASH_SOURCE[0]}" ]
-then
-    SCRIPT_PATH=$(readlink -e "${BASH_SOURCE[0]}")
-    # Direct execution - return installation script
-    echo "# CLI tool wrappers"
-    echo "#"
-    echo "# To install, run the following command:"
-    echo "#     source <(bash ${SCRIPT_PATH@Q})"
-    echo "#"
-    echo ""
-    # Print installation script
-    for NAME in "${SUPPORTED_TOOL_NAMES[@]}"
-    do
-        echo "ln -s ${SCRIPT_PATH@Q} /usr/bin/${NAME}"
-        echo "ln -s ${SCRIPT_PATH@Q} /usr/bin/${NAME}-update"
-        echo "chmod +x /usr/bin/${NAME} /usr/bin/${NAME}-update"
-    done
-    exit 0
-fi
-ENV_TMP_FILE=$(mktemp)
-env > "${ENV_TMP_FILE}"
-cleanup() {
-    rm "${ENV_TMP_FILE}"
-}
-trap cleanup EXIT HUP INT TERM
-
-CONTAINER_ID="$(head -n 1 < /proc/self/cgroup | cut -d ":" -f 3 | cut -d "/" -f 3)"
-
-COMMON_DOCKER_ARGS=(
-    # Share namespaces between all containers.
-    # This way we are even closer to run those tools like if they were installed.
-    # More information: https://docs.docker.com/get-started/overview/#namespaces
-    --ipc "container:${CONTAINER_ID}"
-    --pid "container:${CONTAINER_ID}"
-    --network "container:${CONTAINER_ID}"
-    -v "${HOST_AIRFLOW_SOURCES}/tmp:/tmp"
-    -v "${HOST_AIRFLOW_SOURCES}/files:/files"
-    -v "${HOST_AIRFLOW_SOURCES}:/opt/airflow"
-    --env-file "${ENV_TMP_FILE}"
-    -w "${PWD}"
-)
-
-AWS_CREDENTIALS_DOCKER_ARGS=(-v "${HOST_HOME}/.aws:/root/.aws")
-AZURE_CREDENTIALS_DOCKER_ARGS=(-v "${HOST_HOME}/.azure:/root/.azure")
-
-COMMAND=("${@}")
-
-# Configure selected tool
-case "${TOOL_NAME}" in
-    aws )
-        COMMON_DOCKER_ARGS+=("${AWS_CREDENTIALS_DOCKER_ARGS[@]}")
-        IMAGE_NAME="amazon/aws-cli:latest"
-        ;;
-    az )
-        COMMON_DOCKER_ARGS+=("${AZURE_CREDENTIALS_DOCKER_ARGS[@]}")
-        IMAGE_NAME="mcr.microsoft.com/azure-cli:latest"
-        ;;
-    * )
-        echo
-        echo  "${COLOR_RED}ERROR: Unsupported tool name: ${TOOL_NAME}  ${COLOR_RESET}"
-        echo
-        exit 1
-        ;;
-esac
-
-# Run update, if requested
-if [[ "${SCRIPT_NAME}" == *-update ]]; then
-    docker pull "${IMAGE_NAME}"
-    exit $?
-fi
-
-# Otherwise, run tool
-TOOL_DOCKER_ARGS=(--rm --interactive)
-TOOL_DOCKER_ARGS+=("${COMMON_DOCKER_ARGS[@]}")
-
-if [ -t 0 ] ; then
-    TOOL_DOCKER_ARGS+=(
-        --tty
-    )
-fi
-set +e
-docker run "${TOOL_DOCKER_ARGS[@]}" "${IMAGE_NAME}" "${COMMAND[@]}"
-
-RES=$?
-
-# Set file permissions to the host user
-if [[ "${HOST_OS}" == "Linux" ]]; then
-    docker run --rm "${COMMON_DOCKER_ARGS[@]}" \
-        --entrypoint /opt/airflow/scripts/in_container/run_fix_ownership.sh \
-            "${AIRFLOW_CI_IMAGE}"
-fi
-
-exit ${RES}