You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by po...@apache.org on 2020/11/29 23:59:47 UTC

[airflow] branch v1-10-test updated (104bd5c -> 07f28ca)

This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard 104bd5c  fixup! fixup! fixup! Update setup.py to get non-conflicting set of dependencies (#12636)
 discard bc1ad2e  Update setup.py to get non-conflicting set of dependencies (#12636)
 discard f1b0cfd  fixup! fixup! Update setup.py to get non-conflicting set of dependencies (#12636)
 discard 15e759e  fixup! Update setup.py to get non-conflicting set of dependencies (#12636)
 discard 2182972  fixup! Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)
 discard f71934d  Update setup.py to get non-conflicting set of dependencies (#12636)
 discard 55cef31  fixup! Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)
 discard db059d8  Setup.cfg change triggers full build (#12684)
 discard b354b30  Remove "@" references from constraints generattion (#12671)
 discard 2e844cd  Add 1.10.13 to CI, Breeze and Docs (#12652)
 discard 9fe9102  Allows mounting local sources for github run-id images (#12650)
 discard fcf1c01  Improved breeze messages for initialize-local-virtualenv and static-check --help (#12640)
 discard d1abe71  Adds possibility of forcing upgrade constraint by setting a label (#12635)
 discard d031621  Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)
 discard eecef12  Adds missing licence headers (#12593)
 discard 6ecc7b4  Fixes unneeded docker-context-files added in CI (#12534)
 discard acd5d12  Fix wait-for-migrations command in helm chart (#12522)
 discard 29fd0cf  Fix broken CI.yml (#12454)
 discard ccf7b3e  Cope with '%' in password when waiting for migrations (#12440)
 discard 98b3dd9  The messages about remote image check are only shown with -v (#12402)
 discard 8f3b2a3  Switching to Ubuntu 20.04 as Github Actions runner. (#12404)
 discard 1ae0c5f  Remove CodeQL from PRS. (#12406)
 discard 85075c4  Fix typo in check_environment.sh (#12395)
 discard 680986b  Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)
     new 1fc1220  Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)
     new 55d54d8  Fix typo in check_environment.sh (#12395)
     new a5d2650  Remove CodeQL from PRS. (#12406)
     new b914c32  Switching to Ubuntu 20.04 as Github Actions runner. (#12404)
     new b5a8ca9  The messages about remote image check are only shown with -v (#12402)
     new e58cfa0  Cope with '%' in password when waiting for migrations (#12440)
     new c6021cb  Fix broken CI.yml (#12454)
     new 9f90eb0  Fix wait-for-migrations command in helm chart (#12522)
     new 08d9e0e  Fixes unneeded docker-context-files added in CI (#12534)
     new 49d052b0 Adds missing licence headers (#12593)
     new cf3baba  Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)
     new 113c493  Adds possibility of forcing upgrade constraint by setting a label (#12635)
     new e08f1e2  Improved breeze messages for initialize-local-virtualenv and static-check --help (#12640)
     new e7a395d  Allows mounting local sources for github run-id images (#12650)
     new 659779f  Add 1.10.13 to CI, Breeze and Docs (#12652)
     new 36d3109  Remove "@" references from constraints generattion (#12671)
     new 61e0114  Setup.cfg change triggers full build (#12684)
     new 07f28ca  Update setup.py to get non-conflicting set of dependencies (#12636)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (104bd5c)
            \
             N -- N -- N   refs/heads/v1-10-test (07f28ca)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 18 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:


[airflow] 17/18: Setup.cfg change triggers full build (#12684)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 61e01148ee9ffa97910ecb14d2aa432edc5029ad
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Sat Nov 28 12:39:46 2020 +0100

    Setup.cfg change triggers full build (#12684)
    
    Since we moved part of the setup.py specification to
    setup.cfg, we should trigger full build when only that file
    changes.
    
    (cherry picked from commit e4ab453a37c629e22d3d480511b43570f5237338)
---
 scripts/ci/selective_ci_checks.sh | 1 +
 1 file changed, 1 insertion(+)

diff --git a/scripts/ci/selective_ci_checks.sh b/scripts/ci/selective_ci_checks.sh
index a6c66eb..c87ec41 100755
--- a/scripts/ci/selective_ci_checks.sh
+++ b/scripts/ci/selective_ci_checks.sh
@@ -370,6 +370,7 @@ function run_all_tests_if_environment_files_changed() {
         "^Dockerfile"
         "^scripts"
         "^setup.py"
+        "^setup.cfg"
     )
     show_changed_files
 


[airflow] 12/18: Adds possibility of forcing upgrade constraint by setting a label (#12635)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 113c4933e5b720eb6efedf494c059dd3df5103b8
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Thu Nov 26 11:02:33 2020 +0100

    Adds possibility of forcing upgrade constraint by setting a label (#12635)
    
    You can now set a label on PR that will force upgrading to latest
    dependencies in your PR. If committer sets an
    "upgrade to latest dependencies" label, it will cause the PR
    to upgrade all dependencies to latest versions of dependencies
    matching setup.py + setup.cfg configuration.
    
    (cherry picked from commit 8b9d52f0cc197832188f431a2b6e4eb256f9725b)
---
 .github/workflows/build-images-workflow-run.yml    | 18 ++-----
 .github/workflows/ci.yml                           | 10 ++--
 .github/workflows/codeql-analysis.yml              |  4 +-
 .../workflows/label_when_reviewed_workflow_run.yml |  4 +-
 CONTRIBUTING.rst                                   |  5 ++
 scripts/ci/selective_ci_checks.sh                  | 56 ++++++++++++++++------
 6 files changed, 60 insertions(+), 37 deletions(-)

diff --git a/.github/workflows/build-images-workflow-run.yml b/.github/workflows/build-images-workflow-run.yml
index 9726c5a..c5480c6 100644
--- a/.github/workflows/build-images-workflow-run.yml
+++ b/.github/workflows/build-images-workflow-run.yml
@@ -30,7 +30,6 @@ env:
   SKIP_CHECK_REMOTE_IMAGE: "true"
   DB_RESET: "true"
   VERBOSE: "true"
-  UPGRADE_TO_LATEST_CONSTRAINTS: false
   USE_GITHUB_REGISTRY: "true"
   GITHUB_REPOSITORY: ${{ github.repository }}
   GITHUB_USERNAME: ${{ github.actor }}
@@ -57,7 +56,6 @@ jobs:
       sourceEvent: ${{ steps.source-run-info.outputs.sourceEvent }}
       cacheDirective: ${{ steps.cache-directive.outputs.docker-cache }}
       buildImages: ${{ steps.build-images.outputs.buildImages }}
-      upgradeToLatestConstraints: ${{ steps.upgrade-constraints.outputs.upgradeToLatestConstraints }}
     steps:
       - name: "Get information about the original trigger of the run"
         uses: potiuk/get-workflow-origin@588cc14f9f1cdf1b8be3db816855e96422204fec  # v1_3
@@ -153,15 +151,6 @@ jobs:
           else
               echo "::set-output name=docker-cache::pulled"
           fi
-      - name: "Set upgrade to latest constraints"
-        id: upgrade-constraints
-        run: |
-          if [[ ${{ steps.cancel.outputs.sourceEvent == 'push' ||
-              steps.cancel.outputs.sourceEvent == 'scheduled' }} == 'true' ]]; then
-              echo "::set-output name=upgradeToLatestConstraints::${{ github.sha }}"
-          else
-              echo "::set-output name=upgradeToLatestConstraints::false"
-          fi
       - name: "Cancel all duplicated 'Build Image' runs"
         # We find duplicates of all "Build Image" runs - due to a missing feature
         # in GitHub Actions, we have to use Job names to match Event/Repo/Branch matching
@@ -198,6 +187,7 @@ jobs:
       GITHUB_CONTEXT: ${{ toJson(github) }}
     outputs:
       pythonVersions: ${{ steps.selective-checks.python-versions }}
+      upgradeToLatestConstraints: ${{ steps.selective-checks.outputs.upgrade-to-latest-constraints }}
       allPythonVersions: ${{ steps.selective-checks.outputs.all-python-versions }}
       defaultPythonVersion: ${{ steps.selective-checks.outputs.default-python-version }}
       run-tests: ${{ steps.selective-checks.outputs.run-tests }}
@@ -243,12 +233,12 @@ jobs:
         id: selective-checks
         env:
           EVENT_NAME: ${{ needs.cancel-workflow-runs.outputs.sourceEvent }}
-          INCOMING_COMMIT_SHA: ${{ needs.cancel-workflow-runs.outputs.targetCommitSha }}
+          TARGET_COMMIT_SHA: ${{ needs.cancel-workflow-runs.outputs.targetCommitSha }}
           PR_LABELS: ${{ needs.cancel-workflow-runs.outputs.pullRequestLabels }}
         run: |
           if [[ ${EVENT_NAME} == "pull_request" ]]; then
             # Run selective checks
-            ./scripts/ci/selective_ci_checks.sh "${INCOMING_COMMIT_SHA}"
+            ./scripts/ci/selective_ci_checks.sh "${TARGET_COMMIT_SHA}"
           else
             # Run all checks
             ./scripts/ci/selective_ci_checks.sh
@@ -273,7 +263,7 @@ jobs:
       BACKEND: postgres
       PYTHON_MAJOR_MINOR_VERSION: ${{ matrix.python-version }}
       GITHUB_REGISTRY_PUSH_IMAGE_TAG: ${{ github.event.workflow_run.id }}
-      UPGRADE_TO_LATEST_CONSTRAINTS: ${{ needs.cancel-workflow-runs.outputs.upgradeToLatestConstraints }}
+      UPGRADE_TO_LATEST_CONSTRAINTS: ${{ needs.build-info.outputs.upgradeToLatestConstraints }}
       DOCKER_CACHE: ${{ needs.cancel-workflow-runs.outputs.cacheDirective }}
     steps:
       - name: >
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 5931135..77cbf65 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -35,7 +35,6 @@ env:
   SKIP_CHECK_REMOTE_IMAGE: "true"
   DB_RESET: "true"
   VERBOSE: "true"
-  UPGRADE_TO_LATEST_CONSTRAINTS: ${{ github.event_name == 'push' || github.event_name == 'scheduled' }}
   DOCKER_CACHE: "pulled"
   USE_GITHUB_REGISTRY: "true"
   GITHUB_REPOSITORY: ${{ github.repository }}
@@ -69,6 +68,7 @@ jobs:
       GITHUB_CONTEXT: ${{ toJson(github) }}
     outputs:
       waitForImage: ${{ steps.wait-for-image.outputs.wait-for-image }}
+      upgradeToLatestConstraints: ${{ steps.selective-checks.outputs.upgrade-to-latest-constraints }}
       pythonVersions: ${{ steps.selective-checks.outputs.python-versions }}
       pythonVersionsListAsString: ${{ steps.selective-checks.outputs.python-versions-list-as-string }}
       defaultPythonVersion: ${{ steps.selective-checks.outputs.default-python-version }}
@@ -131,12 +131,12 @@ jobs:
         id: selective-checks
         env:
           EVENT_NAME: ${{ github.event_name }}
-          INCOMING_COMMIT_SHA: ${{ github.sha }}
+          TARGET_COMMIT_SHA: ${{ github.sha }}
           PR_LABELS: "${{ steps.source-run-info.outputs.pullRequestLabels }}"
         run: |
           if [[ ${EVENT_NAME} == "pull_request" ]]; then
             # Run selective checks
-            ./scripts/ci/selective_ci_checks.sh "${INCOMING_COMMIT_SHA}"
+            ./scripts/ci/selective_ci_checks.sh "${TARGET_COMMIT_SHA}"
           else
             # Run all checks
             ./scripts/ci/selective_ci_checks.sh
@@ -150,6 +150,7 @@ jobs:
     if: needs.build-info.outputs.image-build == 'true'
     env:
       BACKEND: sqlite
+      UPGRADE_TO_LATEST_CONSTRAINTS: ${{ needs.build-info.outputs.upgradeToLatestConstraints }}
     steps:
       - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
         uses: actions/checkout@v2
@@ -568,7 +569,8 @@ jobs:
     needs: [build-info]
     env:
       BACKEND: sqlite
-      PYTHON_MAJOR_MINOR_VERSION: ${{needs.build-info.outputs.defaultPythonVersion}}
+      PYTHON_MAJOR_MINOR_VERSION: ${{ needs.build-info.outputs.defaultPythonVersion }}
+      UPGRADE_TO_LATEST_CONSTRAINTS: ${{ needs.build-info.outputs.upgradeToLatestConstraints }}
     if: needs.build-info.outputs.image-build == 'true'
     steps:
       - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml
index 2bf92b7..9fa7b94 100644
--- a/.github/workflows/codeql-analysis.yml
+++ b/.github/workflows/codeql-analysis.yml
@@ -40,11 +40,11 @@ jobs:
         id: selective-checks
         env:
           EVENT_NAME: ${{ github.event_name }}
-          INCOMING_COMMIT_SHA: ${{ github.sha }}
+          TARGET_COMMIT_SHA: ${{ github.sha }}
         run: |
           if [[ ${EVENT_NAME} == "pull_request" ]]; then
             # Run selective checks
-            ./scripts/ci/selective_ci_checks.sh "${INCOMING_COMMIT_SHA}"
+            ./scripts/ci/selective_ci_checks.sh "${TARGET_COMMIT_SHA}"
           else
             # Run all checks
             ./scripts/ci/selective_ci_checks.sh
diff --git a/.github/workflows/label_when_reviewed_workflow_run.yml b/.github/workflows/label_when_reviewed_workflow_run.yml
index 6e45038..6ea15b0 100644
--- a/.github/workflows/label_when_reviewed_workflow_run.yml
+++ b/.github/workflows/label_when_reviewed_workflow_run.yml
@@ -75,12 +75,12 @@ jobs:
         id: selective-checks
         env:
           EVENT_NAME: ${{ steps.source-run-info.outputs.sourceEvent }}
-          INCOMING_COMMIT_SHA: ${{ steps.source-run-info.outputs.targetCommitSha }}
+          TARGET_COMMIT_SHA: ${{ steps.source-run-info.outputs.targetCommitSha }}
           PR_LABELS: ${{ steps.source-run-info.outputs.pullRequestLabels }}
         run: |
           if [[ ${EVENT_NAME} == "pull_request_review" ]]; then
             # Run selective checks
-            ./scripts/ci/selective_ci_checks.sh "${INCOMING_COMMIT_SHA}"
+            ./scripts/ci/selective_ci_checks.sh "${TARGET_COMMIT_SHA}"
           else
             # Run all checks
             ./scripts/ci/selective_ci_checks.sh
diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 6d34026..61883e7 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -321,6 +321,11 @@ Step 4: Prepare PR
        the "full tests needed" label is set for your PR. Additional check is set that prevents from
        accidental merging of the request until full matrix of tests succeeds for the PR.
 
+     * when your change has "upgrade to latest dependencies" label set, constraints will be automatically
+       upgraded to latest constraints matching your setup.py. This is useful in case you want to force
+       upgrade to a latest version of dependencies. You can ask committers to set the label for you
+       when you need it in your PR.
+
    More details about the PR workflow be found in `PULL_REQUEST_WORKFLOW.rst <PULL_REQUEST_WORKFLOW.rst>`_.
 
 
diff --git a/scripts/ci/selective_ci_checks.sh b/scripts/ci/selective_ci_checks.sh
index 3c7132d..a6c66eb 100755
--- a/scripts/ci/selective_ci_checks.sh
+++ b/scripts/ci/selective_ci_checks.sh
@@ -34,16 +34,39 @@ if [[ ${PR_LABELS=} == *"full tests needed"* ]]; then
     echo
     echo "Found the right PR labels in '${PR_LABELS=}': 'full tests needed''"
     echo
-    FULL_TESTS_NEEDED="true"
+    FULL_TESTS_NEEDED_LABEL="true"
 else
     echo
     echo "Did not find the right PR labels in '${PR_LABELS=}': 'full tests needed'"
     echo
-    FULL_TESTS_NEEDED="false"
+    FULL_TESTS_NEEDED_LABEL="false"
+fi
+
+if [[ ${PR_LABELS=} == *"upgrade to latest dependencies"* ]]; then
+    echo
+    echo "Found the right PR labels in '${PR_LABELS=}': 'upgrade to latest dependencies''"
+    echo
+    UPGRADE_TO_LATEST_CONSTRAINTS_LABEL="true"
+else
+    echo
+    echo "Did not find the right PR labels in '${PR_LABELS=}': 'upgrade to latest dependencies'"
+    echo
+    UPGRADE_TO_LATEST_CONSTRAINTS_LABEL="false"
 fi
 
 function output_all_basic_variables() {
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ "${UPGRADE_TO_LATEST_CONSTRAINTS_LABEL}" == "true" ||
+            ${EVENT_NAME} == 'push' || ${EVENT_NAME} == "scheduled" ]]; then
+        # Trigger upgrading to latest constraints where label is set or when
+        # SHA of the merge commit triggers rebuilding layer in the docker image
+        # Each build that upgrades to latest constraints will get truly latest constraints, not those
+        # Cached in the image this way
+        initialization::ga_output upgrade-to-latest-constraints "${INCOMING_COMMIT_SHA}"
+    else
+        initialization::ga_output upgrade-to-latest-constraints "false"
+    fi
+
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output python-versions \
             "$(initialization::parameters_to_json "${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS[@]}")"
         initialization::ga_output all-python-versions \
@@ -60,7 +83,7 @@ function output_all_basic_variables() {
     fi
     initialization::ga_output default-python-version "${DEFAULT_PYTHON_MAJOR_MINOR_VERSION}"
 
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output kubernetes-versions \
             "$(initialization::parameters_to_json "${CURRENT_KUBERNETES_VERSIONS[@]}")"
     else
@@ -73,7 +96,7 @@ function output_all_basic_variables() {
         "$(initialization::parameters_to_json "${CURRENT_KUBERNETES_MODES[@]}")"
     initialization::ga_output default-kubernetes-mode "${KUBERNETES_MODE}"
 
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output postgres-versions \
             "$(initialization::parameters_to_json "${CURRENT_POSTGRES_VERSIONS[@]}")"
     else
@@ -82,7 +105,7 @@ function output_all_basic_variables() {
     fi
     initialization::ga_output default-postgres-version "${POSTGRES_VERSION}"
 
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output mysql-versions \
             "$(initialization::parameters_to_json "${CURRENT_MYSQL_VERSIONS[@]}")"
     else
@@ -100,7 +123,7 @@ function output_all_basic_variables() {
         "$(initialization::parameters_to_json "${CURRENT_HELM_VERSIONS[@]}")"
     initialization::ga_output default-helm-version "${HELM_VERSION}"
 
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output postgres-exclude '[{ "python-version": "3.6" }]'
         initialization::ga_output mysql-exclude '[{ "python-version": "3.7" }]'
         initialization::ga_output sqlite-exclude '[{ "python-version": "3.8" }]'
@@ -114,9 +137,6 @@ function output_all_basic_variables() {
 }
 
 function get_changed_files() {
-    INCOMING_COMMIT_SHA="${1}"
-    readonly INCOMING_COMMIT_SHA
-
     echo
     echo "Incoming commit SHA: ${INCOMING_COMMIT_SHA}"
     echo
@@ -414,14 +434,20 @@ if (($# < 1)); then
     echo
     echo "No Commit SHA - running all tests (likely direct master merge, or scheduled run)!"
     echo
-    # override FULL_TESTS_NEEDED in master/scheduled run
-    FULL_TESTS_NEEDED="true"
-    readonly FULL_TESTS_NEEDED
+    INCOMING_COMMIT_SHA=""
+    readonly INCOMING_COMMIT_SHA
+    # override FULL_TESTS_NEEDED_LABEL in master/scheduled run
+    FULL_TESTS_NEEDED_LABEL="true"
+    readonly FULL_TESTS_NEEDED_LABEL
     output_all_basic_variables
     set_outputs_run_everything_and_exit
+else
+    INCOMING_COMMIT_SHA="${1}"
+    readonly INCOMING_COMMIT_SHA
 fi
 
-readonly FULL_TESTS_NEEDED
+
+readonly FULL_TESTS_NEEDED_LABEL
 output_all_basic_variables
 
 image_build_needed="false"
@@ -429,7 +455,7 @@ docs_build_needed="false"
 tests_needed="false"
 kubernetes_tests_needed="false"
 
-get_changed_files "${1}"
+get_changed_files
 run_all_tests_if_environment_files_changed
 check_if_docs_should_be_generated
 check_if_helm_tests_should_be_run


[airflow] 18/18: Update setup.py to get non-conflicting set of dependencies (#12636)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 07f28caf436f465b085d89c8becc3915ec2bc8ed
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Nov 29 19:45:58 2020 +0100

    Update setup.py to get non-conflicting set of dependencies (#12636)
    
    This change upgrades setup.py and setup.cfg to provide non-conflicting
    `pip check` valid set of constraints for CI image.
---
 BREEZE.rst                                       |  3 ++
 CI.rst                                           |  2 +-
 CONTRIBUTING.rst                                 |  2 +-
 breeze                                           |  2 +
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh   |  1 -
 scripts/ci/images/ci_wait_for_all_ci_images.sh   | 36 ++--------------
 scripts/ci/images/ci_wait_for_all_prod_images.sh | 38 ++---------------
 scripts/ci/images/ci_wait_for_ci_image.sh        | 52 ++++++++++++++++++++++++
 scripts/ci/images/ci_wait_for_prod_image.sh      | 52 ++++++++++++++++++++++++
 scripts/ci/libraries/_build_images.sh            | 43 ++++++++++----------
 scripts/ci/libraries/_push_pull_remove_images.sh | 44 +++++++++++++++++---
 scripts/ci/selective_ci_checks.sh                |  6 +--
 setup.py                                         | 42 +++++++++++++------
 13 files changed, 209 insertions(+), 114 deletions(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index f91b598..095fe1b 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1355,6 +1355,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
           If you use this flag, automatically --github-registry is enabled.
 
+
           Default: latest.
 
   -v, --verbose
@@ -1508,6 +1509,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
           If you use this flag, automatically --github-registry is enabled.
 
+
           Default: latest.
 
   -v, --verbose
@@ -2276,6 +2278,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
           If you use this flag, automatically --github-registry is enabled.
 
+
           Default: latest.
 
   ****************************************************************************************************
diff --git a/CI.rst b/CI.rst
index f4b5294..fac9f0f 100644
--- a/CI.rst
+++ b/CI.rst
@@ -253,7 +253,7 @@ You can use those variables when you try to reproduce the build locally.
 |                                                        Image build variables                                                       |
 +-----------------------------------------+-------------+-------------+------------+-------------------------------------------------+
 | ``UPGRADE_TO_LATEST_CONSTRAINTS``       |    false    |    false    |    false   | Determines whether the build should             |
-|                                         |             |             |     (x)    | attempt to eagerly upgrade all                  |
+|                                         |             |             |     (x)    | attempt to upgrade all                          |
 |                                         |             |             |            | PIP dependencies to latest ones matching        |
 |                                         |             |             |            | ``setup.py`` limits. This tries to replicate    |
 |                                         |             |             |            | the situation of "fresh" user who just installs |
diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 61883e7..0c1c9c1 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -321,7 +321,7 @@ Step 4: Prepare PR
        the "full tests needed" label is set for your PR. Additional check is set that prevents from
        accidental merging of the request until full matrix of tests succeeds for the PR.
 
-     * when your change has "upgrade to latest dependencies" label set, constraints will be automatically
+     * when your change has "upgrade to newer dependencies" label set, constraints will be automatically
        upgraded to latest constraints matching your setup.py. This is useful in case you want to force
        upgrade to a latest version of dependencies. You can ask committers to set the label for you
        when you need it in your PR.
diff --git a/breeze b/breeze
index fe8f038..6f73ad7 100755
--- a/breeze
+++ b/breeze
@@ -1078,6 +1078,7 @@ function breeze::parse_arguments() {
             echo
             echo "Force pulling the image, using github registry and skip mounting local sources."
             echo "This is in order to get the exact same version as used in CI environment for SHA/RUN_ID!."
+            echo "You can specify --skip-mounting-local-sources to not mount local sources. "
             echo
             export FORCE_PULL_IMAGES="true"
             export USE_GITHUB_REGISTRY="true"
@@ -2385,6 +2386,7 @@ function breeze::flag_pull_push_docker_images() {
 
         If you use this flag, automatically --github-registry is enabled.
 
+
         Default: ${_breeze_default_github_image_id:=}.
 
 "
diff --git a/scripts/ci/images/ci_prepare_ci_image_on_ci.sh b/scripts/ci/images/ci_prepare_ci_image_on_ci.sh
index e2637c3..4d2a1ce 100755
--- a/scripts/ci/images/ci_prepare_ci_image_on_ci.sh
+++ b/scripts/ci/images/ci_prepare_ci_image_on_ci.sh
@@ -59,5 +59,4 @@ function build_ci_image_on_ci() {
     export CHECK_IMAGE_FOR_REBUILD="false"
 }
 
-
 build_ci_image_on_ci
diff --git a/scripts/ci/images/ci_wait_for_all_ci_images.sh b/scripts/ci/images/ci_wait_for_all_ci_images.sh
index edb6b29..2451a88 100755
--- a/scripts/ci/images/ci_wait_for_all_ci_images.sh
+++ b/scripts/ci/images/ci_wait_for_all_ci_images.sh
@@ -15,42 +15,12 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-export AIRFLOW_SOURCES="${AIRFLOW_SOURCES:=$( cd "$( dirname "${BASH_SOURCE[0]}" )/../../.." && pwd )}"
 echo
-echo "Airflow sources: ${AIRFLOW_SOURCES}"
+echo "Waiting for all CI images to appear: ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}"
 echo
 
-if [[ ${USE_GITHUB_REGISTRY} != "true" ||  ${GITHUB_REGISTRY_WAIT_FOR_IMAGE} != "true" ]]; then
-    echo
-    echo "This script should not be called"
-    echo "It need both USE_GITHUB_REGISTRY and GITHUB_REGISTRY_WAIT_FOR_IMAGE to true!"
-    echo
-    echo "USE_GITHUB_REGISTRY = ${USE_GITHUB_REGISTRY}"
-    echo "GITHUB_REGISTRY_WAIT_FOR_IMAGE =${GITHUB_REGISTRY_WAIT_FOR_IMAGE}"
-    echo
-    exit 1
-fi
-
-echo
-echo "Waiting for all images to appear: ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}"
-echo
-
-echo
-echo "Check if jq is installed"
-echo
-command -v jq >/dev/null || (echo "ERROR! You must have 'jq' tool installed!" && exit 1)
-
-echo
-echo "The jq version $(jq --version)"
-echo
-
-# shellcheck source=scripts/ci/libraries/_all_libs.sh
-source "${AIRFLOW_SOURCES}/scripts/ci/libraries/_all_libs.sh"
-
-initialization::initialize_common_environment
-
 for PYTHON_MAJOR_MINOR_VERSION in ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}
 do
-    export AIRFLOW_CI_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-ci"
-    push_pull_remove_images::wait_for_github_registry_image "${AIRFLOW_CI_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+    export PYTHON_MAJOR_MINOR_VERSION
+    "$( dirname "${BASH_SOURCE[0]}" )/ci_wait_for_ci_image.sh"
 done
diff --git a/scripts/ci/images/ci_wait_for_all_prod_images.sh b/scripts/ci/images/ci_wait_for_all_prod_images.sh
index 66196c3..25bfd7c 100755
--- a/scripts/ci/images/ci_wait_for_all_prod_images.sh
+++ b/scripts/ci/images/ci_wait_for_all_prod_images.sh
@@ -15,44 +15,12 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-export AIRFLOW_SOURCES="${AIRFLOW_SOURCES:=$( cd "$( dirname "${BASH_SOURCE[0]}" )/../../.." && pwd )}"
 echo
-echo "Airflow sources: ${AIRFLOW_SOURCES}"
+echo "Waiting for all PROD images to appear: ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}"
 echo
 
-if [[ ${USE_GITHUB_REGISTRY} != "true" ||  ${GITHUB_REGISTRY_WAIT_FOR_IMAGE} != "true" ]]; then
-    echo
-    echo "This script should not be called"
-    echo "It need both USE_GITHUB_REGISTRY and GITHUB_REGISTRY_WAIT_FOR_IMAGE to true!"
-    echo
-    echo "USE_GITHUB_REGISTRY = ${USE_GITHUB_REGISTRY}"
-    echo "GITHUB_REGISTRY_WAIT_FOR_IMAGE =${GITHUB_REGISTRY_WAIT_FOR_IMAGE}"
-    echo
-    exit 1
-fi
-
-echo
-echo "Waiting for all images to appear: ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}"
-echo
-
-echo
-echo "Check if jq is installed"
-echo
-command -v jq >/dev/null || (echo "ERROR! You must have 'jq' tool installed!" && exit 1)
-
-echo
-echo "The jq version $(jq --version)"
-echo
-
-# shellcheck source=scripts/ci/libraries/_all_libs.sh
-source "${AIRFLOW_SOURCES}/scripts/ci/libraries/_all_libs.sh"
-
-initialization::initialize_common_environment
-
 for PYTHON_MAJOR_MINOR_VERSION in ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}
 do
-    export AIRFLOW_PROD_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}"
-    export AIRFLOW_PROD_BUILD_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-build"
-    push_pull_remove_images::wait_for_github_registry_image "${AIRFLOW_PROD_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
-    push_pull_remove_images::wait_for_github_registry_image "${AIRFLOW_PROD_BUILD_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+    export PYTHON_MAJOR_MINOR_VERSION
+    "$( dirname "${BASH_SOURCE[0]}" )/ci_wait_for_prod_image.sh"
 done
diff --git a/scripts/ci/images/ci_wait_for_ci_image.sh b/scripts/ci/images/ci_wait_for_ci_image.sh
new file mode 100755
index 0000000..f0b5058
--- /dev/null
+++ b/scripts/ci/images/ci_wait_for_ci_image.sh
@@ -0,0 +1,52 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck source=scripts/ci/libraries/_script_init.sh
+. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
+
+function verify_ci_image_dependencies {
+    echo
+    echo "Checking if Airflow dependencies are non-conflicting in CI image."
+    echo
+
+    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_CI_IMAGE}" \
+        "${GITHUB_REGISTRY_AIRFLOW_CI_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+    # TODO: remove the | true after we fixed pip check for prod image
+    docker run --rm --entrypoint /bin/bash "${AIRFLOW_CI_IMAGE}" -c 'pip check'  || true
+}
+
+push_pull_remove_images::check_if_github_registry_wait_for_image_enabled
+
+push_pull_remove_images::check_if_jq_installed
+
+build_image::login_to_github_registry_if_needed
+
+export AIRFLOW_CI_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-ci"
+
+echo
+echo "Waiting for image to appear: ${AIRFLOW_CI_IMAGE_NAME}"
+echo
+
+push_pull_remove_images::wait_for_github_registry_image \
+    "${AIRFLOW_CI_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+echo
+echo "Verifying the ${AIRFLOW_CI_IMAGE_NAME} image after pulling it"
+echo
+
+verify_ci_image_dependencies
diff --git a/scripts/ci/images/ci_wait_for_prod_image.sh b/scripts/ci/images/ci_wait_for_prod_image.sh
new file mode 100755
index 0000000..e53aec1
--- /dev/null
+++ b/scripts/ci/images/ci_wait_for_prod_image.sh
@@ -0,0 +1,52 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck source=scripts/ci/libraries/_script_init.sh
+. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
+
+function verify_prod_image_dependencies {
+    echo
+    echo "Checking if Airflow dependencies are non-conflicting in PROD image."
+    echo
+
+    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_PROD_IMAGE}" \
+        "${GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+    # TODO: remove the | true after we fixed pip check for prod image
+    docker run --rm --entrypoint /bin/bash "${AIRFLOW_PROD_IMAGE}" -c 'pip check' || true
+}
+
+push_pull_remove_images::check_if_github_registry_wait_for_image_enabled
+
+push_pull_remove_images::check_if_jq_installed
+
+build_image::login_to_github_registry_if_needed
+
+export AIRFLOW_PROD_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}"
+
+echo
+echo "Waiting for image to appear: ${AIRFLOW_PROD_IMAGE_NAME}"
+echo
+
+push_pull_remove_images::wait_for_github_registry_image \
+    "${AIRFLOW_PROD_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+echo
+echo "Verifying the ${AIRFLOW_PROD_IMAGE_NAME} image after pulling it"
+echo
+
+verify_prod_image_dependencies
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index 5bd2d06..8de58db 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -346,13 +346,19 @@ function build_images::get_docker_image_names() {
 
     # File that is touched when the CI image is built for the first time locally
     export BUILT_CI_IMAGE_FLAG_FILE="${BUILD_CACHE_DIR}/${BRANCH_NAME}/.built_${PYTHON_MAJOR_MINOR_VERSION}"
+
+    # GitHub Registry names must be lowercase :(
+    github_repository_lowercase="$(echo "${GITHUB_REPOSITORY}" |tr '[:upper:]' '[:lower:]')"
+    export GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_PROD_BASE_TAG}"
+    export GITHUB_REGISTRY_AIRFLOW_PROD_BUILD_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_PROD_BASE_TAG}-build"
+    export GITHUB_REGISTRY_PYTHON_BASE_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/python:${PYTHON_BASE_IMAGE_VERSION}-slim-buster"
+
+    export GITHUB_REGISTRY_AIRFLOW_CI_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_CI_BASE_TAG}"
+    export GITHUB_REGISTRY_PYTHON_BASE_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/python:${PYTHON_BASE_IMAGE_VERSION}-slim-buster"
 }
 
-# Prepares all variables needed by the CI build. Depending on the configuration used (python version
-# DockerHub user etc. the variables are set so that other functions can use those variables.
-function build_images::prepare_ci_build() {
-    export AIRFLOW_CI_LOCAL_MANIFEST_IMAGE="local/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
-    export AIRFLOW_CI_REMOTE_MANIFEST_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
+# If GitHub Registry is used, login to the registry using GITHUB_USERNAME and GITHUB_TOKEN
+function build_image::login_to_github_registry_if_needed()  {
     if [[ ${USE_GITHUB_REGISTRY} == "true" ]]; then
         if [[ -n ${GITHUB_TOKEN=} ]]; then
             echo "${GITHUB_TOKEN}" | docker login \
@@ -360,11 +366,15 @@ function build_images::prepare_ci_build() {
                 --password-stdin \
                 "${GITHUB_REGISTRY}"
         fi
-        # GitHub Registry names must be lowercase :(
-        github_repository_lowercase="$(echo "${GITHUB_REPOSITORY}" |tr '[:upper:]' '[:lower:]')"
-        export GITHUB_REGISTRY_AIRFLOW_CI_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_CI_BASE_TAG}"
-        export GITHUB_REGISTRY_PYTHON_BASE_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/python:${PYTHON_BASE_IMAGE_VERSION}-slim-buster"
     fi
+
+}
+
+# Prepares all variables needed by the CI build. Depending on the configuration used (python version
+# DockerHub user etc. the variables are set so that other functions can use those variables.
+function build_images::prepare_ci_build() {
+    export AIRFLOW_CI_LOCAL_MANIFEST_IMAGE="local/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
+    export AIRFLOW_CI_REMOTE_MANIFEST_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
     export THE_IMAGE_TYPE="CI"
     export IMAGE_DESCRIPTION="Airflow CI"
 
@@ -375,6 +385,7 @@ function build_images::prepare_ci_build() {
     export AIRFLOW_IMAGE="${AIRFLOW_CI_IMAGE}"
     readonly AIRFLOW_IMAGE
 
+    build_image::login_to_github_registry_if_needed
     sanity_checks::go_to_airflow_sources
     permissions::fix_group_permissions
 }
@@ -662,19 +673,7 @@ function build_images::prepare_prod_build() {
     export AIRFLOW_IMAGE="${AIRFLOW_PROD_IMAGE}"
     readonly AIRFLOW_IMAGE
 
-    if [[ ${USE_GITHUB_REGISTRY="false"} == "true" ]]; then
-        if [[ -n ${GITHUB_TOKEN=} ]]; then
-            echo "${GITHUB_TOKEN}" | docker login \
-                --username "${GITHUB_USERNAME}" \
-                --password-stdin \
-                "${GITHUB_REGISTRY}"
-        fi
-        # GitHub Registry names must be lowercase :(
-        github_repository_lowercase="$(echo "${GITHUB_REPOSITORY}" |tr '[:upper:]' '[:lower:]')"
-        export GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_PROD_BASE_TAG}"
-        export GITHUB_REGISTRY_AIRFLOW_PROD_BUILD_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_PROD_BASE_TAG}-build"
-        export GITHUB_REGISTRY_PYTHON_BASE_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/python:${PYTHON_BASE_IMAGE_VERSION}-slim-buster"
-    fi
+    build_image::login_to_github_registry_if_needed
 
     AIRFLOW_BRANCH_FOR_PYPI_PRELOADING="${BRANCH_NAME}"
     sanity_checks::go_to_airflow_sources
diff --git a/scripts/ci/libraries/_push_pull_remove_images.sh b/scripts/ci/libraries/_push_pull_remove_images.sh
index 7c65db1..216e025 100644
--- a/scripts/ci/libraries/_push_pull_remove_images.sh
+++ b/scripts/ci/libraries/_push_pull_remove_images.sh
@@ -264,13 +264,18 @@ function push_pull_remove_images::push_prod_images() {
 
 # waits for an image to be available in the github registry
 function push_pull_remove_images::wait_for_github_registry_image() {
+    local github_repository_lowercase
     github_repository_lowercase="$(echo "${GITHUB_REPOSITORY}" |tr '[:upper:]' '[:lower:]')"
-    GITHUB_API_ENDPOINT="https://${GITHUB_REGISTRY}/v2/${github_repository_lowercase}"
-    IMAGE_NAME="${1}"
-    IMAGE_TAG=${2}
-    echo "Waiting for ${IMAGE_NAME}:${IMAGE_TAG} image"
+    local github_api_endpoint
+    github_api_endpoint="https://${GITHUB_REGISTRY}/v2/${github_repository_lowercase}"
+    local image_name_in_github_registry="${1}"
+    local image_tag_in_github_registry=${2}
 
-    GITHUB_API_CALL="${GITHUB_API_ENDPOINT}/${IMAGE_NAME}/manifests/${IMAGE_TAG}"
+    echo
+    echo "Waiting for ${GITHUB_REPOSITORY}/${image_name_in_github_registry}:${image_tag_in_github_registry} image"
+    echo
+
+    GITHUB_API_CALL="${github_api_endpoint}/${image_name_in_github_registry}/manifests/${image_tag_in_github_registry}"
     while true; do
         curl -X GET "${GITHUB_API_CALL}" -u "${GITHUB_USERNAME}:${GITHUB_TOKEN}" 2>/dev/null > "${OUTPUT_LOG}"
         local digest
@@ -282,6 +287,33 @@ function push_pull_remove_images::wait_for_github_registry_image() {
         fi
         sleep 10
     done
-    verbosity::print_info "Found ${IMAGE_NAME}:${IMAGE_TAG} image"
+    verbosity::print_info "Found ${image_name_in_github_registry}:${image_tag_in_github_registry} image"
     verbosity::print_info "Digest: '${digest}'"
 }
+
+function push_pull_remove_images::check_if_github_registry_wait_for_image_enabled() {
+    if [[ ${USE_GITHUB_REGISTRY} != "true" ||  ${GITHUB_REGISTRY_WAIT_FOR_IMAGE} != "true" ]]; then
+        echo
+        echo "This script should not be called"
+        echo "It need both USE_GITHUB_REGISTRY and GITHUB_REGISTRY_WAIT_FOR_IMAGE to true!"
+        echo
+        echo "USE_GITHUB_REGISTRY = ${USE_GITHUB_REGISTRY}"
+        echo "GITHUB_REGISTRY_WAIT_FOR_IMAGE =${GITHUB_REGISTRY_WAIT_FOR_IMAGE}"
+        echo
+        exit 1
+    else
+        echo
+        echo "Both USE_GITHUB_REGISTRY and GITHUB_REGISTRY_WAIT_FOR_IMAGE are set to true. Good!"
+    fi
+}
+
+function push_pull_remove_images::check_if_jq_installed() {
+    echo
+    echo "Check if jq is installed"
+    echo
+    command -v jq >/dev/null || (echo "ERROR! You must have 'jq' tool installed!" && exit 1)
+
+    echo
+    echo "The jq version $(jq --version)"
+    echo
+}
diff --git a/scripts/ci/selective_ci_checks.sh b/scripts/ci/selective_ci_checks.sh
index c87ec41..8696d56 100755
--- a/scripts/ci/selective_ci_checks.sh
+++ b/scripts/ci/selective_ci_checks.sh
@@ -42,14 +42,14 @@ else
     FULL_TESTS_NEEDED_LABEL="false"
 fi
 
-if [[ ${PR_LABELS=} == *"upgrade to latest dependencies"* ]]; then
+if [[ ${PR_LABELS=} == *"upgrade to newer dependencies"* ]]; then
     echo
-    echo "Found the right PR labels in '${PR_LABELS=}': 'upgrade to latest dependencies''"
+    echo "Found the right PR labels in '${PR_LABELS=}': 'upgrade to newer dependencies''"
     echo
     UPGRADE_TO_LATEST_CONSTRAINTS_LABEL="true"
 else
     echo
-    echo "Did not find the right PR labels in '${PR_LABELS=}': 'upgrade to latest dependencies'"
+    echo "Did not find the right PR labels in '${PR_LABELS=}': 'upgrade to newer dependencies'"
     echo
     UPGRADE_TO_LATEST_CONSTRAINTS_LABEL="false"
 fi
diff --git a/setup.py b/setup.py
index f5f2a53..10ac240 100644
--- a/setup.py
+++ b/setup.py
@@ -182,11 +182,13 @@ atlas = [
     'atlasclient>=0.1.2',
 ]
 aws = [
-    'boto3~=1.10',
+    'boto3~=1.10,<1.11',  # required by snowflake
 ]
 azure_blob_storage = [
     'azure-storage>=0.34.0, <0.37.0',
-    'azure-storage-blob<12.0',
+    'azure-storage-blob<12.0.0;python_version<"3.6"',
+    'azure-storage-blob;python_version>="3.6"',
+    'azure-storage-common',
 ]
 azure_container_instances = [
     'azure-mgmt-containerinstance>=1.5.0,<2'
@@ -198,6 +200,7 @@ azure_data_lake = [
     'azure-datalake-store>=0.0.45'
     'azure-mgmt-datalake-store>=0.5.0',
     'azure-mgmt-resource>=2.2.0',
+    'cffi<1.14.0;python_version<"3.0"'
 ]
 azure_secrets = [
     'azure-identity>=1.3.1',
@@ -207,7 +210,8 @@ cassandra = [
     'cassandra-driver>=3.13.0,<3.21.0',
 ]
 celery = [
-    'celery~=4.3',
+    'celery~=4.3;python_version>="3.0"',
+    'celery==4.3.1;python_version<"3.0"',
     'flower>=0.7.3, <1.0',
     'kombu==4.6.3;python_version<"3.0"',
     'tornado>=4.2.0, <6.0',  # Dep of flower. Pin to a version that works on Py3.5.2
@@ -222,7 +226,8 @@ cloudant = [
 crypto = [
     # Cryptography 3.2 for python 2.7 is broken
     # https://github.com/pyca/cryptography/issues/5359#issuecomment-727622403
-    'cryptography>=0.9.3,<3.2; python_version<"3.0"',
+    # Snowflake requires <3.0
+    'cryptography>=0.9.3,<3.0; python_version<"3.0"',
     'cryptography>=0.9.3;python_version>="3.0"',
 ]
 dask = [
@@ -260,7 +265,8 @@ flask_oauth = [
     'requests-oauthlib==1.1.0',
 ]
 gcp = [
-    'PyOpenSSL',
+    'PyOpenSSL<20.0.0;python_version<"3.0"',
+    'PyOpenSSL;python_version>="3.0"',
     'google-api-python-client>=1.6.0, <2.0.0',
     'google-auth>=1.0.0, <2.0.0',
     'google-auth-httplib2>=0.0.1',
@@ -336,7 +342,9 @@ papermill = [
     'papermill[all]>=1.0.0',
     'nteract-scrapbook[all]>=0.2.1',
     'pyarrow<1.0.0',
-    'fsspec<0.8.0;python_version=="3.5"'
+    'fsspec<0.8.0;python_version=="3.5"',
+    'black==20.8b0;python_version>="3.6"'  # we need to limit black version as we have click < 7
+
 ]
 password = [
     'bcrypt>=2.0.0',
@@ -355,7 +363,7 @@ qds = [
     'qds-sdk>=1.10.4',
 ]
 rabbitmq = [
-    'amqp',
+    'amqp<5.0.0',
 ]
 redis = [
     'redis~=3.2',
@@ -378,6 +386,7 @@ sentry = [
 ]
 slack = [
     'slackclient>=1.0.0,<2.0.0',
+    'websocket-client<0.55.0'
 ]
 snowflake = [
     'snowflake-connector-python>=1.5.2',
@@ -421,11 +430,14 @@ devel = [
     'click==6.7',
     'contextdecorator;python_version<"3.4"',
     'coverage',
+    'docutils>=0.14, <0.16',
+    'ecdsa<0.15',  # Required for moto 1.3.14
     'flake8>=3.6.0',
     'flake8-colors',
     'flaky',
     'freezegun',
     'gitpython',
+    'idna<2.9',  # Required for moto 1.3.14
     'importlib-metadata~=2.0; python_version<"3.8"',
     'ipdb',
     'jira',
@@ -436,14 +448,15 @@ devel = [
     'packaging',
     'parameterized',
     'paramiko',
+    'pipdeptree',
     'pre-commit',
+    'pyrsistent<=0.16.0;python_version<"3.0"',
+    'pyrsistent;python_version>="3.0"',
     'pysftp',
     'pytest<6.0.0',  # FIXME: pylint complaining for pytest.mark.* on v6.0
     'pytest-cov',
     'pytest-instafail',
-    'pytest-rerunfailures',
     'pytest-timeouts',
-    'pytest-xdist',
     'pywinrm',
     'qds-sdk>=1.9.6',
     'requests_mock',
@@ -590,6 +603,8 @@ INSTALL_REQUIREMENTS = [
     'colorlog==4.0.2',
     'configparser>=3.5.0, <3.6.0',
     'croniter>=0.3.17, <0.4',
+    'cryptography>=0.9.3,<3.0; python_version<"3.0"',  # required by snowflake
+    'cryptography>=0.9.3;python_version>="3.0"',
     'dill>=0.2.2, <0.4',
     'email-validator',
     'enum34~=1.1.6;python_version<"3.4"',
@@ -606,11 +621,12 @@ INSTALL_REQUIREMENTS = [
     'graphviz>=0.12',
     'gunicorn>=19.5.0, <21.0',
     'importlib-metadata~=2.0; python_version<"3.8"',
+    'importlib_resources~=1.4',
     'iso8601>=0.1.12',
     'jinja2>=2.10.1, <2.12.0',
     'json-merge-patch==0.2',
     'jsonschema~=3.0',
-    'lazy_object_proxy~=1.3',
+    'lazy_object_proxy<1.5.0',  # Required to keep pip-check happy with astroid
     'markdown>=2.5.2, <3.0',
     'marshmallow-sqlalchemy>=0.16.1, <0.24.0;python_version>="3.6"',
     'marshmallow-sqlalchemy>=0.16.1, <0.19.0;python_version<"3.6"',
@@ -624,14 +640,16 @@ INSTALL_REQUIREMENTS = [
     'python-dateutil>=2.3, <3',
     'python-nvd3~=0.15.0',
     'python-slugify>=3.0.0,<5.0',
-    'requests>=2.20.0, <3',
+    'requests>=2.20.0, <2.23.0;python_version<"3.0"',  # Required to keep snowflake happy
+    'requests>=2.20.0, <2.24.0;python_version>="3.0"',  # Required to keep snowflake happy
     'setproctitle>=1.1.8, <2',
     'sqlalchemy~=1.3',
     'sqlalchemy_jsonfield==0.8.0;python_version<"3.5"',
     'sqlalchemy_jsonfield~=0.9;python_version>="3.5"',
     'tabulate>=0.7.5, <0.9',
     'tenacity==4.12.0',
-    'thrift>=0.9.2',
+    'thrift>=0.9.2;python_version>="3.0"',
+    'thrift==0.9.3;python_version<"3.0"',  # required by thrifts_sasl for python 2.0
     'typing;python_version<"3.5"',
     'typing-extensions>=3.7.4;python_version<"3.8"',
     'tzlocal>=1.4,<2.0.0',


[airflow] 08/18: Fix wait-for-migrations command in helm chart (#12522)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9f90eb0630ed440b764524b7177ad8ddc9d51b1c
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Sat Nov 21 10:00:02 2020 +0000

    Fix wait-for-migrations command in helm chart (#12522)
    
    If the migrations weren't yet applied this would fail with `NameError:
    name 'log' is not defined`. (I guess no one really noticed as the
    container would restart, and try again.)
    
    (cherry picked from commit 3188b130b5f61332e24c340ac6c0569efa4e8056)
---
 chart/templates/_helpers.yaml | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/chart/templates/_helpers.yaml b/chart/templates/_helpers.yaml
index 98efc9f..530b1d0 100644
--- a/chart/templates/_helpers.yaml
+++ b/chart/templates/_helpers.yaml
@@ -367,6 +367,7 @@ server_tls_key_file = /etc/pgbouncer/server.key
   - -c
   - |
         import airflow
+        import logging
         import os
         import time
 
@@ -399,7 +400,7 @@ server_tls_key_file = /etc/pgbouncer/server.key
                     raise TimeoutError("There are still unapplied migrations after {} seconds.".format(ticker))
                 ticker += 1
                 time.sleep(1)
-                log.info('Waiting for migrations... %s second(s)', ticker)
+                logging.info('Waiting for migrations... %s second(s)', ticker)
 {{- end }}
 
 {{ define "registry_docker_config" -}}


[airflow] 10/18: Adds missing licence headers (#12593)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 49d052b0e781c5c03d038e7307c76f8d0ffe84a5
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Wed Nov 25 00:58:01 2020 +0100

    Adds missing licence headers (#12593)
    
    (cherry picked from commit 58e21ed949203a7ac79bf96c72b917796c5f4d21)
---
 .pre-commit-config.yaml                  |  2 +-
 scripts/ci/dockerfiles/bats/Dockerfile   | 17 +++++++++++++++++
 scripts/ci/dockerfiles/stress/Dockerfile | 17 +++++++++++++++++
 3 files changed, 35 insertions(+), 1 deletion(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 2e27e50..4c6b733 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -46,7 +46,7 @@ repos:
           - license-templates/LICENSE.txt
           - --fuzzy-match-generates-todo
         files: >
-          \.properties$|\.cfg$|\.conf$|\.ini$|\.ldif$|\.readthedocs$|\.service$|\.tf$|^Dockerfile.*$
+          \.properties$|\.cfg$|\.conf$|\.ini$|\.ldif$|\.readthedocs$|\.service$|\.tf$|Dockerfile.*$
       - id: insert-license
         name: Add license for all rst files
         exclude: ^\.github/.*$
diff --git a/scripts/ci/dockerfiles/bats/Dockerfile b/scripts/ci/dockerfiles/bats/Dockerfile
index 01db50d..af21f4d 100644
--- a/scripts/ci/dockerfiles/bats/Dockerfile
+++ b/scripts/ci/dockerfiles/bats/Dockerfile
@@ -1,3 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck disable=SC1091
 FROM debian:buster-slim
 
 ARG BATS_VERSION
diff --git a/scripts/ci/dockerfiles/stress/Dockerfile b/scripts/ci/dockerfiles/stress/Dockerfile
index 3041d21..92df101 100644
--- a/scripts/ci/dockerfiles/stress/Dockerfile
+++ b/scripts/ci/dockerfiles/stress/Dockerfile
@@ -1,3 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck disable=SC1091
 ARG ALPINE_VERSION="3.12"
 
 FROM alpine:${ALPINE_VERSION}


[airflow] 14/18: Allows mounting local sources for github run-id images (#12650)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e7a395daa49452ea19c1ce8d3a31b6f86735b281
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Fri Nov 27 12:15:03 2020 +0100

    Allows mounting local sources for github run-id images (#12650)
    
    The images that are build on github can be used to reproduce
    the test errors in CI - they should then be mounted without
    local sources. However in some cases when you are dealing with
    dependencies for example, it is useful to be able to mount the
    sources.
    
    This PR makes it possible.
    
    (cherry picked from commit c0843930bf5c587a054586706021a2f5b492ec42)
---
 breeze                               | 1 -
 scripts/in_container/run_ci_tests.sh | 8 ++++----
 2 files changed, 4 insertions(+), 5 deletions(-)

diff --git a/breeze b/breeze
index 0c09046..fe8f038 100755
--- a/breeze
+++ b/breeze
@@ -1085,7 +1085,6 @@ function breeze::parse_arguments() {
             export GITHUB_REGISTRY_PUSH_IMAGE_TAG="${2}"
             export CHECK_IMAGE_FOR_REBUILD="false"
             export SKIP_BUILDING_PROD_IMAGE="true"
-            export MOUNT_LOCAL_SOURCES="false"
             export SKIP_CHECK_REMOTE_IMAGE="true"
             export FAIL_ON_GITHUB_DOCKER_PULL_ERROR="true"
             shift 2
diff --git a/scripts/in_container/run_ci_tests.sh b/scripts/in_container/run_ci_tests.sh
index 8b66a94..7f2be4c 100755
--- a/scripts/in_container/run_ci_tests.sh
+++ b/scripts/in_container/run_ci_tests.sh
@@ -52,22 +52,22 @@ elif [[ "${RES}" != "0" ]]; then
     >&2 echo "*"
     >&2 echo "*     Run all tests:"
     >&2 echo "*"
-    >&2 echo "*       ./breeze --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --test-type ${TEST_TYPE}  tests"
+    >&2 echo "*       ./breeze --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --skip-mounting-local-sources --test-type ${TEST_TYPE}  tests"
     >&2 echo "*"
     >&2 echo "*     Enter docker shell:"
     >&2 echo "*"
-    >&2 echo "*       ./breeze --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --test-type ${TEST_TYPE}  shell"
+    >&2 echo "*       ./breeze --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --skip-mounting-local-sources --test-type ${TEST_TYPE}  shell"
     >&2 echo "*"
     if [[ ${GITHUB_REGISTRY_PULL_IMAGE_TAG=} != "" ]]; then
         >&2 echo "*   When you do not have sources:"
         >&2 echo "*"
         >&2 echo "*     Run all tests:"
         >&2 echo "*"
-        >&2 echo "*      ./breeze --github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --test-type ${TEST_TYPE} tests"
+        >&2 echo "*      ./breeze --github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --skip-mounting-local-sources --test-type ${TEST_TYPE} tests"
         >&2 echo "*"
         >&2 echo "*     Enter docker shell:"
         >&2 echo "*"
-        >&2 echo "*      ./breeze --github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --test-type ${TEST_TYPE} shell"
+        >&2 echo "*      ./breeze --github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --skip-mounting-local-sources --test-type ${TEST_TYPE} shell"
         >&2 echo "*"
     fi
     >&2 echo "*"


[airflow] 09/18: Fixes unneeded docker-context-files added in CI (#12534)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 08d9e0eeac1b6fd2af5db3d9b30b0b40f29cd1e8
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Sat Nov 21 19:21:43 2020 +0100

    Fixes unneeded docker-context-files added in CI (#12534)
    
    We do not need to add docker-context-files in CI before we run
    first "cache" PIP installation. Adding it might cause the effect
    that the cache will always be invalidated in case someone has
    a file added there before building and pushing the image.
    
    This PR fixes the problem by adding docker-context files later
    in the Dockerfile and changing the constraints location
    used in the "cache" step to always use the github constraints in
    this case.
    
    Closes #12509
    
    (cherry picked from commit 37548f09acb91edd041565f52051f58610402cb3)
---
 Dockerfile                            |  3 ++-
 Dockerfile.ci                         | 11 ++++++-----
 IMAGES.rst                            |  9 ++++++++-
 scripts/ci/libraries/_build_images.sh | 35 +++++++++++++++--------------------
 4 files changed, 31 insertions(+), 27 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index 8ad3db9..00442bc 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -176,7 +176,8 @@ RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
        fi; \
        pip install --user \
           "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-          --constraint "${AIRFLOW_CONSTRAINTS_LOCATION}" && pip uninstall --yes apache-airflow; \
+          --constraint "https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+          && pip uninstall --yes apache-airflow; \
     fi
 
 ARG AIRFLOW_SOURCES_FROM="."
diff --git a/Dockerfile.ci b/Dockerfile.ci
index aa426ef..ac51a56 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -262,10 +262,6 @@ ENV AIRFLOW_LOCAL_PIP_WHEELS=${AIRFLOW_LOCAL_PIP_WHEELS}
 ARG INSTALL_AIRFLOW_VIA_PIP="true"
 ENV INSTALL_AIRFLOW_VIA_PIP=${INSTALL_AIRFLOW_VIA_PIP}
 
-# If wheel files are found in /docker-context-files during installation
-# they are also installed additionally to whatever is installed from Airflow.
-COPY docker-context-files /docker-context-files
-
 # In case of CI builds we want to pre-install master version of airflow dependencies so that
 # We do not have to always reinstall it from the scratch.
 # This can be reinstalled from latest master by increasing PIP_DEPENDENCIES_EPOCH_NUMBER.
@@ -273,7 +269,8 @@ COPY docker-context-files /docker-context-files
 RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
         pip install \
             "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-                --constraint "${AIRFLOW_CONSTRAINTS_URL}" && pip uninstall --yes apache-airflow; \
+                --constraint "https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+                && pip uninstall --yes apache-airflow; \
     fi
 
 
@@ -322,6 +319,10 @@ RUN if [[ ${INSTALL_AIRFLOW_VIA_PIP} == "true" ]]; then \
         fi; \
     fi
 
+# If wheel files are found in /docker-context-files during installation
+# they are also installed additionally to whatever is installed from Airflow.
+COPY docker-context-files/ /docker-context-files/
+
 RUN if [[ ${AIRFLOW_LOCAL_PIP_WHEELS} != "true" ]]; then \
         if ls /docker-context-files/*.whl 1> /dev/null 2>&1; then \
             pip install --no-deps /docker-context-files/*.whl; \
diff --git a/IMAGES.rst b/IMAGES.rst
index 8c913db..13ce935 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -399,7 +399,14 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 |                                          |                                          | file has to be in docker context so      |
 |                                          |                                          | it's best to place such file in          |
 |                                          |                                          | one of the folders included in           |
-|                                          |                                          | dockerignore                             |
+|                                          |                                          | dockerignore                . for example in the        |
+|                                          |                                          | 'docker-context-files'. Note that the    |
+|                                          |                                          | location does not work for the first     |
+|                                          |                                          | stage of installation when the           |
+|                                          |                                          | stage of installation when the           |
+|                                          |                                          | ``AIRFLOW_PRE_CACHED_PIP_PACKAGES`` is   |
+|                                          |                                          | set to true. Default location from       |
+|                                          |                                          | GitHub is used in this case.             |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_LOCAL_PIP_WHEELS``             | ``false``                                | If set to true, Airflow and it's         |
 |                                          |                                          | dependencies are installed from locally  |
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index fb756ba..5bd2d06 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -97,6 +97,19 @@ function build_images::forget_last_answer() {
     fi
 }
 
+function build_images::confirm_via_terminal() {
+    echo > "${DETECTED_TERMINAL}"
+    echo > "${DETECTED_TERMINAL}"
+    echo "Make sure that you rebased to latest master before rebuilding!" > "${DETECTED_TERMINAL}"
+    echo > "${DETECTED_TERMINAL}"
+    # Make sure to use output of tty rather than stdin/stdout when available - this way confirm
+    # will works also in case of pre-commits (git does not pass stdin/stdout to pre-commit hooks)
+    # shellcheck disable=SC2094
+    "${AIRFLOW_SOURCES}/confirm" "${ACTION} image ${THE_IMAGE_TYPE}-python${PYTHON_MAJOR_MINOR_VERSION}" \
+        <"${DETECTED_TERMINAL}" >"${DETECTED_TERMINAL}"
+    RES=$?
+}
+
 # Confirms if hte image should be rebuild and interactively checks it with the user.
 # In case iit needs to be rebuild. It only ask the user if it determines that the rebuild
 # is needed and that the rebuild is not already forced. It asks the user using available terminals
@@ -144,29 +157,11 @@ function build_images::confirm_image_rebuild() {
         "${AIRFLOW_SOURCES}/confirm" "${ACTION} image ${THE_IMAGE_TYPE}-python${PYTHON_MAJOR_MINOR_VERSION}"
         RES=$?
     elif [[ ${DETECTED_TERMINAL:=$(tty)} != "not a tty" ]]; then
-        echo > "${DETECTED_TERMINAL}"
-        echo > "${DETECTED_TERMINAL}"
-        echo "Make sure that you rebased to latest master before rebuilding!" > "${DETECTED_TERMINAL}"
-        echo > "${DETECTED_TERMINAL}"
-        # Make sure to use output of tty rather than stdin/stdout when available - this way confirm
-        # will works also in case of pre-commits (git does not pass stdin/stdout to pre-commit hooks)
-        # shellcheck disable=SC2094
-        "${AIRFLOW_SOURCES}/confirm" "${ACTION} image ${THE_IMAGE_TYPE}-python${PYTHON_MAJOR_MINOR_VERSION}" \
-            <"${DETECTED_TERMINAL}" >"${DETECTED_TERMINAL}"
-        RES=$?
         export DETECTED_TERMINAL
+        build_images::confirm_via_terminal
     elif [[ -c /dev/tty ]]; then
         export DETECTED_TERMINAL=/dev/tty
-        # Make sure to use /dev/tty first rather than stdin/stdout when available - this way confirm
-        # will works also in case of pre-commits (git does not pass stdin/stdout to pre-commit hooks)
-        echo > "${DETECTED_TERMINAL}"
-        echo > "${DETECTED_TERMINAL}"
-        echo "Make sure that you rebased to latest master before rebuilding!" > "${DETECTED_TERMINAL}"
-        echo > "${DETECTED_TERMINAL}"
-        # shellcheck disable=SC2094
-        "${AIRFLOW_SOURCES}/confirm" "${ACTION} image ${THE_IMAGE_TYPE}-python${PYTHON_MAJOR_MINOR_VERSION}" \
-            <"${DETECTED_TERMINAL}" >"${DETECTED_TERMINAL}"
-        RES=$?
+        build_images::confirm_via_terminal
     else
         verbosity::print_info
         verbosity::print_info "No terminal, no stdin - quitting"


[airflow] 03/18: Remove CodeQL from PRS. (#12406)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a5d2650fec4b816803dbbe981924c9df5eb258f7
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Nov 17 17:37:46 2020 +0100

    Remove CodeQL from PRS. (#12406)
    
    As discussed in https://lists.apache.org/thread.html/r18cc605bbdb6695c1d31e0706f1b033401f6fa6a19cd0584d7be6cc9%40%3Cdev.airflow.apache.org%3E
    removing CodeQL from PRs.
    
    (cherry picked from commit 525f6594d20d7032700cafe87a4f01a8c9ba8d23)
---
 .github/workflows/codeql-analysis.yml | 2 --
 1 file changed, 2 deletions(-)

diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml
index 4229e05..e0178bf 100644
--- a/.github/workflows/codeql-analysis.yml
+++ b/.github/workflows/codeql-analysis.yml
@@ -21,8 +21,6 @@ name: "CodeQL"
 on:  # yamllint disable-line rule:truthy
   push:
     branches: [master]
-  pull_request:
-    branches: [master]
   schedule:
     - cron: '0 2 * * *'
 


[airflow] 13/18: Improved breeze messages for initialize-local-virtualenv and static-check --help (#12640)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e08f1e24afb54628f11bf4f13dfa9333793b491d
Author: Ruben Laguna <ru...@gmail.com>
AuthorDate: Thu Nov 26 14:54:23 2020 +0100

    Improved breeze messages for initialize-local-virtualenv and static-check --help (#12640)
    
    (cherry picked from commit cf718dbb9ba64006652ccece08e936fe130fa51b)
---
 BREEZE.rst | 4 ++++
 breeze     | 6 ++++++
 2 files changed, 10 insertions(+)

diff --git a/BREEZE.rst b/BREEZE.rst
index ce7dc6a..2c4bffbe 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1884,6 +1884,10 @@ This is the current syntax for  `./breeze <./breeze>`_:
         'breeze static-check mypy -- --files tests/core.py'
         'breeze static-check mypy -- --all-files'
 
+        To check all files that differ between you current branch and master run:
+
+        'breeze static-check all -- --from-ref $(git merge-base master HEAD) --to-ref HEAD'
+
         You can see all the options by adding --help EXTRA_ARG:
 
         'breeze static-check mypy -- --help'
diff --git a/breeze b/breeze
index ff5d7cb..0c09046 100755
--- a/breeze
+++ b/breeze
@@ -244,6 +244,8 @@ function breeze::initialize_virtualenv() {
             echo
             if [[ ${OSTYPE} == "darwin"* ]]; then
                 echo "  brew install sqlite mysql postgresql openssl"
+                echo "  export LDFLAGS=\"-L/usr/local/opt/openssl/lib\""
+                echo "  export CPPFLAGS=\"-I/usr/local/opt/openssl/include\""
             else
                 echo "  sudo apt install build-essentials python3.6-dev python3.7-dev python3.8-dev python-dev openssl \\"
                 echo "              sqlite sqlite-dev default-libmysqlclient-dev libmysqld-dev postgresql"
@@ -1757,6 +1759,10 @@ ${FORMATTED_STATIC_CHECKS}
       '${CMDNAME} static-check mypy -- --files tests/core.py'
       '${CMDNAME} static-check mypy -- --all-files'
 
+      To check all files that differ between you current branch and master run:
+
+      '${CMDNAME} static-check all -- --from-ref \$(git merge-base master HEAD) --to-ref HEAD'
+
       You can see all the options by adding --help EXTRA_ARG:
 
       '${CMDNAME} static-check mypy -- --help'


[airflow] 04/18: Switching to Ubuntu 20.04 as Github Actions runner. (#12404)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b914c32138013932cca8a231a05628eae9244d36
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Nov 17 18:49:27 2020 +0100

    Switching to Ubuntu 20.04 as Github Actions runner. (#12404)
    
    Ubuntu 20.04 will soon become the default runner for GA.
    
    See: https://github.com/actions/virtual-environments/issues/1816
    
    This PR tests if this is working fine.
    
    (cherry picked from commit c38dadb526f7104df7a1a6feda72ce1b65557bd9)
---
 .github/workflows/build-images-workflow-run.yml    | 10 +++----
 .github/workflows/ci.yml                           | 34 +++++++++++-----------
 .github/workflows/codeql-analysis.yml              |  4 +--
 .github/workflows/delete_old_artifacts.yml         |  2 +-
 .github/workflows/label_when_reviewed.yml          |  2 +-
 .../workflows/label_when_reviewed_workflow_run.yml |  2 +-
 .github/workflows/scheduled_quarantined.yml        |  4 +--
 7 files changed, 29 insertions(+), 29 deletions(-)

diff --git a/.github/workflows/build-images-workflow-run.yml b/.github/workflows/build-images-workflow-run.yml
index af71710..9726c5a 100644
--- a/.github/workflows/build-images-workflow-run.yml
+++ b/.github/workflows/build-images-workflow-run.yml
@@ -44,7 +44,7 @@ jobs:
   cancel-workflow-runs:
     timeout-minutes: 10
     name: "Cancel workflow runs"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     outputs:
       sourceHeadRepo: ${{ steps.source-run-info.outputs.sourceHeadRepo }}
       sourceHeadBranch: ${{ steps.source-run-info.outputs.sourceHeadBranch }}
@@ -192,7 +192,7 @@ jobs:
       Source Sha: ${{ needs.cancel-workflow-runs.outputs.sourceHeadSha }}
       Merge commit Sha: ${{ needs.cancel-workflow-runs.outputs.mergeCommitSha }}
       Target commit Sha: ${{ needs.cancel-workflow-runs.outputs.targetCommitSha }}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [cancel-workflow-runs]
     env:
       GITHUB_CONTEXT: ${{ toJson(github) }}
@@ -257,7 +257,7 @@ jobs:
   build-images:
     timeout-minutes: 80
     name: "Build ${{matrix.image-type}} images ${{matrix.python-version}}"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, cancel-workflow-runs]
     strategy:
       matrix:
@@ -383,7 +383,7 @@ jobs:
 
   cancel-on-build-cancel:
     name: "Cancel 'CI Build' jobs on build image cancelling."
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     if: cancelled()
     needs: [build-images]
     steps:
@@ -398,7 +398,7 @@ jobs:
 
   cancel-on-build-failure:
     name: "Cancel 'CI Build' jobs on build image failing."
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     if: failure()
     needs: [build-images]
     steps:
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index dad697f..5aadfd0 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -64,7 +64,7 @@ jobs:
 
   build-info:
     name: "Build info"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     env:
       GITHUB_CONTEXT: ${{ toJson(github) }}
     outputs:
@@ -145,7 +145,7 @@ jobs:
   ci-images:
     timeout-minutes: 120
     name: "Wait for CI images"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info]
     if: needs.build-info.outputs.image-build == 'true'
     env:
@@ -179,7 +179,7 @@ jobs:
   static-checks:
     timeout-minutes: 30
     name: "Static checks"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     env:
       MOUNT_LOCAL_SOURCES: "true"
@@ -214,7 +214,7 @@ jobs:
   static-checks-basic-checks-only:
     timeout-minutes: 30
     name: "Static checks: basic checks only"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info]
     env:
       SKIP: "build,mypy,flake8,pylint,bats-in-container-tests"
@@ -250,7 +250,7 @@ jobs:
   docs:
     timeout-minutes: 30
     name: "Build docs"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     if: needs.build-info.outputs.docs-build == 'true'
     steps:
@@ -270,7 +270,7 @@ jobs:
   tests-helm:
     timeout-minutes: 20
     name: "Python unit tests for helm chart"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     env:
       MOUNT_LOCAL_SOURCES: "true"
@@ -318,7 +318,7 @@ jobs:
     name: >
       Postgres${{matrix.postgres-version}},Py${{matrix.python-version}}:
       ${{needs.build-info.outputs.testTypes}}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     strategy:
       matrix:
@@ -370,7 +370,7 @@ jobs:
     timeout-minutes: 80
     name: >
       MySQL${{matrix.mysql-version}}, Py${{matrix.python-version}}: ${{needs.build-info.outputs.testTypes}}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     strategy:
       matrix:
@@ -421,7 +421,7 @@ jobs:
     timeout-minutes: 60
     name: >
       Sqlite Py${{matrix.python-version}}: ${{needs.build-info.outputs.testTypes}}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     strategy:
       matrix:
@@ -469,7 +469,7 @@ jobs:
   tests-quarantined:
     timeout-minutes: 60
     name: "Quarantined tests"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     continue-on-error: true
     needs: [build-info, ci-images]
     strategy:
@@ -541,7 +541,7 @@ jobs:
   upload-coverage:
     timeout-minutes: 5
     name: "Upload coverage"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     continue-on-error: true
     needs:
       - tests-kubernetes
@@ -564,7 +564,7 @@ jobs:
   prod-images:
     timeout-minutes: 120
     name: "Wait for PROD images"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info]
     env:
       BACKEND: sqlite
@@ -594,7 +594,7 @@ jobs:
   tests-kubernetes:
     timeout-minutes: 50
     name: K8s ${{matrix.python-version}} ${{matrix.kubernetes-version}} ${{matrix.kubernetes-mode}}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, prod-images]
     strategy:
       matrix:
@@ -669,7 +669,7 @@ jobs:
   push-prod-images-to-github-registry:
     timeout-minutes: 10
     name: "Push PROD images"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs:
       - build-info
       - static-checks
@@ -705,7 +705,7 @@ jobs:
   push-ci-images-to-github-registry:
     timeout-minutes: 10
     name: "Push CI images"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs:
       - build-info
       - static-checks
@@ -741,7 +741,7 @@ jobs:
   constraints:
     timeout-minutes: 10
     name: "Constraints"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     strategy:
       matrix:
         python-version: ${{ fromJson(needs.build-info.outputs.pythonVersions) }}
@@ -774,7 +774,7 @@ jobs:
   constraints-push:
     timeout-minutes: 10
     name: "Constraints push"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs:
       - build-info
       - constraints
diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml
index e0178bf..2bf92b7 100644
--- a/.github/workflows/codeql-analysis.yml
+++ b/.github/workflows/codeql-analysis.yml
@@ -27,7 +27,7 @@ on:  # yamllint disable-line rule:truthy
 jobs:
   selective-checks:
     name: Selective checks
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     outputs:
       needs-python-scans: ${{ steps.selective-checks.outputs.needs-python-scans }}
       needs-javascript-scans: ${{ steps.selective-checks.outputs.needs-javascript-scans }}
@@ -52,7 +52,7 @@ jobs:
 
   analyze:
     name: Analyze
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [selective-checks]
     strategy:
       fail-fast: false
diff --git a/.github/workflows/delete_old_artifacts.yml b/.github/workflows/delete_old_artifacts.yml
index 8b35711..98329d5 100644
--- a/.github/workflows/delete_old_artifacts.yml
+++ b/.github/workflows/delete_old_artifacts.yml
@@ -23,7 +23,7 @@ on:  # yamllint disable-line rule:truthy
 
 jobs:
   delete-artifacts:
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     steps:
       - uses: kolpav/purge-artifacts-action@04c636a505f26ebc82f8d070b202fb87ff572b10  # v1.0
         with:
diff --git a/.github/workflows/label_when_reviewed.yml b/.github/workflows/label_when_reviewed.yml
index 62d7cc6..5095953 100644
--- a/.github/workflows/label_when_reviewed.yml
+++ b/.github/workflows/label_when_reviewed.yml
@@ -23,7 +23,7 @@ jobs:
 
   label-when-reviewed:
     name: "Label PRs when reviewed"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     steps:
       - name: "Do nothing. Only trigger corresponding workflow_run event"
         run: echo
diff --git a/.github/workflows/label_when_reviewed_workflow_run.yml b/.github/workflows/label_when_reviewed_workflow_run.yml
index f943609..6e45038 100644
--- a/.github/workflows/label_when_reviewed_workflow_run.yml
+++ b/.github/workflows/label_when_reviewed_workflow_run.yml
@@ -25,7 +25,7 @@ jobs:
 
   label-when-reviewed:
     name: "Label PRs when reviewed workflow run"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     outputs:
       labelSet: ${{ steps.label-when-reviewed.outputs.labelSet }}
     steps:
diff --git a/.github/workflows/scheduled_quarantined.yml b/.github/workflows/scheduled_quarantined.yml
index 552edfb..cf29c38 100644
--- a/.github/workflows/scheduled_quarantined.yml
+++ b/.github/workflows/scheduled_quarantined.yml
@@ -48,7 +48,7 @@ jobs:
   trigger-tests:
     timeout-minutes: 5
     name: "Checks if tests should be run"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     outputs:
       run-tests: ${{ steps.trigger-tests.outputs.run-tests }}
     steps:
@@ -60,7 +60,7 @@ jobs:
   tests-quarantined:
     timeout-minutes: 80
     name: "Quarantined tests"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     continue-on-error: true
     needs: [trigger-tests]
     strategy:


[airflow] 16/18: Remove "@" references from constraints generattion (#12671)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 36d31098314b9297539c155cd46290d8dd5f852a
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Sat Nov 28 06:04:45 2020 +0100

    Remove "@" references from constraints generattion (#12671)
    
    Likely fixes: #12665
    
    (cherry picked from commit 3b138d2d60d86ca0a80e9c27afd3421f45df178e)
---
 scripts/in_container/run_generate_constraints.sh | 1 +
 1 file changed, 1 insertion(+)

diff --git a/scripts/in_container/run_generate_constraints.sh b/scripts/in_container/run_generate_constraints.sh
index 999f750..62b2237 100755
--- a/scripts/in_container/run_generate_constraints.sh
+++ b/scripts/in_container/run_generate_constraints.sh
@@ -36,6 +36,7 @@ echo
 
 pip freeze | sort | \
     grep -v "apache_airflow" | \
+    grep -v "@" | \
     grep -v "/opt/airflow" >"${CURRENT_CONSTRAINT_FILE}"
 
 echo


[airflow] 01/18: Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 1fc1220d0190a37b8819281878c1a0a0987b7522
Author: Florent Chehab <fc...@meilleursagents.com>
AuthorDate: Tue Nov 17 10:11:53 2020 +0100

    Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)
    
    * Enable provisionning of extra secrets and configmaps in helm chart
    
    Added 2 new values:
    *  extraSecrets
    *  extraConfigMaps
    
    Those values enable the provisionning of ConfigMaps
    and secrets directly from the airflow chart.
    
    Those objects could be used for storing airflow variables
    or (secret) connections info for instance
    (the plan is to add support for extraEnv and extraEnvFrom later).
    
    Docs and tests updated accordingly.
    
    * Add support for extra env and envFrom items in helm chart
    
    Added 2 new values:
    *  extraEnv
    *  extraEnvFrom
    
    Those values will be added to the defintion of
    airflow containers. They are expected to be string
    (they can be templated).
    
    Those new values won't be supported by "legacy" kubernetes
    executor configuration (you must use the pod template).
    
    Therefore, the value 'env' is also deprecated as it's kind
    of a duplicate for extraEnv.
    
    Docs and tests updated accordingly.
    
    (cherry picked from commit 56ee2bb3cb6838df0181d753c24c72d0f4938b0a)
---
 chart/README.md                                    |   6 +-
 chart/files/pod-template-file.kubernetes-helm-yaml |  11 +-
 chart/templates/_helpers.yaml                      |  15 ++-
 chart/templates/{ => configmaps}/configmap.yaml    |   0
 chart/templates/configmaps/extra-configmaps.yaml   |  45 ++++++++
 chart/templates/create-user-job.yaml               |   2 +
 chart/templates/flower/flower-deployment.yaml      |   2 +-
 chart/templates/migrate-database-job.yaml          |   2 +
 .../templates/scheduler/scheduler-deployment.yaml  |  10 +-
 chart/templates/secrets/extra-secrets.yaml         |  51 +++++++++
 .../templates/webserver/webserver-deployment.yaml  |   8 +-
 chart/templates/workers/worker-deployment.yaml     |  10 +-
 chart/tests/helm_template_generator.py             |  12 +++
 chart/tests/test_extra_configmaps_secrets.py       | 110 +++++++++++++++++++
 chart/tests/test_extra_env_env_from.py             | 117 +++++++++++++++++++++
 chart/values.schema.json                           |  44 ++++++++
 chart/values.yaml                                  |  51 +++++++++
 17 files changed, 485 insertions(+), 11 deletions(-)

diff --git a/chart/README.md b/chart/README.md
index d56f114..c5106be 100644
--- a/chart/README.md
+++ b/chart/README.md
@@ -158,8 +158,12 @@ The following tables lists the configurable parameters of the Airflow chart and
 | `images.pgbouncerExporter.repository`                 | Docker repository to pull image from. Update this to deploy a custom image                                   | `apache/airflow`                                  |
 | `images.pgbouncerExporter.tag`                        | Docker image tag to pull image from. Update this to deploy a new custom image tag                            | `airflow-pgbouncer-exporter-2020.09.25-0.5.0`     |
 | `images.pgbouncerExporter.pullPolicy`                 | PullPolicy for pgbouncer-exporter image                                                                      | `IfNotPresent`                                    |
-| `env`                                                 | Environment variables key/values to mount into Airflow pods                                                  | `[]`                                              |
+| `env`                                                 | Environment variables key/values to mount into Airflow pods (deprecated, prefer using extraEnv)              | `[]`                                              |
 | `secret`                                              | Secret name/key pairs to mount into Airflow pods                                                             | `[]`                                              |
+| `extraEnv`                                            | Extra env 'items' that will be added to the definition of airflow containers                                 | `~`                                               |
+| `extraEnvFrom`                                        | Extra envFrom 'items' that will be added to the definition of airflow containers                             | `~`                                               |
+| `extraSecrets`                                        | Extra Secrets that will be managed by the chart                                                              | `{}`                                              |
+| `extraConfigMaps`                                     | Extra ConfigMaps that will be managed by the chart                                                           | `{}`                                              |
 | `data.metadataSecretName`                             | Secret name to mount Airflow connection string from                                                          | `~`                                               |
 | `data.resultBackendSecretName`                        | Secret name to mount Celery result backend connection string from                                            | `~`                                               |
 | `data.metadataConection`                              | Field separated connection data (alternative to secret name)                                                 | `{}`                                              |
diff --git a/chart/files/pod-template-file.kubernetes-helm-yaml b/chart/files/pod-template-file.kubernetes-helm-yaml
index 5c4fb92..33ae7b5 100644
--- a/chart/files/pod-template-file.kubernetes-helm-yaml
+++ b/chart/files/pod-template-file.kubernetes-helm-yaml
@@ -27,12 +27,13 @@ spec:
   containers:
     - args: []
       command: []
+      envFrom:
+      {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 6 }}
       env:
-      - name: AIRFLOW__CORE__EXECUTOR
-        value: LocalExecutor
-{{- include "standard_airflow_environment" . | indent 4 }}
-{{- include "custom_airflow_environment" . | indent 4 }}
-      envFrom: []
+        - name: AIRFLOW__CORE__EXECUTOR
+          value: LocalExecutor
+{{- include "standard_airflow_environment" . | indent 6}}
+{{- include "custom_airflow_environment" . | indent 6 }}
       image: {{ template "pod_template_image" . }}
       imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
       name: base
diff --git a/chart/templates/_helpers.yaml b/chart/templates/_helpers.yaml
index 059d64d..df7b158 100644
--- a/chart/templates/_helpers.yaml
+++ b/chart/templates/_helpers.yaml
@@ -85,12 +85,25 @@
         name: {{ $config.secretName }}
         key: {{ default "value" $config.secretKey }}
   {{- end }}
-    {{- if or (eq $.Values.executor "KubernetesExecutor") (eq $.Values.executor "CeleryKubernetesExecutor") }}
+  {{- if or (eq $.Values.executor "KubernetesExecutor") (eq $.Values.executor "CeleryKubernetesExecutor") }}
     {{- range $i, $config := .Values.secret }}
   - name: AIRFLOW__KUBERNETES_SECRETS__{{ $config.envName }}
     value: {{ printf "%s=%s" $config.secretName $config.secretKey }}
     {{- end }}
   {{ end }}
+  # Extra env
+  {{- $Global := . }}
+  {{- with .Values.extraEnv }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
+{{- end }}
+
+{{/* User defined Airflow environment from */}}
+{{- define "custom_airflow_environment_from" }}
+  {{- $Global := . }}
+  {{- with .Values.extraEnvFrom }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
 {{- end }}
 
 {{/*  Git ssh key volume */}}
diff --git a/chart/templates/configmap.yaml b/chart/templates/configmaps/configmap.yaml
similarity index 100%
rename from chart/templates/configmap.yaml
rename to chart/templates/configmaps/configmap.yaml
diff --git a/chart/templates/configmaps/extra-configmaps.yaml b/chart/templates/configmaps/extra-configmaps.yaml
new file mode 100644
index 0000000..a186aba
--- /dev/null
+++ b/chart/templates/configmaps/extra-configmaps.yaml
@@ -0,0 +1,45 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+####################################################
+## Extra ConfigMaps provisioned via the chart values
+####################################################
+{{- $Global := . }}
+{{- range $configMapName, $configMapContent := .Values.extraConfigMaps }}
+---
+apiVersion: v1
+kind: ConfigMap
+metadata:
+  name: {{ tpl $configMapName $Global | quote }}
+  labels:
+    release: {{ $Global.Release.Name }}
+    chart: "{{ $Global.Chart.Name }}-{{ $Global.Chart.Version }}"
+    heritage: {{ $Global.Release.Service }}
+  annotations:
+    "helm.sh/hook": "pre-install,pre-upgrade"
+    "helm.sh/hook-delete-policy": "before-hook-creation"
+    "helm.sh/hook-weight": "0"
+{{- with $Global.Values.labels }}
+{{ toYaml . | indent 4 }}
+{{- end }}
+{{- if $configMapContent.data }}
+data:
+  {{- with $configMapContent.data }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
+{{- end }}
+{{- end }}
diff --git a/chart/templates/create-user-job.yaml b/chart/templates/create-user-job.yaml
index 27a0363..4df7dd6 100644
--- a/chart/templates/create-user-job.yaml
+++ b/chart/templates/create-user-job.yaml
@@ -79,6 +79,8 @@ spec:
             - {{ .Values.webserver.defaultUser.lastName }}
             - "-p"
             - {{ .Values.webserver.defaultUser.password }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
diff --git a/chart/templates/flower/flower-deployment.yaml b/chart/templates/flower/flower-deployment.yaml
index c5d1f91..3a33369 100644
--- a/chart/templates/flower/flower-deployment.yaml
+++ b/chart/templates/flower/flower-deployment.yaml
@@ -33,7 +33,7 @@ metadata:
 {{ toYaml . | indent 4 }}
 {{- end }}
   annotations:
-    checksum/airflow-config: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
+    checksum/airflow-config: {{ include (print $.Template.BasePath "/configmaps/configmap.yaml") . | sha256sum }}
 spec:
   replicas: 1
   selector:
diff --git a/chart/templates/migrate-database-job.yaml b/chart/templates/migrate-database-job.yaml
index 37a9b2d..8639648 100644
--- a/chart/templates/migrate-database-job.yaml
+++ b/chart/templates/migrate-database-job.yaml
@@ -62,6 +62,8 @@ spec:
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           # Support running against 1.10.x and 2.0.0dev/master
           args: ["bash", "-c", "airflow upgradedb || airflow db upgrade"]
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
diff --git a/chart/templates/scheduler/scheduler-deployment.yaml b/chart/templates/scheduler/scheduler-deployment.yaml
index 9a928a6..61dcade 100644
--- a/chart/templates/scheduler/scheduler-deployment.yaml
+++ b/chart/templates/scheduler/scheduler-deployment.yaml
@@ -65,7 +65,9 @@ spec:
         checksum/metadata-secret: {{ include (print $.Template.BasePath "/secrets/metadata-connection-secret.yaml") . | sha256sum }}
         checksum/result-backend-secret: {{ include (print $.Template.BasePath "/secrets/result-backend-connection-secret.yaml") . | sha256sum }}
         checksum/pgbouncer-config-secret: {{ include (print $.Template.BasePath "/secrets/pgbouncer-config-secret.yaml") . | sha256sum }}
-        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
+        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmaps/configmap.yaml") . | sha256sum }}
+        checksum/extra-configmaps: {{ include (print $.Template.BasePath "/configmaps/extra-configmaps.yaml") . | sha256sum }}
+        checksum/extra-secrets: {{ include (print $.Template.BasePath "/secrets/extra-secrets.yaml") . | sha256sum }}
         {{- if .Values.scheduler.safeToEvict }}
         cluster-autoscaler.kubernetes.io/safe-to-evict: "true"
         {{- end }}
@@ -95,6 +97,8 @@ spec:
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           args:
           {{- include "wait-for-migrations-command" . | indent 10 }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -104,6 +108,8 @@ spec:
           image: {{ template "airflow_image" . }}
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           args: ["scheduler"]
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -184,6 +190,8 @@ spec:
               mountPath: {{ template "airflow_config_path" . }}
               subPath: airflow.cfg
               readOnly: true
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
diff --git a/chart/templates/secrets/extra-secrets.yaml b/chart/templates/secrets/extra-secrets.yaml
new file mode 100644
index 0000000..1326aa2
--- /dev/null
+++ b/chart/templates/secrets/extra-secrets.yaml
@@ -0,0 +1,51 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+#################################################
+## Extra Secrets provisioned via the chart values
+#################################################
+{{- $Global := . }}
+{{- range $secretName, $secretContent := .Values.extraSecrets }}
+---
+apiVersion: v1
+kind: Secret
+metadata:
+  name: {{ tpl $secretName $Global | quote }}
+  labels:
+    release: {{ $Global.Release.Name }}
+    chart: "{{ $Global.Chart.Name }}-{{ $Global.Chart.Version }}"
+    heritage: {{ $Global.Release.Service }}
+  annotations:
+    "helm.sh/hook": "pre-install,pre-upgrade"
+    "helm.sh/hook-delete-policy": "before-hook-creation"
+    "helm.sh/hook-weight": "0"
+{{- with $Global.Values.labels }}
+{{ toYaml . | indent 4 }}
+{{- end }}
+{{- if $secretContent.data }}
+data:
+  {{- with $secretContent.data }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
+{{- end }}
+{{- if $secretContent.stringData }}
+stringData:
+  {{- with $secretContent.stringData }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
+{{- end }}
+{{- end }}
diff --git a/chart/templates/webserver/webserver-deployment.yaml b/chart/templates/webserver/webserver-deployment.yaml
index a3c42c0..25b6b63 100644
--- a/chart/templates/webserver/webserver-deployment.yaml
+++ b/chart/templates/webserver/webserver-deployment.yaml
@@ -54,7 +54,9 @@ spec:
       annotations:
         checksum/metadata-secret: {{ include (print $.Template.BasePath "/secrets/metadata-connection-secret.yaml") . | sha256sum }}
         checksum/pgbouncer-config-secret: {{ include (print $.Template.BasePath "/secrets/pgbouncer-config-secret.yaml") . | sha256sum }}
-        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
+        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmaps/configmap.yaml") . | sha256sum }}
+        checksum/extra-configmaps: {{ include (print $.Template.BasePath "/configmaps/extra-configmaps.yaml") . | sha256sum }}
+        checksum/extra-secrets: {{ include (print $.Template.BasePath "/secrets/extra-secrets.yaml") . | sha256sum }}
         {{- if .Values.airflowPodAnnotations }}
         {{- toYaml .Values.airflowPodAnnotations | nindent 8 }}
         {{- end }}
@@ -80,6 +82,8 @@ spec:
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           args:
           {{- include "wait-for-migrations-command" . | indent 10 }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -136,6 +140,8 @@ spec:
             timeoutSeconds: {{ .Values.webserver.readinessProbe.timeoutSeconds | default 30 }}
             failureThreshold: {{ .Values.webserver.readinessProbe.failureThreshold | default 20 }}
             periodSeconds: {{ .Values.webserver.readinessProbe.periodSeconds | default 5 }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
diff --git a/chart/templates/workers/worker-deployment.yaml b/chart/templates/workers/worker-deployment.yaml
index 77d5fe2..40fbbe1 100644
--- a/chart/templates/workers/worker-deployment.yaml
+++ b/chart/templates/workers/worker-deployment.yaml
@@ -56,7 +56,9 @@ spec:
         checksum/metadata-secret: {{ include (print $.Template.BasePath "/secrets/metadata-connection-secret.yaml") . | sha256sum }}
         checksum/result-backend-secret: {{ include (print $.Template.BasePath "/secrets/result-backend-connection-secret.yaml") . | sha256sum }}
         checksum/pgbouncer-config-secret: {{ include (print $.Template.BasePath "/secrets/pgbouncer-config-secret.yaml") . | sha256sum }}
-        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
+        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmaps/configmap.yaml") . | sha256sum }}
+        checksum/extra-configmaps: {{ include (print $.Template.BasePath "/configmaps/extra-configmaps.yaml") . | sha256sum }}
+        checksum/extra-secrets: {{ include (print $.Template.BasePath "/secrets/extra-secrets.yaml") . | sha256sum }}
         {{- if .Values.workers.safeToEvict }}
         cluster-autoscaler.kubernetes.io/safe-to-evict: "true"
         {{- end }}
@@ -101,6 +103,8 @@ spec:
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           args:
           {{- include "wait-for-migrations-command" . | indent 10 }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -146,6 +150,8 @@ spec:
             - name: dags
               mountPath: {{ template "airflow_dags_mount_path" . }}
 {{- end }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -195,6 +201,8 @@ spec:
             - name: kerberos-ccache
               mountPath: {{ .Values.kerberos.ccacheMountPath | quote }}
               readOnly: false
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
             - name: KRB5_CONFIG
               value:  {{ .Values.kerberos.configPath | quote }}
diff --git a/chart/tests/helm_template_generator.py b/chart/tests/helm_template_generator.py
index ba870ed..d8e3f49 100644
--- a/chart/tests/helm_template_generator.py
+++ b/chart/tests/helm_template_generator.py
@@ -19,6 +19,7 @@ import subprocess
 import sys
 from functools import lru_cache
 from tempfile import NamedTemporaryFile
+from typing import Any, Dict, Tuple
 
 import jmespath
 import jsonschema
@@ -81,6 +82,17 @@ def render_chart(name="RELEASE-NAME", values=None, show_only=None):
         return k8s_objects
 
 
+def prepare_k8s_lookup_dict(k8s_objects) -> Dict[Tuple[str, str], Dict[str, Any]]:
+    """
+    Helper to create a lookup dict from k8s_objects.
+    The keys of the dict are the k8s object's kind and name
+    """
+    k8s_obj_by_key = {
+        (k8s_object["kind"], k8s_object["metadata"]["name"]): k8s_object for k8s_object in k8s_objects
+    }
+    return k8s_obj_by_key
+
+
 def render_k8s_object(obj, type_to_render):
     """
     Function that renders dictionaries into k8s objects. For helm chart testing only.
diff --git a/chart/tests/test_extra_configmaps_secrets.py b/chart/tests/test_extra_configmaps_secrets.py
new file mode 100644
index 0000000..378d80e
--- /dev/null
+++ b/chart/tests/test_extra_configmaps_secrets.py
@@ -0,0 +1,110 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import textwrap
+import unittest
+from base64 import b64encode
+
+import yaml
+
+from tests.helm_template_generator import prepare_k8s_lookup_dict, render_chart
+
+RELEASE_NAME = "TEST-EXTRA-CONFIGMAPS-SECRETS"
+
+
+class ExtraConfigMapsSecretsTest(unittest.TestCase):
+    def test_extra_configmaps(self):
+        values_str = textwrap.dedent(
+            """
+            extraConfigMaps:
+              "{{ .Release.Name }}-airflow-variables":
+                data: |
+                  AIRFLOW_VAR_HELLO_MESSAGE: "Hi!"
+                  AIRFLOW_VAR_KUBERNETES_NAMESPACE: "{{ .Release.Namespace }}"
+              "{{ .Release.Name }}-other-variables":
+                data: |
+                  HELLO_WORLD: "Hi again!"
+            """
+        )
+        values = yaml.safe_load(values_str)
+        k8s_objects = render_chart(
+            RELEASE_NAME, values=values, show_only=["templates/configmaps/extra-configmaps.yaml"]
+        )
+        k8s_objects_by_key = prepare_k8s_lookup_dict(k8s_objects)
+
+        all_expected_keys = [
+            ("ConfigMap", f"{RELEASE_NAME}-airflow-variables"),
+            ("ConfigMap", f"{RELEASE_NAME}-other-variables"),
+        ]
+        self.assertEqual(set(k8s_objects_by_key.keys()), set(all_expected_keys))
+
+        all_expected_data = [
+            {"AIRFLOW_VAR_HELLO_MESSAGE": "Hi!", "AIRFLOW_VAR_KUBERNETES_NAMESPACE": "default"},
+            {"HELLO_WORLD": "Hi again!"},
+        ]
+        for expected_key, expected_data in zip(all_expected_keys, all_expected_data):
+            configmap_obj = k8s_objects_by_key[expected_key]
+            self.assertEqual(configmap_obj["data"], expected_data)
+
+    def test_extra_secrets(self):
+        values_str = textwrap.dedent(
+            """
+            extraSecrets:
+              "{{ .Release.Name }}-airflow-connections":
+                data: |
+                  AIRFLOW_CON_AWS: {{ printf "aws_connection_string" | b64enc }}
+                stringData: |
+                  AIRFLOW_CON_GCP: "gcp_connection_string"
+              "{{ .Release.Name }}-other-secrets":
+                data: |
+                  MY_SECRET_1: {{ printf "MY_SECRET_1" | b64enc }}
+                  MY_SECRET_2: {{ printf "MY_SECRET_2" | b64enc }}
+                stringData: |
+                  MY_SECRET_3: "MY_SECRET_3"
+                  MY_SECRET_4: "MY_SECRET_4"
+            """
+        )
+        values = yaml.safe_load(values_str)
+        k8s_objects = render_chart(
+            RELEASE_NAME, values=values, show_only=["templates/secrets/extra-secrets.yaml"]
+        )
+        k8s_objects_by_key = prepare_k8s_lookup_dict(k8s_objects)
+
+        all_expected_keys = [
+            ("Secret", f"{RELEASE_NAME}-airflow-connections"),
+            ("Secret", f"{RELEASE_NAME}-other-secrets"),
+        ]
+        self.assertEqual(set(k8s_objects_by_key.keys()), set(all_expected_keys))
+
+        all_expected_data = [
+            {"AIRFLOW_CON_AWS": b64encode(b"aws_connection_string").decode("utf-8")},
+            {
+                "MY_SECRET_1": b64encode(b"MY_SECRET_1").decode("utf-8"),
+                "MY_SECRET_2": b64encode(b"MY_SECRET_2").decode("utf-8"),
+            },
+        ]
+
+        all_expected_string_data = [
+            {"AIRFLOW_CON_GCP": "gcp_connection_string"},
+            {"MY_SECRET_3": "MY_SECRET_3", "MY_SECRET_4": "MY_SECRET_4"},
+        ]
+        for expected_key, expected_data, expected_string_data in zip(
+            all_expected_keys, all_expected_data, all_expected_string_data
+        ):
+            configmap_obj = k8s_objects_by_key[expected_key]
+            self.assertEqual(configmap_obj["data"], expected_data)
+            self.assertEqual(configmap_obj["stringData"], expected_string_data)
diff --git a/chart/tests/test_extra_env_env_from.py b/chart/tests/test_extra_env_env_from.py
new file mode 100644
index 0000000..170fc7a
--- /dev/null
+++ b/chart/tests/test_extra_env_env_from.py
@@ -0,0 +1,117 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import textwrap
+import unittest
+
+import jmespath
+import yaml
+from parameterized import parameterized
+
+from tests.helm_template_generator import prepare_k8s_lookup_dict, render_chart
+
+RELEASE_NAME = "TEST-EXTRA-ENV-ENV-FROM"
+
+# Test Params: k8s object key and paths with expected env / envFrom
+PARAMS = [
+    (
+        ("Job", "{}-create-user".format(RELEASE_NAME)),
+        ("spec.template.spec.containers[0]",),
+    ),
+    (
+        ("Job", "{}-run-airflow-migrations".format(RELEASE_NAME)),
+        ("spec.template.spec.containers[0]",),
+    ),
+    (
+        ("Deployment", "{}-scheduler".format(RELEASE_NAME)),
+        (
+            "spec.template.spec.initContainers[0]",
+            "spec.template.spec.containers[0]",
+        ),
+    ),
+    (
+        ("StatefulSet", "{}-worker".format(RELEASE_NAME)),
+        (
+            "spec.template.spec.initContainers[0]",
+            "spec.template.spec.containers[0]",
+        ),
+    ),
+    (
+        ("Deployment", "{}-webserver".format(RELEASE_NAME)),
+        ("spec.template.spec.initContainers[0]", "spec.template.spec.containers[0]"),
+    ),
+]
+
+
+class ExtraEnvEnvFromTest(unittest.TestCase):
+    @classmethod
+    def setUpClass(cls) -> None:
+        values_str = textwrap.dedent(
+            """
+            executor: "CeleryExecutor"
+            extraEnvFrom: |
+              - secretRef:
+                  name: '{{ .Release.Name }}-airflow-connections'
+              - configMapRef:
+                  name: '{{ .Release.Name }}-airflow-variables'
+            extraEnv: |
+              - name: PLATFORM
+                value: FR
+              - name: TEST
+                valueFrom:
+                  secretKeyRef:
+                    name: '{{ .Release.Name }}-some-secret'
+                    key: connection
+            """
+        )
+        values = yaml.safe_load(values_str)
+        cls.k8s_objects = render_chart(RELEASE_NAME, values=values)  # type: ignore
+        cls.k8s_objects_by_key = prepare_k8s_lookup_dict(cls.k8s_objects)  # type: ignore
+
+    @parameterized.expand(PARAMS)
+    def test_extra_env(self, k8s_obj_key, env_paths):
+        expected_env_as_str = textwrap.dedent(
+            """
+            - name: PLATFORM
+              value: FR
+            - name: TEST
+              valueFrom:
+                secretKeyRef:
+                  key: connection
+                  name: {}-some-secret
+            """.format(RELEASE_NAME)
+        ).lstrip()
+        k8s_object = self.k8s_objects_by_key[k8s_obj_key]
+        for path in env_paths:
+            env = jmespath.search("{}.env".format(path), k8s_object)
+            self.assertIn(expected_env_as_str, yaml.dump(env))
+
+    @parameterized.expand(PARAMS)
+    def test_extra_env_from(self, k8s_obj_key, env_from_paths):
+        expected_env_from_as_str = textwrap.dedent(
+            """
+            - secretRef:
+                name: {}-airflow-connections
+            - configMapRef:
+                name: {}-airflow-variables
+            """.format(RELEASE_NAME, RELEASE_NAME)
+        ).lstrip()
+
+        k8s_object = self.k8s_objects_by_key[k8s_obj_key]
+        for path in env_from_paths:
+            env_from = jmespath.search("{}.envFrom".format(path), k8s_object)
+            self.assertIn(expected_env_from_as_str, yaml.dump(env_from))
diff --git a/chart/values.schema.json b/chart/values.schema.json
index 7881c82..f1d8271 100644
--- a/chart/values.schema.json
+++ b/chart/values.schema.json
@@ -343,6 +343,50 @@
             "description": "Secrets for all airflow containers.",
             "type": "array"
         },
+        "extraEnv": {
+          "description": "Extra env 'items' that will be added to the definition of airflow containers ; a string is expected (can be templated).",
+          "type": ["null", "string"]
+        },
+        "extraEnvFrom": {
+          "description": "Extra envFrom 'items' that will be added to the definition of airflow containers ; a string is expected (can be templated).",
+          "type": ["null", "string"]
+        },
+        "extraSecrets": {
+          "description": "Extra secrets that will be managed by the chart.",
+          "type": "object",
+          "additionalProperties": {
+            "description": "Name of the secret (can be templated).",
+            "type": "object",
+            "minProperties": 1,
+            "additionalProperties": false,
+            "properties": {
+              "data": {
+                "description": "Content **as string** for the 'data' item of the secret (can be templated)",
+                "type": "string"
+              },
+              "stringData": {
+                "description": "Content **as string** for the 'stringData' item of the secret (can be templated)",
+                "type": "string"
+              }
+            }
+          }
+        },
+        "extraConfigMaps": {
+          "description": "Extra configMaps that will be managed by the chart.",
+          "type": "object",
+          "additionalProperties": {
+            "description": "Name of the configMap (can be templated).",
+            "type": "object",
+            "minProperties": 1,
+            "additionalProperties": false,
+            "properties": {
+              "data": {
+                "description": "Content **as string** for the 'data' item of the secret (can be templated)",
+                "type": "string"
+              }
+            }
+          }
+        },
         "data": {
             "description": "Airflow database configuration.",
             "type": "object",
diff --git a/chart/values.yaml b/chart/values.yaml
index 0f5b313..091a0c9 100644
--- a/chart/values.yaml
+++ b/chart/values.yaml
@@ -163,6 +163,57 @@ secret: []
 #   secretName: ""
 #   secretKey: ""
 
+# Extra secrets that will be managed by the chart
+# (You can use them with extraEnv or extraEnvFrom or some of the extraVolumes values).
+# The format is "key/value" where
+#    * key (can be templated) is the the name the secret that will be created
+#    * value: an object with the standard 'data' or 'stringData' key (or both).
+#          The value associated with those keys must be a string (can be templated)
+extraSecrets: {}
+# eg:
+# extraSecrets:
+#   {{ .Release.Name }}-airflow-connections:
+#     data: |
+#       AIRFLOW_CONN_GCP: 'base64_encoded_gcp_conn_string'
+#       AIRFLOW_CONN_AWS: 'base64_encoded_aws_conn_string'
+#     stringData: |
+#       AIRFLOW_CONN_OTHER: 'other_conn'
+#   {{ .Release.Name }}-other-secret-name-suffix: |
+#     data: |
+#        ...
+
+# Extra ConfigMaps that will be managed by the chart
+# (You can use them with extraEnv or extraEnvFrom or some of the extraVolumes values).
+# The format is "key/value" where
+#    * key (can be templated) is the the name the configmap that will be created
+#    * value: an object with the standard 'data' key.
+#          The value associated with this keys must be a string (can be templated)
+extraConfigMaps: {}
+# eg:
+# extraConfigMaps:
+#   {{ .Release.Name }}-airflow-variables:
+#     data: |
+#       AIRFLOW_VAR_HELLO_MESSAGE: "Hi!"
+#       AIRFLOW_VAR_KUBERNETES_NAMESPACE: "{{ .Release.Namespace }}"
+
+# Extra env 'items' that will be added to the definition of airflow containers
+# a string is expected (can be templated).
+extraEnv: ~
+# eg:
+# extraEnv: |
+#   - name: PLATFORM
+#     value: FR
+
+# Extra envFrom 'items' that will be added to the definition of airflow containers
+# A string is expected (can be templated).
+extraEnvFrom: ~
+# eg:
+# extraEnvFrom: |
+#   - secretRef:
+#       name: '{{ .Release.Name }}-airflow-connections'
+#   - configMapRef:
+#       name: '{{ .Release.Name }}-airflow-variables'
+
 # Airflow database config
 data:
   # If secret names are provided, use those secrets


[airflow] 15/18: Add 1.10.13 to CI, Breeze and Docs (#12652)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 659779fbf0341be0481752eefb7f52fe45096791
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Fri Nov 27 13:35:28 2020 +0000

    Add 1.10.13 to CI, Breeze and Docs (#12652)
    
    (cherry picked from commit 9a74ee5fff6922543b7a3086969ca578d05c7417)
---
 BREEZE.rst                     |  8 +++----
 IMAGES.rst                     | 18 +++++++--------
 breeze-complete                |  1 +
 docs/installation.rst          | 10 ++++-----
 docs/production-deployment.rst | 50 +++++++++++++++++++++---------------------
 5 files changed, 44 insertions(+), 43 deletions(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index 2c4bffbe..f91b598 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1187,8 +1187,8 @@ This is the current syntax for  `./breeze <./breeze>`_:
           If specified, installs Airflow directly from PIP released version. This happens at
           image building time in production image and at container entering time for CI image. One of:
 
-                 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3 1.10.2
-                 wheel
+                 1.10.13 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3
+                 1.10.2 wheel
 
   -t, --install-airflow-reference INSTALL_AIRFLOW_REFERENCE
           If specified, installs Airflow directly from reference in GitHub. This happens at
@@ -2098,8 +2098,8 @@ This is the current syntax for  `./breeze <./breeze>`_:
           If specified, installs Airflow directly from PIP released version. This happens at
           image building time in production image and at container entering time for CI image. One of:
 
-                 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3 1.10.2
-                 wheel
+                 1.10.13 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3
+                 1.10.2 wheel
 
   -t, --install-airflow-reference INSTALL_AIRFLOW_REFERENCE
           If specified, installs Airflow directly from reference in GitHub. This happens at
diff --git a/IMAGES.rst b/IMAGES.rst
index 13ce935..724d73c 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -39,7 +39,7 @@ The images are named as follows:
 
 where:
 
-* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, ``v1-10-test``, ``1.10.12``
+* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, ``v1-10-test``, ``1.10.13``
   The ``master`` and ``v1-10-test`` labels are built from branches so they change over time. The ``1.10.*`` and in
   the future ``2.*`` labels are build from git tags and they are "fixed" once built.
 * ``PYTHON_MAJOR_MINOR_VERSION`` - version of python used to build the image. Examples: ``3.5``, ``3.7``
@@ -115,15 +115,15 @@ parameter to Breeze:
 .. code-block:: bash
 
   ./breeze build-image --python 3.7 --additional-extras=presto \
-      --production-image --install-airflow-version=1.10.12
+      --production-image --install-airflow-version=1.10.13
 
 This will build the image using command similar to:
 
 .. code-block:: bash
 
     pip install \
-      apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.12 \
-      --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.6.txt"
+      apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.13 \
+      --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt"
 
 You can also build production images from specific Git version via providing ``--install-airflow-reference``
 parameter to Breeze (this time constraints are taken from the ``constraints-master`` branch which is the
@@ -210,8 +210,8 @@ For example:
   apache/airflow:master-python3.6                - production "latest" image from current master
   apache/airflow:master-python3.6-ci             - CI "latest" image from current master
   apache/airflow:v1-10-test-python2.7-ci         - CI "latest" image from current v1-10-test branch
-  apache/airflow:1.10.12-python3.6               - production image for 1.10.12 release
-  apache/airflow:1.10.12-1-python3.6             - production image for 1.10.12 with some patches applied
+  apache/airflow:1.10.13-python3.6               - production image for 1.10.13 release
+  apache/airflow:1.10.13-1-python3.6             - production image for 1.10.13 with some patches applied
 
 
 You can see DockerHub images at `<https://hub.docker.com/repository/docker/apache/airflow>`_
@@ -292,7 +292,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -308,7 +308,7 @@ the same image can be built using ``breeze`` (it supports auto-completion of the
 .. code-block:: bash
 
   ./breeze build-image -f Dockerfile.ci \
-      --production-image  --python 3.7 --install-airflow-version=1.10.12 \
+      --production-image  --python 3.7 --install-airflow-version=1.10.13 \
       --additional-extras=jdbc --additional-python-deps="pandas" \
       --additional-dev-apt-deps="gcc g++" --additional-runtime-apt-deps="default-jre-headless"
 You can build the default production image with standard ``docker build`` command but they will only build
@@ -326,7 +326,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
diff --git a/breeze-complete b/breeze-complete
index 94854ad..505c6bc 100644
--- a/breeze-complete
+++ b/breeze-complete
@@ -49,6 +49,7 @@ _breeze_allowed_test_types="All Core Integration Heisentests Postgres MySQL Helm
 }
 
 _breeze_allowed_install_airflow_versions=$(cat <<-EOF
+1.10.13
 1.10.12
 1.10.11
 1.10.10
diff --git a/docs/installation.rst b/docs/installation.rst
index de1985c..12ce19e 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -31,7 +31,7 @@ if needed. This means that from time to time plain ``pip install apache-airflow`
 produce unusable Airflow installation.
 
 In order to have repeatable installation, however, starting from **Airflow 1.10.10** and updated in
-**Airflow 1.10.12** we also keep a set of "known-to-be-working" constraint files in the
+**Airflow 1.10.13** we also keep a set of "known-to-be-working" constraint files in the
 ``constraints-master`` and ``constraints-1-10`` orphan branches.
 Those "known-to-be-working" constraints are per major/minor python version. You can use them as constraint
 files when installing Airflow from PyPI. Note that you have to specify correct Airflow version
@@ -47,22 +47,22 @@ and python versions in the URL.
       sudo apt-get install build-essential
 
 
-1. Installing just airflow
+1. Installing just Airflow
 
 .. code-block:: bash
 
-    AIRFLOW_VERSION=1.10.12
+    AIRFLOW_VERSION=1.10.13
     PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
     # For example: 3.6
     CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
-    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.6.txt
+    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt
     pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
 
 2. Installing with extras (for example postgres, google)
 
 .. code-block:: bash
 
-    AIRFLOW_VERSION=1.10.12
+    AIRFLOW_VERSION=1.10.13
     PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
     CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
     pip install "apache-airflow[postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
diff --git a/docs/production-deployment.rst b/docs/production-deployment.rst
index 4bfaabb..3edddb8 100644
--- a/docs/production-deployment.rst
+++ b/docs/production-deployment.rst
@@ -64,7 +64,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.12
+  FROM: apache/airflow:1.10.13
   USER root
   RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -81,7 +81,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.12
+  FROM: apache/airflow:1.10.13
   RUN pip install --no-cache-dir --user my-awesome-pip-dependency-to-add
 
 
@@ -92,7 +92,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.12
+  FROM: apache/airflow:1.10.13
   USER root
   RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -125,7 +125,7 @@ in the `<#production-image-build-arguments>`_ chapter below.
 
 Here just a few examples are presented which should give you general understanding of what you can customize.
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.10 Pypi package and
+This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
 additional apt dev and runtime dependencies.
 
 .. code-block:: bash
@@ -134,7 +134,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -150,7 +150,7 @@ the same image can be built using ``breeze`` (it supports auto-completion of the
 .. code-block:: bash
 
   ./breeze build-image \
-      --production-image  --python 3.7 --install-airflow-version=1.10.12 \
+      --production-image  --python 3.7 --install-airflow-version=1.10.13 \
       --additional-extras=jdbc --additional-python-deps="pandas" \
       --additional-dev-apt-deps="gcc g++" --additional-runtime-apt-deps="default-jre-headless"
 
@@ -166,7 +166,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -225,7 +225,7 @@ Preparing the constraint files and wheel files:
 
   pip download --dest docker-context-files \
     --constraint docker-context-files/constraints-1-10.txt  \
-    apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.12
+    apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.13
 
 
 Building the image (after copying the files downloaded to the "docker-context-files" directory:
@@ -233,7 +233,7 @@ Building the image (after copying the files downloaded to the "docker-context-fi
 .. code-block:: bash
 
   ./breeze build-image \
-      --production-image --python 3.7 --install-airflow-version=1.10.12 \
+      --production-image --python 3.7 --install-airflow-version=1.10.13 \
       --disable-mysql-client-installation --disable-pip-cache --add-local-pip-wheels \
       --constraints-location="/docker-context-files/constraints-1-10.txt"
 
@@ -245,7 +245,7 @@ or
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -392,7 +392,7 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 |                                          |                                          | ``constraints-master`` but can be        |
 |                                          |                                          | ``constraints-1-10`` for 1.10.* versions |
 |                                          |                                          | or it could point to specific version    |
-|                                          |                                          | for example ``constraints-1.10.12``      |
+|                                          |                                          | for example ``constraints-1.10.13``      |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_EXTRAS``                       | (see Dockerfile)                         | Default extras with which airflow is     |
 |                                          |                                          | installed                                |
@@ -503,7 +503,7 @@ production image. There are three types of build:
 | ``AIRFLOW_INSTALL_VERSION``       | Optional - might be used for      |
 |                                   | package installation case to      |
 |                                   | set Airflow version for example   |
-|                                   | "==1.10.12"                       |
+|                                   | "==1.10.13"                       |
 +-----------------------------------+-----------------------------------+
 | ``AIRFLOW_CONSTRAINTS_REFERENCE`` | reference (branch or tag) from    |
 |                                   | GitHub where constraints file     |
@@ -512,7 +512,7 @@ production image. There are three types of build:
 |                                   | ``constraints-1-10`` for 1.10.*   |
 |                                   | constraint or if you want to      |
 |                                   | point to specific version         |
-|                                   | might be ``constraints-1.10.12``  |
+|                                   | might be ``constraints-1.10.13``  |
 +-----------------------------------+-----------------------------------+
 | ``SLUGIFY_USES_TEXT_UNIDECODE``   | In case of of installing airflow  |
 |                                   | 1.10.2 or 1.10.1 you need to      |
@@ -546,7 +546,7 @@ of 2.0 currently):
 
   docker build .
 
-This builds the production image in version 3.7 with default extras from 1.10.12 tag and
+This builds the production image in version 3.7 with default extras from 1.10.13 tag and
 constraints taken from constraints-1-10-12 branch in GitHub.
 
 .. code-block:: bash
@@ -554,14 +554,14 @@ constraints taken from constraints-1-10-12 branch in GitHub.
   docker build . \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
-    --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.12.tar.gz#egg=apache-airflow" \
+    --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.13.tar.gz#egg=apache-airflow" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty"
 
-This builds the production image in version 3.7 with default extras from 1.10.12 Pypi package and
-constraints taken from 1.10.12 tag in GitHub and pre-installed pip dependencies from the top
+This builds the production image in version 3.7 with default extras from 1.10.13 PyPI package and
+constraints taken from 1.10.13 tag in GitHub and pre-installed pip dependencies from the top
 of v1-10-test branch.
 
 .. code-block:: bash
@@ -570,14 +570,14 @@ of v1-10-test branch.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
-    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.12" \
+    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.13" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty"
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.12 Pypi package and
-additional python dependencies and pre-installed pip dependencies from 1.10.12 tagged constraints.
+This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
+additional python dependencies and pre-installed pip dependencies from 1.10.13 tagged constraints.
 
 .. code-block:: bash
 
@@ -585,15 +585,15 @@ additional python dependencies and pre-installed pip dependencies from 1.10.12 t
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
-    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.12" \
+    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.13" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
     --build-arg ADDITIONAL_AIRFLOW_EXTRAS="mssql,hdfs"
     --build-arg ADDITIONAL_PYTHON_DEPS="sshtunnel oauth2client"
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.12 Pypi package and
+This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
 additional apt dev and runtime dependencies.
 
 .. code-block:: bash
@@ -602,7 +602,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \


[airflow] 07/18: Fix broken CI.yml (#12454)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c6021cbffe296a2e561341d9c47b4d388d6f679a
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Wed Nov 18 17:34:00 2020 +0100

    Fix broken CI.yml (#12454)
    
    The PR #12417 broke CI.yaml accidentally. This PR fixes it.
    
    (cherry picked from commit 93b327051605fb9cd9bebf77802090482b246013)
---
 .github/workflows/ci.yml | 1 +
 1 file changed, 1 insertion(+)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 5aadfd0..5931135 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -638,6 +638,7 @@ jobs:
         with:
           path: ".build/.kubernetes_venv*"
           key: "venv-${{ env.cache-name }}-${{ github.job }}-${{ hashFiles('setup.py') }}\
+-${{ hashFiles('setup.cfg') }}\
 -${{ needs.build-info.outputs.defaultPythonVersion }}"
       - name: "Cache bin folder with tools for kubernetes testing"
         uses: actions/cache@v2


[airflow] 02/18: Fix typo in check_environment.sh (#12395)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 55d54d8f3b557372d837da46e0150d3771a95394
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Tue Nov 17 12:04:03 2020 +0000

    Fix typo in check_environment.sh (#12395)
    
    `Databsae` -> `Database`
    
    (cherry picked from commit 3e994abc1cbac318f70f9319d364c1ed5a8074f9)
---
 scripts/in_container/check_environment.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/scripts/in_container/check_environment.sh b/scripts/in_container/check_environment.sh
index 84b3d48..7052628 100755
--- a/scripts/in_container/check_environment.sh
+++ b/scripts/in_container/check_environment.sh
@@ -98,7 +98,7 @@ function resetdb_if_requested() {
             airflow db reset -y
         fi
         echo
-        echo "Databsae has been reset"
+        echo "Database has been reset"
         echo
     fi
     return $?


[airflow] 06/18: Cope with '%' in password when waiting for migrations (#12440)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e58cfa016c0525069598e4084f372077ffb1ea56
Author: highfly22 <hi...@gmail.com>
AuthorDate: Wed Nov 18 21:48:08 2020 +0800

    Cope with '%' in password when waiting for migrations (#12440)
    
    (cherry picked from commit d4c3d32ae5f7c4915d7aac31cb75bb720c246538)
---
 chart/templates/_helpers.yaml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/chart/templates/_helpers.yaml b/chart/templates/_helpers.yaml
index df7b158..98efc9f 100644
--- a/chart/templates/_helpers.yaml
+++ b/chart/templates/_helpers.yaml
@@ -380,7 +380,7 @@ server_tls_key_file = /etc/pgbouncer/server.key
         directory = os.path.join(package_dir, 'migrations')
         config = Config(os.path.join(package_dir, 'alembic.ini'))
         config.set_main_option('script_location', directory)
-        config.set_main_option('sqlalchemy.url', settings.SQL_ALCHEMY_CONN)
+        config.set_main_option('sqlalchemy.url', settings.SQL_ALCHEMY_CONN.replace('%', '%%'))
         script_ = ScriptDirectory.from_config(config)
 
         timeout=60


[airflow] 05/18: The messages about remote image check are only shown with -v (#12402)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b5a8ca9f7ac44a4ff4c201d9e2f60fcfb59581f2
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Nov 17 20:32:00 2020 +0100

    The messages about remote image check are only shown with -v (#12402)
    
    The messages might be confusing and should only be shown when
    verbose is turned on.
    
    (cherry picked from commit dc31ca4dc6986397b619bf21ae8628fd03cba58d)
---
 scripts/ci/libraries/_build_images.sh | 24 ++++++++++++------------
 1 file changed, 12 insertions(+), 12 deletions(-)

diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index a8d9521..fb756ba 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -225,9 +225,9 @@ function build_images::get_local_build_cache_hash() {
     # Remove the container just in case
     docker rm --force "local-airflow-ci-container" 2>/dev/null >/dev/null
     if ! docker create --name "local-airflow-ci-container" "${AIRFLOW_CI_IMAGE}" 2>/dev/null; then
-        >&2 echo
-        >&2 echo "Local airflow CI image not available"
-        >&2 echo
+        verbosity::print_info
+        verbosity::print_info "Local airflow CI image not available"
+        verbosity::print_info
         LOCAL_MANIFEST_IMAGE_UNAVAILABLE="true"
         export LOCAL_MANIFEST_IMAGE_UNAVAILABLE
         touch "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}"
@@ -237,9 +237,9 @@ function build_images::get_local_build_cache_hash() {
         "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}" 2> /dev/null \
         || touch "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}"
     set -e
-    echo
-    echo "Local build cache hash: '$(cat "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}")'"
-    echo
+    verbosity::print_info
+    verbosity::print_info "Local build cache hash: '$(cat "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}")'"
+    verbosity::print_info
 }
 
 # Retrieves information about the build cache hash random file from the remote image.
@@ -257,9 +257,9 @@ function build_images::get_remote_image_build_cache_hash() {
     set +e
     # Pull remote manifest image
     if ! docker pull "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}" 2>/dev/null >/dev/null; then
-        >&2 echo
-        >&2 echo "Remote docker registry unreachable"
-        >&2 echo
+        verbosity::print_info
+        verbosity::print_info "Remote docker registry unreachable"
+        verbosity::print_info
         REMOTE_DOCKER_REGISTRY_UNREACHABLE="true"
         export REMOTE_DOCKER_REGISTRY_UNREACHABLE
         touch "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}"
@@ -274,9 +274,9 @@ function build_images::get_remote_image_build_cache_hash() {
         "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}"
     docker rm --force "$(cat "${REMOTE_IMAGE_CONTAINER_ID_FILE}")"
     rm -f "${REMOTE_IMAGE_CONTAINER_ID_FILE}"
-    echo
-    echo "Remote build cache hash: '$(cat "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}")'"
-    echo
+    verbosity::print_info
+    verbosity::print_info "Remote build cache hash: '$(cat "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}")'"
+    verbosity::print_info
 }
 
 # Compares layers from both remote and local image and set FORCE_PULL_IMAGES to true in case


[airflow] 11/18: Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit cf3babaa498179c2374cd2529cc7b82431dd09fe
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Nov 25 07:43:47 2020 +0000

    Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)
    
    Previously, even though this was passed during docker build it was
    ignored. This commit fixes it
    
    (cherry picked from commit c457c975b885469f09ef2e4c8d1f5836798bc820)
---
 Dockerfile    | 2 +-
 Dockerfile.ci | 9 ++++-----
 2 files changed, 5 insertions(+), 6 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index 00442bc..9b96cfa 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -176,7 +176,7 @@ RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
        fi; \
        pip install --user \
           "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-          --constraint "https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+          --constraint "${AIRFLOW_CONSTRAINTS_LOCATION}" \
           && pip uninstall --yes apache-airflow; \
     fi
 
diff --git a/Dockerfile.ci b/Dockerfile.ci
index ac51a56..cac73bb 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -245,8 +245,8 @@ ENV AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS}${ADDITIONAL_AIRFLOW_EXTRAS:+,}${ADDITIONAL_
 RUN echo "Installing with extras: ${AIRFLOW_EXTRAS}."
 
 ARG AIRFLOW_CONSTRAINTS_REFERENCE="constraints-master"
-ARG AIRFLOW_CONSTRAINTS_URL="https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt"
-ENV AIRFLOW_CONSTRAINTS_URL=${AIRFLOW_CONSTRAINTS_URL}
+ARG AIRFLOW_CONSTRAINTS_LOCATION="https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt"
+ENV AIRFLOW_CONSTRAINTS_LOCATION=${AIRFLOW_CONSTRAINTS_LOCATION}
 
 # By changing the CI build epoch we can force reinstalling Airflow from the current master
 # It can also be overwritten manually by setting the AIRFLOW_CI_BUILD_EPOCH environment variable.
@@ -269,11 +269,10 @@ ENV INSTALL_AIRFLOW_VIA_PIP=${INSTALL_AIRFLOW_VIA_PIP}
 RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
         pip install \
             "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-                --constraint "https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+                --constraint "${AIRFLOW_CONSTRAINTS_LOCATION}" \
                 && pip uninstall --yes apache-airflow; \
     fi
 
-
 # Generate random hex dump file so that we can determine whether it's faster to rebuild the image
 # using current cache (when our dump is the same as the remote onb) or better to pull
 # the new image (when it is different)
@@ -341,7 +340,7 @@ COPY scripts/in_container/entrypoint_ci.sh /entrypoint
 RUN chmod a+x /entrypoint
 
 # We can copy everything here. The Context is filtered by dockerignore. This makes sure we are not
-# copying over stuff that is accidentally generated or that we do not need (such as .egginfo)
+# copying over stuff that is accidentally generated or that we do not need (such as egg-info)
 # if you want to add something that is missing and you expect to see it in the image you can
 # add it with ! in .dockerignore next to the airflow, test etc. directories there
 COPY . ${AIRFLOW_SOURCES}/