You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by as...@apache.org on 2020/12/03 10:40:13 UTC

[airflow] branch v1-10-test updated (f53975a -> a4edcf9)

This is an automated email from the ASF dual-hosted git repository.

ash pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


    omit f53975a  fixup! Add metric for scheduling delay between first run task & expected start time (#9544)
    omit 5c663d5  fixup! Improve verification of images with PIP check (#12718)
    omit 6f7ee10  Update setup.py to get non-conflicting set of dependencies (#12636)
    omit 002a956  Update documentation about PIP 20.3 incompatibility
    omit 047a898  Pins PIP to 20.2.4 in our Dockerfiles (#12738)
    omit e2c6591  Add Changelog for 1.10.14
    omit 4da3617  Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)
    omit 3596996  Bump Airflow Version to 1.10.14
    omit 23b22e1  [AIRFLOW-2886] Generate random Flask SECRET_KEY in default config (#3738)
    omit 28d18c4  [AIRFLOW-2884] Fix Flask SECRET_KEY security issue in www_rbac (#3729)
    omit c336874  [AIRFLOW-2809] Fix security issue regarding Flask SECRET_KEY
    omit 591dc99  Improve verification of images with PIP check (#12718)
    omit 1e53b25  Clarified information about supported Databases.
    omit 28ffe6e  Add metric for scheduling delay between first run task & expected start time (#9544)
    omit 601c75c  Fix empty asctime field in JSON formatted logs (#10515)
    omit bbae43c  Rename `[scheduler] max_threads` to `[scheduler] parsing_processes` (#12605)
    omit d20f756  Update setup.py to get non-conflicting set of dependencies (#12636)
    omit b96ce83  Setup.cfg change triggers full build (#12684)
    omit 6716489  Remove "@" references from constraints generattion (#12671)
    omit 445e332  Add 1.10.13 to CI, Breeze and Docs (#12652)
    omit dd62d0a  Allows mounting local sources for github run-id images (#12650)
    omit 38d5161  Improved breeze messages for initialize-local-virtualenv and static-check --help (#12640)
    omit c7ec034  Adds possibility of forcing upgrade constraint by setting a label (#12635)
    omit 8c764ac  Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)
    omit 1bf6966  Adds missing licence headers (#12593)
    omit 5137af3  Fixes unneeded docker-context-files added in CI (#12534)
    omit 99d6806  Fix wait-for-migrations command in helm chart (#12522)
    omit 272386f  Fix broken CI.yml (#12454)
    omit c0ee0dc  Cope with '%' in password when waiting for migrations (#12440)
    omit 9c49572  The messages about remote image check are only shown with -v (#12402)
    omit 528d3f2  Switching to Ubuntu 20.04 as Github Actions runner. (#12404)
    omit bfb1053  Remove CodeQL from PRS. (#12406)
    omit 6e17c16  Fix typo in check_environment.sh (#12395)
    omit 235ccdd  Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)
    omit b6fe105  Typo Fix: Deprecated config force_log_out_after was not used (#12661)
    omit 2daf685  Fix issue with empty Resources in executor_config (#12633)
     add 41edd25  Add back mistakenly removed scheduler command (#12779)
     new eb6ae74  Fix issue with empty Resources in executor_config (#12633)
     new 7356ae1  Typo Fix: Deprecated config force_log_out_after was not used (#12661)
     new daa725b  Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)
     new 3b50f4f  Fix typo in check_environment.sh (#12395)
     new 130ddd9  Remove CodeQL from PRS. (#12406)
     new c12c2f0  Switching to Ubuntu 20.04 as Github Actions runner. (#12404)
     new ff0f26a  The messages about remote image check are only shown with -v (#12402)
     new 011f948  Cope with '%' in password when waiting for migrations (#12440)
     new 7701f51  Fix broken CI.yml (#12454)
     new 951001a  Fix wait-for-migrations command in helm chart (#12522)
     new ff02efe  Fixes unneeded docker-context-files added in CI (#12534)
     new 43784ce  Adds missing licence headers (#12593)
     new 416b125  Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)
     new a4a825b  Adds possibility of forcing upgrade constraint by setting a label (#12635)
     new 5bd7613  Improved breeze messages for initialize-local-virtualenv and static-check --help (#12640)
     new 2d0b41a  Allows mounting local sources for github run-id images (#12650)
     new 0f18653  Add 1.10.13 to CI, Breeze and Docs (#12652)
     new 71c1e2e  Remove "@" references from constraints generattion (#12671)
     new 74e3c56  Setup.cfg change triggers full build (#12684)
     new 3f43846  Update setup.py to get non-conflicting set of dependencies (#12636)
     new 82fd4f9  Rename `[scheduler] max_threads` to `[scheduler] parsing_processes` (#12605)
     new 38a2219  Fix empty asctime field in JSON formatted logs (#10515)
     new 700bb07  Add metric for scheduling delay between first run task & expected start time (#9544)
     new 21e0202  Clarified information about supported Databases.
     new e3406e3  Improve verification of images with PIP check (#12718)
     new 2f3b1c7  [AIRFLOW-2809] Fix security issue regarding Flask SECRET_KEY
     new a8900fa  [AIRFLOW-2884] Fix Flask SECRET_KEY security issue in www_rbac (#3729)
     new 6b06584  [AIRFLOW-2886] Generate random Flask SECRET_KEY in default config (#3738)
     new 06a4606  Bump Airflow Version to 1.10.14
     new 152175c  Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)
     new b8b9c0e  Add Changelog for 1.10.14
     new edcd18e  Pins PIP to 20.2.4 in our Dockerfiles (#12738)
     new 8998a55  Update documentation about PIP 20.3 incompatibility
     new a4edcf9  Update setup.py to get non-conflicting set of dependencies (#12636)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (f53975a)
            \
             N -- N -- N   refs/heads/v1-10-test (a4edcf9)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 34 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/bin/cli.py | 6 ++++++
 1 file changed, 6 insertions(+)


[airflow] 28/34: [AIRFLOW-2886] Generate random Flask SECRET_KEY in default config (#3738)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6b065840323f9a4fc8e372b458d26e419e4fa99b
Author: Xiaodong <xd...@hotmail.com>
AuthorDate: Wed Aug 15 03:08:48 2018 +0800

    [AIRFLOW-2886] Generate random Flask SECRET_KEY in default config (#3738)
    
    The Flask SECRET_KEY should be as random as possible.
    
    On the other hand, we can nott genrate random value when
    we launch the webserver (the secret_key will be
    inconsistent across the workers).
    
    We can generate a random one in the configuration file
    airflow.cfg, just like how we deal with FERNET_KEY.
    
    The SECRET_KEY is generated using os.urandom, as
    recommended by Flask community.
    
    (cherry picked from commit f7602f8266559e55bc602a9639e3e1ab640f30e8)
---
 airflow/config_templates/config.yml          | 5 ++---
 airflow/config_templates/default_airflow.cfg | 5 ++---
 airflow/configuration.py                     | 3 +++
 airflow/www/app.py                           | 7 +------
 airflow/www_rbac/app.py                      | 6 +-----
 5 files changed, 9 insertions(+), 17 deletions(-)

diff --git a/airflow/config_templates/config.yml b/airflow/config_templates/config.yml
index 7f0f714..4040131 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -737,12 +737,11 @@
     - name: secret_key
       description: |
         Secret key used to run your flask app
-        If default value is given ("temporary_key"), a random secret_key will be generated
-        when you launch your webserver for security reason
+        It should be as random as possible
       version_added: ~
       type: string
       example: ~
-      default: "temporary_key"
+      default: "{SECRET_KEY}"
     - name: workers
       description: |
         Number of workers to run the Gunicorn web server
diff --git a/airflow/config_templates/default_airflow.cfg b/airflow/config_templates/default_airflow.cfg
index 765b1ce..0b70db8 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -362,9 +362,8 @@ worker_refresh_interval = 30
 reload_on_plugin_change = False
 
 # Secret key used to run your flask app
-# If default value is given ("temporary_key"), a random secret_key will be generated
-# when you launch your webserver for security reason
-secret_key = temporary_key
+# It should be as random as possible
+secret_key = {SECRET_KEY}
 
 # Number of workers to run the Gunicorn web server
 workers = 4
diff --git a/airflow/configuration.py b/airflow/configuration.py
index 16081a3..8c33de4 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -22,6 +22,7 @@ from __future__ import division
 from __future__ import print_function
 from __future__ import unicode_literals
 
+from base64 import b64encode
 from builtins import str
 from collections import OrderedDict
 import copy
@@ -706,6 +707,8 @@ if not os.path.isfile(TEST_CONFIG_FILE) or not os.path.isfile(AIRFLOW_CONFIG):
 else:
     FERNET_KEY = ''
 
+SECRET_KEY = b64encode(os.urandom(16)).decode('utf-8')
+
 TEMPLATE_START = (
     '# ----------------------- TEMPLATE BEGINS HERE -----------------------')
 if not os.path.isfile(TEST_CONFIG_FILE):
diff --git a/airflow/www/app.py b/airflow/www/app.py
index 2d463a2..ccf7939 100644
--- a/airflow/www/app.py
+++ b/airflow/www/app.py
@@ -61,16 +61,11 @@ def create_app(config=None, testing=False):
             x_port=conf.getint("webserver", "PROXY_FIX_X_PORT", fallback=1),
             x_prefix=conf.getint("webserver", "PROXY_FIX_X_PREFIX", fallback=1)
         )
-    app.secret_key = conf.get('webserver', 'SECRET_KEY')
     app.config['PERMANENT_SESSION_LIFETIME'] = datetime.timedelta(minutes=settings.get_session_lifetime_config())
     app.config['LOGIN_DISABLED'] = not conf.getboolean(
         'webserver', 'AUTHENTICATE')
 
-    if configuration.conf.get('webserver', 'SECRET_KEY') == "temporary_key":
-        log.info("SECRET_KEY for Flask App is not specified. Using a random one.")
-        app.secret_key = os.urandom(16)
-    else:
-        app.secret_key = configuration.conf.get('webserver', 'SECRET_KEY')
+    app.secret_key = conf.get('webserver', 'SECRET_KEY')
 
     app.config['SESSION_COOKIE_HTTPONLY'] = True
     app.config['SESSION_COOKIE_SECURE'] = conf.getboolean('webserver', 'COOKIE_SECURE')
diff --git a/airflow/www_rbac/app.py b/airflow/www_rbac/app.py
index 2e653a2..d4a4f03 100644
--- a/airflow/www_rbac/app.py
+++ b/airflow/www_rbac/app.py
@@ -61,13 +61,9 @@ def create_app(config=None, session=None, testing=False, app_name="Airflow"):
             x_port=conf.getint("webserver", "PROXY_FIX_X_PORT", fallback=1),
             x_prefix=conf.getint("webserver", "PROXY_FIX_X_PREFIX", fallback=1)
         )
-    app.secret_key = conf.get('webserver', 'SECRET_KEY')
     app.config['PERMANENT_SESSION_LIFETIME'] = timedelta(minutes=settings.get_session_lifetime_config())
 
-    if conf.get('webserver', 'SECRET_KEY') == "temporary_key":
-        app.secret_key = os.urandom(16)
-    else:
-        app.secret_key = conf.get('webserver', 'SECRET_KEY')
+    app.secret_key = conf.get('webserver', 'SECRET_KEY')
 
     app.config.from_pyfile(settings.WEBSERVER_CONFIG, silent=True)
     app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False


[airflow] 27/34: [AIRFLOW-2884] Fix Flask SECRET_KEY security issue in www_rbac (#3729)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a8900fa5f2b8963e9f57ba4ae5520a5d339aeaad
Author: Xiaodong <xd...@hotmail.com>
AuthorDate: Fri Aug 10 18:30:41 2018 +0800

    [AIRFLOW-2884] Fix Flask SECRET_KEY security issue in www_rbac (#3729)
    
    The same issue was fixed for /www previously in
    PR https://github.com/apache/incubator-airflow/pull/3651
    (JIRA ticket 2809)
    
    (cherry picked from commit fe6d00a54f83468e296777d3b83b65a2ae7169ec)
---
 airflow/config_templates/config.yml          | 3 ++-
 airflow/config_templates/default_airflow.cfg | 3 ++-
 airflow/www_rbac/app.py                      | 6 ++++++
 3 files changed, 10 insertions(+), 2 deletions(-)

diff --git a/airflow/config_templates/config.yml b/airflow/config_templates/config.yml
index 87ee928..7f0f714 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -737,7 +737,8 @@
     - name: secret_key
       description: |
         Secret key used to run your flask app
-        It should be as random as possible
+        If default value is given ("temporary_key"), a random secret_key will be generated
+        when you launch your webserver for security reason
       version_added: ~
       type: string
       example: ~
diff --git a/airflow/config_templates/default_airflow.cfg b/airflow/config_templates/default_airflow.cfg
index 662fd00..765b1ce 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -362,7 +362,8 @@ worker_refresh_interval = 30
 reload_on_plugin_change = False
 
 # Secret key used to run your flask app
-# It should be as random as possible
+# If default value is given ("temporary_key"), a random secret_key will be generated
+# when you launch your webserver for security reason
 secret_key = temporary_key
 
 # Number of workers to run the Gunicorn web server
diff --git a/airflow/www_rbac/app.py b/airflow/www_rbac/app.py
index a2ebf7b..2e653a2 100644
--- a/airflow/www_rbac/app.py
+++ b/airflow/www_rbac/app.py
@@ -19,6 +19,7 @@
 #
 import logging
 import socket
+import os
 from datetime import timedelta
 from typing import Any
 
@@ -63,6 +64,11 @@ def create_app(config=None, session=None, testing=False, app_name="Airflow"):
     app.secret_key = conf.get('webserver', 'SECRET_KEY')
     app.config['PERMANENT_SESSION_LIFETIME'] = timedelta(minutes=settings.get_session_lifetime_config())
 
+    if conf.get('webserver', 'SECRET_KEY') == "temporary_key":
+        app.secret_key = os.urandom(16)
+    else:
+        app.secret_key = conf.get('webserver', 'SECRET_KEY')
+
     app.config.from_pyfile(settings.WEBSERVER_CONFIG, silent=True)
     app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
     app.config['APP_NAME'] = app_name


[airflow] 23/34: Add metric for scheduling delay between first run task & expected start time (#9544)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 700bb07631fa8d03b45bdcb74ab28d354ad0987f
Author: Ace Haidrey <ah...@pandora.com>
AuthorDate: Fri Nov 13 14:03:42 2020 -0800

    Add metric for scheduling delay between first run task & expected start time (#9544)
    
    Co-authored-by: Ace Haidrey <ah...@pinterest.com>
    (cherry picked from commit aac3877ec374e5f376d8f95b50031c10625216a4)
---
 airflow/models/dagrun.py    | 36 ++++++++++++++++++++++++++++++
 docs/metrics.rst            | 23 ++++++++++---------
 tests/models/test_dagrun.py | 54 ++++++++++++++++++++++++++++++++++++++++++---
 3 files changed, 99 insertions(+), 14 deletions(-)

diff --git a/airflow/models/dagrun.py b/airflow/models/dagrun.py
index 9775c9f..8ba7f4e 100644
--- a/airflow/models/dagrun.py
+++ b/airflow/models/dagrun.py
@@ -320,6 +320,7 @@ class DagRun(Base, LoggingMixin):
         else:
             self.set_state(State.RUNNING)
 
+        self._emit_true_scheduling_delay_stats_for_finished_state(finished_tasks)
         self._emit_duration_stats_for_finished_state()
 
         # todo: determine we want to use with_for_update to make sure to lock the run
@@ -356,6 +357,41 @@ class DagRun(Base, LoggingMixin):
                     session=session):
                 return True
 
+    def _emit_true_scheduling_delay_stats_for_finished_state(self, finished_tis):
+        """
+        This is a helper method to emit the true scheduling delay stats, which is defined as
+        the time when the first task in DAG starts minus the expected DAG run datetime.
+        This method will be used in the update_state method when the state of the DagRun
+        is updated to a completed status (either success or failure). The method will find the first
+        started task within the DAG and calculate the expected DagRun start time (based on
+        dag.execution_date & dag.schedule_interval), and minus these two values to get the delay.
+        The emitted data may contains outlier (e.g. when the first task was cleared, so
+        the second task's start_date will be used), but we can get rid of the the outliers
+        on the stats side through the dashboards tooling built.
+        Note, the stat will only be emitted if the DagRun is a scheduler triggered one
+        (i.e. external_trigger is False).
+        """
+        try:
+            if self.state == State.RUNNING:
+                return
+            if self.external_trigger:
+                return
+            if not finished_tis:
+                return
+            dag = self.get_dag()
+            ordered_tis_by_start_date = [ti for ti in finished_tis if ti.start_date]
+            ordered_tis_by_start_date.sort(key=lambda ti: ti.start_date, reverse=False)
+            first_start_date = ordered_tis_by_start_date[0].start_date
+            if first_start_date:
+                # dag.following_schedule calculates the expected start datetime for a scheduled dagrun
+                # i.e. a daily flow for execution date 1/1/20 actually runs on 1/2/20 hh:mm:ss,
+                # and ti.start_date will be 1/2/20 hh:mm:ss so the following schedule is comparison
+                true_delay = (first_start_date - dag.following_schedule(self.execution_date)).total_seconds()
+                if true_delay >= 0:
+                    Stats.timing('dagrun.{}.first_task_scheduling_delay'.format(dag.dag_id), true_delay)
+        except Exception as e:
+            self.log.warning('Failed to record first_task_scheduling_delay metric:\n', e)
+
     def _emit_duration_stats_for_finished_state(self):
         if self.state == State.RUNNING:
             return
diff --git a/docs/metrics.rst b/docs/metrics.rst
index afbd7c9..7f7c92d 100644
--- a/docs/metrics.rst
+++ b/docs/metrics.rst
@@ -90,14 +90,15 @@ Name                                                Description
 Timers
 ------
 
-=========================================== =================================================
-Name                                        Description
-=========================================== =================================================
-``dagrun.dependency-check.<dag_id>``        Milliseconds taken to check DAG dependencies
-``dag.<dag_id>.<task_id>.duration``         Milliseconds taken to finish a task
-``dag_processing.last_duration.<dag_file>`` Milliseconds taken to load the given DAG file
-``dagrun.duration.success.<dag_id>``        Milliseconds taken for a DagRun to reach success state
-``dagrun.duration.failed.<dag_id>``         Milliseconds taken for a DagRun to reach failed state
-``dagrun.schedule_delay.<dag_id>``          Milliseconds of delay between the scheduled DagRun
-                                            start date and the actual DagRun start date
-=========================================== =================================================
+================================================= =======================================================================
+Name                                              Description
+================================================= =======================================================================
+``dagrun.dependency-check.<dag_id>``              Milliseconds taken to check DAG dependencies
+``dag.<dag_id>.<task_id>.duration``               Milliseconds taken to finish a task
+``dag_processing.last_duration.<dag_file>``       Milliseconds taken to load the given DAG file
+``dagrun.duration.success.<dag_id>``              Milliseconds taken for a DagRun to reach success state
+``dagrun.duration.failed.<dag_id>``               Milliseconds taken for a DagRun to reach failed state
+``dagrun.schedule_delay.<dag_id>``                Milliseconds of delay between the scheduled DagRun
+                                                  start date and the actual DagRun start date
+``dagrun.<dag_id>.first_task_scheduling_delay``   Seconds elapsed between first task start_date and dagrun expected start
+================================================= =======================================================================
diff --git a/tests/models/test_dagrun.py b/tests/models/test_dagrun.py
index 6dcf49e..1f627c5 100644
--- a/tests/models/test_dagrun.py
+++ b/tests/models/test_dagrun.py
@@ -16,7 +16,6 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-
 import datetime
 import unittest
 
@@ -24,14 +23,16 @@ from parameterized import parameterized
 
 from airflow import settings, models
 from airflow.jobs import BackfillJob
-from airflow.models import DAG, DagRun, clear_task_instances
+from airflow.models import DAG, DagRun, clear_task_instances, DagModel
 from airflow.models import TaskInstance as TI
 from airflow.operators.dummy_operator import DummyOperator
 from airflow.operators.python_operator import ShortCircuitOperator
+from airflow.settings import Stats
 from airflow.utils import timezone
+from airflow.utils.dates import days_ago
 from airflow.utils.state import State
 from airflow.utils.trigger_rule import TriggerRule
-from tests.compat import mock
+from tests.compat import mock, call
 from tests.models import DEFAULT_DATE
 
 
@@ -608,3 +609,50 @@ class DagRunTest(unittest.TestCase):
         dagrun.verify_integrity()
         task = dagrun.get_task_instances()[0]
         assert task.queue == 'queue1'
+
+    @mock.patch.object(Stats, 'timing')
+    def test_no_scheduling_delay_for_nonscheduled_runs(self, stats_mock):
+        """
+        Tests that dag scheduling delay stat is not called if the dagrun is not a scheduled run.
+        This case is manual run. Simple test for sanity check.
+        """
+        dag = DAG(dag_id='test_dagrun_stats', start_date=days_ago(1))
+        dag_task = DummyOperator(task_id='dummy', dag=dag)
+
+        initial_task_states = {
+            dag_task.task_id: State.SUCCESS,
+        }
+
+        dag_run = self.create_dag_run(dag=dag, state=State.RUNNING, task_states=initial_task_states)
+        dag_run.update_state()
+        self.assertNotIn(call('dagrun.{}.first_task_scheduling_delay'.format(dag.dag_id)),
+                         stats_mock.mock_calls)
+
+    @mock.patch.object(Stats, 'timing')
+    def test_emit_scheduling_delay(self, stats_mock):
+        """
+        Tests that dag scheduling delay stat is set properly once running scheduled dag.
+        dag_run.update_state() invokes the _emit_true_scheduling_delay_stats_for_finished_state method.
+        """
+        dag = DAG(dag_id='test_emit_dag_stats', start_date=days_ago(1))
+        dag_task = DummyOperator(task_id='dummy', dag=dag, owner='airflow')
+
+        session = settings.Session()
+        orm_dag = DagModel(dag_id=dag.dag_id, is_active=True)
+        session.add(orm_dag)
+        session.flush()
+        dag_run = dag.create_dagrun(
+            run_id="test",
+            state=State.SUCCESS,
+            execution_date=dag.start_date,
+            start_date=dag.start_date,
+            session=session,
+        )
+        ti = dag_run.get_task_instance(dag_task.task_id)
+        ti.set_state(State.SUCCESS, session)
+        session.commit()
+        session.close()
+        dag_run.update_state()
+        true_delay = (ti.start_date - dag.following_schedule(dag_run.execution_date)).total_seconds()
+        sched_delay_stat_call = call('dagrun.{}.first_task_scheduling_delay'.format(dag.dag_id), true_delay)
+        self.assertIn(sched_delay_stat_call, stats_mock.mock_calls)


[airflow] 06/34: Switching to Ubuntu 20.04 as Github Actions runner. (#12404)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c12c2f0d385bac3608e9113ae12b092e96d92004
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Nov 17 18:49:27 2020 +0100

    Switching to Ubuntu 20.04 as Github Actions runner. (#12404)
    
    Ubuntu 20.04 will soon become the default runner for GA.
    
    See: https://github.com/actions/virtual-environments/issues/1816
    
    This PR tests if this is working fine.
    
    (cherry picked from commit c38dadb526f7104df7a1a6feda72ce1b65557bd9)
---
 .github/workflows/build-images-workflow-run.yml    | 10 +++----
 .github/workflows/ci.yml                           | 34 +++++++++++-----------
 .github/workflows/codeql-analysis.yml              |  4 +--
 .github/workflows/delete_old_artifacts.yml         |  2 +-
 .github/workflows/label_when_reviewed.yml          |  2 +-
 .../workflows/label_when_reviewed_workflow_run.yml |  2 +-
 .github/workflows/scheduled_quarantined.yml        |  4 +--
 7 files changed, 29 insertions(+), 29 deletions(-)

diff --git a/.github/workflows/build-images-workflow-run.yml b/.github/workflows/build-images-workflow-run.yml
index af71710..9726c5a 100644
--- a/.github/workflows/build-images-workflow-run.yml
+++ b/.github/workflows/build-images-workflow-run.yml
@@ -44,7 +44,7 @@ jobs:
   cancel-workflow-runs:
     timeout-minutes: 10
     name: "Cancel workflow runs"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     outputs:
       sourceHeadRepo: ${{ steps.source-run-info.outputs.sourceHeadRepo }}
       sourceHeadBranch: ${{ steps.source-run-info.outputs.sourceHeadBranch }}
@@ -192,7 +192,7 @@ jobs:
       Source Sha: ${{ needs.cancel-workflow-runs.outputs.sourceHeadSha }}
       Merge commit Sha: ${{ needs.cancel-workflow-runs.outputs.mergeCommitSha }}
       Target commit Sha: ${{ needs.cancel-workflow-runs.outputs.targetCommitSha }}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [cancel-workflow-runs]
     env:
       GITHUB_CONTEXT: ${{ toJson(github) }}
@@ -257,7 +257,7 @@ jobs:
   build-images:
     timeout-minutes: 80
     name: "Build ${{matrix.image-type}} images ${{matrix.python-version}}"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, cancel-workflow-runs]
     strategy:
       matrix:
@@ -383,7 +383,7 @@ jobs:
 
   cancel-on-build-cancel:
     name: "Cancel 'CI Build' jobs on build image cancelling."
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     if: cancelled()
     needs: [build-images]
     steps:
@@ -398,7 +398,7 @@ jobs:
 
   cancel-on-build-failure:
     name: "Cancel 'CI Build' jobs on build image failing."
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     if: failure()
     needs: [build-images]
     steps:
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index dad697f..5aadfd0 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -64,7 +64,7 @@ jobs:
 
   build-info:
     name: "Build info"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     env:
       GITHUB_CONTEXT: ${{ toJson(github) }}
     outputs:
@@ -145,7 +145,7 @@ jobs:
   ci-images:
     timeout-minutes: 120
     name: "Wait for CI images"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info]
     if: needs.build-info.outputs.image-build == 'true'
     env:
@@ -179,7 +179,7 @@ jobs:
   static-checks:
     timeout-minutes: 30
     name: "Static checks"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     env:
       MOUNT_LOCAL_SOURCES: "true"
@@ -214,7 +214,7 @@ jobs:
   static-checks-basic-checks-only:
     timeout-minutes: 30
     name: "Static checks: basic checks only"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info]
     env:
       SKIP: "build,mypy,flake8,pylint,bats-in-container-tests"
@@ -250,7 +250,7 @@ jobs:
   docs:
     timeout-minutes: 30
     name: "Build docs"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     if: needs.build-info.outputs.docs-build == 'true'
     steps:
@@ -270,7 +270,7 @@ jobs:
   tests-helm:
     timeout-minutes: 20
     name: "Python unit tests for helm chart"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     env:
       MOUNT_LOCAL_SOURCES: "true"
@@ -318,7 +318,7 @@ jobs:
     name: >
       Postgres${{matrix.postgres-version}},Py${{matrix.python-version}}:
       ${{needs.build-info.outputs.testTypes}}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     strategy:
       matrix:
@@ -370,7 +370,7 @@ jobs:
     timeout-minutes: 80
     name: >
       MySQL${{matrix.mysql-version}}, Py${{matrix.python-version}}: ${{needs.build-info.outputs.testTypes}}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     strategy:
       matrix:
@@ -421,7 +421,7 @@ jobs:
     timeout-minutes: 60
     name: >
       Sqlite Py${{matrix.python-version}}: ${{needs.build-info.outputs.testTypes}}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
     strategy:
       matrix:
@@ -469,7 +469,7 @@ jobs:
   tests-quarantined:
     timeout-minutes: 60
     name: "Quarantined tests"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     continue-on-error: true
     needs: [build-info, ci-images]
     strategy:
@@ -541,7 +541,7 @@ jobs:
   upload-coverage:
     timeout-minutes: 5
     name: "Upload coverage"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     continue-on-error: true
     needs:
       - tests-kubernetes
@@ -564,7 +564,7 @@ jobs:
   prod-images:
     timeout-minutes: 120
     name: "Wait for PROD images"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info]
     env:
       BACKEND: sqlite
@@ -594,7 +594,7 @@ jobs:
   tests-kubernetes:
     timeout-minutes: 50
     name: K8s ${{matrix.python-version}} ${{matrix.kubernetes-version}} ${{matrix.kubernetes-mode}}
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [build-info, prod-images]
     strategy:
       matrix:
@@ -669,7 +669,7 @@ jobs:
   push-prod-images-to-github-registry:
     timeout-minutes: 10
     name: "Push PROD images"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs:
       - build-info
       - static-checks
@@ -705,7 +705,7 @@ jobs:
   push-ci-images-to-github-registry:
     timeout-minutes: 10
     name: "Push CI images"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs:
       - build-info
       - static-checks
@@ -741,7 +741,7 @@ jobs:
   constraints:
     timeout-minutes: 10
     name: "Constraints"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     strategy:
       matrix:
         python-version: ${{ fromJson(needs.build-info.outputs.pythonVersions) }}
@@ -774,7 +774,7 @@ jobs:
   constraints-push:
     timeout-minutes: 10
     name: "Constraints push"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs:
       - build-info
       - constraints
diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml
index e0178bf..2bf92b7 100644
--- a/.github/workflows/codeql-analysis.yml
+++ b/.github/workflows/codeql-analysis.yml
@@ -27,7 +27,7 @@ on:  # yamllint disable-line rule:truthy
 jobs:
   selective-checks:
     name: Selective checks
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     outputs:
       needs-python-scans: ${{ steps.selective-checks.outputs.needs-python-scans }}
       needs-javascript-scans: ${{ steps.selective-checks.outputs.needs-javascript-scans }}
@@ -52,7 +52,7 @@ jobs:
 
   analyze:
     name: Analyze
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     needs: [selective-checks]
     strategy:
       fail-fast: false
diff --git a/.github/workflows/delete_old_artifacts.yml b/.github/workflows/delete_old_artifacts.yml
index 8b35711..98329d5 100644
--- a/.github/workflows/delete_old_artifacts.yml
+++ b/.github/workflows/delete_old_artifacts.yml
@@ -23,7 +23,7 @@ on:  # yamllint disable-line rule:truthy
 
 jobs:
   delete-artifacts:
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     steps:
       - uses: kolpav/purge-artifacts-action@04c636a505f26ebc82f8d070b202fb87ff572b10  # v1.0
         with:
diff --git a/.github/workflows/label_when_reviewed.yml b/.github/workflows/label_when_reviewed.yml
index 62d7cc6..5095953 100644
--- a/.github/workflows/label_when_reviewed.yml
+++ b/.github/workflows/label_when_reviewed.yml
@@ -23,7 +23,7 @@ jobs:
 
   label-when-reviewed:
     name: "Label PRs when reviewed"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     steps:
       - name: "Do nothing. Only trigger corresponding workflow_run event"
         run: echo
diff --git a/.github/workflows/label_when_reviewed_workflow_run.yml b/.github/workflows/label_when_reviewed_workflow_run.yml
index f943609..6e45038 100644
--- a/.github/workflows/label_when_reviewed_workflow_run.yml
+++ b/.github/workflows/label_when_reviewed_workflow_run.yml
@@ -25,7 +25,7 @@ jobs:
 
   label-when-reviewed:
     name: "Label PRs when reviewed workflow run"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     outputs:
       labelSet: ${{ steps.label-when-reviewed.outputs.labelSet }}
     steps:
diff --git a/.github/workflows/scheduled_quarantined.yml b/.github/workflows/scheduled_quarantined.yml
index 552edfb..cf29c38 100644
--- a/.github/workflows/scheduled_quarantined.yml
+++ b/.github/workflows/scheduled_quarantined.yml
@@ -48,7 +48,7 @@ jobs:
   trigger-tests:
     timeout-minutes: 5
     name: "Checks if tests should be run"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     outputs:
       run-tests: ${{ steps.trigger-tests.outputs.run-tests }}
     steps:
@@ -60,7 +60,7 @@ jobs:
   tests-quarantined:
     timeout-minutes: 80
     name: "Quarantined tests"
-    runs-on: ubuntu-latest
+    runs-on: ubuntu-20.04
     continue-on-error: true
     needs: [trigger-tests]
     strategy:


[airflow] 20/34: Update setup.py to get non-conflicting set of dependencies (#12636)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3f438461498b2f6c13671fed8f70a6a12a51f418
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Nov 29 19:45:58 2020 +0100

    Update setup.py to get non-conflicting set of dependencies (#12636)
    
    This change upgrades setup.py and setup.cfg to provide non-conflicting
    `pip check` valid set of constraints for CI image.
---
 BREEZE.rst                                       |  3 ++
 CI.rst                                           |  2 +-
 CONTRIBUTING.rst                                 |  2 +-
 breeze                                           |  2 +
 scripts/ci/images/ci_prepare_ci_image_on_ci.sh   |  1 -
 scripts/ci/images/ci_wait_for_all_ci_images.sh   | 36 ++--------------
 scripts/ci/images/ci_wait_for_all_prod_images.sh | 38 ++---------------
 scripts/ci/images/ci_wait_for_ci_image.sh        | 52 ++++++++++++++++++++++++
 scripts/ci/images/ci_wait_for_prod_image.sh      | 52 ++++++++++++++++++++++++
 scripts/ci/libraries/_build_images.sh            | 43 ++++++++++----------
 scripts/ci/libraries/_push_pull_remove_images.sh | 44 +++++++++++++++++---
 scripts/ci/selective_ci_checks.sh                |  6 +--
 setup.py                                         | 44 ++++++++++++++------
 13 files changed, 210 insertions(+), 115 deletions(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index f91b598..095fe1b 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1355,6 +1355,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
           If you use this flag, automatically --github-registry is enabled.
 
+
           Default: latest.
 
   -v, --verbose
@@ -1508,6 +1509,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
           If you use this flag, automatically --github-registry is enabled.
 
+
           Default: latest.
 
   -v, --verbose
@@ -2276,6 +2278,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
           If you use this flag, automatically --github-registry is enabled.
 
+
           Default: latest.
 
   ****************************************************************************************************
diff --git a/CI.rst b/CI.rst
index f4b5294..fac9f0f 100644
--- a/CI.rst
+++ b/CI.rst
@@ -253,7 +253,7 @@ You can use those variables when you try to reproduce the build locally.
 |                                                        Image build variables                                                       |
 +-----------------------------------------+-------------+-------------+------------+-------------------------------------------------+
 | ``UPGRADE_TO_LATEST_CONSTRAINTS``       |    false    |    false    |    false   | Determines whether the build should             |
-|                                         |             |             |     (x)    | attempt to eagerly upgrade all                  |
+|                                         |             |             |     (x)    | attempt to upgrade all                          |
 |                                         |             |             |            | PIP dependencies to latest ones matching        |
 |                                         |             |             |            | ``setup.py`` limits. This tries to replicate    |
 |                                         |             |             |            | the situation of "fresh" user who just installs |
diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 61883e7..0c1c9c1 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -321,7 +321,7 @@ Step 4: Prepare PR
        the "full tests needed" label is set for your PR. Additional check is set that prevents from
        accidental merging of the request until full matrix of tests succeeds for the PR.
 
-     * when your change has "upgrade to latest dependencies" label set, constraints will be automatically
+     * when your change has "upgrade to newer dependencies" label set, constraints will be automatically
        upgraded to latest constraints matching your setup.py. This is useful in case you want to force
        upgrade to a latest version of dependencies. You can ask committers to set the label for you
        when you need it in your PR.
diff --git a/breeze b/breeze
index fe8f038..6f73ad7 100755
--- a/breeze
+++ b/breeze
@@ -1078,6 +1078,7 @@ function breeze::parse_arguments() {
             echo
             echo "Force pulling the image, using github registry and skip mounting local sources."
             echo "This is in order to get the exact same version as used in CI environment for SHA/RUN_ID!."
+            echo "You can specify --skip-mounting-local-sources to not mount local sources. "
             echo
             export FORCE_PULL_IMAGES="true"
             export USE_GITHUB_REGISTRY="true"
@@ -2385,6 +2386,7 @@ function breeze::flag_pull_push_docker_images() {
 
         If you use this flag, automatically --github-registry is enabled.
 
+
         Default: ${_breeze_default_github_image_id:=}.
 
 "
diff --git a/scripts/ci/images/ci_prepare_ci_image_on_ci.sh b/scripts/ci/images/ci_prepare_ci_image_on_ci.sh
index e2637c3..4d2a1ce 100755
--- a/scripts/ci/images/ci_prepare_ci_image_on_ci.sh
+++ b/scripts/ci/images/ci_prepare_ci_image_on_ci.sh
@@ -59,5 +59,4 @@ function build_ci_image_on_ci() {
     export CHECK_IMAGE_FOR_REBUILD="false"
 }
 
-
 build_ci_image_on_ci
diff --git a/scripts/ci/images/ci_wait_for_all_ci_images.sh b/scripts/ci/images/ci_wait_for_all_ci_images.sh
index edb6b29..2451a88 100755
--- a/scripts/ci/images/ci_wait_for_all_ci_images.sh
+++ b/scripts/ci/images/ci_wait_for_all_ci_images.sh
@@ -15,42 +15,12 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-export AIRFLOW_SOURCES="${AIRFLOW_SOURCES:=$( cd "$( dirname "${BASH_SOURCE[0]}" )/../../.." && pwd )}"
 echo
-echo "Airflow sources: ${AIRFLOW_SOURCES}"
+echo "Waiting for all CI images to appear: ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}"
 echo
 
-if [[ ${USE_GITHUB_REGISTRY} != "true" ||  ${GITHUB_REGISTRY_WAIT_FOR_IMAGE} != "true" ]]; then
-    echo
-    echo "This script should not be called"
-    echo "It need both USE_GITHUB_REGISTRY and GITHUB_REGISTRY_WAIT_FOR_IMAGE to true!"
-    echo
-    echo "USE_GITHUB_REGISTRY = ${USE_GITHUB_REGISTRY}"
-    echo "GITHUB_REGISTRY_WAIT_FOR_IMAGE =${GITHUB_REGISTRY_WAIT_FOR_IMAGE}"
-    echo
-    exit 1
-fi
-
-echo
-echo "Waiting for all images to appear: ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}"
-echo
-
-echo
-echo "Check if jq is installed"
-echo
-command -v jq >/dev/null || (echo "ERROR! You must have 'jq' tool installed!" && exit 1)
-
-echo
-echo "The jq version $(jq --version)"
-echo
-
-# shellcheck source=scripts/ci/libraries/_all_libs.sh
-source "${AIRFLOW_SOURCES}/scripts/ci/libraries/_all_libs.sh"
-
-initialization::initialize_common_environment
-
 for PYTHON_MAJOR_MINOR_VERSION in ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}
 do
-    export AIRFLOW_CI_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-ci"
-    push_pull_remove_images::wait_for_github_registry_image "${AIRFLOW_CI_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+    export PYTHON_MAJOR_MINOR_VERSION
+    "$( dirname "${BASH_SOURCE[0]}" )/ci_wait_for_ci_image.sh"
 done
diff --git a/scripts/ci/images/ci_wait_for_all_prod_images.sh b/scripts/ci/images/ci_wait_for_all_prod_images.sh
index 66196c3..25bfd7c 100755
--- a/scripts/ci/images/ci_wait_for_all_prod_images.sh
+++ b/scripts/ci/images/ci_wait_for_all_prod_images.sh
@@ -15,44 +15,12 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-export AIRFLOW_SOURCES="${AIRFLOW_SOURCES:=$( cd "$( dirname "${BASH_SOURCE[0]}" )/../../.." && pwd )}"
 echo
-echo "Airflow sources: ${AIRFLOW_SOURCES}"
+echo "Waiting for all PROD images to appear: ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}"
 echo
 
-if [[ ${USE_GITHUB_REGISTRY} != "true" ||  ${GITHUB_REGISTRY_WAIT_FOR_IMAGE} != "true" ]]; then
-    echo
-    echo "This script should not be called"
-    echo "It need both USE_GITHUB_REGISTRY and GITHUB_REGISTRY_WAIT_FOR_IMAGE to true!"
-    echo
-    echo "USE_GITHUB_REGISTRY = ${USE_GITHUB_REGISTRY}"
-    echo "GITHUB_REGISTRY_WAIT_FOR_IMAGE =${GITHUB_REGISTRY_WAIT_FOR_IMAGE}"
-    echo
-    exit 1
-fi
-
-echo
-echo "Waiting for all images to appear: ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}"
-echo
-
-echo
-echo "Check if jq is installed"
-echo
-command -v jq >/dev/null || (echo "ERROR! You must have 'jq' tool installed!" && exit 1)
-
-echo
-echo "The jq version $(jq --version)"
-echo
-
-# shellcheck source=scripts/ci/libraries/_all_libs.sh
-source "${AIRFLOW_SOURCES}/scripts/ci/libraries/_all_libs.sh"
-
-initialization::initialize_common_environment
-
 for PYTHON_MAJOR_MINOR_VERSION in ${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS_AS_STRING}
 do
-    export AIRFLOW_PROD_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}"
-    export AIRFLOW_PROD_BUILD_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-build"
-    push_pull_remove_images::wait_for_github_registry_image "${AIRFLOW_PROD_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
-    push_pull_remove_images::wait_for_github_registry_image "${AIRFLOW_PROD_BUILD_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+    export PYTHON_MAJOR_MINOR_VERSION
+    "$( dirname "${BASH_SOURCE[0]}" )/ci_wait_for_prod_image.sh"
 done
diff --git a/scripts/ci/images/ci_wait_for_ci_image.sh b/scripts/ci/images/ci_wait_for_ci_image.sh
new file mode 100755
index 0000000..2c0bdf2
--- /dev/null
+++ b/scripts/ci/images/ci_wait_for_ci_image.sh
@@ -0,0 +1,52 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck source=scripts/ci/libraries/_script_init.sh
+. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
+
+function verify_ci_image_dependencies {
+    echo
+    echo "Checking if Airflow dependencies are non-conflicting in CI image."
+    echo
+
+    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_CI_IMAGE}" \
+        "${GITHUB_REGISTRY_AIRFLOW_CI_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+    # TODO: remove after we have it fully working
+    docker run --rm --entrypoint /bin/bash "${AIRFLOW_CI_IMAGE}" -c 'pip check' || true
+}
+
+push_pull_remove_images::check_if_github_registry_wait_for_image_enabled
+
+push_pull_remove_images::check_if_jq_installed
+
+build_image::login_to_github_registry_if_needed
+
+export AIRFLOW_CI_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-ci"
+
+echo
+echo "Waiting for image to appear: ${AIRFLOW_CI_IMAGE_NAME}"
+echo
+
+push_pull_remove_images::wait_for_github_registry_image \
+    "${AIRFLOW_CI_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+echo
+echo "Verifying the ${AIRFLOW_CI_IMAGE_NAME} image after pulling it"
+echo
+
+verify_ci_image_dependencies
diff --git a/scripts/ci/images/ci_wait_for_prod_image.sh b/scripts/ci/images/ci_wait_for_prod_image.sh
new file mode 100755
index 0000000..e53aec1
--- /dev/null
+++ b/scripts/ci/images/ci_wait_for_prod_image.sh
@@ -0,0 +1,52 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck source=scripts/ci/libraries/_script_init.sh
+. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
+
+function verify_prod_image_dependencies {
+    echo
+    echo "Checking if Airflow dependencies are non-conflicting in PROD image."
+    echo
+
+    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_PROD_IMAGE}" \
+        "${GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+    # TODO: remove the | true after we fixed pip check for prod image
+    docker run --rm --entrypoint /bin/bash "${AIRFLOW_PROD_IMAGE}" -c 'pip check' || true
+}
+
+push_pull_remove_images::check_if_github_registry_wait_for_image_enabled
+
+push_pull_remove_images::check_if_jq_installed
+
+build_image::login_to_github_registry_if_needed
+
+export AIRFLOW_PROD_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}"
+
+echo
+echo "Waiting for image to appear: ${AIRFLOW_PROD_IMAGE_NAME}"
+echo
+
+push_pull_remove_images::wait_for_github_registry_image \
+    "${AIRFLOW_PROD_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+echo
+echo "Verifying the ${AIRFLOW_PROD_IMAGE_NAME} image after pulling it"
+echo
+
+verify_prod_image_dependencies
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index 5bd2d06..8de58db 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -346,13 +346,19 @@ function build_images::get_docker_image_names() {
 
     # File that is touched when the CI image is built for the first time locally
     export BUILT_CI_IMAGE_FLAG_FILE="${BUILD_CACHE_DIR}/${BRANCH_NAME}/.built_${PYTHON_MAJOR_MINOR_VERSION}"
+
+    # GitHub Registry names must be lowercase :(
+    github_repository_lowercase="$(echo "${GITHUB_REPOSITORY}" |tr '[:upper:]' '[:lower:]')"
+    export GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_PROD_BASE_TAG}"
+    export GITHUB_REGISTRY_AIRFLOW_PROD_BUILD_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_PROD_BASE_TAG}-build"
+    export GITHUB_REGISTRY_PYTHON_BASE_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/python:${PYTHON_BASE_IMAGE_VERSION}-slim-buster"
+
+    export GITHUB_REGISTRY_AIRFLOW_CI_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_CI_BASE_TAG}"
+    export GITHUB_REGISTRY_PYTHON_BASE_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/python:${PYTHON_BASE_IMAGE_VERSION}-slim-buster"
 }
 
-# Prepares all variables needed by the CI build. Depending on the configuration used (python version
-# DockerHub user etc. the variables are set so that other functions can use those variables.
-function build_images::prepare_ci_build() {
-    export AIRFLOW_CI_LOCAL_MANIFEST_IMAGE="local/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
-    export AIRFLOW_CI_REMOTE_MANIFEST_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
+# If GitHub Registry is used, login to the registry using GITHUB_USERNAME and GITHUB_TOKEN
+function build_image::login_to_github_registry_if_needed()  {
     if [[ ${USE_GITHUB_REGISTRY} == "true" ]]; then
         if [[ -n ${GITHUB_TOKEN=} ]]; then
             echo "${GITHUB_TOKEN}" | docker login \
@@ -360,11 +366,15 @@ function build_images::prepare_ci_build() {
                 --password-stdin \
                 "${GITHUB_REGISTRY}"
         fi
-        # GitHub Registry names must be lowercase :(
-        github_repository_lowercase="$(echo "${GITHUB_REPOSITORY}" |tr '[:upper:]' '[:lower:]')"
-        export GITHUB_REGISTRY_AIRFLOW_CI_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_CI_BASE_TAG}"
-        export GITHUB_REGISTRY_PYTHON_BASE_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/python:${PYTHON_BASE_IMAGE_VERSION}-slim-buster"
     fi
+
+}
+
+# Prepares all variables needed by the CI build. Depending on the configuration used (python version
+# DockerHub user etc. the variables are set so that other functions can use those variables.
+function build_images::prepare_ci_build() {
+    export AIRFLOW_CI_LOCAL_MANIFEST_IMAGE="local/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
+    export AIRFLOW_CI_REMOTE_MANIFEST_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
     export THE_IMAGE_TYPE="CI"
     export IMAGE_DESCRIPTION="Airflow CI"
 
@@ -375,6 +385,7 @@ function build_images::prepare_ci_build() {
     export AIRFLOW_IMAGE="${AIRFLOW_CI_IMAGE}"
     readonly AIRFLOW_IMAGE
 
+    build_image::login_to_github_registry_if_needed
     sanity_checks::go_to_airflow_sources
     permissions::fix_group_permissions
 }
@@ -662,19 +673,7 @@ function build_images::prepare_prod_build() {
     export AIRFLOW_IMAGE="${AIRFLOW_PROD_IMAGE}"
     readonly AIRFLOW_IMAGE
 
-    if [[ ${USE_GITHUB_REGISTRY="false"} == "true" ]]; then
-        if [[ -n ${GITHUB_TOKEN=} ]]; then
-            echo "${GITHUB_TOKEN}" | docker login \
-                --username "${GITHUB_USERNAME}" \
-                --password-stdin \
-                "${GITHUB_REGISTRY}"
-        fi
-        # GitHub Registry names must be lowercase :(
-        github_repository_lowercase="$(echo "${GITHUB_REPOSITORY}" |tr '[:upper:]' '[:lower:]')"
-        export GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_PROD_BASE_TAG}"
-        export GITHUB_REGISTRY_AIRFLOW_PROD_BUILD_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/${AIRFLOW_PROD_BASE_TAG}-build"
-        export GITHUB_REGISTRY_PYTHON_BASE_IMAGE="${GITHUB_REGISTRY}/${github_repository_lowercase}/python:${PYTHON_BASE_IMAGE_VERSION}-slim-buster"
-    fi
+    build_image::login_to_github_registry_if_needed
 
     AIRFLOW_BRANCH_FOR_PYPI_PRELOADING="${BRANCH_NAME}"
     sanity_checks::go_to_airflow_sources
diff --git a/scripts/ci/libraries/_push_pull_remove_images.sh b/scripts/ci/libraries/_push_pull_remove_images.sh
index 7c65db1..216e025 100644
--- a/scripts/ci/libraries/_push_pull_remove_images.sh
+++ b/scripts/ci/libraries/_push_pull_remove_images.sh
@@ -264,13 +264,18 @@ function push_pull_remove_images::push_prod_images() {
 
 # waits for an image to be available in the github registry
 function push_pull_remove_images::wait_for_github_registry_image() {
+    local github_repository_lowercase
     github_repository_lowercase="$(echo "${GITHUB_REPOSITORY}" |tr '[:upper:]' '[:lower:]')"
-    GITHUB_API_ENDPOINT="https://${GITHUB_REGISTRY}/v2/${github_repository_lowercase}"
-    IMAGE_NAME="${1}"
-    IMAGE_TAG=${2}
-    echo "Waiting for ${IMAGE_NAME}:${IMAGE_TAG} image"
+    local github_api_endpoint
+    github_api_endpoint="https://${GITHUB_REGISTRY}/v2/${github_repository_lowercase}"
+    local image_name_in_github_registry="${1}"
+    local image_tag_in_github_registry=${2}
 
-    GITHUB_API_CALL="${GITHUB_API_ENDPOINT}/${IMAGE_NAME}/manifests/${IMAGE_TAG}"
+    echo
+    echo "Waiting for ${GITHUB_REPOSITORY}/${image_name_in_github_registry}:${image_tag_in_github_registry} image"
+    echo
+
+    GITHUB_API_CALL="${github_api_endpoint}/${image_name_in_github_registry}/manifests/${image_tag_in_github_registry}"
     while true; do
         curl -X GET "${GITHUB_API_CALL}" -u "${GITHUB_USERNAME}:${GITHUB_TOKEN}" 2>/dev/null > "${OUTPUT_LOG}"
         local digest
@@ -282,6 +287,33 @@ function push_pull_remove_images::wait_for_github_registry_image() {
         fi
         sleep 10
     done
-    verbosity::print_info "Found ${IMAGE_NAME}:${IMAGE_TAG} image"
+    verbosity::print_info "Found ${image_name_in_github_registry}:${image_tag_in_github_registry} image"
     verbosity::print_info "Digest: '${digest}'"
 }
+
+function push_pull_remove_images::check_if_github_registry_wait_for_image_enabled() {
+    if [[ ${USE_GITHUB_REGISTRY} != "true" ||  ${GITHUB_REGISTRY_WAIT_FOR_IMAGE} != "true" ]]; then
+        echo
+        echo "This script should not be called"
+        echo "It need both USE_GITHUB_REGISTRY and GITHUB_REGISTRY_WAIT_FOR_IMAGE to true!"
+        echo
+        echo "USE_GITHUB_REGISTRY = ${USE_GITHUB_REGISTRY}"
+        echo "GITHUB_REGISTRY_WAIT_FOR_IMAGE =${GITHUB_REGISTRY_WAIT_FOR_IMAGE}"
+        echo
+        exit 1
+    else
+        echo
+        echo "Both USE_GITHUB_REGISTRY and GITHUB_REGISTRY_WAIT_FOR_IMAGE are set to true. Good!"
+    fi
+}
+
+function push_pull_remove_images::check_if_jq_installed() {
+    echo
+    echo "Check if jq is installed"
+    echo
+    command -v jq >/dev/null || (echo "ERROR! You must have 'jq' tool installed!" && exit 1)
+
+    echo
+    echo "The jq version $(jq --version)"
+    echo
+}
diff --git a/scripts/ci/selective_ci_checks.sh b/scripts/ci/selective_ci_checks.sh
index c87ec41..8696d56 100755
--- a/scripts/ci/selective_ci_checks.sh
+++ b/scripts/ci/selective_ci_checks.sh
@@ -42,14 +42,14 @@ else
     FULL_TESTS_NEEDED_LABEL="false"
 fi
 
-if [[ ${PR_LABELS=} == *"upgrade to latest dependencies"* ]]; then
+if [[ ${PR_LABELS=} == *"upgrade to newer dependencies"* ]]; then
     echo
-    echo "Found the right PR labels in '${PR_LABELS=}': 'upgrade to latest dependencies''"
+    echo "Found the right PR labels in '${PR_LABELS=}': 'upgrade to newer dependencies''"
     echo
     UPGRADE_TO_LATEST_CONSTRAINTS_LABEL="true"
 else
     echo
-    echo "Did not find the right PR labels in '${PR_LABELS=}': 'upgrade to latest dependencies'"
+    echo "Did not find the right PR labels in '${PR_LABELS=}': 'upgrade to newer dependencies'"
     echo
     UPGRADE_TO_LATEST_CONSTRAINTS_LABEL="false"
 fi
diff --git a/setup.py b/setup.py
index f5f2a53..83e8a98 100644
--- a/setup.py
+++ b/setup.py
@@ -182,11 +182,13 @@ atlas = [
     'atlasclient>=0.1.2',
 ]
 aws = [
-    'boto3~=1.10',
+    'boto3~=1.10,<1.11',  # required by snowflake
 ]
 azure_blob_storage = [
     'azure-storage>=0.34.0, <0.37.0',
-    'azure-storage-blob<12.0',
+    'azure-storage-blob<12.0.0;python_version<"3.6"',
+    'azure-storage-blob;python_version>="3.6"',
+    'azure-storage-common',
 ]
 azure_container_instances = [
     'azure-mgmt-containerinstance>=1.5.0,<2'
@@ -198,6 +200,7 @@ azure_data_lake = [
     'azure-datalake-store>=0.0.45'
     'azure-mgmt-datalake-store>=0.5.0',
     'azure-mgmt-resource>=2.2.0',
+    'cffi<1.14.0;python_version<"3.0"'
 ]
 azure_secrets = [
     'azure-identity>=1.3.1',
@@ -207,7 +210,8 @@ cassandra = [
     'cassandra-driver>=3.13.0,<3.21.0',
 ]
 celery = [
-    'celery~=4.3',
+    'celery~=4.3;python_version>="3.0"',
+    'celery==4.3.1;python_version<"3.0"',
     'flower>=0.7.3, <1.0',
     'kombu==4.6.3;python_version<"3.0"',
     'tornado>=4.2.0, <6.0',  # Dep of flower. Pin to a version that works on Py3.5.2
@@ -222,7 +226,8 @@ cloudant = [
 crypto = [
     # Cryptography 3.2 for python 2.7 is broken
     # https://github.com/pyca/cryptography/issues/5359#issuecomment-727622403
-    'cryptography>=0.9.3,<3.2; python_version<"3.0"',
+    # Snowflake requires <3.0
+    'cryptography>=0.9.3,<3.0; python_version<"3.0"',
     'cryptography>=0.9.3;python_version>="3.0"',
 ]
 dask = [
@@ -260,7 +265,8 @@ flask_oauth = [
     'requests-oauthlib==1.1.0',
 ]
 gcp = [
-    'PyOpenSSL',
+    'PyOpenSSL<20.0.0;python_version<"3.0"',
+    'PyOpenSSL;python_version>="3.0"',
     'google-api-python-client>=1.6.0, <2.0.0',
     'google-auth>=1.0.0, <2.0.0',
     'google-auth-httplib2>=0.0.1',
@@ -306,7 +312,8 @@ jira = [
 kerberos = [
     'pykerberos>=1.1.13',
     'requests_kerberos>=0.10.0',
-    'thrift_sasl>=0.2.0',
+    'thrift_sasl>=0.2.0,<0.4.1;python_version<"3.0"',
+    'thrift_sasl>=0.2.0;python_version>="3.0"',
 ]
 kubernetes = [
     'cryptography>=2.0.0',
@@ -336,7 +343,9 @@ papermill = [
     'papermill[all]>=1.0.0',
     'nteract-scrapbook[all]>=0.2.1',
     'pyarrow<1.0.0',
-    'fsspec<0.8.0;python_version=="3.5"'
+    'fsspec<0.8.0;python_version=="3.5"',
+    'black==20.8b0;python_version>="3.6"'  # we need to limit black version as we have click < 7
+
 ]
 password = [
     'bcrypt>=2.0.0',
@@ -355,7 +364,7 @@ qds = [
     'qds-sdk>=1.10.4',
 ]
 rabbitmq = [
-    'amqp',
+    'amqp<5.0.0',
 ]
 redis = [
     'redis~=3.2',
@@ -378,6 +387,7 @@ sentry = [
 ]
 slack = [
     'slackclient>=1.0.0,<2.0.0',
+    'websocket-client<0.55.0'
 ]
 snowflake = [
     'snowflake-connector-python>=1.5.2',
@@ -421,11 +431,14 @@ devel = [
     'click==6.7',
     'contextdecorator;python_version<"3.4"',
     'coverage',
+    'docutils>=0.14, <0.16',
+    'ecdsa<0.15',  # Required for moto 1.3.14
     'flake8>=3.6.0',
     'flake8-colors',
     'flaky',
     'freezegun',
     'gitpython',
+    'idna<2.9',  # Required for moto 1.3.14
     'importlib-metadata~=2.0; python_version<"3.8"',
     'ipdb',
     'jira',
@@ -436,14 +449,15 @@ devel = [
     'packaging',
     'parameterized',
     'paramiko',
+    'pipdeptree',
     'pre-commit',
+    'pyrsistent<=0.16.0;python_version<"3.0"',
+    'pyrsistent;python_version>="3.0"',
     'pysftp',
     'pytest<6.0.0',  # FIXME: pylint complaining for pytest.mark.* on v6.0
     'pytest-cov',
     'pytest-instafail',
-    'pytest-rerunfailures',
     'pytest-timeouts',
-    'pytest-xdist',
     'pywinrm',
     'qds-sdk>=1.9.6',
     'requests_mock',
@@ -590,6 +604,8 @@ INSTALL_REQUIREMENTS = [
     'colorlog==4.0.2',
     'configparser>=3.5.0, <3.6.0',
     'croniter>=0.3.17, <0.4',
+    'cryptography>=0.9.3,<3.0; python_version<"3.0"',  # required by snowflake
+    'cryptography>=0.9.3;python_version>="3.0"',
     'dill>=0.2.2, <0.4',
     'email-validator',
     'enum34~=1.1.6;python_version<"3.4"',
@@ -606,11 +622,12 @@ INSTALL_REQUIREMENTS = [
     'graphviz>=0.12',
     'gunicorn>=19.5.0, <21.0',
     'importlib-metadata~=2.0; python_version<"3.8"',
+    'importlib_resources~=1.4',
     'iso8601>=0.1.12',
     'jinja2>=2.10.1, <2.12.0',
     'json-merge-patch==0.2',
     'jsonschema~=3.0',
-    'lazy_object_proxy~=1.3',
+    'lazy_object_proxy<1.5.0',  # Required to keep pip-check happy with astroid
     'markdown>=2.5.2, <3.0',
     'marshmallow-sqlalchemy>=0.16.1, <0.24.0;python_version>="3.6"',
     'marshmallow-sqlalchemy>=0.16.1, <0.19.0;python_version<"3.6"',
@@ -624,14 +641,15 @@ INSTALL_REQUIREMENTS = [
     'python-dateutil>=2.3, <3',
     'python-nvd3~=0.15.0',
     'python-slugify>=3.0.0,<5.0',
-    'requests>=2.20.0, <3',
+    'requests>=2.20.0, <2.23.0;python_version<"3.0"',  # Required to keep snowflake happy
+    'requests>=2.20.0, <2.24.0;python_version>="3.0"',  # Required to keep snowflake happy
     'setproctitle>=1.1.8, <2',
     'sqlalchemy~=1.3',
     'sqlalchemy_jsonfield==0.8.0;python_version<"3.5"',
     'sqlalchemy_jsonfield~=0.9;python_version>="3.5"',
     'tabulate>=0.7.5, <0.9',
     'tenacity==4.12.0',
-    'thrift>=0.9.2',
+    'thrift>=0.11.0',
     'typing;python_version<"3.5"',
     'typing-extensions>=3.7.4;python_version<"3.8"',
     'tzlocal>=1.4,<2.0.0',


[airflow] 17/34: Add 1.10.13 to CI, Breeze and Docs (#12652)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0f186538dcfd38dd69345e02e1fd48cdc39fce75
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Fri Nov 27 13:35:28 2020 +0000

    Add 1.10.13 to CI, Breeze and Docs (#12652)
    
    (cherry picked from commit 9a74ee5fff6922543b7a3086969ca578d05c7417)
---
 BREEZE.rst                     |  8 +++----
 IMAGES.rst                     | 18 +++++++--------
 breeze-complete                |  1 +
 docs/installation.rst          | 10 ++++-----
 docs/production-deployment.rst | 50 +++++++++++++++++++++---------------------
 5 files changed, 44 insertions(+), 43 deletions(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index 2c4bffbe..f91b598 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1187,8 +1187,8 @@ This is the current syntax for  `./breeze <./breeze>`_:
           If specified, installs Airflow directly from PIP released version. This happens at
           image building time in production image and at container entering time for CI image. One of:
 
-                 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3 1.10.2
-                 wheel
+                 1.10.13 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3
+                 1.10.2 wheel
 
   -t, --install-airflow-reference INSTALL_AIRFLOW_REFERENCE
           If specified, installs Airflow directly from reference in GitHub. This happens at
@@ -2098,8 +2098,8 @@ This is the current syntax for  `./breeze <./breeze>`_:
           If specified, installs Airflow directly from PIP released version. This happens at
           image building time in production image and at container entering time for CI image. One of:
 
-                 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3 1.10.2
-                 wheel
+                 1.10.13 1.10.12 1.10.11 1.10.10 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3
+                 1.10.2 wheel
 
   -t, --install-airflow-reference INSTALL_AIRFLOW_REFERENCE
           If specified, installs Airflow directly from reference in GitHub. This happens at
diff --git a/IMAGES.rst b/IMAGES.rst
index 13ce935..724d73c 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -39,7 +39,7 @@ The images are named as follows:
 
 where:
 
-* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, ``v1-10-test``, ``1.10.12``
+* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, ``v1-10-test``, ``1.10.13``
   The ``master`` and ``v1-10-test`` labels are built from branches so they change over time. The ``1.10.*`` and in
   the future ``2.*`` labels are build from git tags and they are "fixed" once built.
 * ``PYTHON_MAJOR_MINOR_VERSION`` - version of python used to build the image. Examples: ``3.5``, ``3.7``
@@ -115,15 +115,15 @@ parameter to Breeze:
 .. code-block:: bash
 
   ./breeze build-image --python 3.7 --additional-extras=presto \
-      --production-image --install-airflow-version=1.10.12
+      --production-image --install-airflow-version=1.10.13
 
 This will build the image using command similar to:
 
 .. code-block:: bash
 
     pip install \
-      apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.12 \
-      --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.6.txt"
+      apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.13 \
+      --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt"
 
 You can also build production images from specific Git version via providing ``--install-airflow-reference``
 parameter to Breeze (this time constraints are taken from the ``constraints-master`` branch which is the
@@ -210,8 +210,8 @@ For example:
   apache/airflow:master-python3.6                - production "latest" image from current master
   apache/airflow:master-python3.6-ci             - CI "latest" image from current master
   apache/airflow:v1-10-test-python2.7-ci         - CI "latest" image from current v1-10-test branch
-  apache/airflow:1.10.12-python3.6               - production image for 1.10.12 release
-  apache/airflow:1.10.12-1-python3.6             - production image for 1.10.12 with some patches applied
+  apache/airflow:1.10.13-python3.6               - production image for 1.10.13 release
+  apache/airflow:1.10.13-1-python3.6             - production image for 1.10.13 with some patches applied
 
 
 You can see DockerHub images at `<https://hub.docker.com/repository/docker/apache/airflow>`_
@@ -292,7 +292,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -308,7 +308,7 @@ the same image can be built using ``breeze`` (it supports auto-completion of the
 .. code-block:: bash
 
   ./breeze build-image -f Dockerfile.ci \
-      --production-image  --python 3.7 --install-airflow-version=1.10.12 \
+      --production-image  --python 3.7 --install-airflow-version=1.10.13 \
       --additional-extras=jdbc --additional-python-deps="pandas" \
       --additional-dev-apt-deps="gcc g++" --additional-runtime-apt-deps="default-jre-headless"
 You can build the default production image with standard ``docker build`` command but they will only build
@@ -326,7 +326,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
diff --git a/breeze-complete b/breeze-complete
index 94854ad..505c6bc 100644
--- a/breeze-complete
+++ b/breeze-complete
@@ -49,6 +49,7 @@ _breeze_allowed_test_types="All Core Integration Heisentests Postgres MySQL Helm
 }
 
 _breeze_allowed_install_airflow_versions=$(cat <<-EOF
+1.10.13
 1.10.12
 1.10.11
 1.10.10
diff --git a/docs/installation.rst b/docs/installation.rst
index de1985c..12ce19e 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -31,7 +31,7 @@ if needed. This means that from time to time plain ``pip install apache-airflow`
 produce unusable Airflow installation.
 
 In order to have repeatable installation, however, starting from **Airflow 1.10.10** and updated in
-**Airflow 1.10.12** we also keep a set of "known-to-be-working" constraint files in the
+**Airflow 1.10.13** we also keep a set of "known-to-be-working" constraint files in the
 ``constraints-master`` and ``constraints-1-10`` orphan branches.
 Those "known-to-be-working" constraints are per major/minor python version. You can use them as constraint
 files when installing Airflow from PyPI. Note that you have to specify correct Airflow version
@@ -47,22 +47,22 @@ and python versions in the URL.
       sudo apt-get install build-essential
 
 
-1. Installing just airflow
+1. Installing just Airflow
 
 .. code-block:: bash
 
-    AIRFLOW_VERSION=1.10.12
+    AIRFLOW_VERSION=1.10.13
     PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
     # For example: 3.6
     CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
-    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.6.txt
+    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt
     pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
 
 2. Installing with extras (for example postgres, google)
 
 .. code-block:: bash
 
-    AIRFLOW_VERSION=1.10.12
+    AIRFLOW_VERSION=1.10.13
     PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
     CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
     pip install "apache-airflow[postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
diff --git a/docs/production-deployment.rst b/docs/production-deployment.rst
index 4bfaabb..3edddb8 100644
--- a/docs/production-deployment.rst
+++ b/docs/production-deployment.rst
@@ -64,7 +64,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.12
+  FROM: apache/airflow:1.10.13
   USER root
   RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -81,7 +81,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.12
+  FROM: apache/airflow:1.10.13
   RUN pip install --no-cache-dir --user my-awesome-pip-dependency-to-add
 
 
@@ -92,7 +92,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.12
+  FROM: apache/airflow:1.10.13
   USER root
   RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -125,7 +125,7 @@ in the `<#production-image-build-arguments>`_ chapter below.
 
 Here just a few examples are presented which should give you general understanding of what you can customize.
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.10 Pypi package and
+This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
 additional apt dev and runtime dependencies.
 
 .. code-block:: bash
@@ -134,7 +134,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -150,7 +150,7 @@ the same image can be built using ``breeze`` (it supports auto-completion of the
 .. code-block:: bash
 
   ./breeze build-image \
-      --production-image  --python 3.7 --install-airflow-version=1.10.12 \
+      --production-image  --python 3.7 --install-airflow-version=1.10.13 \
       --additional-extras=jdbc --additional-python-deps="pandas" \
       --additional-dev-apt-deps="gcc g++" --additional-runtime-apt-deps="default-jre-headless"
 
@@ -166,7 +166,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -225,7 +225,7 @@ Preparing the constraint files and wheel files:
 
   pip download --dest docker-context-files \
     --constraint docker-context-files/constraints-1-10.txt  \
-    apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.12
+    apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.13
 
 
 Building the image (after copying the files downloaded to the "docker-context-files" directory:
@@ -233,7 +233,7 @@ Building the image (after copying the files downloaded to the "docker-context-fi
 .. code-block:: bash
 
   ./breeze build-image \
-      --production-image --python 3.7 --install-airflow-version=1.10.12 \
+      --production-image --python 3.7 --install-airflow-version=1.10.13 \
       --disable-mysql-client-installation --disable-pip-cache --add-local-pip-wheels \
       --constraints-location="/docker-context-files/constraints-1-10.txt"
 
@@ -245,7 +245,7 @@ or
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -392,7 +392,7 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 |                                          |                                          | ``constraints-master`` but can be        |
 |                                          |                                          | ``constraints-1-10`` for 1.10.* versions |
 |                                          |                                          | or it could point to specific version    |
-|                                          |                                          | for example ``constraints-1.10.12``      |
+|                                          |                                          | for example ``constraints-1.10.13``      |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_EXTRAS``                       | (see Dockerfile)                         | Default extras with which airflow is     |
 |                                          |                                          | installed                                |
@@ -503,7 +503,7 @@ production image. There are three types of build:
 | ``AIRFLOW_INSTALL_VERSION``       | Optional - might be used for      |
 |                                   | package installation case to      |
 |                                   | set Airflow version for example   |
-|                                   | "==1.10.12"                       |
+|                                   | "==1.10.13"                       |
 +-----------------------------------+-----------------------------------+
 | ``AIRFLOW_CONSTRAINTS_REFERENCE`` | reference (branch or tag) from    |
 |                                   | GitHub where constraints file     |
@@ -512,7 +512,7 @@ production image. There are three types of build:
 |                                   | ``constraints-1-10`` for 1.10.*   |
 |                                   | constraint or if you want to      |
 |                                   | point to specific version         |
-|                                   | might be ``constraints-1.10.12``  |
+|                                   | might be ``constraints-1.10.13``  |
 +-----------------------------------+-----------------------------------+
 | ``SLUGIFY_USES_TEXT_UNIDECODE``   | In case of of installing airflow  |
 |                                   | 1.10.2 or 1.10.1 you need to      |
@@ -546,7 +546,7 @@ of 2.0 currently):
 
   docker build .
 
-This builds the production image in version 3.7 with default extras from 1.10.12 tag and
+This builds the production image in version 3.7 with default extras from 1.10.13 tag and
 constraints taken from constraints-1-10-12 branch in GitHub.
 
 .. code-block:: bash
@@ -554,14 +554,14 @@ constraints taken from constraints-1-10-12 branch in GitHub.
   docker build . \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
-    --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.12.tar.gz#egg=apache-airflow" \
+    --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.13.tar.gz#egg=apache-airflow" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty"
 
-This builds the production image in version 3.7 with default extras from 1.10.12 Pypi package and
-constraints taken from 1.10.12 tag in GitHub and pre-installed pip dependencies from the top
+This builds the production image in version 3.7 with default extras from 1.10.13 PyPI package and
+constraints taken from 1.10.13 tag in GitHub and pre-installed pip dependencies from the top
 of v1-10-test branch.
 
 .. code-block:: bash
@@ -570,14 +570,14 @@ of v1-10-test branch.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
-    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.12" \
+    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.13" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty"
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.12 Pypi package and
-additional python dependencies and pre-installed pip dependencies from 1.10.12 tagged constraints.
+This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
+additional python dependencies and pre-installed pip dependencies from 1.10.13 tagged constraints.
 
 .. code-block:: bash
 
@@ -585,15 +585,15 @@ additional python dependencies and pre-installed pip dependencies from 1.10.12 t
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
-    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.12" \
+    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.13" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
     --build-arg ADDITIONAL_AIRFLOW_EXTRAS="mssql,hdfs"
     --build-arg ADDITIONAL_PYTHON_DEPS="sshtunnel oauth2client"
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.12 Pypi package and
+This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
 additional apt dev and runtime dependencies.
 
 .. code-block:: bash
@@ -602,7 +602,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.12" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \


[airflow] 04/34: Fix typo in check_environment.sh (#12395)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3b50f4f332e00e6707010fa64a660db0e16f7e47
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Tue Nov 17 12:04:03 2020 +0000

    Fix typo in check_environment.sh (#12395)
    
    `Databsae` -> `Database`
    
    (cherry picked from commit 3e994abc1cbac318f70f9319d364c1ed5a8074f9)
---
 scripts/in_container/check_environment.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/scripts/in_container/check_environment.sh b/scripts/in_container/check_environment.sh
index 84b3d48..7052628 100755
--- a/scripts/in_container/check_environment.sh
+++ b/scripts/in_container/check_environment.sh
@@ -98,7 +98,7 @@ function resetdb_if_requested() {
             airflow db reset -y
         fi
         echo
-        echo "Databsae has been reset"
+        echo "Database has been reset"
         echo
     fi
     return $?


[airflow] 33/34: Update documentation about PIP 20.3 incompatibility

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8998a554bcdca3a56cb96905206da9f1752802fd
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Wed Dec 2 17:37:48 2020 +0100

    Update documentation about PIP 20.3 incompatibility
---
 CONTRIBUTING.rst      | 26 ++++++++++++++++++++++++--
 IMAGES.rst            |  8 ++++++++
 INSTALL               | 26 +++++++++++++++++++++++---
 LOCAL_VIRTUALENV.rst  |  8 ++++++++
 README.md             |  9 +++++++++
 docs/installation.rst | 16 ++++++++++++++++
 docs/metrics.rst      |  8 ++++++++
 docs/security.rst     | 23 +++++++++++++++++++++++
 docs/start.rst        |  9 +++++++++
 9 files changed, 128 insertions(+), 5 deletions(-)

diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 0c1c9c1..8c4bf35 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -541,6 +541,14 @@ extras can be specified after the usual pip install - for example
 installs all development dependencies. There is also ``devel_ci`` that installs
 all dependencies needed in the CI environment.
 
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 This is the full list of those extras:
 
   .. START EXTRAS HERE
@@ -591,6 +599,14 @@ the other provider package you can install it adding [extra] after the
 ``pip install apache-airflow-backport-providers-google[amazon]`` in case you want to use GCP
 transfer operators from Amazon ECS.
 
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 If you add a new dependency between different providers packages, it will be detected automatically during
 pre-commit phase and pre-commit will fail - and add entry in dependencies.json so that the package extra
 dependencies are properly added when package is installed.
@@ -671,6 +687,14 @@ install in case a direct or transitive dependency is released that breaks the in
 when installing ``apache-airflow``, you might need to provide additional constraints (for
 example ``pip install apache-airflow==1.10.2 Werkzeug<1.0.0``)
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 However we now have ``constraints-<PYTHON_MAJOR_MINOR_VERSION>.txt`` files generated
 automatically and committed to orphan ``constraints-master`` and ``constraint-1-10`` branches based on
 the set of all latest working and tested dependency versions. Those
@@ -682,7 +706,6 @@ constraints file when installing Apache Airflow - either from the sources:
   pip install -e . \
     --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1-10/constraints-3.6.txt"
 
-
 or from the pypi package:
 
 .. code-block:: bash
@@ -690,7 +713,6 @@ or from the pypi package:
   pip install apache-airflow \
     --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1-10/constraints-3.6.txt"
 
-
 This works also with extras - for example:
 
 .. code-block:: bash
diff --git a/IMAGES.rst b/IMAGES.rst
index 339969b..6a04428 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -125,6 +125,14 @@ This will build the image using command similar to:
       apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.14 \
       --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt"
 
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 You can also build production images from specific Git version via providing ``--install-airflow-reference``
 parameter to Breeze (this time constraints are taken from the ``constraints-master`` branch which is the
 HEAD of development for constraints):
diff --git a/INSTALL b/INSTALL
index 0e2f582..763ed20 100644
--- a/INSTALL
+++ b/INSTALL
@@ -31,16 +31,36 @@ source PATH_TO_YOUR_VENV/bin/activate
 # [required] building and installing by pip (preferred)
 pip install .
 
-# or directly
+NOTE!
+
+On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+depends on your choice of extras. In order to install Airflow you need to either downgrade
+pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+``--use-deprecated legacy-resolver`` to your pip install command.
+
+
+# or you can install it directly via setup.py
 python setup.py install
 
+
 # You can also install recommended version of the dependencies by using
 # constraint-python<PYTHON_MAJOR_MINOR_VERSION>.txt files as constraint file. This is needed in case
 # you have problems with installing the current requirements from PyPI.
-# There are different constraint files for different python versions. For example"
+# There are different constraint files for different python versions and you shopuld choose the
+# version of constraints specific for your version.
+# For example:
 
 pip install . \
-  --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt"
+  --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt"
+
+
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
 
 # You can also install Airflow with extras specified. The list of available extras:
 # START EXTRAS HERE
diff --git a/LOCAL_VIRTUALENV.rst b/LOCAL_VIRTUALENV.rst
index 8a20c02..574366d 100644
--- a/LOCAL_VIRTUALENV.rst
+++ b/LOCAL_VIRTUALENV.rst
@@ -118,6 +118,14 @@ To create and initialize the local virtualenv:
 
     pip install -U -e ".[devel,<OTHER EXTRAS>]" # for example: pip install -U -e ".[devel,gcp,postgres]"
 
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 In case you have problems with installing airflow because of some requirements are not installable, you can
 try to install it with the set of working constraints (note that there are different constraint files
 for different python versions:
diff --git a/README.md b/README.md
index b72b175..79cceed 100644
--- a/README.md
+++ b/README.md
@@ -122,6 +122,15 @@ pip install apache-airflow==1.10.14 \
  --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.7.txt"
 ```
 
+**NOTE!!!**
+
+On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+depends on your choice of extras. In order to install Airflow you need to either downgrade
+pip to version 20.2.4 `pip upgrade --pip==20.2.4` or, in case you use Pip 20.3, you need to add option
+`--use-deprecated legacy-resolver` to your pip install command.
+
+
 2. Installing with extras (for example postgres,gcp)
 ```bash
 pip install apache-airflow[postgres,gcp]==1.10.14 \
diff --git a/docs/installation.rst b/docs/installation.rst
index 4a084e1..ed4f4a0 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -58,6 +58,14 @@ and python versions in the URL.
     # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt
     pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
 
+
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
 2. Installing with extras (for example postgres, google)
 
 .. code-block:: bash
@@ -68,6 +76,14 @@ and python versions in the URL.
     pip install "apache-airflow[postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
 
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 You need certain system level requirements in order to install Airflow. Those are requirements that are known
 to be needed for Linux system (Tested on Ubuntu Buster LTS) :
 
diff --git a/docs/metrics.rst b/docs/metrics.rst
index 7f7c92d..82e62b0 100644
--- a/docs/metrics.rst
+++ b/docs/metrics.rst
@@ -31,6 +31,14 @@ First you must install statsd requirement:
 
    pip install 'apache-airflow[statsd]'
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 Add the following lines to your configuration file e.g. ``airflow.cfg``
 
 .. code-block:: ini
diff --git a/docs/security.rst b/docs/security.rst
index b22dfc0..5fdf23f 100644
--- a/docs/security.rst
+++ b/docs/security.rst
@@ -320,6 +320,13 @@ To use kerberos authentication, you must install Airflow with the ``kerberos`` e
 
    pip install 'apache-airflow[kerberos]'
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
 OAuth Authentication
 --------------------
 
@@ -359,6 +366,14 @@ To use GHE authentication, you must install Airflow with the ``github_enterprise
 
    pip install 'apache-airflow[github_enterprise]'
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 Setting up GHE Authentication
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -414,6 +429,14 @@ To use Google authentication, you must install Airflow with the ``google_auth``
 
    pip install 'apache-airflow[google_auth]'
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 Setting up Google Authentication
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
diff --git a/docs/start.rst b/docs/start.rst
index bff52ae..f2b4322 100644
--- a/docs/start.rst
+++ b/docs/start.rst
@@ -43,6 +43,15 @@ The installation is quick and straightforward.
 
     # visit localhost:8080 in the browser and enable the example dag in the home page
 
+
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 Upon running these commands, Airflow will create the ``$AIRFLOW_HOME`` folder
 and lay an "airflow.cfg" file with defaults that get you going fast. You can
 inspect the file either in ``$AIRFLOW_HOME/airflow.cfg``, or through the UI in


[airflow] 21/34: Rename `[scheduler] max_threads` to `[scheduler] parsing_processes` (#12605)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 82fd4f9d50dfd16adb44f39e0d26220c6bd76ed4
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Nov 25 09:33:19 2020 +0000

    Rename `[scheduler] max_threads` to `[scheduler] parsing_processes` (#12605)
    
    From Airflow 1.10.14, `max_threads` config under `[scheduler]` section has been renamed to `parsing_processes`.
    
    This is to align the name with the actual code where the Scheduler launches the number of processes defined by
    `[scheduler] parsing_processes` to Parse DAG files, calculates next DagRun date for each DAG,
    serialize them and store them in the DB.
    
    (cherry picked from commit 486134426bf2cd54fae1f75d9bd50715b8369ca1)
---
 UPDATING.md                                  | 7 +++++++
 airflow/config_templates/config.yml          | 6 +++---
 airflow/config_templates/default_airflow.cfg | 6 +++---
 airflow/config_templates/default_test.cfg    | 2 +-
 airflow/configuration.py                     | 3 +++
 airflow/jobs/scheduler_job.py                | 2 +-
 airflow/utils/dag_processing.py              | 4 ++--
 docs/faq.rst                                 | 7 +++++--
 scripts/in_container/airflow_ci.cfg          | 2 +-
 tests/utils/test_dag_processing.py           | 2 +-
 10 files changed, 27 insertions(+), 14 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index 577b644..4ad226a 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -64,6 +64,13 @@ https://developers.google.com/style/inclusive-documentation
 -->
 ## Airflow 1.10.14
 
+### `[scheduler] max_threads` config has been renamed to `[scheduler] parsing_processes`
+
+From Airflow 1.10.14, `max_threads` config under `[scheduler]` section has been renamed to `parsing_processes`.
+
+This is to align the name with the actual code where the Scheduler launches the number of processes defined by
+`[scheduler] parsing_processes` to parse the DAG files.
+
 ### Airflow CLI changes in line with  2.0
 
 The Airflow CLI has been organized so that related commands are grouped together as subcommands,
diff --git a/airflow/config_templates/config.yml b/airflow/config_templates/config.yml
index e89df22..87ee928 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -1439,10 +1439,10 @@
       type: string
       example: ~
       default: ""
-    - name: max_threads
+    - name: parsing_processes
       description: |
-        The scheduler can run multiple threads in parallel to schedule dags.
-        This defines how many threads will run.
+        The scheduler can run multiple processes in parallel to parse dags.
+        This defines how many processes will run.
       version_added: ~
       type: string
       example: ~
diff --git a/airflow/config_templates/default_airflow.cfg b/airflow/config_templates/default_airflow.cfg
index ea64d8f..662fd00 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -695,9 +695,9 @@ statsd_prefix = airflow
 # start with the elements of the list (e.g: scheduler,executor,dagrun)
 statsd_allow_list =
 
-# The scheduler can run multiple threads in parallel to schedule dags.
-# This defines how many threads will run.
-max_threads = 2
+# The scheduler can run multiple processes in parallel to parse dags.
+# This defines how many processes will run.
+parsing_processes = 2
 authenticate = False
 
 # Turn off scheduler use of cron intervals by setting this to False.
diff --git a/airflow/config_templates/default_test.cfg b/airflow/config_templates/default_test.cfg
index 3ac2225..30a82a4 100644
--- a/airflow/config_templates/default_test.cfg
+++ b/airflow/config_templates/default_test.cfg
@@ -113,7 +113,7 @@ job_heartbeat_sec = 1
 scheduler_heartbeat_sec = 5
 scheduler_health_check_threshold = 30
 authenticate = true
-max_threads = 2
+parsing_processes = 2
 catchup_by_default = True
 scheduler_zombie_task_threshold = 300
 dag_dir_list_interval = 0
diff --git a/airflow/configuration.py b/airflow/configuration.py
index 290843f..16081a3 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -183,6 +183,9 @@ class AirflowConfigParser(ConfigParser):
             'json_format': 'elasticsearch_json_format',
             'json_fields': 'elasticsearch_json_fields'
 
+        },
+        'scheduler': {
+            'parsing_processes': 'max_threads'
         }
     }
 
diff --git a/airflow/jobs/scheduler_job.py b/airflow/jobs/scheduler_job.py
index 9149699..b72e2b1 100644
--- a/airflow/jobs/scheduler_job.py
+++ b/airflow/jobs/scheduler_job.py
@@ -391,7 +391,7 @@ class SchedulerJob(BaseJob):
         self.do_pickle = do_pickle
         super(SchedulerJob, self).__init__(*args, **kwargs)
 
-        self.max_threads = conf.getint('scheduler', 'max_threads')
+        self.max_threads = conf.getint('scheduler', 'parsing_processes')
 
         if log:
             self._log = log
diff --git a/airflow/utils/dag_processing.py b/airflow/utils/dag_processing.py
index 4a4b240..881a8ce 100644
--- a/airflow/utils/dag_processing.py
+++ b/airflow/utils/dag_processing.py
@@ -769,10 +769,10 @@ class DagFileProcessorManager(LoggingMixin):
         self._dag_ids = dag_ids
         self._async_mode = async_mode
 
-        self._parallelism = conf.getint('scheduler', 'max_threads')
+        self._parallelism = conf.getint('scheduler', 'parsing_processes')
         if 'sqlite' in conf.get('core', 'sql_alchemy_conn') and self._parallelism > 1:
             self.log.warning(
-                "Because we cannot use more than 1 thread (max_threads = {}) "
+                "Because we cannot use more than 1 thread (parsing_processes = {}) "
                 "when using sqlite. So we set parallelism to 1.".format(self._parallelism)
             )
             self._parallelism = 1
diff --git a/docs/faq.rst b/docs/faq.rst
index 80849e0..c041e0a 100644
--- a/docs/faq.rst
+++ b/docs/faq.rst
@@ -205,8 +205,11 @@ This means ``explicit_defaults_for_timestamp`` is disabled in your mysql server
 How to reduce airflow dag scheduling latency in production?
 -----------------------------------------------------------
 
-- ``max_threads``: Scheduler will spawn multiple threads in parallel to schedule dags. This is controlled by ``max_threads`` with default value of 2. User should increase this value to a larger value (e.g numbers of cpus where scheduler runs - 1) in production.
-- ``scheduler_heartbeat_sec``: User should consider to increase ``scheduler_heartbeat_sec`` config to a higher value (e.g 60 secs) which controls how frequent the airflow scheduler gets the heartbeat and updates the job's entry in database.
+- ``parsing_processes``: Scheduler will spawn multiple threads in parallel to parse dags.
+  This is controlled by ``parsing_processes`` with default value of 2.
+  User should increase this value to a larger value (e.g numbers of cpus where scheduler runs + 1) in production.
+- ``scheduler_heartbeat_sec``: User should consider to increase ``scheduler_heartbeat_sec`` config to a higher value (e.g 60 secs) which controls how frequent the airflow scheduler gets the heartbeat
+  and updates the job's entry in database.
 
 Why next_ds or prev_ds might not contain expected values?
 ---------------------------------------------------------
diff --git a/scripts/in_container/airflow_ci.cfg b/scripts/in_container/airflow_ci.cfg
index b097752..4933af0 100644
--- a/scripts/in_container/airflow_ci.cfg
+++ b/scripts/in_container/airflow_ci.cfg
@@ -52,4 +52,4 @@ _test_only_string = this is a test
 job_heartbeat_sec = 1
 scheduler_heartbeat_sec = 5
 authenticate = true
-max_threads = 2
+parsing_processes = 2
diff --git a/tests/utils/test_dag_processing.py b/tests/utils/test_dag_processing.py
index 3726a8d..b2bdf35 100644
--- a/tests/utils/test_dag_processing.py
+++ b/tests/utils/test_dag_processing.py
@@ -267,7 +267,7 @@ class TestDagFileProcessorManager(unittest.TestCase):
         file processors until the next zombie detection logic is invoked.
         """
         test_dag_path = os.path.join(TEST_DAG_FOLDER, 'test_example_bash_operator.py')
-        with conf_vars({('scheduler', 'max_threads'): '1',
+        with conf_vars({('scheduler', 'parsing_processes'): '1',
                         ('core', 'load_examples'): 'False'}):
             dagbag = DagBag(test_dag_path)
             with create_session() as session:


[airflow] 11/34: Fixes unneeded docker-context-files added in CI (#12534)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ff02efe1fc573dcf8b123bb3312c2e55d33981de
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Sat Nov 21 19:21:43 2020 +0100

    Fixes unneeded docker-context-files added in CI (#12534)
    
    We do not need to add docker-context-files in CI before we run
    first "cache" PIP installation. Adding it might cause the effect
    that the cache will always be invalidated in case someone has
    a file added there before building and pushing the image.
    
    This PR fixes the problem by adding docker-context files later
    in the Dockerfile and changing the constraints location
    used in the "cache" step to always use the github constraints in
    this case.
    
    Closes #12509
    
    (cherry picked from commit 37548f09acb91edd041565f52051f58610402cb3)
---
 Dockerfile                            |  3 ++-
 Dockerfile.ci                         | 11 ++++++-----
 IMAGES.rst                            |  9 ++++++++-
 scripts/ci/libraries/_build_images.sh | 35 +++++++++++++++--------------------
 4 files changed, 31 insertions(+), 27 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index 8ad3db9..00442bc 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -176,7 +176,8 @@ RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
        fi; \
        pip install --user \
           "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-          --constraint "${AIRFLOW_CONSTRAINTS_LOCATION}" && pip uninstall --yes apache-airflow; \
+          --constraint "https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+          && pip uninstall --yes apache-airflow; \
     fi
 
 ARG AIRFLOW_SOURCES_FROM="."
diff --git a/Dockerfile.ci b/Dockerfile.ci
index aa426ef..ac51a56 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -262,10 +262,6 @@ ENV AIRFLOW_LOCAL_PIP_WHEELS=${AIRFLOW_LOCAL_PIP_WHEELS}
 ARG INSTALL_AIRFLOW_VIA_PIP="true"
 ENV INSTALL_AIRFLOW_VIA_PIP=${INSTALL_AIRFLOW_VIA_PIP}
 
-# If wheel files are found in /docker-context-files during installation
-# they are also installed additionally to whatever is installed from Airflow.
-COPY docker-context-files /docker-context-files
-
 # In case of CI builds we want to pre-install master version of airflow dependencies so that
 # We do not have to always reinstall it from the scratch.
 # This can be reinstalled from latest master by increasing PIP_DEPENDENCIES_EPOCH_NUMBER.
@@ -273,7 +269,8 @@ COPY docker-context-files /docker-context-files
 RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
         pip install \
             "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-                --constraint "${AIRFLOW_CONSTRAINTS_URL}" && pip uninstall --yes apache-airflow; \
+                --constraint "https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+                && pip uninstall --yes apache-airflow; \
     fi
 
 
@@ -322,6 +319,10 @@ RUN if [[ ${INSTALL_AIRFLOW_VIA_PIP} == "true" ]]; then \
         fi; \
     fi
 
+# If wheel files are found in /docker-context-files during installation
+# they are also installed additionally to whatever is installed from Airflow.
+COPY docker-context-files/ /docker-context-files/
+
 RUN if [[ ${AIRFLOW_LOCAL_PIP_WHEELS} != "true" ]]; then \
         if ls /docker-context-files/*.whl 1> /dev/null 2>&1; then \
             pip install --no-deps /docker-context-files/*.whl; \
diff --git a/IMAGES.rst b/IMAGES.rst
index 8c913db..13ce935 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -399,7 +399,14 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 |                                          |                                          | file has to be in docker context so      |
 |                                          |                                          | it's best to place such file in          |
 |                                          |                                          | one of the folders included in           |
-|                                          |                                          | dockerignore                             |
+|                                          |                                          | dockerignore                . for example in the        |
+|                                          |                                          | 'docker-context-files'. Note that the    |
+|                                          |                                          | location does not work for the first     |
+|                                          |                                          | stage of installation when the           |
+|                                          |                                          | stage of installation when the           |
+|                                          |                                          | ``AIRFLOW_PRE_CACHED_PIP_PACKAGES`` is   |
+|                                          |                                          | set to true. Default location from       |
+|                                          |                                          | GitHub is used in this case.             |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_LOCAL_PIP_WHEELS``             | ``false``                                | If set to true, Airflow and it's         |
 |                                          |                                          | dependencies are installed from locally  |
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index fb756ba..5bd2d06 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -97,6 +97,19 @@ function build_images::forget_last_answer() {
     fi
 }
 
+function build_images::confirm_via_terminal() {
+    echo > "${DETECTED_TERMINAL}"
+    echo > "${DETECTED_TERMINAL}"
+    echo "Make sure that you rebased to latest master before rebuilding!" > "${DETECTED_TERMINAL}"
+    echo > "${DETECTED_TERMINAL}"
+    # Make sure to use output of tty rather than stdin/stdout when available - this way confirm
+    # will works also in case of pre-commits (git does not pass stdin/stdout to pre-commit hooks)
+    # shellcheck disable=SC2094
+    "${AIRFLOW_SOURCES}/confirm" "${ACTION} image ${THE_IMAGE_TYPE}-python${PYTHON_MAJOR_MINOR_VERSION}" \
+        <"${DETECTED_TERMINAL}" >"${DETECTED_TERMINAL}"
+    RES=$?
+}
+
 # Confirms if hte image should be rebuild and interactively checks it with the user.
 # In case iit needs to be rebuild. It only ask the user if it determines that the rebuild
 # is needed and that the rebuild is not already forced. It asks the user using available terminals
@@ -144,29 +157,11 @@ function build_images::confirm_image_rebuild() {
         "${AIRFLOW_SOURCES}/confirm" "${ACTION} image ${THE_IMAGE_TYPE}-python${PYTHON_MAJOR_MINOR_VERSION}"
         RES=$?
     elif [[ ${DETECTED_TERMINAL:=$(tty)} != "not a tty" ]]; then
-        echo > "${DETECTED_TERMINAL}"
-        echo > "${DETECTED_TERMINAL}"
-        echo "Make sure that you rebased to latest master before rebuilding!" > "${DETECTED_TERMINAL}"
-        echo > "${DETECTED_TERMINAL}"
-        # Make sure to use output of tty rather than stdin/stdout when available - this way confirm
-        # will works also in case of pre-commits (git does not pass stdin/stdout to pre-commit hooks)
-        # shellcheck disable=SC2094
-        "${AIRFLOW_SOURCES}/confirm" "${ACTION} image ${THE_IMAGE_TYPE}-python${PYTHON_MAJOR_MINOR_VERSION}" \
-            <"${DETECTED_TERMINAL}" >"${DETECTED_TERMINAL}"
-        RES=$?
         export DETECTED_TERMINAL
+        build_images::confirm_via_terminal
     elif [[ -c /dev/tty ]]; then
         export DETECTED_TERMINAL=/dev/tty
-        # Make sure to use /dev/tty first rather than stdin/stdout when available - this way confirm
-        # will works also in case of pre-commits (git does not pass stdin/stdout to pre-commit hooks)
-        echo > "${DETECTED_TERMINAL}"
-        echo > "${DETECTED_TERMINAL}"
-        echo "Make sure that you rebased to latest master before rebuilding!" > "${DETECTED_TERMINAL}"
-        echo > "${DETECTED_TERMINAL}"
-        # shellcheck disable=SC2094
-        "${AIRFLOW_SOURCES}/confirm" "${ACTION} image ${THE_IMAGE_TYPE}-python${PYTHON_MAJOR_MINOR_VERSION}" \
-            <"${DETECTED_TERMINAL}" >"${DETECTED_TERMINAL}"
-        RES=$?
+        build_images::confirm_via_terminal
     else
         verbosity::print_info
         verbosity::print_info "No terminal, no stdin - quitting"


[airflow] 30/34: Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 152175c97a79b920047d1f4705ffed9f95c0e8e0
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Tue Nov 3 15:28:51 2020 +0000

    Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)
    
    closes: https://github.com/apache/airflow/issues/11146
    (cherry picked from commit 980c7252c0f28c251e9f87d736cd88d6027f3da3)
---
 airflow/bin/cli.py    |  81 +++++++++++++++++++++++++++++++++
 tests/cli/test_cli.py | 122 ++++++++++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 203 insertions(+)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index a155cff..4f23038 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -1464,6 +1464,74 @@ Happy Airflowing!
     print(output_string)
 
 
+@cli_utils.action_logging
+def cleanup_pods(args):
+    from kubernetes.client.rest import ApiException
+
+    from airflow.kubernetes.kube_client import get_kube_client
+
+    """Clean up k8s pods in evicted/failed/succeeded states"""
+    namespace = args.namespace
+
+    # https://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle/
+    # All Containers in the Pod have terminated in success, and will not be restarted.
+    pod_succeeded = 'succeeded'
+
+    # All Containers in the Pod have terminated, and at least one Container has terminated in failure.
+    # That is, the Container either exited with non-zero status or was terminated by the system.
+    pod_failed = 'failed'
+
+    # https://kubernetes.io/docs/tasks/administer-cluster/out-of-resource/
+    pod_reason_evicted = 'evicted'
+    # If pod is failed and restartPolicy is:
+    # * Always: Restart Container; Pod phase stays Running.
+    # * OnFailure: Restart Container; Pod phase stays Running.
+    # * Never: Pod phase becomes Failed.
+    pod_restart_policy_never = 'never'
+
+    print('Loading Kubernetes configuration')
+    kube_client = get_kube_client()
+    print('Listing pods in namespace {}'.format(namespace))
+    continue_token = None
+    while True:  # pylint: disable=too-many-nested-blocks
+        pod_list = kube_client.list_namespaced_pod(namespace=namespace, limit=500, _continue=continue_token)
+        for pod in pod_list.items:
+            pod_name = pod.metadata.name
+            print('Inspecting pod {}'.format(pod_name))
+            pod_phase = pod.status.phase.lower()
+            pod_reason = pod.status.reason.lower() if pod.status.reason else ''
+            pod_restart_policy = pod.spec.restart_policy.lower()
+
+            if (
+                pod_phase == pod_succeeded
+                or (pod_phase == pod_failed and pod_restart_policy == pod_restart_policy_never)
+                or (pod_reason == pod_reason_evicted)
+            ):
+                print('Deleting pod "{}" phase "{}" and reason "{}", restart policy "{}"'.format(
+                    pod_name, pod_phase, pod_reason, pod_restart_policy)
+                )
+                try:
+                    _delete_pod(pod.metadata.name, namespace)
+                except ApiException as e:
+                    print("can't remove POD: {}".format(e), file=sys.stderr)
+                continue
+            print('No action taken on pod {}'.format(pod_name))
+        continue_token = pod_list.metadata._continue  # pylint: disable=protected-access
+        if not continue_token:
+            break
+
+
+def _delete_pod(name, namespace):
+    """Helper Function for cleanup_pods"""
+    from kubernetes import client
+
+    core_v1 = client.CoreV1Api()
+    delete_options = client.V1DeleteOptions()
+    print('Deleting POD "{}" from "{}" namespace'.format(name, namespace))
+    api_response = core_v1.delete_namespaced_pod(name=name, namespace=namespace, body=delete_options)
+    print(api_response)
+
+
 @cli_utils.deprecated_action(new_name='celery worker')
 @cli_utils.action_logging
 def worker(args):
@@ -2705,6 +2773,13 @@ ARG_SKIP_SERVE_LOGS = Arg(
     action="store_true",
 )
 
+# kubernetes cleanup-pods
+ARG_NAMESPACE = Arg(
+    ("--namespace",),
+    default='default',
+    help="Kubernetes Namespace",
+)
+
 ALTERNATIVE_CONN_SPECS_ARGS = [
     ARG_CONN_TYPE,
     ARG_CONN_HOST,
@@ -3154,6 +3229,12 @@ CONFIG_COMMANDS = (
 
 KUBERNETES_COMMANDS = (
     ActionCommand(
+        name='cleanup-pods',
+        help="Clean up Kubernetes pods in evicted/failed/succeeded states",
+        func=cleanup_pods,
+        args=(ARG_NAMESPACE, ),
+    ),
+    ActionCommand(
         name='generate-dag-yaml',
         help="Generate YAML files for all tasks in DAG. Useful for debugging tasks without "
         "launching into a cluster",
diff --git a/tests/cli/test_cli.py b/tests/cli/test_cli.py
index 048f802..bb39869 100644
--- a/tests/cli/test_cli.py
+++ b/tests/cli/test_cli.py
@@ -23,6 +23,8 @@ import io
 import logging
 import os
 
+import kubernetes
+
 from airflow.configuration import conf
 from parameterized import parameterized
 from six import StringIO, PY2
@@ -1026,3 +1028,123 @@ class TestCLIGetNumReadyWorkersRunning(unittest.TestCase):
 
         with mock.patch('psutil.Process', return_value=self.process):
             self.assertEqual(self.monitor._get_num_ready_workers_running(), 0)
+
+
+class TestCleanUpPodsCommand(unittest.TestCase):
+    @classmethod
+    def setUpClass(cls):
+        cls.parser = cli.get_parser()
+
+    @mock.patch('kubernetes.client.CoreV1Api.delete_namespaced_pod')
+    def test_delete_pod(self, delete_namespaced_pod):
+        cli._delete_pod('dummy', 'awesome-namespace')
+        delete_namespaced_pod.assert_called_with(body=mock.ANY, name='dummy', namespace='awesome-namespace')
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('airflow.kubernetes.kube_client.config.load_incluster_config')
+    def test_running_pods_are_not_cleaned(self, load_incluster_config, list_namespaced_pod, delete_pod):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy'
+        pod1.status.phase = 'Running'
+        pod1.status.reason = None
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_not_called()
+        load_incluster_config.assert_called_once()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('airflow.kubernetes.kube_client.config.load_incluster_config')
+    def test_cleanup_succeeded_pods(self, load_incluster_config, list_namespaced_pod, delete_pod):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy'
+        pod1.status.phase = 'Succeeded'
+        pod1.status.reason = None
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_called_with('dummy', 'awesome-namespace')
+        load_incluster_config.assert_called_once()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('kubernetes.config.load_incluster_config')
+    def test_no_cleanup_failed_pods_wo_restart_policy_never(
+        self, load_incluster_config, list_namespaced_pod, delete_pod
+    ):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy2'
+        pod1.status.phase = 'Failed'
+        pod1.status.reason = None
+        pod1.spec.restart_policy = 'Always'
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_not_called()
+        load_incluster_config.assert_called_once()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('kubernetes.config.load_incluster_config')
+    def test_cleanup_failed_pods_w_restart_policy_never(
+        self, load_incluster_config, list_namespaced_pod, delete_pod
+    ):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy3'
+        pod1.status.phase = 'Failed'
+        pod1.status.reason = None
+        pod1.spec.restart_policy = 'Never'
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_called_with('dummy3', 'awesome-namespace')
+        load_incluster_config.assert_called_once()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('kubernetes.config.load_incluster_config')
+    def test_cleanup_evicted_pods(self, load_incluster_config, list_namespaced_pod, delete_pod):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy4'
+        pod1.status.phase = 'Failed'
+        pod1.status.reason = 'Evicted'
+        pod1.spec.restart_policy = 'Never'
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_called_with('dummy4', 'awesome-namespace')
+        load_incluster_config.assert_called_once()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('kubernetes.config.load_incluster_config')
+    def test_cleanup_api_exception_continue(self, load_incluster_config, list_namespaced_pod, delete_pod):
+        delete_pod.side_effect = kubernetes.client.rest.ApiException(status=0)
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy'
+        pod1.status.phase = 'Succeeded'
+        pod1.status.reason = None
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        load_incluster_config.assert_called_once()


[airflow] 09/34: Fix broken CI.yml (#12454)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7701f514ce8e0a4504ba58abd11b4d6eae1081ee
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Wed Nov 18 17:34:00 2020 +0100

    Fix broken CI.yml (#12454)
    
    The PR #12417 broke CI.yaml accidentally. This PR fixes it.
    
    (cherry picked from commit 93b327051605fb9cd9bebf77802090482b246013)
---
 .github/workflows/ci.yml | 1 +
 1 file changed, 1 insertion(+)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 5aadfd0..5931135 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -638,6 +638,7 @@ jobs:
         with:
           path: ".build/.kubernetes_venv*"
           key: "venv-${{ env.cache-name }}-${{ github.job }}-${{ hashFiles('setup.py') }}\
+-${{ hashFiles('setup.cfg') }}\
 -${{ needs.build-info.outputs.defaultPythonVersion }}"
       - name: "Cache bin folder with tools for kubernetes testing"
         uses: actions/cache@v2


[airflow] 18/34: Remove "@" references from constraints generattion (#12671)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 71c1e2e852b3c87dd0a6cae75017419bc56cee57
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Sat Nov 28 06:04:45 2020 +0100

    Remove "@" references from constraints generattion (#12671)
    
    Likely fixes: #12665
    
    (cherry picked from commit 3b138d2d60d86ca0a80e9c27afd3421f45df178e)
---
 scripts/in_container/run_generate_constraints.sh | 1 +
 1 file changed, 1 insertion(+)

diff --git a/scripts/in_container/run_generate_constraints.sh b/scripts/in_container/run_generate_constraints.sh
index 999f750..62b2237 100755
--- a/scripts/in_container/run_generate_constraints.sh
+++ b/scripts/in_container/run_generate_constraints.sh
@@ -36,6 +36,7 @@ echo
 
 pip freeze | sort | \
     grep -v "apache_airflow" | \
+    grep -v "@" | \
     grep -v "/opt/airflow" >"${CURRENT_CONSTRAINT_FILE}"
 
 echo


[airflow] 07/34: The messages about remote image check are only shown with -v (#12402)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ff0f26ac9fe311f7f050b941064a3a58eee88fcb
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Nov 17 20:32:00 2020 +0100

    The messages about remote image check are only shown with -v (#12402)
    
    The messages might be confusing and should only be shown when
    verbose is turned on.
    
    (cherry picked from commit dc31ca4dc6986397b619bf21ae8628fd03cba58d)
---
 scripts/ci/libraries/_build_images.sh | 24 ++++++++++++------------
 1 file changed, 12 insertions(+), 12 deletions(-)

diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index a8d9521..fb756ba 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -225,9 +225,9 @@ function build_images::get_local_build_cache_hash() {
     # Remove the container just in case
     docker rm --force "local-airflow-ci-container" 2>/dev/null >/dev/null
     if ! docker create --name "local-airflow-ci-container" "${AIRFLOW_CI_IMAGE}" 2>/dev/null; then
-        >&2 echo
-        >&2 echo "Local airflow CI image not available"
-        >&2 echo
+        verbosity::print_info
+        verbosity::print_info "Local airflow CI image not available"
+        verbosity::print_info
         LOCAL_MANIFEST_IMAGE_UNAVAILABLE="true"
         export LOCAL_MANIFEST_IMAGE_UNAVAILABLE
         touch "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}"
@@ -237,9 +237,9 @@ function build_images::get_local_build_cache_hash() {
         "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}" 2> /dev/null \
         || touch "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}"
     set -e
-    echo
-    echo "Local build cache hash: '$(cat "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}")'"
-    echo
+    verbosity::print_info
+    verbosity::print_info "Local build cache hash: '$(cat "${LOCAL_IMAGE_BUILD_CACHE_HASH_FILE}")'"
+    verbosity::print_info
 }
 
 # Retrieves information about the build cache hash random file from the remote image.
@@ -257,9 +257,9 @@ function build_images::get_remote_image_build_cache_hash() {
     set +e
     # Pull remote manifest image
     if ! docker pull "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}" 2>/dev/null >/dev/null; then
-        >&2 echo
-        >&2 echo "Remote docker registry unreachable"
-        >&2 echo
+        verbosity::print_info
+        verbosity::print_info "Remote docker registry unreachable"
+        verbosity::print_info
         REMOTE_DOCKER_REGISTRY_UNREACHABLE="true"
         export REMOTE_DOCKER_REGISTRY_UNREACHABLE
         touch "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}"
@@ -274,9 +274,9 @@ function build_images::get_remote_image_build_cache_hash() {
         "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}"
     docker rm --force "$(cat "${REMOTE_IMAGE_CONTAINER_ID_FILE}")"
     rm -f "${REMOTE_IMAGE_CONTAINER_ID_FILE}"
-    echo
-    echo "Remote build cache hash: '$(cat "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}")'"
-    echo
+    verbosity::print_info
+    verbosity::print_info "Remote build cache hash: '$(cat "${REMOTE_IMAGE_BUILD_CACHE_HASH_FILE}")'"
+    verbosity::print_info
 }
 
 # Compares layers from both remote and local image and set FORCE_PULL_IMAGES to true in case


[airflow] 08/34: Cope with '%' in password when waiting for migrations (#12440)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 011f9484667b8bb80ec710b6c385ead4b1030358
Author: highfly22 <hi...@gmail.com>
AuthorDate: Wed Nov 18 21:48:08 2020 +0800

    Cope with '%' in password when waiting for migrations (#12440)
    
    (cherry picked from commit d4c3d32ae5f7c4915d7aac31cb75bb720c246538)
---
 chart/templates/_helpers.yaml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/chart/templates/_helpers.yaml b/chart/templates/_helpers.yaml
index df7b158..98efc9f 100644
--- a/chart/templates/_helpers.yaml
+++ b/chart/templates/_helpers.yaml
@@ -380,7 +380,7 @@ server_tls_key_file = /etc/pgbouncer/server.key
         directory = os.path.join(package_dir, 'migrations')
         config = Config(os.path.join(package_dir, 'alembic.ini'))
         config.set_main_option('script_location', directory)
-        config.set_main_option('sqlalchemy.url', settings.SQL_ALCHEMY_CONN)
+        config.set_main_option('sqlalchemy.url', settings.SQL_ALCHEMY_CONN.replace('%', '%%'))
         script_ = ScriptDirectory.from_config(config)
 
         timeout=60


[airflow] 25/34: Improve verification of images with PIP check (#12718)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e3406e312ce7b512f349c62db22339b795f17337
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Dec 1 09:51:24 2020 +0100

    Improve verification of images with PIP check (#12718)
    
    Verification of images with PIP is done in separate jobs and
    they provide useful information to committers and contributors
    when the pip check fails.
    
    (cherry picked from commit ebc8fcf199d3304d8c55f6c3908108701c05f9ad)
---
 .github/workflows/ci.yml                           | 47 +++++++++++
 scripts/ci/images/ci_prepare_prod_image_on_ci.sh   |  1 -
 ..._wait_for_ci_image.sh => ci_verify_ci_image.sh} | 49 +++++------
 scripts/ci/images/ci_verify_prod_image.sh          | 94 ++++++++++++++++++++++
 scripts/ci/images/ci_wait_for_ci_image.sh          | 18 -----
 scripts/ci/images/ci_wait_for_prod_image.sh        | 18 -----
 scripts/ci/libraries/_build_images.sh              | 61 ++++++++++++++
 7 files changed, 228 insertions(+), 60 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 77cbf65..9655955 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -177,6 +177,28 @@ jobs:
         run: ./scripts/ci/images/ci_wait_for_all_ci_images.sh
         if: needs.build-info.outputs.waitForImage == 'true'
 
+  verify-ci-images:
+    timeout-minutes: 20
+    name: "Verify CI Image Py${{matrix.python-version}}"
+    runs-on: ubuntu-20.04
+    needs: [build-info, ci-images]
+    strategy:
+      matrix:
+        python-version: ${{ fromJson(needs.build-info.outputs.pythonVersions) }}
+    env:
+      BACKEND: sqlite
+    if: needs.build-info.outputs.image-build == 'true'
+    steps:
+      - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
+        uses: actions/checkout@v2
+        if: needs.build-info.outputs.waitForImage == 'true'
+      - name: "Free space"
+        run: ./scripts/ci/tools/ci_free_space_on_ci.sh
+        if: needs.build-info.outputs.waitForImage == 'true'
+      - name: "Verify CI image Py${{matrix.python-version}}:${{ env.GITHUB_REGISTRY_PULL_IMAGE_TAG }}"
+        run: ./scripts/ci/images/ci_verify_ci_image.sh
+        if: needs.build-info.outputs.waitForImage == 'true'
+
   static-checks:
     timeout-minutes: 30
     name: "Static checks"
@@ -593,6 +615,27 @@ jobs:
         run: ./scripts/ci/images/ci_wait_for_all_prod_images.sh
         if: needs.build-info.outputs.waitForImage == 'true'
 
+  verify-prod-images:
+    timeout-minutes: 20
+    name: "Verify Prod Image Py${{matrix.python-version}}"
+    runs-on: ubuntu-20.04
+    needs: [build-info, prod-images]
+    strategy:
+      matrix:
+        python-version: ${{ fromJson(needs.build-info.outputs.pythonVersions) }}
+    env:
+      BACKEND: sqlite
+    steps:
+      - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
+        uses: actions/checkout@v2
+        if: needs.build-info.outputs.waitForImage == 'true'
+      - name: "Free space"
+        run: ./scripts/ci/tools/ci_free_space_on_ci.sh
+        if: needs.build-info.outputs.waitForImage == 'true'
+      - name: "Verify PROD image Py${{matrix.python-version}}:${{ env.GITHUB_REGISTRY_PULL_IMAGE_TAG }}"
+        run: ./scripts/ci/images/ci_verify_prod_image.sh
+        if: needs.build-info.outputs.waitForImage == 'true'
+
   tests-kubernetes:
     timeout-minutes: 50
     name: K8s ${{matrix.python-version}} ${{matrix.kubernetes-version}} ${{matrix.kubernetes-mode}}
@@ -681,6 +724,7 @@ jobs:
       - tests-mysql
       - tests-kubernetes
       - prod-images
+      - verify-prod-images
       - docs
     if: >
       (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/v1-10-test') &&
@@ -717,6 +761,7 @@ jobs:
       - tests-mysql
       - tests-kubernetes
       - ci-images
+      - verify-ci-images
       - docs
     if: >
       (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/v1-10-test' ) &&
@@ -781,6 +826,8 @@ jobs:
     needs:
       - build-info
       - constraints
+      - verify-ci-images
+      - verify-prod-images
       - static-checks
       - tests-sqlite
       - tests-mysql
diff --git a/scripts/ci/images/ci_prepare_prod_image_on_ci.sh b/scripts/ci/images/ci_prepare_prod_image_on_ci.sh
index f8cdcd2..d38182b 100755
--- a/scripts/ci/images/ci_prepare_prod_image_on_ci.sh
+++ b/scripts/ci/images/ci_prepare_prod_image_on_ci.sh
@@ -47,5 +47,4 @@ function build_prod_images_on_ci() {
     unset FORCE_BUILD
 }
 
-
 build_prod_images_on_ci
diff --git a/scripts/ci/images/ci_wait_for_ci_image.sh b/scripts/ci/images/ci_verify_ci_image.sh
similarity index 57%
copy from scripts/ci/images/ci_wait_for_ci_image.sh
copy to scripts/ci/images/ci_verify_ci_image.sh
index 2c0bdf2..e1f2b98 100755
--- a/scripts/ci/images/ci_wait_for_ci_image.sh
+++ b/scripts/ci/images/ci_verify_ci_image.sh
@@ -16,37 +16,40 @@
 # specific language governing permissions and limitations
 # under the License.
 # shellcheck source=scripts/ci/libraries/_script_init.sh
-. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
+. "$(dirname "${BASH_SOURCE[0]}")/../libraries/_script_init.sh"
+
+function verify_ci_image_dependencies() {
 
-function verify_ci_image_dependencies {
     echo
-    echo "Checking if Airflow dependencies are non-conflicting in CI image."
+    echo "Checking if Airflow dependencies are non-conflicting in ${AIRFLOW_CI_IMAGE} image."
     echo
 
-    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_CI_IMAGE}" \
-        "${GITHUB_REGISTRY_AIRFLOW_CI_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
-
-    # TODO: remove after we have it fully working
-    docker run --rm --entrypoint /bin/bash "${AIRFLOW_CI_IMAGE}" -c 'pip check' || true
+    set +e
+    docker run --rm --entrypoint /bin/bash "${AIRFLOW_CI_IMAGE}" -c 'pip check'
+    local res=$?
+    if [[ ${res} != "0" ]]; then
+        echo -e " \e[31mERROR: ^^^ Some dependencies are conflicting. See instructions below on how to deal with it.\e[0m"
+        echo
+        build_images::inform_about_pip_check ""
+    else
+        echo
+        echo -e " \e[32mOK. The ${AIRFLOW_PROD_IMAGE} image dependencies are consistent.\e[0m"
+        echo
+    fi
+    set -e
+    exit ${res}
 }
 
-push_pull_remove_images::check_if_github_registry_wait_for_image_enabled
-
-push_pull_remove_images::check_if_jq_installed
-
-build_image::login_to_github_registry_if_needed
-
-export AIRFLOW_CI_IMAGE_NAME="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-ci"
+function pull_ci_image() {
+    local image_name_with_tag="${GITHUB_REGISTRY_AIRFLOW_CI_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+    echo
+    echo "Pulling the ${image_name_with_tag} image."
+    echo
 
-echo
-echo "Waiting for image to appear: ${AIRFLOW_CI_IMAGE_NAME}"
-echo
+    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_CI_IMAGE}" "${image_name_with_tag}"
+}
 
-push_pull_remove_images::wait_for_github_registry_image \
-    "${AIRFLOW_CI_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
 
-echo
-echo "Verifying the ${AIRFLOW_CI_IMAGE_NAME} image after pulling it"
-echo
+build_images::prepare_ci_build
 
 verify_ci_image_dependencies
diff --git a/scripts/ci/images/ci_verify_prod_image.sh b/scripts/ci/images/ci_verify_prod_image.sh
new file mode 100755
index 0000000..9718a48
--- /dev/null
+++ b/scripts/ci/images/ci_verify_prod_image.sh
@@ -0,0 +1,94 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck source=scripts/ci/libraries/_script_init.sh
+. "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
+
+function verify_prod_image_has_airflow {
+    echo
+    echo "Airflow folders installed in the image:"
+    echo
+
+    AIRFLOW_INSTALLATION_LOCATION="/home/airflow/.local"
+
+    docker run --rm --entrypoint /bin/bash "${AIRFLOW_PROD_IMAGE}" -c \
+        'find '"${AIRFLOW_INSTALLATION_LOCATION}"'/lib/python*/site-packages/airflow/ -type d'
+
+    EXPECTED_MIN_AIRFLOW_DIRS_COUNT="60"
+    readonly EXPECTED_MIN_AIRFLOW_DIRS_COUNT
+
+    COUNT_AIRFLOW_DIRS=$(docker run --rm --entrypoint /bin/bash "${AIRFLOW_PROD_IMAGE}" -c \
+         'find '"${AIRFLOW_INSTALLATION_LOCATION}"'/lib/python*/site-packages/airflow/ -type d | grep -c -v "/airflow/providers"')
+
+    echo
+    echo "Number of airflow dirs: ${COUNT_AIRFLOW_DIRS}"
+    echo
+
+    if [[ "${COUNT_AIRFLOW_DIRS}" -lt "${EXPECTED_MIN_AIRFLOW_DIRS_COUNT}" ]]; then
+        >&2 echo
+        >&2 echo Number of airflow folders installed is less than ${EXPECTED_MIN_AIRFLOW_DIRS_COUNT}
+        >&2 echo This is unexpected. Please investigate, looking at the output above!
+        >&2 echo
+        exit 1
+    else
+        echo
+        echo -e " \e[32mOK. Airflow is installed.\e[0m"
+        echo
+    fi
+}
+
+
+function verify_prod_image_dependencies {
+
+    echo
+    echo "Checking if Airflow dependencies are non-conflicting in ${AIRFLOW_PROD_IMAGE} image."
+    echo
+
+    set +e
+    docker run --rm --entrypoint /bin/bash "${AIRFLOW_PROD_IMAGE}" -c 'pip check'
+    local res=$?
+    if [[ ${res} != "0" ]]; then
+        echo -e " \e[31mERROR: ^^^ Some dependencies are conflicting. See instructions below on how to deal with it.\e[0m"
+        echo
+        build_images::inform_about_pip_check "--production "
+        # TODO(potiuk) - enable the comment once https://github.com/apache/airflow/pull/12188 is merged
+        # exit ${res}
+    else
+        echo
+        echo " \e[32mOK. The ${AIRFLOW_PROD_IMAGE} image dependencies are consistent.\e[0m"
+        echo
+    fi
+    set -e
+
+}
+
+function pull_prod_image() {
+    local image_name_with_tag="${GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
+
+    echo
+    echo "Pulling the ${image_name_with_tag} image."
+    echo
+
+    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_PROD_IMAGE}" "${image_name_with_tag}"
+}
+
+build_images::prepare_prod_build
+
+
+verify_prod_image_has_airflow
+
+verify_prod_image_dependencies
diff --git a/scripts/ci/images/ci_wait_for_ci_image.sh b/scripts/ci/images/ci_wait_for_ci_image.sh
index 2c0bdf2..0c3ea08 100755
--- a/scripts/ci/images/ci_wait_for_ci_image.sh
+++ b/scripts/ci/images/ci_wait_for_ci_image.sh
@@ -18,18 +18,6 @@
 # shellcheck source=scripts/ci/libraries/_script_init.sh
 . "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
 
-function verify_ci_image_dependencies {
-    echo
-    echo "Checking if Airflow dependencies are non-conflicting in CI image."
-    echo
-
-    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_CI_IMAGE}" \
-        "${GITHUB_REGISTRY_AIRFLOW_CI_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
-
-    # TODO: remove after we have it fully working
-    docker run --rm --entrypoint /bin/bash "${AIRFLOW_CI_IMAGE}" -c 'pip check' || true
-}
-
 push_pull_remove_images::check_if_github_registry_wait_for_image_enabled
 
 push_pull_remove_images::check_if_jq_installed
@@ -44,9 +32,3 @@ echo
 
 push_pull_remove_images::wait_for_github_registry_image \
     "${AIRFLOW_CI_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
-
-echo
-echo "Verifying the ${AIRFLOW_CI_IMAGE_NAME} image after pulling it"
-echo
-
-verify_ci_image_dependencies
diff --git a/scripts/ci/images/ci_wait_for_prod_image.sh b/scripts/ci/images/ci_wait_for_prod_image.sh
index e53aec1..1c7cef5 100755
--- a/scripts/ci/images/ci_wait_for_prod_image.sh
+++ b/scripts/ci/images/ci_wait_for_prod_image.sh
@@ -18,18 +18,6 @@
 # shellcheck source=scripts/ci/libraries/_script_init.sh
 . "$( dirname "${BASH_SOURCE[0]}" )/../libraries/_script_init.sh"
 
-function verify_prod_image_dependencies {
-    echo
-    echo "Checking if Airflow dependencies are non-conflicting in PROD image."
-    echo
-
-    push_pull_remove_images::pull_image_github_dockerhub "${AIRFLOW_PROD_IMAGE}" \
-        "${GITHUB_REGISTRY_AIRFLOW_PROD_IMAGE}:${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
-
-    # TODO: remove the | true after we fixed pip check for prod image
-    docker run --rm --entrypoint /bin/bash "${AIRFLOW_PROD_IMAGE}" -c 'pip check' || true
-}
-
 push_pull_remove_images::check_if_github_registry_wait_for_image_enabled
 
 push_pull_remove_images::check_if_jq_installed
@@ -44,9 +32,3 @@ echo
 
 push_pull_remove_images::wait_for_github_registry_image \
     "${AIRFLOW_PROD_IMAGE_NAME}" "${GITHUB_REGISTRY_PULL_IMAGE_TAG}"
-
-echo
-echo "Verifying the ${AIRFLOW_PROD_IMAGE_NAME} image after pulling it"
-echo
-
-verify_prod_image_dependencies
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index 8de58db..30e7a85 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -849,3 +849,64 @@ function build_images::determine_docker_cache_strategy() {
     verbosity::print_info "Using ${DOCKER_CACHE} cache strategy for the build."
     verbosity::print_info
 }
+
+
+# Useful information for people who stumble upon a pip check failure
+function build_images::inform_about_pip_check() {
+        >&2 echo """
+
+The image did not pass 'pip check' verification. This means that there are some conflicting dependencies
+in the image. Usually it means that some setup.py or setup.cfg limits need to be adjusted to fix it.
+
+Usually it happens when one of the dependencies gets upgraded and it has more strict requirements
+than the other dependencies and they are conflicting.
+
+In case you did not update setup.py or any of your dependencies, this error might happen in case
+someone accidentally merges conflicting dependencies in master. This
+should not happen as we are running 'pip check' as dependency before we upgrade the constrained
+dependencies, but we could miss some edge cases (thank you for your patience). Please let committer now
+and apologies for the troubles. You do not have to do anything in this case. You might be asked to
+rebase to the latest master after the problem is fixed.
+
+In case you actually updated setup.py, there are some steps you can take to address that:
+
+* first of all ask the committer to set 'upgrade to newer dependencies' and 'full tests needed' labels
+  for your PR. This will turn your PR in mode where all the dependencies are upgraded to latest matching
+  dependencies and the checks will run for all python versions
+
+* run locally the image that is failing with Breeze - this will make it easy to manually try to update
+  the setup.py and test the consequences of changing constraints. You can do it by checking out your PR
+  and running this command:
+
+    ./breeze ${1}--github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} --python ${PYTHON_MAJOR_MINOR_VERSION}
+
+* your setup.py and setup.cfg will be mounted to the container and you will be able to iterate with
+  different setup.py versions.
+
+* run 'pipdeptree' to figure out where the dependency conflict comes from. Useful commands that can help you
+  to find out dependencies you have are:
+     * 'pipdeptree | less' (you can then search through the dependencies with vim-like shortcuts)
+     * 'pipdeptree > /files/pipdeptree.txt' - this will produce a pipdeptree.txt file in your source
+       'files' directory and you can open it in editor of your choice,
+     * 'pipdeptree | grep YOUR_DEPENDENCY' - to see all the requirements your dependency has as specified
+       by other packages
+
+* figure out which dependency limits should be upgraded. First try to upgrade them in setup.py extras
+  and run pip to upgrade your dependencies accordingly:
+
+     pip install '.[all]' --upgrade --upgrade-strategy eager
+
+* run pip check to figure out if the dependencies have been fixed (it should let you know which dependencies
+  are conflicting or (hurray!) if there are no conflicts:
+
+     pip check
+
+* in some, rare, cases, pip will not limit the requirement in case you specify it in extras, you might
+  need to add such requirement in 'install_requires' section of setup.cfg in order to have pip take it into
+  account. This will happen if higher version of your dependency is already installed in 'install_requires'
+  section. In such case update 'setup.cfg' and run pip install/pip check from the previous steps
+
+* iterate until all such dependency conflicts are fixed.
+
+"""
+}


[airflow] 19/34: Setup.cfg change triggers full build (#12684)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 74e3c56380efff308dd1d6b92ba43e87dca261ed
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Sat Nov 28 12:39:46 2020 +0100

    Setup.cfg change triggers full build (#12684)
    
    Since we moved part of the setup.py specification to
    setup.cfg, we should trigger full build when only that file
    changes.
    
    (cherry picked from commit e4ab453a37c629e22d3d480511b43570f5237338)
---
 scripts/ci/selective_ci_checks.sh | 1 +
 1 file changed, 1 insertion(+)

diff --git a/scripts/ci/selective_ci_checks.sh b/scripts/ci/selective_ci_checks.sh
index a6c66eb..c87ec41 100755
--- a/scripts/ci/selective_ci_checks.sh
+++ b/scripts/ci/selective_ci_checks.sh
@@ -370,6 +370,7 @@ function run_all_tests_if_environment_files_changed() {
         "^Dockerfile"
         "^scripts"
         "^setup.py"
+        "^setup.cfg"
     )
     show_changed_files
 


[airflow] 29/34: Bump Airflow Version to 1.10.14

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 06a4606b6f478edbe983045037a629074d823fb1
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Dec 2 15:28:28 2020 +0000

    Bump Airflow Version to 1.10.14
---
 IMAGES.rst                     | 18 +++++++--------
 README.md                      | 12 +++++-----
 airflow/version.py             |  2 +-
 docs/installation.rst          |  8 +++----
 docs/production-deployment.rst | 50 +++++++++++++++++++++---------------------
 5 files changed, 45 insertions(+), 45 deletions(-)

diff --git a/IMAGES.rst b/IMAGES.rst
index 724d73c..339969b 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -39,7 +39,7 @@ The images are named as follows:
 
 where:
 
-* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, ``v1-10-test``, ``1.10.13``
+* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, ``v1-10-test``, ``1.10.14``
   The ``master`` and ``v1-10-test`` labels are built from branches so they change over time. The ``1.10.*`` and in
   the future ``2.*`` labels are build from git tags and they are "fixed" once built.
 * ``PYTHON_MAJOR_MINOR_VERSION`` - version of python used to build the image. Examples: ``3.5``, ``3.7``
@@ -115,15 +115,15 @@ parameter to Breeze:
 .. code-block:: bash
 
   ./breeze build-image --python 3.7 --additional-extras=presto \
-      --production-image --install-airflow-version=1.10.13
+      --production-image --install-airflow-version=1.10.14
 
 This will build the image using command similar to:
 
 .. code-block:: bash
 
     pip install \
-      apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.13 \
-      --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt"
+      apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.14 \
+      --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt"
 
 You can also build production images from specific Git version via providing ``--install-airflow-reference``
 parameter to Breeze (this time constraints are taken from the ``constraints-master`` branch which is the
@@ -210,8 +210,8 @@ For example:
   apache/airflow:master-python3.6                - production "latest" image from current master
   apache/airflow:master-python3.6-ci             - CI "latest" image from current master
   apache/airflow:v1-10-test-python2.7-ci         - CI "latest" image from current v1-10-test branch
-  apache/airflow:1.10.13-python3.6               - production image for 1.10.13 release
-  apache/airflow:1.10.13-1-python3.6             - production image for 1.10.13 with some patches applied
+  apache/airflow:1.10.14-python3.6               - production image for 1.10.14 release
+  apache/airflow:1.10.14-1-python3.6             - production image for 1.10.14 with some patches applied
 
 
 You can see DockerHub images at `<https://hub.docker.com/repository/docker/apache/airflow>`_
@@ -292,7 +292,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -308,7 +308,7 @@ the same image can be built using ``breeze`` (it supports auto-completion of the
 .. code-block:: bash
 
   ./breeze build-image -f Dockerfile.ci \
-      --production-image  --python 3.7 --install-airflow-version=1.10.13 \
+      --production-image  --python 3.7 --install-airflow-version=1.10.14 \
       --additional-extras=jdbc --additional-python-deps="pandas" \
       --additional-dev-apt-deps="gcc g++" --additional-runtime-apt-deps="default-jre-headless"
 You can build the default production image with standard ``docker build`` command but they will only build
@@ -326,7 +326,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
diff --git a/README.md b/README.md
index 5a5edd6..b72b175 100644
--- a/README.md
+++ b/README.md
@@ -76,7 +76,7 @@ Airflow is not a streaming solution, but it is often used to process real-time d
 
 Apache Airflow is tested with:
 
-|              | Master version (2.0.0dev) | Stable version (1.10.13) |
+|              | Master version (2.0.0dev) | Stable version (1.10.14) |
 | ------------ | ------------------------- | ------------------------ |
 | Python       | 3.6, 3.7, 3.8             | 2.7, 3.5, 3.6, 3.7, 3.8  |
 | PostgreSQL   | 9.6, 10, 11, 12, 13       | 9.6, 10, 11, 12, 13      |
@@ -109,7 +109,7 @@ if needed. This means that from time to time plain `pip install apache-airflow`
 produce unusable Airflow installation.
 
 In order to have repeatable installation, however, introduced in **Airflow 1.10.10** and updated in
-**Airflow 1.10.13** we also keep a set of "known-to-be-working" constraint files in the
+**Airflow 1.10.12** we also keep a set of "known-to-be-working" constraint files in the
 orphan `constraints-master` and `constraints-1-10` branches. We keep those "known-to-be-working"
 constraints files separately per major/minor python version.
 You can use them as constraint files when installing Airflow from PyPI. Note that you have to specify
@@ -118,14 +118,14 @@ correct Airflow tag/version/branch and python versions in the URL.
 1. Installing just Airflow:
 
 ```bash
-pip install apache-airflow==1.10.13 \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.7.txt"
+pip install apache-airflow==1.10.14 \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.7.txt"
 ```
 
 2. Installing with extras (for example postgres,gcp)
 ```bash
-pip install apache-airflow[postgres,gcp]==1.10.13 \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.7.txt"
+pip install apache-airflow[postgres,gcp]==1.10.14 \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.7.txt"
 ```
 
 For information on installing backport providers check https://airflow.readthedocs.io/en/latest/backport-providers.html.
diff --git a/airflow/version.py b/airflow/version.py
index 115c560..b3b5b30 100644
--- a/airflow/version.py
+++ b/airflow/version.py
@@ -18,4 +18,4 @@
 # under the License.
 #
 
-version = '1.10.13'
+version = '1.10.14'
diff --git a/docs/installation.rst b/docs/installation.rst
index 12ce19e..4a084e1 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -31,7 +31,7 @@ if needed. This means that from time to time plain ``pip install apache-airflow`
 produce unusable Airflow installation.
 
 In order to have repeatable installation, however, starting from **Airflow 1.10.10** and updated in
-**Airflow 1.10.13** we also keep a set of "known-to-be-working" constraint files in the
+**Airflow 1.10.12** we also keep a set of "known-to-be-working" constraint files in the
 ``constraints-master`` and ``constraints-1-10`` orphan branches.
 Those "known-to-be-working" constraints are per major/minor python version. You can use them as constraint
 files when installing Airflow from PyPI. Note that you have to specify correct Airflow version
@@ -51,18 +51,18 @@ and python versions in the URL.
 
 .. code-block:: bash
 
-    AIRFLOW_VERSION=1.10.13
+    AIRFLOW_VERSION=1.10.14
     PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
     # For example: 3.6
     CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
-    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt
+    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt
     pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
 
 2. Installing with extras (for example postgres, google)
 
 .. code-block:: bash
 
-    AIRFLOW_VERSION=1.10.13
+    AIRFLOW_VERSION=1.10.14
     PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
     CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
     pip install "apache-airflow[postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
diff --git a/docs/production-deployment.rst b/docs/production-deployment.rst
index 3edddb8..ac6c76d 100644
--- a/docs/production-deployment.rst
+++ b/docs/production-deployment.rst
@@ -64,7 +64,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.13
+  FROM: apache/airflow:1.10.14
   USER root
   RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -81,7 +81,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.13
+  FROM: apache/airflow:1.10.14
   RUN pip install --no-cache-dir --user my-awesome-pip-dependency-to-add
 
 
@@ -92,7 +92,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.13
+  FROM: apache/airflow:1.10.14
   USER root
   RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -125,7 +125,7 @@ in the `<#production-image-build-arguments>`_ chapter below.
 
 Here just a few examples are presented which should give you general understanding of what you can customize.
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
+This builds the production image in version 3.7 with additional airflow extras from 1.10.14 PyPI package and
 additional apt dev and runtime dependencies.
 
 .. code-block:: bash
@@ -134,7 +134,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -150,7 +150,7 @@ the same image can be built using ``breeze`` (it supports auto-completion of the
 .. code-block:: bash
 
   ./breeze build-image \
-      --production-image  --python 3.7 --install-airflow-version=1.10.13 \
+      --production-image  --python 3.7 --install-airflow-version=1.10.14 \
       --additional-extras=jdbc --additional-python-deps="pandas" \
       --additional-dev-apt-deps="gcc g++" --additional-runtime-apt-deps="default-jre-headless"
 
@@ -166,7 +166,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -225,7 +225,7 @@ Preparing the constraint files and wheel files:
 
   pip download --dest docker-context-files \
     --constraint docker-context-files/constraints-1-10.txt  \
-    apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.13
+    apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.14
 
 
 Building the image (after copying the files downloaded to the "docker-context-files" directory:
@@ -233,7 +233,7 @@ Building the image (after copying the files downloaded to the "docker-context-fi
 .. code-block:: bash
 
   ./breeze build-image \
-      --production-image --python 3.7 --install-airflow-version=1.10.13 \
+      --production-image --python 3.7 --install-airflow-version=1.10.14 \
       --disable-mysql-client-installation --disable-pip-cache --add-local-pip-wheels \
       --constraints-location="/docker-context-files/constraints-1-10.txt"
 
@@ -245,7 +245,7 @@ or
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -392,7 +392,7 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 |                                          |                                          | ``constraints-master`` but can be        |
 |                                          |                                          | ``constraints-1-10`` for 1.10.* versions |
 |                                          |                                          | or it could point to specific version    |
-|                                          |                                          | for example ``constraints-1.10.13``      |
+|                                          |                                          | for example ``constraints-1.10.14``      |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_EXTRAS``                       | (see Dockerfile)                         | Default extras with which airflow is     |
 |                                          |                                          | installed                                |
@@ -503,7 +503,7 @@ production image. There are three types of build:
 | ``AIRFLOW_INSTALL_VERSION``       | Optional - might be used for      |
 |                                   | package installation case to      |
 |                                   | set Airflow version for example   |
-|                                   | "==1.10.13"                       |
+|                                   | "==1.10.14"                       |
 +-----------------------------------+-----------------------------------+
 | ``AIRFLOW_CONSTRAINTS_REFERENCE`` | reference (branch or tag) from    |
 |                                   | GitHub where constraints file     |
@@ -512,7 +512,7 @@ production image. There are three types of build:
 |                                   | ``constraints-1-10`` for 1.10.*   |
 |                                   | constraint or if you want to      |
 |                                   | point to specific version         |
-|                                   | might be ``constraints-1.10.13``  |
+|                                   | might be ``constraints-1.10.14``  |
 +-----------------------------------+-----------------------------------+
 | ``SLUGIFY_USES_TEXT_UNIDECODE``   | In case of of installing airflow  |
 |                                   | 1.10.2 or 1.10.1 you need to      |
@@ -546,7 +546,7 @@ of 2.0 currently):
 
   docker build .
 
-This builds the production image in version 3.7 with default extras from 1.10.13 tag and
+This builds the production image in version 3.7 with default extras from 1.10.14 tag and
 constraints taken from constraints-1-10-12 branch in GitHub.
 
 .. code-block:: bash
@@ -554,14 +554,14 @@ constraints taken from constraints-1-10-12 branch in GitHub.
   docker build . \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
-    --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.13.tar.gz#egg=apache-airflow" \
+    --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.14.tar.gz#egg=apache-airflow" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty"
 
-This builds the production image in version 3.7 with default extras from 1.10.13 PyPI package and
-constraints taken from 1.10.13 tag in GitHub and pre-installed pip dependencies from the top
+This builds the production image in version 3.7 with default extras from 1.10.14 PyPI package and
+constraints taken from 1.10.14 tag in GitHub and pre-installed pip dependencies from the top
 of v1-10-test branch.
 
 .. code-block:: bash
@@ -570,14 +570,14 @@ of v1-10-test branch.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
-    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.13" \
+    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.14" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty"
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
-additional python dependencies and pre-installed pip dependencies from 1.10.13 tagged constraints.
+This builds the production image in version 3.7 with additional airflow extras from 1.10.14 PyPI package and
+additional python dependencies and pre-installed pip dependencies from 1.10.14 tagged constraints.
 
 .. code-block:: bash
 
@@ -585,15 +585,15 @@ additional python dependencies and pre-installed pip dependencies from 1.10.13 t
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
-    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.13" \
+    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.14" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
     --build-arg ADDITIONAL_AIRFLOW_EXTRAS="mssql,hdfs"
     --build-arg ADDITIONAL_PYTHON_DEPS="sshtunnel oauth2client"
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
+This builds the production image in version 3.7 with additional airflow extras from 1.10.14 PyPI package and
 additional apt dev and runtime dependencies.
 
 .. code-block:: bash
@@ -602,7 +602,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \


[airflow] 13/34: Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 416b125a2c5d61723fd0132dd61db621bfaa5f76
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Nov 25 07:43:47 2020 +0000

    Use AIRFLOW_CONSTRAINTS_LOCATION when passed during docker build (#12604)
    
    Previously, even though this was passed during docker build it was
    ignored. This commit fixes it
    
    (cherry picked from commit c457c975b885469f09ef2e4c8d1f5836798bc820)
---
 Dockerfile    | 2 +-
 Dockerfile.ci | 9 ++++-----
 2 files changed, 5 insertions(+), 6 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index 00442bc..9b96cfa 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -176,7 +176,7 @@ RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
        fi; \
        pip install --user \
           "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-          --constraint "https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+          --constraint "${AIRFLOW_CONSTRAINTS_LOCATION}" \
           && pip uninstall --yes apache-airflow; \
     fi
 
diff --git a/Dockerfile.ci b/Dockerfile.ci
index ac51a56..cac73bb 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -245,8 +245,8 @@ ENV AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS}${ADDITIONAL_AIRFLOW_EXTRAS:+,}${ADDITIONAL_
 RUN echo "Installing with extras: ${AIRFLOW_EXTRAS}."
 
 ARG AIRFLOW_CONSTRAINTS_REFERENCE="constraints-master"
-ARG AIRFLOW_CONSTRAINTS_URL="https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt"
-ENV AIRFLOW_CONSTRAINTS_URL=${AIRFLOW_CONSTRAINTS_URL}
+ARG AIRFLOW_CONSTRAINTS_LOCATION="https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt"
+ENV AIRFLOW_CONSTRAINTS_LOCATION=${AIRFLOW_CONSTRAINTS_LOCATION}
 
 # By changing the CI build epoch we can force reinstalling Airflow from the current master
 # It can also be overwritten manually by setting the AIRFLOW_CI_BUILD_EPOCH environment variable.
@@ -269,11 +269,10 @@ ENV INSTALL_AIRFLOW_VIA_PIP=${INSTALL_AIRFLOW_VIA_PIP}
 RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
         pip install \
             "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-                --constraint "https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_CONSTRAINTS_REFERENCE}/constraints-${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+                --constraint "${AIRFLOW_CONSTRAINTS_LOCATION}" \
                 && pip uninstall --yes apache-airflow; \
     fi
 
-
 # Generate random hex dump file so that we can determine whether it's faster to rebuild the image
 # using current cache (when our dump is the same as the remote onb) or better to pull
 # the new image (when it is different)
@@ -341,7 +340,7 @@ COPY scripts/in_container/entrypoint_ci.sh /entrypoint
 RUN chmod a+x /entrypoint
 
 # We can copy everything here. The Context is filtered by dockerignore. This makes sure we are not
-# copying over stuff that is accidentally generated or that we do not need (such as .egginfo)
+# copying over stuff that is accidentally generated or that we do not need (such as egg-info)
 # if you want to add something that is missing and you expect to see it in the image you can
 # add it with ! in .dockerignore next to the airflow, test etc. directories there
 COPY . ${AIRFLOW_SOURCES}/


[airflow] 15/34: Improved breeze messages for initialize-local-virtualenv and static-check --help (#12640)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 5bd7613d412d4e18e86f9bd5832813960d8884aa
Author: Ruben Laguna <ru...@gmail.com>
AuthorDate: Thu Nov 26 14:54:23 2020 +0100

    Improved breeze messages for initialize-local-virtualenv and static-check --help (#12640)
    
    (cherry picked from commit cf718dbb9ba64006652ccece08e936fe130fa51b)
---
 BREEZE.rst | 4 ++++
 breeze     | 6 ++++++
 2 files changed, 10 insertions(+)

diff --git a/BREEZE.rst b/BREEZE.rst
index ce7dc6a..2c4bffbe 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1884,6 +1884,10 @@ This is the current syntax for  `./breeze <./breeze>`_:
         'breeze static-check mypy -- --files tests/core.py'
         'breeze static-check mypy -- --all-files'
 
+        To check all files that differ between you current branch and master run:
+
+        'breeze static-check all -- --from-ref $(git merge-base master HEAD) --to-ref HEAD'
+
         You can see all the options by adding --help EXTRA_ARG:
 
         'breeze static-check mypy -- --help'
diff --git a/breeze b/breeze
index ff5d7cb..0c09046 100755
--- a/breeze
+++ b/breeze
@@ -244,6 +244,8 @@ function breeze::initialize_virtualenv() {
             echo
             if [[ ${OSTYPE} == "darwin"* ]]; then
                 echo "  brew install sqlite mysql postgresql openssl"
+                echo "  export LDFLAGS=\"-L/usr/local/opt/openssl/lib\""
+                echo "  export CPPFLAGS=\"-I/usr/local/opt/openssl/include\""
             else
                 echo "  sudo apt install build-essentials python3.6-dev python3.7-dev python3.8-dev python-dev openssl \\"
                 echo "              sqlite sqlite-dev default-libmysqlclient-dev libmysqld-dev postgresql"
@@ -1757,6 +1759,10 @@ ${FORMATTED_STATIC_CHECKS}
       '${CMDNAME} static-check mypy -- --files tests/core.py'
       '${CMDNAME} static-check mypy -- --all-files'
 
+      To check all files that differ between you current branch and master run:
+
+      '${CMDNAME} static-check all -- --from-ref \$(git merge-base master HEAD) --to-ref HEAD'
+
       You can see all the options by adding --help EXTRA_ARG:
 
       '${CMDNAME} static-check mypy -- --help'


[airflow] 16/34: Allows mounting local sources for github run-id images (#12650)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2d0b41ab893c88fb824da5ba95709c95ebb3cb5a
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Fri Nov 27 12:15:03 2020 +0100

    Allows mounting local sources for github run-id images (#12650)
    
    The images that are build on github can be used to reproduce
    the test errors in CI - they should then be mounted without
    local sources. However in some cases when you are dealing with
    dependencies for example, it is useful to be able to mount the
    sources.
    
    This PR makes it possible.
    
    (cherry picked from commit c0843930bf5c587a054586706021a2f5b492ec42)
---
 breeze                               | 1 -
 scripts/in_container/run_ci_tests.sh | 8 ++++----
 2 files changed, 4 insertions(+), 5 deletions(-)

diff --git a/breeze b/breeze
index 0c09046..fe8f038 100755
--- a/breeze
+++ b/breeze
@@ -1085,7 +1085,6 @@ function breeze::parse_arguments() {
             export GITHUB_REGISTRY_PUSH_IMAGE_TAG="${2}"
             export CHECK_IMAGE_FOR_REBUILD="false"
             export SKIP_BUILDING_PROD_IMAGE="true"
-            export MOUNT_LOCAL_SOURCES="false"
             export SKIP_CHECK_REMOTE_IMAGE="true"
             export FAIL_ON_GITHUB_DOCKER_PULL_ERROR="true"
             shift 2
diff --git a/scripts/in_container/run_ci_tests.sh b/scripts/in_container/run_ci_tests.sh
index 8b66a94..7f2be4c 100755
--- a/scripts/in_container/run_ci_tests.sh
+++ b/scripts/in_container/run_ci_tests.sh
@@ -52,22 +52,22 @@ elif [[ "${RES}" != "0" ]]; then
     >&2 echo "*"
     >&2 echo "*     Run all tests:"
     >&2 echo "*"
-    >&2 echo "*       ./breeze --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --test-type ${TEST_TYPE}  tests"
+    >&2 echo "*       ./breeze --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --skip-mounting-local-sources --test-type ${TEST_TYPE}  tests"
     >&2 echo "*"
     >&2 echo "*     Enter docker shell:"
     >&2 echo "*"
-    >&2 echo "*       ./breeze --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --test-type ${TEST_TYPE}  shell"
+    >&2 echo "*       ./breeze --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --skip-mounting-local-sources --test-type ${TEST_TYPE}  shell"
     >&2 echo "*"
     if [[ ${GITHUB_REGISTRY_PULL_IMAGE_TAG=} != "" ]]; then
         >&2 echo "*   When you do not have sources:"
         >&2 echo "*"
         >&2 echo "*     Run all tests:"
         >&2 echo "*"
-        >&2 echo "*      ./breeze --github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --test-type ${TEST_TYPE} tests"
+        >&2 echo "*      ./breeze --github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --skip-mounting-local-sources --test-type ${TEST_TYPE} tests"
         >&2 echo "*"
         >&2 echo "*     Enter docker shell:"
         >&2 echo "*"
-        >&2 echo "*      ./breeze --github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --test-type ${TEST_TYPE} shell"
+        >&2 echo "*      ./breeze --github-image-id ${GITHUB_REGISTRY_PULL_IMAGE_TAG} --backend ${BACKEND} ${EXTRA_ARGS}--python ${PYTHON_MAJOR_MINOR_VERSION} --db-reset --skip-mounting-local-sources --test-type ${TEST_TYPE} shell"
         >&2 echo "*"
     fi
     >&2 echo "*"


[airflow] 14/34: Adds possibility of forcing upgrade constraint by setting a label (#12635)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a4a825bd0164a15a73e5d837fb647070b185dade
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Thu Nov 26 11:02:33 2020 +0100

    Adds possibility of forcing upgrade constraint by setting a label (#12635)
    
    You can now set a label on PR that will force upgrading to latest
    dependencies in your PR. If committer sets an
    "upgrade to latest dependencies" label, it will cause the PR
    to upgrade all dependencies to latest versions of dependencies
    matching setup.py + setup.cfg configuration.
    
    (cherry picked from commit 8b9d52f0cc197832188f431a2b6e4eb256f9725b)
---
 .github/workflows/build-images-workflow-run.yml    | 18 ++-----
 .github/workflows/ci.yml                           | 10 ++--
 .github/workflows/codeql-analysis.yml              |  4 +-
 .../workflows/label_when_reviewed_workflow_run.yml |  4 +-
 CONTRIBUTING.rst                                   |  5 ++
 scripts/ci/selective_ci_checks.sh                  | 56 ++++++++++++++++------
 6 files changed, 60 insertions(+), 37 deletions(-)

diff --git a/.github/workflows/build-images-workflow-run.yml b/.github/workflows/build-images-workflow-run.yml
index 9726c5a..c5480c6 100644
--- a/.github/workflows/build-images-workflow-run.yml
+++ b/.github/workflows/build-images-workflow-run.yml
@@ -30,7 +30,6 @@ env:
   SKIP_CHECK_REMOTE_IMAGE: "true"
   DB_RESET: "true"
   VERBOSE: "true"
-  UPGRADE_TO_LATEST_CONSTRAINTS: false
   USE_GITHUB_REGISTRY: "true"
   GITHUB_REPOSITORY: ${{ github.repository }}
   GITHUB_USERNAME: ${{ github.actor }}
@@ -57,7 +56,6 @@ jobs:
       sourceEvent: ${{ steps.source-run-info.outputs.sourceEvent }}
       cacheDirective: ${{ steps.cache-directive.outputs.docker-cache }}
       buildImages: ${{ steps.build-images.outputs.buildImages }}
-      upgradeToLatestConstraints: ${{ steps.upgrade-constraints.outputs.upgradeToLatestConstraints }}
     steps:
       - name: "Get information about the original trigger of the run"
         uses: potiuk/get-workflow-origin@588cc14f9f1cdf1b8be3db816855e96422204fec  # v1_3
@@ -153,15 +151,6 @@ jobs:
           else
               echo "::set-output name=docker-cache::pulled"
           fi
-      - name: "Set upgrade to latest constraints"
-        id: upgrade-constraints
-        run: |
-          if [[ ${{ steps.cancel.outputs.sourceEvent == 'push' ||
-              steps.cancel.outputs.sourceEvent == 'scheduled' }} == 'true' ]]; then
-              echo "::set-output name=upgradeToLatestConstraints::${{ github.sha }}"
-          else
-              echo "::set-output name=upgradeToLatestConstraints::false"
-          fi
       - name: "Cancel all duplicated 'Build Image' runs"
         # We find duplicates of all "Build Image" runs - due to a missing feature
         # in GitHub Actions, we have to use Job names to match Event/Repo/Branch matching
@@ -198,6 +187,7 @@ jobs:
       GITHUB_CONTEXT: ${{ toJson(github) }}
     outputs:
       pythonVersions: ${{ steps.selective-checks.python-versions }}
+      upgradeToLatestConstraints: ${{ steps.selective-checks.outputs.upgrade-to-latest-constraints }}
       allPythonVersions: ${{ steps.selective-checks.outputs.all-python-versions }}
       defaultPythonVersion: ${{ steps.selective-checks.outputs.default-python-version }}
       run-tests: ${{ steps.selective-checks.outputs.run-tests }}
@@ -243,12 +233,12 @@ jobs:
         id: selective-checks
         env:
           EVENT_NAME: ${{ needs.cancel-workflow-runs.outputs.sourceEvent }}
-          INCOMING_COMMIT_SHA: ${{ needs.cancel-workflow-runs.outputs.targetCommitSha }}
+          TARGET_COMMIT_SHA: ${{ needs.cancel-workflow-runs.outputs.targetCommitSha }}
           PR_LABELS: ${{ needs.cancel-workflow-runs.outputs.pullRequestLabels }}
         run: |
           if [[ ${EVENT_NAME} == "pull_request" ]]; then
             # Run selective checks
-            ./scripts/ci/selective_ci_checks.sh "${INCOMING_COMMIT_SHA}"
+            ./scripts/ci/selective_ci_checks.sh "${TARGET_COMMIT_SHA}"
           else
             # Run all checks
             ./scripts/ci/selective_ci_checks.sh
@@ -273,7 +263,7 @@ jobs:
       BACKEND: postgres
       PYTHON_MAJOR_MINOR_VERSION: ${{ matrix.python-version }}
       GITHUB_REGISTRY_PUSH_IMAGE_TAG: ${{ github.event.workflow_run.id }}
-      UPGRADE_TO_LATEST_CONSTRAINTS: ${{ needs.cancel-workflow-runs.outputs.upgradeToLatestConstraints }}
+      UPGRADE_TO_LATEST_CONSTRAINTS: ${{ needs.build-info.outputs.upgradeToLatestConstraints }}
       DOCKER_CACHE: ${{ needs.cancel-workflow-runs.outputs.cacheDirective }}
     steps:
       - name: >
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 5931135..77cbf65 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -35,7 +35,6 @@ env:
   SKIP_CHECK_REMOTE_IMAGE: "true"
   DB_RESET: "true"
   VERBOSE: "true"
-  UPGRADE_TO_LATEST_CONSTRAINTS: ${{ github.event_name == 'push' || github.event_name == 'scheduled' }}
   DOCKER_CACHE: "pulled"
   USE_GITHUB_REGISTRY: "true"
   GITHUB_REPOSITORY: ${{ github.repository }}
@@ -69,6 +68,7 @@ jobs:
       GITHUB_CONTEXT: ${{ toJson(github) }}
     outputs:
       waitForImage: ${{ steps.wait-for-image.outputs.wait-for-image }}
+      upgradeToLatestConstraints: ${{ steps.selective-checks.outputs.upgrade-to-latest-constraints }}
       pythonVersions: ${{ steps.selective-checks.outputs.python-versions }}
       pythonVersionsListAsString: ${{ steps.selective-checks.outputs.python-versions-list-as-string }}
       defaultPythonVersion: ${{ steps.selective-checks.outputs.default-python-version }}
@@ -131,12 +131,12 @@ jobs:
         id: selective-checks
         env:
           EVENT_NAME: ${{ github.event_name }}
-          INCOMING_COMMIT_SHA: ${{ github.sha }}
+          TARGET_COMMIT_SHA: ${{ github.sha }}
           PR_LABELS: "${{ steps.source-run-info.outputs.pullRequestLabels }}"
         run: |
           if [[ ${EVENT_NAME} == "pull_request" ]]; then
             # Run selective checks
-            ./scripts/ci/selective_ci_checks.sh "${INCOMING_COMMIT_SHA}"
+            ./scripts/ci/selective_ci_checks.sh "${TARGET_COMMIT_SHA}"
           else
             # Run all checks
             ./scripts/ci/selective_ci_checks.sh
@@ -150,6 +150,7 @@ jobs:
     if: needs.build-info.outputs.image-build == 'true'
     env:
       BACKEND: sqlite
+      UPGRADE_TO_LATEST_CONSTRAINTS: ${{ needs.build-info.outputs.upgradeToLatestConstraints }}
     steps:
       - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
         uses: actions/checkout@v2
@@ -568,7 +569,8 @@ jobs:
     needs: [build-info]
     env:
       BACKEND: sqlite
-      PYTHON_MAJOR_MINOR_VERSION: ${{needs.build-info.outputs.defaultPythonVersion}}
+      PYTHON_MAJOR_MINOR_VERSION: ${{ needs.build-info.outputs.defaultPythonVersion }}
+      UPGRADE_TO_LATEST_CONSTRAINTS: ${{ needs.build-info.outputs.upgradeToLatestConstraints }}
     if: needs.build-info.outputs.image-build == 'true'
     steps:
       - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml
index 2bf92b7..9fa7b94 100644
--- a/.github/workflows/codeql-analysis.yml
+++ b/.github/workflows/codeql-analysis.yml
@@ -40,11 +40,11 @@ jobs:
         id: selective-checks
         env:
           EVENT_NAME: ${{ github.event_name }}
-          INCOMING_COMMIT_SHA: ${{ github.sha }}
+          TARGET_COMMIT_SHA: ${{ github.sha }}
         run: |
           if [[ ${EVENT_NAME} == "pull_request" ]]; then
             # Run selective checks
-            ./scripts/ci/selective_ci_checks.sh "${INCOMING_COMMIT_SHA}"
+            ./scripts/ci/selective_ci_checks.sh "${TARGET_COMMIT_SHA}"
           else
             # Run all checks
             ./scripts/ci/selective_ci_checks.sh
diff --git a/.github/workflows/label_when_reviewed_workflow_run.yml b/.github/workflows/label_when_reviewed_workflow_run.yml
index 6e45038..6ea15b0 100644
--- a/.github/workflows/label_when_reviewed_workflow_run.yml
+++ b/.github/workflows/label_when_reviewed_workflow_run.yml
@@ -75,12 +75,12 @@ jobs:
         id: selective-checks
         env:
           EVENT_NAME: ${{ steps.source-run-info.outputs.sourceEvent }}
-          INCOMING_COMMIT_SHA: ${{ steps.source-run-info.outputs.targetCommitSha }}
+          TARGET_COMMIT_SHA: ${{ steps.source-run-info.outputs.targetCommitSha }}
           PR_LABELS: ${{ steps.source-run-info.outputs.pullRequestLabels }}
         run: |
           if [[ ${EVENT_NAME} == "pull_request_review" ]]; then
             # Run selective checks
-            ./scripts/ci/selective_ci_checks.sh "${INCOMING_COMMIT_SHA}"
+            ./scripts/ci/selective_ci_checks.sh "${TARGET_COMMIT_SHA}"
           else
             # Run all checks
             ./scripts/ci/selective_ci_checks.sh
diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 6d34026..61883e7 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -321,6 +321,11 @@ Step 4: Prepare PR
        the "full tests needed" label is set for your PR. Additional check is set that prevents from
        accidental merging of the request until full matrix of tests succeeds for the PR.
 
+     * when your change has "upgrade to latest dependencies" label set, constraints will be automatically
+       upgraded to latest constraints matching your setup.py. This is useful in case you want to force
+       upgrade to a latest version of dependencies. You can ask committers to set the label for you
+       when you need it in your PR.
+
    More details about the PR workflow be found in `PULL_REQUEST_WORKFLOW.rst <PULL_REQUEST_WORKFLOW.rst>`_.
 
 
diff --git a/scripts/ci/selective_ci_checks.sh b/scripts/ci/selective_ci_checks.sh
index 3c7132d..a6c66eb 100755
--- a/scripts/ci/selective_ci_checks.sh
+++ b/scripts/ci/selective_ci_checks.sh
@@ -34,16 +34,39 @@ if [[ ${PR_LABELS=} == *"full tests needed"* ]]; then
     echo
     echo "Found the right PR labels in '${PR_LABELS=}': 'full tests needed''"
     echo
-    FULL_TESTS_NEEDED="true"
+    FULL_TESTS_NEEDED_LABEL="true"
 else
     echo
     echo "Did not find the right PR labels in '${PR_LABELS=}': 'full tests needed'"
     echo
-    FULL_TESTS_NEEDED="false"
+    FULL_TESTS_NEEDED_LABEL="false"
+fi
+
+if [[ ${PR_LABELS=} == *"upgrade to latest dependencies"* ]]; then
+    echo
+    echo "Found the right PR labels in '${PR_LABELS=}': 'upgrade to latest dependencies''"
+    echo
+    UPGRADE_TO_LATEST_CONSTRAINTS_LABEL="true"
+else
+    echo
+    echo "Did not find the right PR labels in '${PR_LABELS=}': 'upgrade to latest dependencies'"
+    echo
+    UPGRADE_TO_LATEST_CONSTRAINTS_LABEL="false"
 fi
 
 function output_all_basic_variables() {
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ "${UPGRADE_TO_LATEST_CONSTRAINTS_LABEL}" == "true" ||
+            ${EVENT_NAME} == 'push' || ${EVENT_NAME} == "scheduled" ]]; then
+        # Trigger upgrading to latest constraints where label is set or when
+        # SHA of the merge commit triggers rebuilding layer in the docker image
+        # Each build that upgrades to latest constraints will get truly latest constraints, not those
+        # Cached in the image this way
+        initialization::ga_output upgrade-to-latest-constraints "${INCOMING_COMMIT_SHA}"
+    else
+        initialization::ga_output upgrade-to-latest-constraints "false"
+    fi
+
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output python-versions \
             "$(initialization::parameters_to_json "${CURRENT_PYTHON_MAJOR_MINOR_VERSIONS[@]}")"
         initialization::ga_output all-python-versions \
@@ -60,7 +83,7 @@ function output_all_basic_variables() {
     fi
     initialization::ga_output default-python-version "${DEFAULT_PYTHON_MAJOR_MINOR_VERSION}"
 
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output kubernetes-versions \
             "$(initialization::parameters_to_json "${CURRENT_KUBERNETES_VERSIONS[@]}")"
     else
@@ -73,7 +96,7 @@ function output_all_basic_variables() {
         "$(initialization::parameters_to_json "${CURRENT_KUBERNETES_MODES[@]}")"
     initialization::ga_output default-kubernetes-mode "${KUBERNETES_MODE}"
 
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output postgres-versions \
             "$(initialization::parameters_to_json "${CURRENT_POSTGRES_VERSIONS[@]}")"
     else
@@ -82,7 +105,7 @@ function output_all_basic_variables() {
     fi
     initialization::ga_output default-postgres-version "${POSTGRES_VERSION}"
 
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output mysql-versions \
             "$(initialization::parameters_to_json "${CURRENT_MYSQL_VERSIONS[@]}")"
     else
@@ -100,7 +123,7 @@ function output_all_basic_variables() {
         "$(initialization::parameters_to_json "${CURRENT_HELM_VERSIONS[@]}")"
     initialization::ga_output default-helm-version "${HELM_VERSION}"
 
-    if [[ ${FULL_TESTS_NEEDED} == "true" ]]; then
+    if [[ ${FULL_TESTS_NEEDED_LABEL} == "true" ]]; then
         initialization::ga_output postgres-exclude '[{ "python-version": "3.6" }]'
         initialization::ga_output mysql-exclude '[{ "python-version": "3.7" }]'
         initialization::ga_output sqlite-exclude '[{ "python-version": "3.8" }]'
@@ -114,9 +137,6 @@ function output_all_basic_variables() {
 }
 
 function get_changed_files() {
-    INCOMING_COMMIT_SHA="${1}"
-    readonly INCOMING_COMMIT_SHA
-
     echo
     echo "Incoming commit SHA: ${INCOMING_COMMIT_SHA}"
     echo
@@ -414,14 +434,20 @@ if (($# < 1)); then
     echo
     echo "No Commit SHA - running all tests (likely direct master merge, or scheduled run)!"
     echo
-    # override FULL_TESTS_NEEDED in master/scheduled run
-    FULL_TESTS_NEEDED="true"
-    readonly FULL_TESTS_NEEDED
+    INCOMING_COMMIT_SHA=""
+    readonly INCOMING_COMMIT_SHA
+    # override FULL_TESTS_NEEDED_LABEL in master/scheduled run
+    FULL_TESTS_NEEDED_LABEL="true"
+    readonly FULL_TESTS_NEEDED_LABEL
     output_all_basic_variables
     set_outputs_run_everything_and_exit
+else
+    INCOMING_COMMIT_SHA="${1}"
+    readonly INCOMING_COMMIT_SHA
 fi
 
-readonly FULL_TESTS_NEEDED
+
+readonly FULL_TESTS_NEEDED_LABEL
 output_all_basic_variables
 
 image_build_needed="false"
@@ -429,7 +455,7 @@ docs_build_needed="false"
 tests_needed="false"
 kubernetes_tests_needed="false"
 
-get_changed_files "${1}"
+get_changed_files
 run_all_tests_if_environment_files_changed
 check_if_docs_should_be_generated
 check_if_helm_tests_should_be_run


[airflow] 12/34: Adds missing licence headers (#12593)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 43784ce0f46060f7809dde13f5dd247bc4d91c41
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Wed Nov 25 00:58:01 2020 +0100

    Adds missing licence headers (#12593)
    
    (cherry picked from commit 58e21ed949203a7ac79bf96c72b917796c5f4d21)
---
 .pre-commit-config.yaml                  |  2 +-
 scripts/ci/dockerfiles/bats/Dockerfile   | 17 +++++++++++++++++
 scripts/ci/dockerfiles/stress/Dockerfile | 17 +++++++++++++++++
 3 files changed, 35 insertions(+), 1 deletion(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 2e27e50..4c6b733 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -46,7 +46,7 @@ repos:
           - license-templates/LICENSE.txt
           - --fuzzy-match-generates-todo
         files: >
-          \.properties$|\.cfg$|\.conf$|\.ini$|\.ldif$|\.readthedocs$|\.service$|\.tf$|^Dockerfile.*$
+          \.properties$|\.cfg$|\.conf$|\.ini$|\.ldif$|\.readthedocs$|\.service$|\.tf$|Dockerfile.*$
       - id: insert-license
         name: Add license for all rst files
         exclude: ^\.github/.*$
diff --git a/scripts/ci/dockerfiles/bats/Dockerfile b/scripts/ci/dockerfiles/bats/Dockerfile
index 01db50d..af21f4d 100644
--- a/scripts/ci/dockerfiles/bats/Dockerfile
+++ b/scripts/ci/dockerfiles/bats/Dockerfile
@@ -1,3 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck disable=SC1091
 FROM debian:buster-slim
 
 ARG BATS_VERSION
diff --git a/scripts/ci/dockerfiles/stress/Dockerfile b/scripts/ci/dockerfiles/stress/Dockerfile
index 3041d21..92df101 100644
--- a/scripts/ci/dockerfiles/stress/Dockerfile
+++ b/scripts/ci/dockerfiles/stress/Dockerfile
@@ -1,3 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+# shellcheck disable=SC1091
 ARG ALPINE_VERSION="3.12"
 
 FROM alpine:${ALPINE_VERSION}


[airflow] 03/34: Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit daa725b9f359c23a374a808ad53bbd7ecefce0d9
Author: Florent Chehab <fc...@meilleursagents.com>
AuthorDate: Tue Nov 17 10:11:53 2020 +0100

    Support creation of configmaps & secrets and extra env & envFrom configuration in Helm Chart (#12164)
    
    * Enable provisionning of extra secrets and configmaps in helm chart
    
    Added 2 new values:
    *  extraSecrets
    *  extraConfigMaps
    
    Those values enable the provisionning of ConfigMaps
    and secrets directly from the airflow chart.
    
    Those objects could be used for storing airflow variables
    or (secret) connections info for instance
    (the plan is to add support for extraEnv and extraEnvFrom later).
    
    Docs and tests updated accordingly.
    
    * Add support for extra env and envFrom items in helm chart
    
    Added 2 new values:
    *  extraEnv
    *  extraEnvFrom
    
    Those values will be added to the defintion of
    airflow containers. They are expected to be string
    (they can be templated).
    
    Those new values won't be supported by "legacy" kubernetes
    executor configuration (you must use the pod template).
    
    Therefore, the value 'env' is also deprecated as it's kind
    of a duplicate for extraEnv.
    
    Docs and tests updated accordingly.
    
    (cherry picked from commit 56ee2bb3cb6838df0181d753c24c72d0f4938b0a)
---
 chart/README.md                                    |   6 +-
 chart/files/pod-template-file.kubernetes-helm-yaml |  11 +-
 chart/templates/_helpers.yaml                      |  15 ++-
 chart/templates/{ => configmaps}/configmap.yaml    |   0
 chart/templates/configmaps/extra-configmaps.yaml   |  45 ++++++++
 chart/templates/create-user-job.yaml               |   2 +
 chart/templates/flower/flower-deployment.yaml      |   2 +-
 chart/templates/migrate-database-job.yaml          |   2 +
 .../templates/scheduler/scheduler-deployment.yaml  |  10 +-
 chart/templates/secrets/extra-secrets.yaml         |  51 +++++++++
 .../templates/webserver/webserver-deployment.yaml  |   8 +-
 chart/templates/workers/worker-deployment.yaml     |  10 +-
 chart/tests/helm_template_generator.py             |  12 +++
 chart/tests/test_extra_configmaps_secrets.py       | 110 +++++++++++++++++++
 chart/tests/test_extra_env_env_from.py             | 117 +++++++++++++++++++++
 chart/values.schema.json                           |  44 ++++++++
 chart/values.yaml                                  |  51 +++++++++
 17 files changed, 485 insertions(+), 11 deletions(-)

diff --git a/chart/README.md b/chart/README.md
index d56f114..c5106be 100644
--- a/chart/README.md
+++ b/chart/README.md
@@ -158,8 +158,12 @@ The following tables lists the configurable parameters of the Airflow chart and
 | `images.pgbouncerExporter.repository`                 | Docker repository to pull image from. Update this to deploy a custom image                                   | `apache/airflow`                                  |
 | `images.pgbouncerExporter.tag`                        | Docker image tag to pull image from. Update this to deploy a new custom image tag                            | `airflow-pgbouncer-exporter-2020.09.25-0.5.0`     |
 | `images.pgbouncerExporter.pullPolicy`                 | PullPolicy for pgbouncer-exporter image                                                                      | `IfNotPresent`                                    |
-| `env`                                                 | Environment variables key/values to mount into Airflow pods                                                  | `[]`                                              |
+| `env`                                                 | Environment variables key/values to mount into Airflow pods (deprecated, prefer using extraEnv)              | `[]`                                              |
 | `secret`                                              | Secret name/key pairs to mount into Airflow pods                                                             | `[]`                                              |
+| `extraEnv`                                            | Extra env 'items' that will be added to the definition of airflow containers                                 | `~`                                               |
+| `extraEnvFrom`                                        | Extra envFrom 'items' that will be added to the definition of airflow containers                             | `~`                                               |
+| `extraSecrets`                                        | Extra Secrets that will be managed by the chart                                                              | `{}`                                              |
+| `extraConfigMaps`                                     | Extra ConfigMaps that will be managed by the chart                                                           | `{}`                                              |
 | `data.metadataSecretName`                             | Secret name to mount Airflow connection string from                                                          | `~`                                               |
 | `data.resultBackendSecretName`                        | Secret name to mount Celery result backend connection string from                                            | `~`                                               |
 | `data.metadataConection`                              | Field separated connection data (alternative to secret name)                                                 | `{}`                                              |
diff --git a/chart/files/pod-template-file.kubernetes-helm-yaml b/chart/files/pod-template-file.kubernetes-helm-yaml
index 5c4fb92..33ae7b5 100644
--- a/chart/files/pod-template-file.kubernetes-helm-yaml
+++ b/chart/files/pod-template-file.kubernetes-helm-yaml
@@ -27,12 +27,13 @@ spec:
   containers:
     - args: []
       command: []
+      envFrom:
+      {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 6 }}
       env:
-      - name: AIRFLOW__CORE__EXECUTOR
-        value: LocalExecutor
-{{- include "standard_airflow_environment" . | indent 4 }}
-{{- include "custom_airflow_environment" . | indent 4 }}
-      envFrom: []
+        - name: AIRFLOW__CORE__EXECUTOR
+          value: LocalExecutor
+{{- include "standard_airflow_environment" . | indent 6}}
+{{- include "custom_airflow_environment" . | indent 6 }}
       image: {{ template "pod_template_image" . }}
       imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
       name: base
diff --git a/chart/templates/_helpers.yaml b/chart/templates/_helpers.yaml
index 059d64d..df7b158 100644
--- a/chart/templates/_helpers.yaml
+++ b/chart/templates/_helpers.yaml
@@ -85,12 +85,25 @@
         name: {{ $config.secretName }}
         key: {{ default "value" $config.secretKey }}
   {{- end }}
-    {{- if or (eq $.Values.executor "KubernetesExecutor") (eq $.Values.executor "CeleryKubernetesExecutor") }}
+  {{- if or (eq $.Values.executor "KubernetesExecutor") (eq $.Values.executor "CeleryKubernetesExecutor") }}
     {{- range $i, $config := .Values.secret }}
   - name: AIRFLOW__KUBERNETES_SECRETS__{{ $config.envName }}
     value: {{ printf "%s=%s" $config.secretName $config.secretKey }}
     {{- end }}
   {{ end }}
+  # Extra env
+  {{- $Global := . }}
+  {{- with .Values.extraEnv }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
+{{- end }}
+
+{{/* User defined Airflow environment from */}}
+{{- define "custom_airflow_environment_from" }}
+  {{- $Global := . }}
+  {{- with .Values.extraEnvFrom }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
 {{- end }}
 
 {{/*  Git ssh key volume */}}
diff --git a/chart/templates/configmap.yaml b/chart/templates/configmaps/configmap.yaml
similarity index 100%
rename from chart/templates/configmap.yaml
rename to chart/templates/configmaps/configmap.yaml
diff --git a/chart/templates/configmaps/extra-configmaps.yaml b/chart/templates/configmaps/extra-configmaps.yaml
new file mode 100644
index 0000000..a186aba
--- /dev/null
+++ b/chart/templates/configmaps/extra-configmaps.yaml
@@ -0,0 +1,45 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+####################################################
+## Extra ConfigMaps provisioned via the chart values
+####################################################
+{{- $Global := . }}
+{{- range $configMapName, $configMapContent := .Values.extraConfigMaps }}
+---
+apiVersion: v1
+kind: ConfigMap
+metadata:
+  name: {{ tpl $configMapName $Global | quote }}
+  labels:
+    release: {{ $Global.Release.Name }}
+    chart: "{{ $Global.Chart.Name }}-{{ $Global.Chart.Version }}"
+    heritage: {{ $Global.Release.Service }}
+  annotations:
+    "helm.sh/hook": "pre-install,pre-upgrade"
+    "helm.sh/hook-delete-policy": "before-hook-creation"
+    "helm.sh/hook-weight": "0"
+{{- with $Global.Values.labels }}
+{{ toYaml . | indent 4 }}
+{{- end }}
+{{- if $configMapContent.data }}
+data:
+  {{- with $configMapContent.data }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
+{{- end }}
+{{- end }}
diff --git a/chart/templates/create-user-job.yaml b/chart/templates/create-user-job.yaml
index 27a0363..4df7dd6 100644
--- a/chart/templates/create-user-job.yaml
+++ b/chart/templates/create-user-job.yaml
@@ -79,6 +79,8 @@ spec:
             - {{ .Values.webserver.defaultUser.lastName }}
             - "-p"
             - {{ .Values.webserver.defaultUser.password }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
diff --git a/chart/templates/flower/flower-deployment.yaml b/chart/templates/flower/flower-deployment.yaml
index c5d1f91..3a33369 100644
--- a/chart/templates/flower/flower-deployment.yaml
+++ b/chart/templates/flower/flower-deployment.yaml
@@ -33,7 +33,7 @@ metadata:
 {{ toYaml . | indent 4 }}
 {{- end }}
   annotations:
-    checksum/airflow-config: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
+    checksum/airflow-config: {{ include (print $.Template.BasePath "/configmaps/configmap.yaml") . | sha256sum }}
 spec:
   replicas: 1
   selector:
diff --git a/chart/templates/migrate-database-job.yaml b/chart/templates/migrate-database-job.yaml
index 37a9b2d..8639648 100644
--- a/chart/templates/migrate-database-job.yaml
+++ b/chart/templates/migrate-database-job.yaml
@@ -62,6 +62,8 @@ spec:
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           # Support running against 1.10.x and 2.0.0dev/master
           args: ["bash", "-c", "airflow upgradedb || airflow db upgrade"]
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
diff --git a/chart/templates/scheduler/scheduler-deployment.yaml b/chart/templates/scheduler/scheduler-deployment.yaml
index 9a928a6..61dcade 100644
--- a/chart/templates/scheduler/scheduler-deployment.yaml
+++ b/chart/templates/scheduler/scheduler-deployment.yaml
@@ -65,7 +65,9 @@ spec:
         checksum/metadata-secret: {{ include (print $.Template.BasePath "/secrets/metadata-connection-secret.yaml") . | sha256sum }}
         checksum/result-backend-secret: {{ include (print $.Template.BasePath "/secrets/result-backend-connection-secret.yaml") . | sha256sum }}
         checksum/pgbouncer-config-secret: {{ include (print $.Template.BasePath "/secrets/pgbouncer-config-secret.yaml") . | sha256sum }}
-        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
+        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmaps/configmap.yaml") . | sha256sum }}
+        checksum/extra-configmaps: {{ include (print $.Template.BasePath "/configmaps/extra-configmaps.yaml") . | sha256sum }}
+        checksum/extra-secrets: {{ include (print $.Template.BasePath "/secrets/extra-secrets.yaml") . | sha256sum }}
         {{- if .Values.scheduler.safeToEvict }}
         cluster-autoscaler.kubernetes.io/safe-to-evict: "true"
         {{- end }}
@@ -95,6 +97,8 @@ spec:
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           args:
           {{- include "wait-for-migrations-command" . | indent 10 }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -104,6 +108,8 @@ spec:
           image: {{ template "airflow_image" . }}
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           args: ["scheduler"]
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -184,6 +190,8 @@ spec:
               mountPath: {{ template "airflow_config_path" . }}
               subPath: airflow.cfg
               readOnly: true
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
diff --git a/chart/templates/secrets/extra-secrets.yaml b/chart/templates/secrets/extra-secrets.yaml
new file mode 100644
index 0000000..1326aa2
--- /dev/null
+++ b/chart/templates/secrets/extra-secrets.yaml
@@ -0,0 +1,51 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+#################################################
+## Extra Secrets provisioned via the chart values
+#################################################
+{{- $Global := . }}
+{{- range $secretName, $secretContent := .Values.extraSecrets }}
+---
+apiVersion: v1
+kind: Secret
+metadata:
+  name: {{ tpl $secretName $Global | quote }}
+  labels:
+    release: {{ $Global.Release.Name }}
+    chart: "{{ $Global.Chart.Name }}-{{ $Global.Chart.Version }}"
+    heritage: {{ $Global.Release.Service }}
+  annotations:
+    "helm.sh/hook": "pre-install,pre-upgrade"
+    "helm.sh/hook-delete-policy": "before-hook-creation"
+    "helm.sh/hook-weight": "0"
+{{- with $Global.Values.labels }}
+{{ toYaml . | indent 4 }}
+{{- end }}
+{{- if $secretContent.data }}
+data:
+  {{- with $secretContent.data }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
+{{- end }}
+{{- if $secretContent.stringData }}
+stringData:
+  {{- with $secretContent.stringData }}
+  {{- tpl . $Global | nindent 2 }}
+  {{- end }}
+{{- end }}
+{{- end }}
diff --git a/chart/templates/webserver/webserver-deployment.yaml b/chart/templates/webserver/webserver-deployment.yaml
index a3c42c0..25b6b63 100644
--- a/chart/templates/webserver/webserver-deployment.yaml
+++ b/chart/templates/webserver/webserver-deployment.yaml
@@ -54,7 +54,9 @@ spec:
       annotations:
         checksum/metadata-secret: {{ include (print $.Template.BasePath "/secrets/metadata-connection-secret.yaml") . | sha256sum }}
         checksum/pgbouncer-config-secret: {{ include (print $.Template.BasePath "/secrets/pgbouncer-config-secret.yaml") . | sha256sum }}
-        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
+        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmaps/configmap.yaml") . | sha256sum }}
+        checksum/extra-configmaps: {{ include (print $.Template.BasePath "/configmaps/extra-configmaps.yaml") . | sha256sum }}
+        checksum/extra-secrets: {{ include (print $.Template.BasePath "/secrets/extra-secrets.yaml") . | sha256sum }}
         {{- if .Values.airflowPodAnnotations }}
         {{- toYaml .Values.airflowPodAnnotations | nindent 8 }}
         {{- end }}
@@ -80,6 +82,8 @@ spec:
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           args:
           {{- include "wait-for-migrations-command" . | indent 10 }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -136,6 +140,8 @@ spec:
             timeoutSeconds: {{ .Values.webserver.readinessProbe.timeoutSeconds | default 30 }}
             failureThreshold: {{ .Values.webserver.readinessProbe.failureThreshold | default 20 }}
             periodSeconds: {{ .Values.webserver.readinessProbe.periodSeconds | default 5 }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
diff --git a/chart/templates/workers/worker-deployment.yaml b/chart/templates/workers/worker-deployment.yaml
index 77d5fe2..40fbbe1 100644
--- a/chart/templates/workers/worker-deployment.yaml
+++ b/chart/templates/workers/worker-deployment.yaml
@@ -56,7 +56,9 @@ spec:
         checksum/metadata-secret: {{ include (print $.Template.BasePath "/secrets/metadata-connection-secret.yaml") . | sha256sum }}
         checksum/result-backend-secret: {{ include (print $.Template.BasePath "/secrets/result-backend-connection-secret.yaml") . | sha256sum }}
         checksum/pgbouncer-config-secret: {{ include (print $.Template.BasePath "/secrets/pgbouncer-config-secret.yaml") . | sha256sum }}
-        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
+        checksum/airflow-config: {{ include (print $.Template.BasePath "/configmaps/configmap.yaml") . | sha256sum }}
+        checksum/extra-configmaps: {{ include (print $.Template.BasePath "/configmaps/extra-configmaps.yaml") . | sha256sum }}
+        checksum/extra-secrets: {{ include (print $.Template.BasePath "/secrets/extra-secrets.yaml") . | sha256sum }}
         {{- if .Values.workers.safeToEvict }}
         cluster-autoscaler.kubernetes.io/safe-to-evict: "true"
         {{- end }}
@@ -101,6 +103,8 @@ spec:
           imagePullPolicy: {{ .Values.images.airflow.pullPolicy }}
           args:
           {{- include "wait-for-migrations-command" . | indent 10 }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -146,6 +150,8 @@ spec:
             - name: dags
               mountPath: {{ template "airflow_dags_mount_path" . }}
 {{- end }}
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
           {{- include "custom_airflow_environment" . | indent 10 }}
           {{- include "standard_airflow_environment" . | indent 10 }}
@@ -195,6 +201,8 @@ spec:
             - name: kerberos-ccache
               mountPath: {{ .Values.kerberos.ccacheMountPath | quote }}
               readOnly: false
+          envFrom:
+          {{- include "custom_airflow_environment_from" . | default "\n  []" | indent 10 }}
           env:
             - name: KRB5_CONFIG
               value:  {{ .Values.kerberos.configPath | quote }}
diff --git a/chart/tests/helm_template_generator.py b/chart/tests/helm_template_generator.py
index ba870ed..d8e3f49 100644
--- a/chart/tests/helm_template_generator.py
+++ b/chart/tests/helm_template_generator.py
@@ -19,6 +19,7 @@ import subprocess
 import sys
 from functools import lru_cache
 from tempfile import NamedTemporaryFile
+from typing import Any, Dict, Tuple
 
 import jmespath
 import jsonschema
@@ -81,6 +82,17 @@ def render_chart(name="RELEASE-NAME", values=None, show_only=None):
         return k8s_objects
 
 
+def prepare_k8s_lookup_dict(k8s_objects) -> Dict[Tuple[str, str], Dict[str, Any]]:
+    """
+    Helper to create a lookup dict from k8s_objects.
+    The keys of the dict are the k8s object's kind and name
+    """
+    k8s_obj_by_key = {
+        (k8s_object["kind"], k8s_object["metadata"]["name"]): k8s_object for k8s_object in k8s_objects
+    }
+    return k8s_obj_by_key
+
+
 def render_k8s_object(obj, type_to_render):
     """
     Function that renders dictionaries into k8s objects. For helm chart testing only.
diff --git a/chart/tests/test_extra_configmaps_secrets.py b/chart/tests/test_extra_configmaps_secrets.py
new file mode 100644
index 0000000..378d80e
--- /dev/null
+++ b/chart/tests/test_extra_configmaps_secrets.py
@@ -0,0 +1,110 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import textwrap
+import unittest
+from base64 import b64encode
+
+import yaml
+
+from tests.helm_template_generator import prepare_k8s_lookup_dict, render_chart
+
+RELEASE_NAME = "TEST-EXTRA-CONFIGMAPS-SECRETS"
+
+
+class ExtraConfigMapsSecretsTest(unittest.TestCase):
+    def test_extra_configmaps(self):
+        values_str = textwrap.dedent(
+            """
+            extraConfigMaps:
+              "{{ .Release.Name }}-airflow-variables":
+                data: |
+                  AIRFLOW_VAR_HELLO_MESSAGE: "Hi!"
+                  AIRFLOW_VAR_KUBERNETES_NAMESPACE: "{{ .Release.Namespace }}"
+              "{{ .Release.Name }}-other-variables":
+                data: |
+                  HELLO_WORLD: "Hi again!"
+            """
+        )
+        values = yaml.safe_load(values_str)
+        k8s_objects = render_chart(
+            RELEASE_NAME, values=values, show_only=["templates/configmaps/extra-configmaps.yaml"]
+        )
+        k8s_objects_by_key = prepare_k8s_lookup_dict(k8s_objects)
+
+        all_expected_keys = [
+            ("ConfigMap", f"{RELEASE_NAME}-airflow-variables"),
+            ("ConfigMap", f"{RELEASE_NAME}-other-variables"),
+        ]
+        self.assertEqual(set(k8s_objects_by_key.keys()), set(all_expected_keys))
+
+        all_expected_data = [
+            {"AIRFLOW_VAR_HELLO_MESSAGE": "Hi!", "AIRFLOW_VAR_KUBERNETES_NAMESPACE": "default"},
+            {"HELLO_WORLD": "Hi again!"},
+        ]
+        for expected_key, expected_data in zip(all_expected_keys, all_expected_data):
+            configmap_obj = k8s_objects_by_key[expected_key]
+            self.assertEqual(configmap_obj["data"], expected_data)
+
+    def test_extra_secrets(self):
+        values_str = textwrap.dedent(
+            """
+            extraSecrets:
+              "{{ .Release.Name }}-airflow-connections":
+                data: |
+                  AIRFLOW_CON_AWS: {{ printf "aws_connection_string" | b64enc }}
+                stringData: |
+                  AIRFLOW_CON_GCP: "gcp_connection_string"
+              "{{ .Release.Name }}-other-secrets":
+                data: |
+                  MY_SECRET_1: {{ printf "MY_SECRET_1" | b64enc }}
+                  MY_SECRET_2: {{ printf "MY_SECRET_2" | b64enc }}
+                stringData: |
+                  MY_SECRET_3: "MY_SECRET_3"
+                  MY_SECRET_4: "MY_SECRET_4"
+            """
+        )
+        values = yaml.safe_load(values_str)
+        k8s_objects = render_chart(
+            RELEASE_NAME, values=values, show_only=["templates/secrets/extra-secrets.yaml"]
+        )
+        k8s_objects_by_key = prepare_k8s_lookup_dict(k8s_objects)
+
+        all_expected_keys = [
+            ("Secret", f"{RELEASE_NAME}-airflow-connections"),
+            ("Secret", f"{RELEASE_NAME}-other-secrets"),
+        ]
+        self.assertEqual(set(k8s_objects_by_key.keys()), set(all_expected_keys))
+
+        all_expected_data = [
+            {"AIRFLOW_CON_AWS": b64encode(b"aws_connection_string").decode("utf-8")},
+            {
+                "MY_SECRET_1": b64encode(b"MY_SECRET_1").decode("utf-8"),
+                "MY_SECRET_2": b64encode(b"MY_SECRET_2").decode("utf-8"),
+            },
+        ]
+
+        all_expected_string_data = [
+            {"AIRFLOW_CON_GCP": "gcp_connection_string"},
+            {"MY_SECRET_3": "MY_SECRET_3", "MY_SECRET_4": "MY_SECRET_4"},
+        ]
+        for expected_key, expected_data, expected_string_data in zip(
+            all_expected_keys, all_expected_data, all_expected_string_data
+        ):
+            configmap_obj = k8s_objects_by_key[expected_key]
+            self.assertEqual(configmap_obj["data"], expected_data)
+            self.assertEqual(configmap_obj["stringData"], expected_string_data)
diff --git a/chart/tests/test_extra_env_env_from.py b/chart/tests/test_extra_env_env_from.py
new file mode 100644
index 0000000..170fc7a
--- /dev/null
+++ b/chart/tests/test_extra_env_env_from.py
@@ -0,0 +1,117 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import textwrap
+import unittest
+
+import jmespath
+import yaml
+from parameterized import parameterized
+
+from tests.helm_template_generator import prepare_k8s_lookup_dict, render_chart
+
+RELEASE_NAME = "TEST-EXTRA-ENV-ENV-FROM"
+
+# Test Params: k8s object key and paths with expected env / envFrom
+PARAMS = [
+    (
+        ("Job", "{}-create-user".format(RELEASE_NAME)),
+        ("spec.template.spec.containers[0]",),
+    ),
+    (
+        ("Job", "{}-run-airflow-migrations".format(RELEASE_NAME)),
+        ("spec.template.spec.containers[0]",),
+    ),
+    (
+        ("Deployment", "{}-scheduler".format(RELEASE_NAME)),
+        (
+            "spec.template.spec.initContainers[0]",
+            "spec.template.spec.containers[0]",
+        ),
+    ),
+    (
+        ("StatefulSet", "{}-worker".format(RELEASE_NAME)),
+        (
+            "spec.template.spec.initContainers[0]",
+            "spec.template.spec.containers[0]",
+        ),
+    ),
+    (
+        ("Deployment", "{}-webserver".format(RELEASE_NAME)),
+        ("spec.template.spec.initContainers[0]", "spec.template.spec.containers[0]"),
+    ),
+]
+
+
+class ExtraEnvEnvFromTest(unittest.TestCase):
+    @classmethod
+    def setUpClass(cls) -> None:
+        values_str = textwrap.dedent(
+            """
+            executor: "CeleryExecutor"
+            extraEnvFrom: |
+              - secretRef:
+                  name: '{{ .Release.Name }}-airflow-connections'
+              - configMapRef:
+                  name: '{{ .Release.Name }}-airflow-variables'
+            extraEnv: |
+              - name: PLATFORM
+                value: FR
+              - name: TEST
+                valueFrom:
+                  secretKeyRef:
+                    name: '{{ .Release.Name }}-some-secret'
+                    key: connection
+            """
+        )
+        values = yaml.safe_load(values_str)
+        cls.k8s_objects = render_chart(RELEASE_NAME, values=values)  # type: ignore
+        cls.k8s_objects_by_key = prepare_k8s_lookup_dict(cls.k8s_objects)  # type: ignore
+
+    @parameterized.expand(PARAMS)
+    def test_extra_env(self, k8s_obj_key, env_paths):
+        expected_env_as_str = textwrap.dedent(
+            """
+            - name: PLATFORM
+              value: FR
+            - name: TEST
+              valueFrom:
+                secretKeyRef:
+                  key: connection
+                  name: {}-some-secret
+            """.format(RELEASE_NAME)
+        ).lstrip()
+        k8s_object = self.k8s_objects_by_key[k8s_obj_key]
+        for path in env_paths:
+            env = jmespath.search("{}.env".format(path), k8s_object)
+            self.assertIn(expected_env_as_str, yaml.dump(env))
+
+    @parameterized.expand(PARAMS)
+    def test_extra_env_from(self, k8s_obj_key, env_from_paths):
+        expected_env_from_as_str = textwrap.dedent(
+            """
+            - secretRef:
+                name: {}-airflow-connections
+            - configMapRef:
+                name: {}-airflow-variables
+            """.format(RELEASE_NAME, RELEASE_NAME)
+        ).lstrip()
+
+        k8s_object = self.k8s_objects_by_key[k8s_obj_key]
+        for path in env_from_paths:
+            env_from = jmespath.search("{}.envFrom".format(path), k8s_object)
+            self.assertIn(expected_env_from_as_str, yaml.dump(env_from))
diff --git a/chart/values.schema.json b/chart/values.schema.json
index 7881c82..f1d8271 100644
--- a/chart/values.schema.json
+++ b/chart/values.schema.json
@@ -343,6 +343,50 @@
             "description": "Secrets for all airflow containers.",
             "type": "array"
         },
+        "extraEnv": {
+          "description": "Extra env 'items' that will be added to the definition of airflow containers ; a string is expected (can be templated).",
+          "type": ["null", "string"]
+        },
+        "extraEnvFrom": {
+          "description": "Extra envFrom 'items' that will be added to the definition of airflow containers ; a string is expected (can be templated).",
+          "type": ["null", "string"]
+        },
+        "extraSecrets": {
+          "description": "Extra secrets that will be managed by the chart.",
+          "type": "object",
+          "additionalProperties": {
+            "description": "Name of the secret (can be templated).",
+            "type": "object",
+            "minProperties": 1,
+            "additionalProperties": false,
+            "properties": {
+              "data": {
+                "description": "Content **as string** for the 'data' item of the secret (can be templated)",
+                "type": "string"
+              },
+              "stringData": {
+                "description": "Content **as string** for the 'stringData' item of the secret (can be templated)",
+                "type": "string"
+              }
+            }
+          }
+        },
+        "extraConfigMaps": {
+          "description": "Extra configMaps that will be managed by the chart.",
+          "type": "object",
+          "additionalProperties": {
+            "description": "Name of the configMap (can be templated).",
+            "type": "object",
+            "minProperties": 1,
+            "additionalProperties": false,
+            "properties": {
+              "data": {
+                "description": "Content **as string** for the 'data' item of the secret (can be templated)",
+                "type": "string"
+              }
+            }
+          }
+        },
         "data": {
             "description": "Airflow database configuration.",
             "type": "object",
diff --git a/chart/values.yaml b/chart/values.yaml
index 0f5b313..091a0c9 100644
--- a/chart/values.yaml
+++ b/chart/values.yaml
@@ -163,6 +163,57 @@ secret: []
 #   secretName: ""
 #   secretKey: ""
 
+# Extra secrets that will be managed by the chart
+# (You can use them with extraEnv or extraEnvFrom or some of the extraVolumes values).
+# The format is "key/value" where
+#    * key (can be templated) is the the name the secret that will be created
+#    * value: an object with the standard 'data' or 'stringData' key (or both).
+#          The value associated with those keys must be a string (can be templated)
+extraSecrets: {}
+# eg:
+# extraSecrets:
+#   {{ .Release.Name }}-airflow-connections:
+#     data: |
+#       AIRFLOW_CONN_GCP: 'base64_encoded_gcp_conn_string'
+#       AIRFLOW_CONN_AWS: 'base64_encoded_aws_conn_string'
+#     stringData: |
+#       AIRFLOW_CONN_OTHER: 'other_conn'
+#   {{ .Release.Name }}-other-secret-name-suffix: |
+#     data: |
+#        ...
+
+# Extra ConfigMaps that will be managed by the chart
+# (You can use them with extraEnv or extraEnvFrom or some of the extraVolumes values).
+# The format is "key/value" where
+#    * key (can be templated) is the the name the configmap that will be created
+#    * value: an object with the standard 'data' key.
+#          The value associated with this keys must be a string (can be templated)
+extraConfigMaps: {}
+# eg:
+# extraConfigMaps:
+#   {{ .Release.Name }}-airflow-variables:
+#     data: |
+#       AIRFLOW_VAR_HELLO_MESSAGE: "Hi!"
+#       AIRFLOW_VAR_KUBERNETES_NAMESPACE: "{{ .Release.Namespace }}"
+
+# Extra env 'items' that will be added to the definition of airflow containers
+# a string is expected (can be templated).
+extraEnv: ~
+# eg:
+# extraEnv: |
+#   - name: PLATFORM
+#     value: FR
+
+# Extra envFrom 'items' that will be added to the definition of airflow containers
+# A string is expected (can be templated).
+extraEnvFrom: ~
+# eg:
+# extraEnvFrom: |
+#   - secretRef:
+#       name: '{{ .Release.Name }}-airflow-connections'
+#   - configMapRef:
+#       name: '{{ .Release.Name }}-airflow-variables'
+
 # Airflow database config
 data:
   # If secret names are provided, use those secrets


[airflow] 31/34: Add Changelog for 1.10.14

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b8b9c0e2315ca0927f40d4546a88fedf65c454fe
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Dec 2 15:25:02 2020 +0000

    Add Changelog for 1.10.14
---
 CHANGELOG.txt | 30 ++++++++++++++++++++++++++++++
 1 file changed, 30 insertions(+)

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index b818fef..0f2c39b 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -1,3 +1,33 @@
+Airflow 1.10.14, 2020-12-05
+----------------------------
+
+Bug Fixes
+"""""""""
+
+- BugFix: Tasks with ``depends_on_past`` or ``task_concurrency`` are stuck (#12663)
+- Fix issue with empty Resources in executor_config (#12633)
+- Fix: Deprecated config ``force_log_out_after`` was not used (#12661)
+- Fix empty asctime field in JSON formatted logs (#10515)
+- [AIRFLOW-2809] Fix security issue regarding Flask SECRET_KEY (#3651)
+- [AIRFLOW-2884] Fix Flask SECRET_KEY security issue in www_rbac (#3729)
+- [AIRFLOW-2886] Generate random Flask SECRET_KEY in default config (#3738)
+
+
+Improvements
+""""""""""""
+
+- Update setup.py to get non-conflicting set of dependencies (#12636)
+- Rename ``[scheduler] max_threads`` to ``[scheduler] parsing_processes`` (#12605)
+- Add metric for scheduling delay between first run task & expected start time (#9544)
+- Add new-style 2.0 command names for Airflow 1.10.x (#12725)
+- Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)
+
+Doc only changes
+""""""""""""""""
+
+- Clarified information about supported Databases
+
+
 Airflow 1.10.13, 2020-11-24
 ----------------------------
 


[airflow] 34/34: Update setup.py to get non-conflicting set of dependencies (#12636)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a4edcf9d1f24d05ecd7b43e66c79e29ef0b73329
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Nov 29 19:45:58 2020 +0100

    Update setup.py to get non-conflicting set of dependencies (#12636)
    
    This change upgrades setup.py and setup.cfg to provide non-conflicting
    `pip check` valid set of constraints for CI image.
    
    (cherry picked from commit 5370f3feb095e02704b0852fe630efdd118cb8f5)


[airflow] 10/34: Fix wait-for-migrations command in helm chart (#12522)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 951001ad42e279815bb59c4652c55aab9110f278
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Sat Nov 21 10:00:02 2020 +0000

    Fix wait-for-migrations command in helm chart (#12522)
    
    If the migrations weren't yet applied this would fail with `NameError:
    name 'log' is not defined`. (I guess no one really noticed as the
    container would restart, and try again.)
    
    (cherry picked from commit 3188b130b5f61332e24c340ac6c0569efa4e8056)
---
 chart/templates/_helpers.yaml | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/chart/templates/_helpers.yaml b/chart/templates/_helpers.yaml
index 98efc9f..530b1d0 100644
--- a/chart/templates/_helpers.yaml
+++ b/chart/templates/_helpers.yaml
@@ -367,6 +367,7 @@ server_tls_key_file = /etc/pgbouncer/server.key
   - -c
   - |
         import airflow
+        import logging
         import os
         import time
 
@@ -399,7 +400,7 @@ server_tls_key_file = /etc/pgbouncer/server.key
                     raise TimeoutError("There are still unapplied migrations after {} seconds.".format(ticker))
                 ticker += 1
                 time.sleep(1)
-                log.info('Waiting for migrations... %s second(s)', ticker)
+                logging.info('Waiting for migrations... %s second(s)', ticker)
 {{- end }}
 
 {{ define "registry_docker_config" -}}


[airflow] 24/34: Clarified information about supported Databases.

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 21e0202deaee34c202db998ecc054794d3ea3485
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Tue Dec 1 09:59:21 2020 +0100

    Clarified information about supported Databases.
    
    Cherry-picked the new description from master, following
    complaints that documentation suggests we support other
    databases.
    
    See issue #9717
---
 docs/howto/initialize-database.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/howto/initialize-database.rst b/docs/howto/initialize-database.rst
index e0429d9..16d2cbb 100644
--- a/docs/howto/initialize-database.rst
+++ b/docs/howto/initialize-database.rst
@@ -21,9 +21,9 @@ Initializing a Database Backend
 If you want to take a real test drive of Airflow, you should consider
 setting up a real database backend and switching to the LocalExecutor.
 
-As Airflow was built to interact with its metadata using the great SqlAlchemy
-library, you should be able to use any database backend supported as a
-SqlAlchemy backend. We recommend using **MySQL** or **Postgres**.
+Airflow was built to interact with its metadata using SqlAlchemy
+with **MySQL**,  **Postgres** and **SQLite** as supported backends
+(SQLite is used primarily for development purpose)
 
 .. note:: We rely on more strict ANSI SQL settings for MySQL in order to have
    sane defaults. Make sure to have specified ``explicit_defaults_for_timestamp=1``


[airflow] 01/34: Fix issue with empty Resources in executor_config (#12633)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit eb6ae74e78f39ef0957bde759d713bf78199c6ce
Author: Daniel Imberman <da...@gmail.com>
AuthorDate: Fri Nov 27 05:52:49 2020 -0800

    Fix issue with empty Resources in executor_config (#12633)
    
    Fixes an issue where if a user specifies a request but not a limit in
    resources for the executor_config, backwards compat can not parse
    
    (cherry picked from commit 84eecf94bab1a8c66b5161f03c6631448fb4850e)
---
 airflow/contrib/kubernetes/pod.py   |   4 +-
 tests/kubernetes/models/test_pod.py | 240 ++++++++++++++++++++++--------------
 2 files changed, 152 insertions(+), 92 deletions(-)

diff --git a/airflow/contrib/kubernetes/pod.py b/airflow/contrib/kubernetes/pod.py
index d1f30a8..7e38147 100644
--- a/airflow/contrib/kubernetes/pod.py
+++ b/airflow/contrib/kubernetes/pod.py
@@ -250,8 +250,8 @@ def _extract_ports(ports):
 
 def _extract_resources(resources):
     if isinstance(resources, k8s.V1ResourceRequirements):
-        requests = resources.requests
-        limits = resources.limits
+        requests = resources.requests or {}
+        limits = resources.limits or {}
         return Resources(
             request_memory=requests.get('memory', None),
             request_cpu=requests.get('cpu', None),
diff --git a/tests/kubernetes/models/test_pod.py b/tests/kubernetes/models/test_pod.py
index f8df28a..6939597 100644
--- a/tests/kubernetes/models/test_pod.py
+++ b/tests/kubernetes/models/test_pod.py
@@ -19,63 +19,88 @@ from tests.compat import mock
 from kubernetes.client import ApiClient
 import kubernetes.client.models as k8s
 from airflow.kubernetes.pod import Port
+from airflow.kubernetes.pod import Resources
+from airflow.contrib.kubernetes.pod import _extract_resources
 from airflow.kubernetes.pod_generator import PodGenerator
 from airflow.kubernetes.k8s_model import append_to_pod
 
 
 class TestPod(unittest.TestCase):
+    def test_extract_resources(self):
+        res = _extract_resources(k8s.V1ResourceRequirements())
+        self.assertEqual(
+            res.to_k8s_client_obj().to_dict(), Resources().to_k8s_client_obj().to_dict()
+        )
+        res = _extract_resources(k8s.V1ResourceRequirements(limits={"memory": "1G"}))
+        self.assertEqual(
+            res.to_k8s_client_obj().to_dict(),
+            Resources(limit_memory="1G").to_k8s_client_obj().to_dict(),
+        )
+        res = _extract_resources(k8s.V1ResourceRequirements(requests={"memory": "1G"}))
+        self.assertEqual(
+            res.to_k8s_client_obj().to_dict(),
+            Resources(request_memory="1G").to_k8s_client_obj().to_dict(),
+        )
+        res = _extract_resources(
+            k8s.V1ResourceRequirements(
+                limits={"memory": "1G"}, requests={"memory": "1G"}
+            )
+        )
+        self.assertEqual(
+            res.to_k8s_client_obj().to_dict(),
+            Resources(limit_memory="1G", request_memory="1G")
+            .to_k8s_client_obj()
+            .to_dict(),
+        )
 
     def test_port_to_k8s_client_obj(self):
-        port = Port('http', 80)
+        port = Port("http", 80)
         self.assertEqual(
             port.to_k8s_client_obj(),
-            k8s.V1ContainerPort(
-                name='http',
-                container_port=80
-            )
+            k8s.V1ContainerPort(name="http", container_port=80),
         )
 
-    @mock.patch('uuid.uuid4')
+    @mock.patch("uuid.uuid4")
     def test_port_attach_to_pod(self, mock_uuid):
         import uuid
-        static_uuid = uuid.UUID('cf4a56d2-8101-4217-b027-2af6216feb48')
+
+        static_uuid = uuid.UUID("cf4a56d2-8101-4217-b027-2af6216feb48")
         mock_uuid.return_value = static_uuid
-        pod = PodGenerator(image='airflow-worker:latest', name='base').gen_pod()
-        ports = [
-            Port('https', 443),
-            Port('http', 80)
-        ]
+        pod = PodGenerator(image="airflow-worker:latest", name="base").gen_pod()
+        ports = [Port("https", 443), Port("http", 80)]
         k8s_client = ApiClient()
         result = append_to_pod(pod, ports)
         result = k8s_client.sanitize_for_serialization(result)
-        self.assertEqual({
-            'apiVersion': 'v1',
-            'kind': 'Pod',
-            'metadata': {'name': 'base-' + static_uuid.hex},
-            'spec': {
-                'containers': [{
-                    'args': [],
-                    'command': [],
-                    'env': [],
-                    'envFrom': [],
-                    'image': 'airflow-worker:latest',
-                    'name': 'base',
-                    'ports': [{
-                        'name': 'https',
-                        'containerPort': 443
-                    }, {
-                        'name': 'http',
-                        'containerPort': 80
-                    }],
-                    'volumeMounts': [],
-                }],
-                'hostNetwork': False,
-                'imagePullSecrets': [],
-                'volumes': []
-            }
-        }, result)
+        self.assertEqual(
+            {
+                "apiVersion": "v1",
+                "kind": "Pod",
+                "metadata": {"name": "base-" + static_uuid.hex},
+                "spec": {
+                    "containers": [
+                        {
+                            "args": [],
+                            "command": [],
+                            "env": [],
+                            "envFrom": [],
+                            "image": "airflow-worker:latest",
+                            "name": "base",
+                            "ports": [
+                                {"name": "https", "containerPort": 443},
+                                {"name": "http", "containerPort": 80},
+                            ],
+                            "volumeMounts": [],
+                        }
+                    ],
+                    "hostNetwork": False,
+                    "imagePullSecrets": [],
+                    "volumes": [],
+                },
+            },
+            result,
+        )
 
-    @mock.patch('uuid.uuid4')
+    @mock.patch("uuid.uuid4")
     def test_to_v1_pod(self, mock_uuid):
         from airflow.contrib.kubernetes.pod import Pod as DeprecatedPod
         from airflow.kubernetes.volume import Volume
@@ -83,7 +108,8 @@ class TestPod(unittest.TestCase):
         from airflow.kubernetes.secret import Secret
         from airflow.kubernetes.pod import Resources
         import uuid
-        static_uuid = uuid.UUID('cf4a56d2-8101-4217-b027-2af6216feb48')
+
+        static_uuid = uuid.UUID("cf4a56d2-8101-4217-b027-2af6216feb48")
         mock_uuid.return_value = static_uuid
 
         pod = DeprecatedPod(
@@ -94,24 +120,26 @@ class TestPod(unittest.TestCase):
             envs={"test_key": "test_value"},
             cmds=["airflow"],
             resources=Resources(
-                request_memory="1G",
-                request_cpu="100Mi",
-                limit_gpu="100G"
+                request_memory="1G", request_cpu="100Mi", limit_gpu="100G"
             ),
             init_containers=k8s.V1Container(
                 name="test-container",
-                volume_mounts=k8s.V1VolumeMount(mount_path="/foo/bar", name="init-volume-secret")
+                volume_mounts=k8s.V1VolumeMount(
+                    mount_path="/foo/bar", name="init-volume-secret"
+                ),
             ),
             volumes=[
                 Volume(name="foo", configs={}),
-                {"name": "bar", 'secret': {'secretName': 'volume-secret'}}
+                {"name": "bar", "secret": {"secretName": "volume-secret"}},
             ],
             secrets=[
                 Secret("volume", None, "init-volume-secret"),
-                Secret('env', "AIRFLOW_SECRET", 'secret_name', "airflow_config"),
-                Secret("volume", "/opt/airflow", "volume-secret", "secret-key")
+                Secret("env", "AIRFLOW_SECRET", "secret_name", "airflow_config"),
+                Secret("volume", "/opt/airflow", "volume-secret", "secret-key"),
+            ],
+            volume_mounts=[
+                VolumeMount(name="foo", mount_path="/mnt", sub_path="/", read_only=True)
             ],
-            volume_mounts=[VolumeMount(name="foo", mount_path="/mnt", sub_path="/", read_only=True)]
         )
 
         k8s_client = ApiClient()
@@ -119,47 +147,79 @@ class TestPod(unittest.TestCase):
         result = pod.to_v1_kubernetes_pod()
         result = k8s_client.sanitize_for_serialization(result)
 
-        expected = \
-            {'metadata': {'annotations': {},
-                          'labels': {},
-                          'name': 'bar',
-                          'namespace': 'baz'},
-             'spec': {'affinity': {},
-                      'containers': [{'args': [],
-                                      'command': ['airflow'],
-                                      'env': [{'name': 'test_key', 'value': 'test_value'},
-                                              {'name': 'AIRFLOW_SECRET',
-                                               'valueFrom': {'secretKeyRef': {'key': 'airflow_config',
-                                                                              'name': 'secret_name'}}}],
-                                      'envFrom': [],
-                                      'image': 'foo',
-                                      'imagePullPolicy': 'Never',
-                                      'name': 'base',
-                                      'resources': {'limits': {'nvidia.com/gpu': '100G'},
-                                                    'requests': {'cpu': '100Mi',
-                                                                 'memory': '1G'}},
-                                      'volumeMounts': [{'mountPath': '/mnt',
-                                                        'name': 'foo',
-                                                        'readOnly': True,
-                                                        'subPath': '/'},
-                                                       {'mountPath': '/opt/airflow',
-                                                        'name': 'secretvol' + str(static_uuid),
-                                                        'readOnly': True}]}],
-                      'hostNetwork': False,
-                      'imagePullSecrets': [],
-                      'initContainers': {'name': 'test-container',
-                                         'volumeMounts': {'mountPath': '/foo/bar',
-                                                          'name': 'init-volume-secret'}},
-                      'nodeSelector': {},
-                      'securityContext': {},
-                      'tolerations': [],
-                      'volumes': [{'name': 'foo'},
-                                  {'name': 'bar',
-                                   'secret': {'secretName': 'volume-secret'}},
-                                  {'name': 'secretvol' + str(static_uuid),
-                                   'secret': {'secretName': 'init-volume-secret'}},
-                                  {'name': 'secretvol' + str(static_uuid),
-                                   'secret': {'secretName': 'volume-secret'}}
-                                  ]}}
+        expected = {
+            "metadata": {
+                "annotations": {},
+                "labels": {},
+                "name": "bar",
+                "namespace": "baz",
+            },
+            "spec": {
+                "affinity": {},
+                "containers": [
+                    {
+                        "args": [],
+                        "command": ["airflow"],
+                        "env": [
+                            {"name": "test_key", "value": "test_value"},
+                            {
+                                "name": "AIRFLOW_SECRET",
+                                "valueFrom": {
+                                    "secretKeyRef": {
+                                        "key": "airflow_config",
+                                        "name": "secret_name",
+                                    }
+                                },
+                            },
+                        ],
+                        "envFrom": [],
+                        "image": "foo",
+                        "imagePullPolicy": "Never",
+                        "name": "base",
+                        "resources": {
+                            "limits": {"nvidia.com/gpu": "100G"},
+                            "requests": {"cpu": "100Mi", "memory": "1G"},
+                        },
+                        "volumeMounts": [
+                            {
+                                "mountPath": "/mnt",
+                                "name": "foo",
+                                "readOnly": True,
+                                "subPath": "/",
+                            },
+                            {
+                                "mountPath": "/opt/airflow",
+                                "name": "secretvol" + str(static_uuid),
+                                "readOnly": True,
+                            },
+                        ],
+                    }
+                ],
+                "hostNetwork": False,
+                "imagePullSecrets": [],
+                "initContainers": {
+                    "name": "test-container",
+                    "volumeMounts": {
+                        "mountPath": "/foo/bar",
+                        "name": "init-volume-secret",
+                    },
+                },
+                "nodeSelector": {},
+                "securityContext": {},
+                "tolerations": [],
+                "volumes": [
+                    {"name": "foo"},
+                    {"name": "bar", "secret": {"secretName": "volume-secret"}},
+                    {
+                        "name": "secretvol" + str(static_uuid),
+                        "secret": {"secretName": "init-volume-secret"},
+                    },
+                    {
+                        "name": "secretvol" + str(static_uuid),
+                        "secret": {"secretName": "volume-secret"},
+                    },
+                ],
+            },
+        }
         self.maxDiff = None
         self.assertEqual(expected, result)


[airflow] 32/34: Pins PIP to 20.2.4 in our Dockerfiles (#12738)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit edcd18e0f54a7a3efcfd55cfba282dfa8e7cc31f
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Dec 1 17:39:55 2020 +0100

    Pins PIP to 20.2.4 in our Dockerfiles (#12738)
    
    Until we make sure that the new resolver in PIP 20.3 works
    we should pin PIP to 20.2.4.
    
    This is hopefully a temporary measure.
    
    Part of #12737
    
    (cherry picked from commit 0451d84ea2409c7b091640f52c25ac9a0bb2505f)
---
 Dockerfile    | 12 ++++++++++++
 Dockerfile.ci |  5 +++++
 2 files changed, 17 insertions(+)

diff --git a/Dockerfile b/Dockerfile
index 9b96cfa..35f50b3 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -47,6 +47,8 @@ ARG CASS_DRIVER_BUILD_CONCURRENCY="8"
 ARG PYTHON_BASE_IMAGE="python:3.6-slim-buster"
 ARG PYTHON_MAJOR_MINOR_VERSION="3.6"
 
+ARG PIP_VERSION=20.2.4
+
 ##############################################################################################
 # This is the build image where we build all dependencies
 ##############################################################################################
@@ -59,6 +61,9 @@ ENV PYTHON_BASE_IMAGE=${PYTHON_BASE_IMAGE}
 ARG PYTHON_MAJOR_MINOR_VERSION
 ENV PYTHON_MAJOR_MINOR_VERSION=${PYTHON_MAJOR_MINOR_VERSION}
 
+ARG PIP_VERSION
+ENV PIP_VERSION=${PIP_VERSION}
+
 # Make sure noninteractive debian install is used and language variables set
 ENV DEBIAN_FRONTEND=noninteractive LANGUAGE=C.UTF-8 LANG=C.UTF-8 LC_ALL=C.UTF-8 \
     LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8
@@ -168,6 +173,8 @@ RUN if [[ -f /docker-context-files/.pypirc ]]; then \
         cp /docker-context-files/.pypirc /root/.pypirc; \
     fi
 
+RUN pip install --upgrade "pip==${PIP_VERSION}"
+
 # In case of Production build image segment we want to pre-install master version of airflow
 # dependencies from GitHub so that we do not have to always reinstall it from the scratch.
 RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
@@ -295,6 +302,9 @@ ENV AIRFLOW_VERSION=${AIRFLOW_VERSION}
 ENV DEBIAN_FRONTEND=noninteractive LANGUAGE=C.UTF-8 LANG=C.UTF-8 LC_ALL=C.UTF-8 \
     LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8
 
+ARG PIP_VERSION
+ENV PIP_VERSION=${PIP_VERSION}
+
 # Install curl and gnupg2 - needed for many other installation steps
 RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -395,6 +405,8 @@ COPY --chown=airflow:root scripts/in_container/prod/entrypoint_prod.sh /entrypoi
 COPY --chown=airflow:root scripts/in_container/prod/clean-logs.sh /clean-logs
 RUN chmod a+x /entrypoint /clean-logs
 
+RUN pip install --upgrade "pip==${PIP_VERSION}"
+
 # Make /etc/passwd root-group-writeable so that user can be dynamically added by OpenShift
 # See https://github.com/apache/airflow/issues/9248
 RUN chmod g=u /etc/passwd
diff --git a/Dockerfile.ci b/Dockerfile.ci
index cac73bb..c71fae6 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -29,6 +29,9 @@ ENV AIRFLOW_VERSION=$AIRFLOW_VERSION
 ARG PYTHON_MAJOR_MINOR_VERSION="3.6"
 ENV PYTHON_MAJOR_MINOR_VERSION=${PYTHON_MAJOR_MINOR_VERSION}
 
+ARG PIP_VERSION=20.2.4
+ENV PIP_VERSION=${PIP_VERSION}
+
 # Print versions
 RUN echo "Base image: ${PYTHON_BASE_IMAGE}"
 RUN echo "Airflow version: ${AIRFLOW_VERSION}"
@@ -262,6 +265,8 @@ ENV AIRFLOW_LOCAL_PIP_WHEELS=${AIRFLOW_LOCAL_PIP_WHEELS}
 ARG INSTALL_AIRFLOW_VIA_PIP="true"
 ENV INSTALL_AIRFLOW_VIA_PIP=${INSTALL_AIRFLOW_VIA_PIP}
 
+RUN pip install --upgrade "pip==${PIP_VERSION}"
+
 # In case of CI builds we want to pre-install master version of airflow dependencies so that
 # We do not have to always reinstall it from the scratch.
 # This can be reinstalled from latest master by increasing PIP_DEPENDENCIES_EPOCH_NUMBER.


[airflow] 26/34: [AIRFLOW-2809] Fix security issue regarding Flask SECRET_KEY

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2f3b1c780472afd4c8a93633e6633feb7083792e
Author: XD-DENG <xd...@hotmail.com>
AuthorDate: Sun Jul 29 11:57:46 2018 +0200

    [AIRFLOW-2809] Fix security issue regarding Flask SECRET_KEY
    
    It's recommended by Falsk community to use random
    SECRET_KEY for security reason.
    
    However, in Airflow there is a default value for
    secret_key and most users will ignore to change
    it.
    
    This may cause security concern.
    
    Closes #3651 from XD-DENG/patch-2
    
    (cherry picked from commit dfa7b26ddaca80ee8fd9915ee9f6eac50fac77f6)
---
 airflow/www/app.py | 8 ++++++++
 1 file changed, 8 insertions(+)

diff --git a/airflow/www/app.py b/airflow/www/app.py
index 58e82b9..2d463a2 100644
--- a/airflow/www/app.py
+++ b/airflow/www/app.py
@@ -19,6 +19,7 @@
 #
 import datetime
 import logging
+import os
 from typing import Any
 
 import flask
@@ -49,6 +50,7 @@ log = logging.getLogger(__name__)
 
 
 def create_app(config=None, testing=False):
+
     app = Flask(__name__)
     if conf.getboolean('webserver', 'ENABLE_PROXY_FIX'):
         app.wsgi_app = ProxyFix(
@@ -64,6 +66,12 @@ def create_app(config=None, testing=False):
     app.config['LOGIN_DISABLED'] = not conf.getboolean(
         'webserver', 'AUTHENTICATE')
 
+    if configuration.conf.get('webserver', 'SECRET_KEY') == "temporary_key":
+        log.info("SECRET_KEY for Flask App is not specified. Using a random one.")
+        app.secret_key = os.urandom(16)
+    else:
+        app.secret_key = configuration.conf.get('webserver', 'SECRET_KEY')
+
     app.config['SESSION_COOKIE_HTTPONLY'] = True
     app.config['SESSION_COOKIE_SECURE'] = conf.getboolean('webserver', 'COOKIE_SECURE')
     app.config['SESSION_COOKIE_SAMESITE'] = conf.get('webserver', 'COOKIE_SAMESITE')


[airflow] 05/34: Remove CodeQL from PRS. (#12406)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 130ddd9a8ae35bcfbc4565dfa06be888f06f7791
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Nov 17 17:37:46 2020 +0100

    Remove CodeQL from PRS. (#12406)
    
    As discussed in https://lists.apache.org/thread.html/r18cc605bbdb6695c1d31e0706f1b033401f6fa6a19cd0584d7be6cc9%40%3Cdev.airflow.apache.org%3E
    removing CodeQL from PRs.
    
    (cherry picked from commit 525f6594d20d7032700cafe87a4f01a8c9ba8d23)
---
 .github/workflows/codeql-analysis.yml | 2 --
 1 file changed, 2 deletions(-)

diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml
index 4229e05..e0178bf 100644
--- a/.github/workflows/codeql-analysis.yml
+++ b/.github/workflows/codeql-analysis.yml
@@ -21,8 +21,6 @@ name: "CodeQL"
 on:  # yamllint disable-line rule:truthy
   push:
     branches: [master]
-  pull_request:
-    branches: [master]
   schedule:
     - cron: '0 2 * * *'
 


[airflow] 22/34: Fix empty asctime field in JSON formatted logs (#10515)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 38a2219746875adaadffcf8f5e65c0eefc84527e
Author: Robert Grizzell <ro...@grizzell.me>
AuthorDate: Wed Sep 16 11:50:27 2020 -0500

    Fix empty asctime field in JSON formatted logs (#10515)
    
    (cherry picked from commit 2aec99c22847594040d28e587ab5e2473eff8c94)
---
 airflow/utils/log/json_formatter.py    | 3 +++
 tests/utils/log/test_json_formatter.py | 9 +++++++++
 2 files changed, 12 insertions(+)

diff --git a/airflow/utils/log/json_formatter.py b/airflow/utils/log/json_formatter.py
index 1d90bc3..3cf4530 100644
--- a/airflow/utils/log/json_formatter.py
+++ b/airflow/utils/log/json_formatter.py
@@ -53,6 +53,9 @@ class JSONFormatter(logging.Formatter):
         self.json_fields = json_fields
         self.extras = extras
 
+    def usesTime(self):
+        return self.json_fields.count('asctime') > 0
+
     def format(self, record):
         super(JSONFormatter, self).format(record)
         record_dict = {label: getattr(record, label, None)
diff --git a/tests/utils/log/test_json_formatter.py b/tests/utils/log/test_json_formatter.py
index 5c305c9..1608d81 100644
--- a/tests/utils/log/test_json_formatter.py
+++ b/tests/utils/log/test_json_formatter.py
@@ -75,6 +75,15 @@ class TestJSONFormatter(unittest.TestCase):
         merged = merge_dicts(dict1, dict2)
         self.assertDictEqual(merged, {'a': 1, 'r': {'b': 0, 'c': 3}})
 
+    def test_uses_time(self):
+        """
+        Test usesTime method from JSONFormatter
+        """
+        json_fmt_asctime = JSONFormatter(json_fields=["asctime", "label"])
+        json_fmt_no_asctime = JSONFormatter(json_fields=["label"])
+        self.assertTrue(json_fmt_asctime.usesTime())
+        self.assertFalse(json_fmt_no_asctime.usesTime())
+
     def test_format(self):
         """
         Test format method from JSONFormatter


[airflow] 02/34: Typo Fix: Deprecated config force_log_out_after was not used (#12661)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7356ae1c28084edefc700e914b79c31fcb9a5306
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Fri Nov 27 17:36:10 2020 +0000

    Typo Fix: Deprecated config force_log_out_after was not used (#12661)
    
    `force_logout_after` should be `force_log_out_after` in the code
    section https://github.com/apache/airflow/blob/master/airflow/settings.py#L372-L381.
    
    As `force_log_out_after` is actually used and written in
    https://github.com/apache/airflow/blob/c5700a56bb3b9a5b872bda0fe0d3de82b0128bdf/UPDATING.md#unify-user-session-lifetime-configuration.
    
    (cherry picked from commit 531e00660af0cc7729792ef08559edd84c6c46ab)
---
 airflow/settings.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/settings.py b/airflow/settings.py
index 2aedb72..c708d90 100644
--- a/airflow/settings.py
+++ b/airflow/settings.py
@@ -386,7 +386,7 @@ def get_session_lifetime_config():
     session_lifetime_minutes = conf.get('webserver', 'session_lifetime_minutes', fallback=None)
     session_lifetime_days = conf.get('webserver', 'session_lifetime_days', fallback=None)
     uses_deprecated_lifetime_configs = session_lifetime_days or conf.get(
-        'webserver', 'force_logout_after', fallback=None
+        'webserver', 'force_log_out_after', fallback=None
     )
 
     minutes_per_day = 24 * 60
@@ -395,7 +395,7 @@ def get_session_lifetime_config():
         warnings.warn(
             '`session_lifetime_days` option from `[webserver]` section has been '
             'renamed to `session_lifetime_minutes`. The new option allows to configure '
-            'session lifetime in minutes. The `force_logout_after` option has been removed '
+            'session lifetime in minutes. The `force_log_out_after` option has been removed '
             'from `[webserver]` section. Please update your configuration.',
             category=DeprecationWarning,
         )