You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ka...@apache.org on 2020/12/03 15:57:05 UTC

[airflow] branch v1-10-test updated (dd00196 -> a30ee0e)

This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


    omit dd00196  Don't let webserver run with dangerous config (#12747)
    omit b13a9ee  Update documentation about PIP 20.3 incompatibility
    omit 77f3ecd  Pins PIP to 20.2.4 in our Dockerfiles (#12738)
    omit 9a0c638  Add Changelog for 1.10.14
    omit c00a98e  Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)
    omit 06a4606  Bump Airflow Version to 1.10.14
     new 2b8b8a8  Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)
     new 9312a28  Pins PIP to 20.2.4 in our Dockerfiles (#12738)
     new d59853a  Update documentation about PIP 20.3 incompatibility
     new f46ed7c  Don't let webserver run with dangerous config (#12747)
     new fc6d0a8  Bump Airflow Version to 1.10.14
     new a30ee0e  Add Changelog for 1.10.14

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (dd00196)
            \
             N -- N -- N   refs/heads/v1-10-test (a30ee0e)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 6 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 CHANGELOG.txt | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)


[airflow] 04/06: Don't let webserver run with dangerous config (#12747)

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f46ed7c85d2392d3ac4d40e98a53cd9c1b5210f5
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Wed Dec 2 10:55:22 2020 +0000

    Don't let webserver run with dangerous config (#12747)
    
    (cherry picked from commit dab783fcdcd6e18ee4d46c6daad0d43a0b075ada)
---
 airflow/bin/cli.py | 11 +++++++++++
 1 file changed, 11 insertions(+)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index 4f23038..ac1b9a4 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -1138,6 +1138,17 @@ def webserver(args):
     py2_deprecation_waring()
     print(settings.HEADER)
 
+    # Check for old/insecure config, and fail safe (i.e. don't launch) if the config is wildly insecure.
+    if conf.get('webserver', 'secret_key') == 'temporary_key':
+        print(
+            "ERROR: The `secret_key` setting under the webserver config has an insecure "
+            "value - Airflow has failed safe and refuses to start. Please change this value to a new, "
+            "per-environment, randomly generated string, for example using this command `openssl rand "
+            "-hex 30`",
+            file=sys.stderr,
+        )
+        sys.exit(1)
+
     access_logfile = args.access_logfile or conf.get('webserver', 'access_logfile')
     error_logfile = args.error_logfile or conf.get('webserver', 'error_logfile')
     num_workers = args.workers or conf.get('webserver', 'workers')


[airflow] 02/06: Pins PIP to 20.2.4 in our Dockerfiles (#12738)

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9312a28117b398c67f9367cc140c17362ffc65c9
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Dec 1 17:39:55 2020 +0100

    Pins PIP to 20.2.4 in our Dockerfiles (#12738)
    
    Until we make sure that the new resolver in PIP 20.3 works
    we should pin PIP to 20.2.4.
    
    This is hopefully a temporary measure.
    
    Part of #12737
    
    (cherry picked from commit 0451d84ea2409c7b091640f52c25ac9a0bb2505f)
---
 Dockerfile    | 12 ++++++++++++
 Dockerfile.ci |  5 +++++
 2 files changed, 17 insertions(+)

diff --git a/Dockerfile b/Dockerfile
index 9b96cfa..35f50b3 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -47,6 +47,8 @@ ARG CASS_DRIVER_BUILD_CONCURRENCY="8"
 ARG PYTHON_BASE_IMAGE="python:3.6-slim-buster"
 ARG PYTHON_MAJOR_MINOR_VERSION="3.6"
 
+ARG PIP_VERSION=20.2.4
+
 ##############################################################################################
 # This is the build image where we build all dependencies
 ##############################################################################################
@@ -59,6 +61,9 @@ ENV PYTHON_BASE_IMAGE=${PYTHON_BASE_IMAGE}
 ARG PYTHON_MAJOR_MINOR_VERSION
 ENV PYTHON_MAJOR_MINOR_VERSION=${PYTHON_MAJOR_MINOR_VERSION}
 
+ARG PIP_VERSION
+ENV PIP_VERSION=${PIP_VERSION}
+
 # Make sure noninteractive debian install is used and language variables set
 ENV DEBIAN_FRONTEND=noninteractive LANGUAGE=C.UTF-8 LANG=C.UTF-8 LC_ALL=C.UTF-8 \
     LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8
@@ -168,6 +173,8 @@ RUN if [[ -f /docker-context-files/.pypirc ]]; then \
         cp /docker-context-files/.pypirc /root/.pypirc; \
     fi
 
+RUN pip install --upgrade "pip==${PIP_VERSION}"
+
 # In case of Production build image segment we want to pre-install master version of airflow
 # dependencies from GitHub so that we do not have to always reinstall it from the scratch.
 RUN if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \
@@ -295,6 +302,9 @@ ENV AIRFLOW_VERSION=${AIRFLOW_VERSION}
 ENV DEBIAN_FRONTEND=noninteractive LANGUAGE=C.UTF-8 LANG=C.UTF-8 LC_ALL=C.UTF-8 \
     LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8
 
+ARG PIP_VERSION
+ENV PIP_VERSION=${PIP_VERSION}
+
 # Install curl and gnupg2 - needed for many other installation steps
 RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -395,6 +405,8 @@ COPY --chown=airflow:root scripts/in_container/prod/entrypoint_prod.sh /entrypoi
 COPY --chown=airflow:root scripts/in_container/prod/clean-logs.sh /clean-logs
 RUN chmod a+x /entrypoint /clean-logs
 
+RUN pip install --upgrade "pip==${PIP_VERSION}"
+
 # Make /etc/passwd root-group-writeable so that user can be dynamically added by OpenShift
 # See https://github.com/apache/airflow/issues/9248
 RUN chmod g=u /etc/passwd
diff --git a/Dockerfile.ci b/Dockerfile.ci
index cac73bb..c71fae6 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -29,6 +29,9 @@ ENV AIRFLOW_VERSION=$AIRFLOW_VERSION
 ARG PYTHON_MAJOR_MINOR_VERSION="3.6"
 ENV PYTHON_MAJOR_MINOR_VERSION=${PYTHON_MAJOR_MINOR_VERSION}
 
+ARG PIP_VERSION=20.2.4
+ENV PIP_VERSION=${PIP_VERSION}
+
 # Print versions
 RUN echo "Base image: ${PYTHON_BASE_IMAGE}"
 RUN echo "Airflow version: ${AIRFLOW_VERSION}"
@@ -262,6 +265,8 @@ ENV AIRFLOW_LOCAL_PIP_WHEELS=${AIRFLOW_LOCAL_PIP_WHEELS}
 ARG INSTALL_AIRFLOW_VIA_PIP="true"
 ENV INSTALL_AIRFLOW_VIA_PIP=${INSTALL_AIRFLOW_VIA_PIP}
 
+RUN pip install --upgrade "pip==${PIP_VERSION}"
+
 # In case of CI builds we want to pre-install master version of airflow dependencies so that
 # We do not have to always reinstall it from the scratch.
 # This can be reinstalled from latest master by increasing PIP_DEPENDENCIES_EPOCH_NUMBER.


[airflow] 03/06: Update documentation about PIP 20.3 incompatibility

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d59853a89c283526b009a734aa6561eaaaf3d05e
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Wed Dec 2 17:37:48 2020 +0100

    Update documentation about PIP 20.3 incompatibility
---
 CONTRIBUTING.rst      | 26 ++++++++++++++++++++++++--
 IMAGES.rst            |  8 ++++++++
 INSTALL               | 26 +++++++++++++++++++++++---
 LOCAL_VIRTUALENV.rst  |  8 ++++++++
 README.md             |  9 +++++++++
 docs/installation.rst | 16 ++++++++++++++++
 docs/metrics.rst      |  8 ++++++++
 docs/security.rst     | 23 +++++++++++++++++++++++
 docs/start.rst        |  9 +++++++++
 9 files changed, 128 insertions(+), 5 deletions(-)

diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 0c1c9c1..8c4bf35 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -541,6 +541,14 @@ extras can be specified after the usual pip install - for example
 installs all development dependencies. There is also ``devel_ci`` that installs
 all dependencies needed in the CI environment.
 
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 This is the full list of those extras:
 
   .. START EXTRAS HERE
@@ -591,6 +599,14 @@ the other provider package you can install it adding [extra] after the
 ``pip install apache-airflow-backport-providers-google[amazon]`` in case you want to use GCP
 transfer operators from Amazon ECS.
 
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 If you add a new dependency between different providers packages, it will be detected automatically during
 pre-commit phase and pre-commit will fail - and add entry in dependencies.json so that the package extra
 dependencies are properly added when package is installed.
@@ -671,6 +687,14 @@ install in case a direct or transitive dependency is released that breaks the in
 when installing ``apache-airflow``, you might need to provide additional constraints (for
 example ``pip install apache-airflow==1.10.2 Werkzeug<1.0.0``)
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 However we now have ``constraints-<PYTHON_MAJOR_MINOR_VERSION>.txt`` files generated
 automatically and committed to orphan ``constraints-master`` and ``constraint-1-10`` branches based on
 the set of all latest working and tested dependency versions. Those
@@ -682,7 +706,6 @@ constraints file when installing Apache Airflow - either from the sources:
   pip install -e . \
     --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1-10/constraints-3.6.txt"
 
-
 or from the pypi package:
 
 .. code-block:: bash
@@ -690,7 +713,6 @@ or from the pypi package:
   pip install apache-airflow \
     --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1-10/constraints-3.6.txt"
 
-
 This works also with extras - for example:
 
 .. code-block:: bash
diff --git a/IMAGES.rst b/IMAGES.rst
index 724d73c..b486cba 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -125,6 +125,14 @@ This will build the image using command similar to:
       apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.13 \
       --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt"
 
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 You can also build production images from specific Git version via providing ``--install-airflow-reference``
 parameter to Breeze (this time constraints are taken from the ``constraints-master`` branch which is the
 HEAD of development for constraints):
diff --git a/INSTALL b/INSTALL
index 0e2f582..763ed20 100644
--- a/INSTALL
+++ b/INSTALL
@@ -31,16 +31,36 @@ source PATH_TO_YOUR_VENV/bin/activate
 # [required] building and installing by pip (preferred)
 pip install .
 
-# or directly
+NOTE!
+
+On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+depends on your choice of extras. In order to install Airflow you need to either downgrade
+pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+``--use-deprecated legacy-resolver`` to your pip install command.
+
+
+# or you can install it directly via setup.py
 python setup.py install
 
+
 # You can also install recommended version of the dependencies by using
 # constraint-python<PYTHON_MAJOR_MINOR_VERSION>.txt files as constraint file. This is needed in case
 # you have problems with installing the current requirements from PyPI.
-# There are different constraint files for different python versions. For example"
+# There are different constraint files for different python versions and you shopuld choose the
+# version of constraints specific for your version.
+# For example:
 
 pip install . \
-  --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt"
+  --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt"
+
+
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
 
 # You can also install Airflow with extras specified. The list of available extras:
 # START EXTRAS HERE
diff --git a/LOCAL_VIRTUALENV.rst b/LOCAL_VIRTUALENV.rst
index 8a20c02..574366d 100644
--- a/LOCAL_VIRTUALENV.rst
+++ b/LOCAL_VIRTUALENV.rst
@@ -118,6 +118,14 @@ To create and initialize the local virtualenv:
 
     pip install -U -e ".[devel,<OTHER EXTRAS>]" # for example: pip install -U -e ".[devel,gcp,postgres]"
 
+.. note::
+   On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+   This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+   depends on your choice of extras. In order to install Airflow you need to either downgrade
+   pip to version 20.2.4 ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 In case you have problems with installing airflow because of some requirements are not installable, you can
 try to install it with the set of working constraints (note that there are different constraint files
 for different python versions:
diff --git a/README.md b/README.md
index 5a5edd6..75ccd24 100644
--- a/README.md
+++ b/README.md
@@ -122,6 +122,15 @@ pip install apache-airflow==1.10.13 \
  --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.7.txt"
 ```
 
+**NOTE!!!**
+
+On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
+This resolver does not yet work with Apache Airflow and might leads to errors in installation -
+depends on your choice of extras. In order to install Airflow you need to either downgrade
+pip to version 20.2.4 `pip upgrade --pip==20.2.4` or, in case you use Pip 20.3, you need to add option
+`--use-deprecated legacy-resolver` to your pip install command.
+
+
 2. Installing with extras (for example postgres,gcp)
 ```bash
 pip install apache-airflow[postgres,gcp]==1.10.13 \
diff --git a/docs/installation.rst b/docs/installation.rst
index 12ce19e..fa5fbc9 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -58,6 +58,14 @@ and python versions in the URL.
     # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt
     pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
 
+
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
 2. Installing with extras (for example postgres, google)
 
 .. code-block:: bash
@@ -68,6 +76,14 @@ and python versions in the URL.
     pip install "apache-airflow[postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
 
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 You need certain system level requirements in order to install Airflow. Those are requirements that are known
 to be needed for Linux system (Tested on Ubuntu Buster LTS) :
 
diff --git a/docs/metrics.rst b/docs/metrics.rst
index 7f7c92d..82e62b0 100644
--- a/docs/metrics.rst
+++ b/docs/metrics.rst
@@ -31,6 +31,14 @@ First you must install statsd requirement:
 
    pip install 'apache-airflow[statsd]'
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 Add the following lines to your configuration file e.g. ``airflow.cfg``
 
 .. code-block:: ini
diff --git a/docs/security.rst b/docs/security.rst
index b22dfc0..5fdf23f 100644
--- a/docs/security.rst
+++ b/docs/security.rst
@@ -320,6 +320,13 @@ To use kerberos authentication, you must install Airflow with the ``kerberos`` e
 
    pip install 'apache-airflow[kerberos]'
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
 OAuth Authentication
 --------------------
 
@@ -359,6 +366,14 @@ To use GHE authentication, you must install Airflow with the ``github_enterprise
 
    pip install 'apache-airflow[github_enterprise]'
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 Setting up GHE Authentication
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -414,6 +429,14 @@ To use Google authentication, you must install Airflow with the ``google_auth``
 
    pip install 'apache-airflow[google_auth]'
 
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 Setting up Google Authentication
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
diff --git a/docs/start.rst b/docs/start.rst
index bff52ae..f2b4322 100644
--- a/docs/start.rst
+++ b/docs/start.rst
@@ -43,6 +43,15 @@ The installation is quick and straightforward.
 
     # visit localhost:8080 in the browser and enable the example dag in the home page
 
+
+.. note::
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+
 Upon running these commands, Airflow will create the ``$AIRFLOW_HOME`` folder
 and lay an "airflow.cfg" file with defaults that get you going fast. You can
 inspect the file either in ``$AIRFLOW_HOME/airflow.cfg``, or through the UI in


[airflow] 06/06: Add Changelog for 1.10.14

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a30ee0eabaa959a7c709fa5c1b2d6ebb34ff5ae2
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Dec 2 15:25:02 2020 +0000

    Add Changelog for 1.10.14
---
 CHANGELOG.txt | 30 ++++++++++++++++++++++++++++++
 1 file changed, 30 insertions(+)

diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index b818fef..a735c16 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -1,3 +1,33 @@
+Airflow 1.10.14, 2020-12-06
+----------------------------
+
+Bug Fixes
+"""""""""
+
+- BugFix: Tasks with ``depends_on_past`` or ``task_concurrency`` are stuck (#12663)
+- Fix issue with empty Resources in executor_config (#12633)
+- Fix: Deprecated config ``force_log_out_after`` was not used (#12661)
+- Fix empty asctime field in JSON formatted logs (#10515)
+- [AIRFLOW-2809] Fix security issue regarding Flask SECRET_KEY (#3651)
+- [AIRFLOW-2884] Fix Flask SECRET_KEY security issue in www_rbac (#3729)
+- [AIRFLOW-2886] Generate random Flask SECRET_KEY in default config (#3738)
+
+Improvements
+""""""""""""
+
+- Update setup.py to get non-conflicting set of dependencies (#12636)
+- Rename ``[scheduler] max_threads`` to ``[scheduler] parsing_processes`` (#12605)
+- Add metric for scheduling delay between first run task & expected start time (#9544)
+- Add new-style 2.0 command names for Airflow 1.10.x (#12725)
+- Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)
+- Don't let webserver run with dangerous config (#12747)
+
+Doc only changes
+""""""""""""""""
+
+- Clarified information about supported Databases
+
+
 Airflow 1.10.13, 2020-11-24
 ----------------------------
 


[airflow] 05/06: Bump Airflow Version to 1.10.14

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit fc6d0a82ab5ccc423ebadb06fa45d3c693158ab0
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Dec 2 15:28:28 2020 +0000

    Bump Airflow Version to 1.10.14
---
 IMAGES.rst                     | 18 +++++++--------
 README.md                      | 12 +++++-----
 airflow/version.py             |  2 +-
 docs/installation.rst          |  8 +++----
 docs/production-deployment.rst | 50 +++++++++++++++++++++---------------------
 5 files changed, 45 insertions(+), 45 deletions(-)

diff --git a/IMAGES.rst b/IMAGES.rst
index b486cba..6a04428 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -39,7 +39,7 @@ The images are named as follows:
 
 where:
 
-* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, ``v1-10-test``, ``1.10.13``
+* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, ``v1-10-test``, ``1.10.14``
   The ``master`` and ``v1-10-test`` labels are built from branches so they change over time. The ``1.10.*`` and in
   the future ``2.*`` labels are build from git tags and they are "fixed" once built.
 * ``PYTHON_MAJOR_MINOR_VERSION`` - version of python used to build the image. Examples: ``3.5``, ``3.7``
@@ -115,15 +115,15 @@ parameter to Breeze:
 .. code-block:: bash
 
   ./breeze build-image --python 3.7 --additional-extras=presto \
-      --production-image --install-airflow-version=1.10.13
+      --production-image --install-airflow-version=1.10.14
 
 This will build the image using command similar to:
 
 .. code-block:: bash
 
     pip install \
-      apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.13 \
-      --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt"
+      apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv,presto]==1.10.14 \
+      --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt"
 
 .. note::
    On 30th of November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver.
@@ -218,8 +218,8 @@ For example:
   apache/airflow:master-python3.6                - production "latest" image from current master
   apache/airflow:master-python3.6-ci             - CI "latest" image from current master
   apache/airflow:v1-10-test-python2.7-ci         - CI "latest" image from current v1-10-test branch
-  apache/airflow:1.10.13-python3.6               - production image for 1.10.13 release
-  apache/airflow:1.10.13-1-python3.6             - production image for 1.10.13 with some patches applied
+  apache/airflow:1.10.14-python3.6               - production image for 1.10.14 release
+  apache/airflow:1.10.14-1-python3.6             - production image for 1.10.14 with some patches applied
 
 
 You can see DockerHub images at `<https://hub.docker.com/repository/docker/apache/airflow>`_
@@ -300,7 +300,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -316,7 +316,7 @@ the same image can be built using ``breeze`` (it supports auto-completion of the
 .. code-block:: bash
 
   ./breeze build-image -f Dockerfile.ci \
-      --production-image  --python 3.7 --install-airflow-version=1.10.13 \
+      --production-image  --python 3.7 --install-airflow-version=1.10.14 \
       --additional-extras=jdbc --additional-python-deps="pandas" \
       --additional-dev-apt-deps="gcc g++" --additional-runtime-apt-deps="default-jre-headless"
 You can build the default production image with standard ``docker build`` command but they will only build
@@ -334,7 +334,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
diff --git a/README.md b/README.md
index 75ccd24..79cceed 100644
--- a/README.md
+++ b/README.md
@@ -76,7 +76,7 @@ Airflow is not a streaming solution, but it is often used to process real-time d
 
 Apache Airflow is tested with:
 
-|              | Master version (2.0.0dev) | Stable version (1.10.13) |
+|              | Master version (2.0.0dev) | Stable version (1.10.14) |
 | ------------ | ------------------------- | ------------------------ |
 | Python       | 3.6, 3.7, 3.8             | 2.7, 3.5, 3.6, 3.7, 3.8  |
 | PostgreSQL   | 9.6, 10, 11, 12, 13       | 9.6, 10, 11, 12, 13      |
@@ -109,7 +109,7 @@ if needed. This means that from time to time plain `pip install apache-airflow`
 produce unusable Airflow installation.
 
 In order to have repeatable installation, however, introduced in **Airflow 1.10.10** and updated in
-**Airflow 1.10.13** we also keep a set of "known-to-be-working" constraint files in the
+**Airflow 1.10.12** we also keep a set of "known-to-be-working" constraint files in the
 orphan `constraints-master` and `constraints-1-10` branches. We keep those "known-to-be-working"
 constraints files separately per major/minor python version.
 You can use them as constraint files when installing Airflow from PyPI. Note that you have to specify
@@ -118,8 +118,8 @@ correct Airflow tag/version/branch and python versions in the URL.
 1. Installing just Airflow:
 
 ```bash
-pip install apache-airflow==1.10.13 \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.7.txt"
+pip install apache-airflow==1.10.14 \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.7.txt"
 ```
 
 **NOTE!!!**
@@ -133,8 +133,8 @@ pip to version 20.2.4 `pip upgrade --pip==20.2.4` or, in case you use Pip 20.3,
 
 2. Installing with extras (for example postgres,gcp)
 ```bash
-pip install apache-airflow[postgres,gcp]==1.10.13 \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.7.txt"
+pip install apache-airflow[postgres,gcp]==1.10.14 \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.7.txt"
 ```
 
 For information on installing backport providers check https://airflow.readthedocs.io/en/latest/backport-providers.html.
diff --git a/airflow/version.py b/airflow/version.py
index 115c560..b3b5b30 100644
--- a/airflow/version.py
+++ b/airflow/version.py
@@ -18,4 +18,4 @@
 # under the License.
 #
 
-version = '1.10.13'
+version = '1.10.14'
diff --git a/docs/installation.rst b/docs/installation.rst
index fa5fbc9..ed4f4a0 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -31,7 +31,7 @@ if needed. This means that from time to time plain ``pip install apache-airflow`
 produce unusable Airflow installation.
 
 In order to have repeatable installation, however, starting from **Airflow 1.10.10** and updated in
-**Airflow 1.10.13** we also keep a set of "known-to-be-working" constraint files in the
+**Airflow 1.10.12** we also keep a set of "known-to-be-working" constraint files in the
 ``constraints-master`` and ``constraints-1-10`` orphan branches.
 Those "known-to-be-working" constraints are per major/minor python version. You can use them as constraint
 files when installing Airflow from PyPI. Note that you have to specify correct Airflow version
@@ -51,11 +51,11 @@ and python versions in the URL.
 
 .. code-block:: bash
 
-    AIRFLOW_VERSION=1.10.13
+    AIRFLOW_VERSION=1.10.14
     PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
     # For example: 3.6
     CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
-    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.13/constraints-3.6.txt
+    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-1.10.14/constraints-3.6.txt
     pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
 
 
@@ -70,7 +70,7 @@ and python versions in the URL.
 
 .. code-block:: bash
 
-    AIRFLOW_VERSION=1.10.13
+    AIRFLOW_VERSION=1.10.14
     PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
     CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
     pip install "apache-airflow[postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
diff --git a/docs/production-deployment.rst b/docs/production-deployment.rst
index 3edddb8..ac6c76d 100644
--- a/docs/production-deployment.rst
+++ b/docs/production-deployment.rst
@@ -64,7 +64,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.13
+  FROM: apache/airflow:1.10.14
   USER root
   RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -81,7 +81,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.13
+  FROM: apache/airflow:1.10.14
   RUN pip install --no-cache-dir --user my-awesome-pip-dependency-to-add
 
 
@@ -92,7 +92,7 @@ You should be aware, about a few things:
 
 .. code-block:: dockerfile
 
-  FROM: apache/airflow:1.10.13
+  FROM: apache/airflow:1.10.14
   USER root
   RUN apt-get update \
     && apt-get install -y --no-install-recommends \
@@ -125,7 +125,7 @@ in the `<#production-image-build-arguments>`_ chapter below.
 
 Here just a few examples are presented which should give you general understanding of what you can customize.
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
+This builds the production image in version 3.7 with additional airflow extras from 1.10.14 PyPI package and
 additional apt dev and runtime dependencies.
 
 .. code-block:: bash
@@ -134,7 +134,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -150,7 +150,7 @@ the same image can be built using ``breeze`` (it supports auto-completion of the
 .. code-block:: bash
 
   ./breeze build-image \
-      --production-image  --python 3.7 --install-airflow-version=1.10.13 \
+      --production-image  --python 3.7 --install-airflow-version=1.10.14 \
       --additional-extras=jdbc --additional-python-deps="pandas" \
       --additional-dev-apt-deps="gcc g++" --additional-runtime-apt-deps="default-jre-headless"
 
@@ -166,7 +166,7 @@ based on example in `this comment <https://github.com/apache/airflow/issues/8605
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -225,7 +225,7 @@ Preparing the constraint files and wheel files:
 
   pip download --dest docker-context-files \
     --constraint docker-context-files/constraints-1-10.txt  \
-    apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.13
+    apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.14
 
 
 Building the image (after copying the files downloaded to the "docker-context-files" directory:
@@ -233,7 +233,7 @@ Building the image (after copying the files downloaded to the "docker-context-fi
 .. code-block:: bash
 
   ./breeze build-image \
-      --production-image --python 3.7 --install-airflow-version=1.10.13 \
+      --production-image --python 3.7 --install-airflow-version=1.10.14 \
       --disable-mysql-client-installation --disable-pip-cache --add-local-pip-wheels \
       --constraints-location="/docker-context-files/constraints-1-10.txt"
 
@@ -245,7 +245,7 @@ or
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
@@ -392,7 +392,7 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 |                                          |                                          | ``constraints-master`` but can be        |
 |                                          |                                          | ``constraints-1-10`` for 1.10.* versions |
 |                                          |                                          | or it could point to specific version    |
-|                                          |                                          | for example ``constraints-1.10.13``      |
+|                                          |                                          | for example ``constraints-1.10.14``      |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_EXTRAS``                       | (see Dockerfile)                         | Default extras with which airflow is     |
 |                                          |                                          | installed                                |
@@ -503,7 +503,7 @@ production image. There are three types of build:
 | ``AIRFLOW_INSTALL_VERSION``       | Optional - might be used for      |
 |                                   | package installation case to      |
 |                                   | set Airflow version for example   |
-|                                   | "==1.10.13"                       |
+|                                   | "==1.10.14"                       |
 +-----------------------------------+-----------------------------------+
 | ``AIRFLOW_CONSTRAINTS_REFERENCE`` | reference (branch or tag) from    |
 |                                   | GitHub where constraints file     |
@@ -512,7 +512,7 @@ production image. There are three types of build:
 |                                   | ``constraints-1-10`` for 1.10.*   |
 |                                   | constraint or if you want to      |
 |                                   | point to specific version         |
-|                                   | might be ``constraints-1.10.13``  |
+|                                   | might be ``constraints-1.10.14``  |
 +-----------------------------------+-----------------------------------+
 | ``SLUGIFY_USES_TEXT_UNIDECODE``   | In case of of installing airflow  |
 |                                   | 1.10.2 or 1.10.1 you need to      |
@@ -546,7 +546,7 @@ of 2.0 currently):
 
   docker build .
 
-This builds the production image in version 3.7 with default extras from 1.10.13 tag and
+This builds the production image in version 3.7 with default extras from 1.10.14 tag and
 constraints taken from constraints-1-10-12 branch in GitHub.
 
 .. code-block:: bash
@@ -554,14 +554,14 @@ constraints taken from constraints-1-10-12 branch in GitHub.
   docker build . \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
-    --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.13.tar.gz#egg=apache-airflow" \
+    --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.14.tar.gz#egg=apache-airflow" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty"
 
-This builds the production image in version 3.7 with default extras from 1.10.13 PyPI package and
-constraints taken from 1.10.13 tag in GitHub and pre-installed pip dependencies from the top
+This builds the production image in version 3.7 with default extras from 1.10.14 PyPI package and
+constraints taken from 1.10.14 tag in GitHub and pre-installed pip dependencies from the top
 of v1-10-test branch.
 
 .. code-block:: bash
@@ -570,14 +570,14 @@ of v1-10-test branch.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
-    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.13" \
+    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.14" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty"
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
-additional python dependencies and pre-installed pip dependencies from 1.10.13 tagged constraints.
+This builds the production image in version 3.7 with additional airflow extras from 1.10.14 PyPI package and
+additional python dependencies and pre-installed pip dependencies from 1.10.14 tagged constraints.
 
 .. code-block:: bash
 
@@ -585,15 +585,15 @@ additional python dependencies and pre-installed pip dependencies from 1.10.13 t
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_BRANCH="v1-10-test" \
-    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.13" \
+    --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1.10.14" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
     --build-arg ADDITIONAL_AIRFLOW_EXTRAS="mssql,hdfs"
     --build-arg ADDITIONAL_PYTHON_DEPS="sshtunnel oauth2client"
 
-This builds the production image in version 3.7 with additional airflow extras from 1.10.13 PyPI package and
+This builds the production image in version 3.7 with additional airflow extras from 1.10.14 PyPI package and
 additional apt dev and runtime dependencies.
 
 .. code-block:: bash
@@ -602,7 +602,7 @@ additional apt dev and runtime dependencies.
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
-    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.13" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.14" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-1-10" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \


[airflow] 01/06: Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2b8b8a85e04f4ee632601be4524faaaf5a04ce6d
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Tue Nov 3 15:28:51 2020 +0000

    Add Kubernetes cleanup-pods CLI command for Helm Chart (#11802)
    
    closes: https://github.com/apache/airflow/issues/11146
    (cherry picked from commit 980c7252c0f28c251e9f87d736cd88d6027f3da3)
---
 airflow/bin/cli.py    |  81 +++++++++++++++++++++++++++++++++
 tests/cli/test_cli.py | 122 ++++++++++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 203 insertions(+)

diff --git a/airflow/bin/cli.py b/airflow/bin/cli.py
index a155cff..4f23038 100644
--- a/airflow/bin/cli.py
+++ b/airflow/bin/cli.py
@@ -1464,6 +1464,74 @@ Happy Airflowing!
     print(output_string)
 
 
+@cli_utils.action_logging
+def cleanup_pods(args):
+    from kubernetes.client.rest import ApiException
+
+    from airflow.kubernetes.kube_client import get_kube_client
+
+    """Clean up k8s pods in evicted/failed/succeeded states"""
+    namespace = args.namespace
+
+    # https://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle/
+    # All Containers in the Pod have terminated in success, and will not be restarted.
+    pod_succeeded = 'succeeded'
+
+    # All Containers in the Pod have terminated, and at least one Container has terminated in failure.
+    # That is, the Container either exited with non-zero status or was terminated by the system.
+    pod_failed = 'failed'
+
+    # https://kubernetes.io/docs/tasks/administer-cluster/out-of-resource/
+    pod_reason_evicted = 'evicted'
+    # If pod is failed and restartPolicy is:
+    # * Always: Restart Container; Pod phase stays Running.
+    # * OnFailure: Restart Container; Pod phase stays Running.
+    # * Never: Pod phase becomes Failed.
+    pod_restart_policy_never = 'never'
+
+    print('Loading Kubernetes configuration')
+    kube_client = get_kube_client()
+    print('Listing pods in namespace {}'.format(namespace))
+    continue_token = None
+    while True:  # pylint: disable=too-many-nested-blocks
+        pod_list = kube_client.list_namespaced_pod(namespace=namespace, limit=500, _continue=continue_token)
+        for pod in pod_list.items:
+            pod_name = pod.metadata.name
+            print('Inspecting pod {}'.format(pod_name))
+            pod_phase = pod.status.phase.lower()
+            pod_reason = pod.status.reason.lower() if pod.status.reason else ''
+            pod_restart_policy = pod.spec.restart_policy.lower()
+
+            if (
+                pod_phase == pod_succeeded
+                or (pod_phase == pod_failed and pod_restart_policy == pod_restart_policy_never)
+                or (pod_reason == pod_reason_evicted)
+            ):
+                print('Deleting pod "{}" phase "{}" and reason "{}", restart policy "{}"'.format(
+                    pod_name, pod_phase, pod_reason, pod_restart_policy)
+                )
+                try:
+                    _delete_pod(pod.metadata.name, namespace)
+                except ApiException as e:
+                    print("can't remove POD: {}".format(e), file=sys.stderr)
+                continue
+            print('No action taken on pod {}'.format(pod_name))
+        continue_token = pod_list.metadata._continue  # pylint: disable=protected-access
+        if not continue_token:
+            break
+
+
+def _delete_pod(name, namespace):
+    """Helper Function for cleanup_pods"""
+    from kubernetes import client
+
+    core_v1 = client.CoreV1Api()
+    delete_options = client.V1DeleteOptions()
+    print('Deleting POD "{}" from "{}" namespace'.format(name, namespace))
+    api_response = core_v1.delete_namespaced_pod(name=name, namespace=namespace, body=delete_options)
+    print(api_response)
+
+
 @cli_utils.deprecated_action(new_name='celery worker')
 @cli_utils.action_logging
 def worker(args):
@@ -2705,6 +2773,13 @@ ARG_SKIP_SERVE_LOGS = Arg(
     action="store_true",
 )
 
+# kubernetes cleanup-pods
+ARG_NAMESPACE = Arg(
+    ("--namespace",),
+    default='default',
+    help="Kubernetes Namespace",
+)
+
 ALTERNATIVE_CONN_SPECS_ARGS = [
     ARG_CONN_TYPE,
     ARG_CONN_HOST,
@@ -3154,6 +3229,12 @@ CONFIG_COMMANDS = (
 
 KUBERNETES_COMMANDS = (
     ActionCommand(
+        name='cleanup-pods',
+        help="Clean up Kubernetes pods in evicted/failed/succeeded states",
+        func=cleanup_pods,
+        args=(ARG_NAMESPACE, ),
+    ),
+    ActionCommand(
         name='generate-dag-yaml',
         help="Generate YAML files for all tasks in DAG. Useful for debugging tasks without "
         "launching into a cluster",
diff --git a/tests/cli/test_cli.py b/tests/cli/test_cli.py
index 048f802..07d31ac 100644
--- a/tests/cli/test_cli.py
+++ b/tests/cli/test_cli.py
@@ -23,6 +23,8 @@ import io
 import logging
 import os
 
+import kubernetes
+
 from airflow.configuration import conf
 from parameterized import parameterized
 from six import StringIO, PY2
@@ -1026,3 +1028,123 @@ class TestCLIGetNumReadyWorkersRunning(unittest.TestCase):
 
         with mock.patch('psutil.Process', return_value=self.process):
             self.assertEqual(self.monitor._get_num_ready_workers_running(), 0)
+
+
+class TestCleanUpPodsCommand(unittest.TestCase):
+    @classmethod
+    def setUpClass(cls):
+        cls.parser = cli.get_parser()
+
+    @mock.patch('kubernetes.client.CoreV1Api.delete_namespaced_pod')
+    def test_delete_pod(self, delete_namespaced_pod):
+        cli._delete_pod('dummy', 'awesome-namespace')
+        delete_namespaced_pod.assert_called_with(body=mock.ANY, name='dummy', namespace='awesome-namespace')
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('airflow.kubernetes.kube_client.config.load_incluster_config')
+    def test_running_pods_are_not_cleaned(self, load_incluster_config, list_namespaced_pod, delete_pod):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy'
+        pod1.status.phase = 'Running'
+        pod1.status.reason = None
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_not_called()
+        load_incluster_config.assert_called_once_with()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('airflow.kubernetes.kube_client.config.load_incluster_config')
+    def test_cleanup_succeeded_pods(self, load_incluster_config, list_namespaced_pod, delete_pod):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy'
+        pod1.status.phase = 'Succeeded'
+        pod1.status.reason = None
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_called_with('dummy', 'awesome-namespace')
+        load_incluster_config.assert_called_once_with()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('kubernetes.config.load_incluster_config')
+    def test_no_cleanup_failed_pods_wo_restart_policy_never(
+        self, load_incluster_config, list_namespaced_pod, delete_pod
+    ):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy2'
+        pod1.status.phase = 'Failed'
+        pod1.status.reason = None
+        pod1.spec.restart_policy = 'Always'
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_not_called()
+        load_incluster_config.assert_called_once_with()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('kubernetes.config.load_incluster_config')
+    def test_cleanup_failed_pods_w_restart_policy_never(
+        self, load_incluster_config, list_namespaced_pod, delete_pod
+    ):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy3'
+        pod1.status.phase = 'Failed'
+        pod1.status.reason = None
+        pod1.spec.restart_policy = 'Never'
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_called_with('dummy3', 'awesome-namespace')
+        load_incluster_config.assert_called_once_with()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('kubernetes.config.load_incluster_config')
+    def test_cleanup_evicted_pods(self, load_incluster_config, list_namespaced_pod, delete_pod):
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy4'
+        pod1.status.phase = 'Failed'
+        pod1.status.reason = 'Evicted'
+        pod1.spec.restart_policy = 'Never'
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        delete_pod.assert_called_with('dummy4', 'awesome-namespace')
+        load_incluster_config.assert_called_once_with()
+
+    @mock.patch('airflow.bin.cli._delete_pod')
+    @mock.patch('kubernetes.client.CoreV1Api.list_namespaced_pod')
+    @mock.patch('kubernetes.config.load_incluster_config')
+    def test_cleanup_api_exception_continue(self, load_incluster_config, list_namespaced_pod, delete_pod):
+        delete_pod.side_effect = kubernetes.client.rest.ApiException(status=0)
+        pod1 = MagicMock()
+        pod1.metadata.name = 'dummy'
+        pod1.status.phase = 'Succeeded'
+        pod1.status.reason = None
+        pods = list_namespaced_pod()
+        pods.metadata._continue = None
+        pods.items = [pod1]
+        cli.cleanup_pods(
+            self.parser.parse_args(['kubernetes', 'cleanup-pods', '--namespace', 'awesome-namespace'])
+        )
+        load_incluster_config.assert_called_once_with()