You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ep...@apache.org on 2022/08/19 12:45:55 UTC

[airflow] branch v2-3-test updated (184d29ceea -> e4ac0f1c26)

This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-3-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


 discard 184d29ceea Help pip resolver make better decision on Pyarrow (#25791)
 discard a51ed184a1 Remove certifi limitations from eager upgrade limits (#23995)
    omit beaca717c5 Add release notes
    omit 0625247993 Update version to 2.3.4
     new 151eebf112 Remove certifi limitations from eager upgrade limits (#23995)
     new d6e1e53652 Help pip resolver make better decision on Pyarrow (#25791)
     new 65aca96638 Update version to 2.3.4
     new e4ac0f1c26 Add release notes

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (184d29ceea)
            \
             N -- N -- N   refs/heads/v2-3-test (e4ac0f1c26)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 4 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:


[airflow] 04/04: Add release notes

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-3-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e4ac0f1c2604202d141ac8848fb4f6852ea2fd8d
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Tue Aug 16 11:56:06 2022 +0100

    Add release notes
---
 RELEASE_NOTES.rst                   | 93 +++++++++++++++++++++++++++++++++++++
 newsfragments/23574.feature.rst     |  1 -
 newsfragments/24755.improvement.rst |  1 -
 newsfragments/24811.significant.rst | 22 ---------
 newsfragments/25147.bugfix.rst      |  1 -
 5 files changed, 93 insertions(+), 25 deletions(-)

diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index 546a1e1c79..7818e538dc 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -22,6 +22,99 @@
 .. towncrier release notes start
 
 
+Airflow 2.3.4 (2022-08-22)
+--------------------------
+
+Significant Changes
+^^^^^^^^^^^^^^^^^^^
+
+Added new config ``[logging]log_formatter_class`` to fix timezone display for logs on UI (#24811)
+"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
+
+If you are using a custom Formatter subclass in your ``[logging]logging_config_class``, please inherit from ``airflow.utils.log.timezone_aware.TimezoneAware`` instead of ``logging.Formatter``.
+For example, in your ``custom_config.py``:
+
+.. code-block:: python
+    from airflow.utils.log.timezone_aware import TimezoneAware
+
+    # before
+    class YourCustomFormatter(logging.Formatter):
+        ...
+
+
+    # after
+    class YourCustomFormatter(TimezoneAware):
+        ...
+
+
+    AIRFLOW_FORMATTER = LOGGING_CONFIG["formatters"]["airflow"]
+    AIRFLOW_FORMATTER["class"] = "somewhere.your.custom_config.YourCustomFormatter"
+    # or use TimezoneAware class directly. If you don't have custom Formatter.
+    AIRFLOW_FORMATTER["class"] = "airflow.utils.log.timezone_aware.TimezoneAware"
+
+Bug Fixes
+^^^^^^^^^
+
+- Fix mapped sensor with reschedule mode (#25594)
+- Cache the custom secrets backend so the same instance gets re-used (#25556)
+- Add right padding (#25554)
+- Fix reducing mapped length of a mapped task at runtime after a clear (#25531)
+- Fix ``airflow db reset`` when dangling tables exist (#25441)
+- Change ``disable_verify_ssl`` behaviour (#25023)
+- Set default task group in dag.add_task method (#25000)
+- Added exception catching to send default email if template file raises any exception (#24943)
+- Removed interfering force of index. (#25404)
+- Remove useless logging line (#25347)
+- Don't mistakenly take a lock on DagRun via ``ti.refresh_from_fb`` (#25312)
+- Adding mysql index hint to use index on ``task_instance.state`` in critical section query (#25673)
+- Configurable umask to all daemonized processes. (#25664)
+- Fix the errors raised when None is passed to template filters (#25593)
+- Allow wildcarded CORS origins (#25553)
+- Fix "This Session's transaction has been rolled back" (#25532)
+- Fix Serialization error in ``TaskCallbackRequest`` (#25471)
+- fix - resolve bash by absolute path (#25331)
+- Add ``__repr__`` to ParamsDict class (#25305)
+- Only load distribution of a name once (#25296)
+- convert ``TimeSensorAsync`` ``target_time`` to utc on call time (#25221)
+- call ``updateNodeLabels`` after ``expandGroup`` (#25217)
+- Stop SLA callbacks gazumping other callbacks and DOS'ing the ``DagProcessorManager`` queue (#25147)
+- Fix ``invalidateQueries`` call (#25097)
+- ``airflow/www/package.json``: Add name, version fields. (#25065)
+- No grid auto-refresh for backfill dag runs (#25042)
+- Fix tag link on dag detail page (#24918)
+- Fix zombie task handling with multiple schedulers (#24906)
+- Bind log server on worker to IPv6 address (#24755) (#24846)
+- Add ``%z`` for ``%(asctime)s`` to fix timezone for logs on UI (#24811)
+- ``TriggerDagRunOperator.operator_extra_links`` is attr (#24676)
+- Send DAG timeout callbacks to processor outside of ``prohibit_commit`` (#24366)
+- Don't rely on current ORM structure for db clean command (#23574)
+- Clear next method when clearing TIs (#23929)
+- Rotate session id during login (#25771)
+
+Doc only changes
+^^^^^^^^^^^^^^^^
+
+- Update set-up-database.rst (#24983)
+- Fix syntax in mysql setup documentation (#24893 (#24939)
+- Note how DAG policy works with default_args (#24804)
+- Update PythonVirtualenvOperator Howto (#24782)
+- Doc: Add hyperlinks to Github PRs for Release Notes (#24532)
+
+Misc/Internal
+^^^^^^^^^^^^^
+
+- Bump cattrs version (#25689)
+- Include missing mention of ``external_executor_id`` in ``sql_engine_collation_for_ids`` docs (#25197)
+- Refactor ``DR.task_instance_scheduling_decisions`` (#24774)
+- Sort operator extra links (#24992)
+- Extends ``resolve_xcom_backend`` function level documentation (#24965)
+- Upgrade FAB to 4.1.3 (#24884)
+- Limit Flask to <2.3 in the wake of 2.2 breaking our tests (#25511)
+- Limit astroid version to < 2.12 (#24982)
+- Move javascript compilation to host (#25169)
+- Bump typing-extensions and mypy for ParamSpec (#25088)
+
+
 Airflow 2.3.3 (2022-07-09)
 --------------------------
 
diff --git a/newsfragments/23574.feature.rst b/newsfragments/23574.feature.rst
deleted file mode 100644
index 805b7b18bd..0000000000
--- a/newsfragments/23574.feature.rst
+++ /dev/null
@@ -1 +0,0 @@
-Command ``airflow db clean`` now archives data before purging.
diff --git a/newsfragments/24755.improvement.rst b/newsfragments/24755.improvement.rst
deleted file mode 100644
index 1a75c283f4..0000000000
--- a/newsfragments/24755.improvement.rst
+++ /dev/null
@@ -1 +0,0 @@
-Log server on worker binds IPv6 interface.
diff --git a/newsfragments/24811.significant.rst b/newsfragments/24811.significant.rst
deleted file mode 100644
index cb7208843c..0000000000
--- a/newsfragments/24811.significant.rst
+++ /dev/null
@@ -1,22 +0,0 @@
-Added new config ``[logging]log_formatter_class`` to fix timezone display for logs on UI
-
-If you are using a custom Formatter subclass in your ``[logging]logging_config_class``, please inherit from ``airflow.utils.log.timezone_aware.TimezoneAware`` instead of ``logging.Formatter``.
-For example, in your ``custom_config.py``:
-
-.. code-block:: python
-   from airflow.utils.log.timezone_aware import TimezoneAware
-
-   # before
-   class YourCustomFormatter(logging.Formatter):
-       ...
-
-
-   # after
-   class YourCustomFormatter(TimezoneAware):
-       ...
-
-
-   AIRFLOW_FORMATTER = LOGGING_CONFIG["formatters"]["airflow"]
-   AIRFLOW_FORMATTER["class"] = "somewhere.your.custom_config.YourCustomFormatter"
-   # or use TimezoneAware class directly. If you don't have custom Formatter.
-   AIRFLOW_FORMATTER["class"] = "airflow.utils.log.timezone_aware.TimezoneAware"
diff --git a/newsfragments/25147.bugfix.rst b/newsfragments/25147.bugfix.rst
deleted file mode 100644
index 2d4523604c..0000000000
--- a/newsfragments/25147.bugfix.rst
+++ /dev/null
@@ -1 +0,0 @@
-``DagProcessorManager`` callback queue changed to queue SLAs at the back (stops DAG processing stalling due to SLAs)


[airflow] 02/04: Help pip resolver make better decision on Pyarrow (#25791)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-3-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d6e1e53652608e13f75dd1d99a4f9c8695fb666e
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Thu Aug 18 17:13:21 2022 +0200

    Help pip resolver make better decision on Pyarrow (#25791)
    
    (cherry picked from commit c0973cd61e353886fe223e2a335c74afb9fac06c)
---
 Dockerfile    | 4 +++-
 Dockerfile.ci | 4 +++-
 2 files changed, 6 insertions(+), 2 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index 273f8de939..0dfade43b2 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -1218,7 +1218,9 @@ ARG ADDITIONAL_PYTHON_DEPS=""
 # Those are additional constraints that are needed for some extras but we do not want to
 # Force them on the main Airflow package.
 # * dill<0.3.3 required by apache-beam
-ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3"
+# * pyarrow>=6.0.0 is because pip resolver decides for Python 3.10 to downgrade pyarrow to 5 even if it is OK
+#   for python 3.10 and other dependencies adding the limit helps resolver to make better decisions
+ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3 pyarrow>=6.0.0"
 
 ENV ADDITIONAL_PYTHON_DEPS=${ADDITIONAL_PYTHON_DEPS} \
     INSTALL_PACKAGES_FROM_CONTEXT=${INSTALL_PACKAGES_FROM_CONTEXT} \
diff --git a/Dockerfile.ci b/Dockerfile.ci
index 1c059854ad..adbbedfee8 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -1175,7 +1175,9 @@ RUN echo "Airflow version: ${AIRFLOW_VERSION}"
 # Those are additional constraints that are needed for some extras but we do not want to
 # force them on the main Airflow package. Those limitations are:
 # * dill<0.3.3 required by apache-beam
-ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3"
+# * pyarrow>=6.0.0 is because pip resolver decides for Python 3.10 to downgrade pyarrow to 5 even if it is OK
+#   for python 3.10 and other dependencies adding the limit helps resolver to make better decisions
+ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3 pyarrow>=6.0.0"
 ARG UPGRADE_TO_NEWER_DEPENDENCIES="false"
 ENV EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS=${EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS} \
     UPGRADE_TO_NEWER_DEPENDENCIES=${UPGRADE_TO_NEWER_DEPENDENCIES}


[airflow] 03/04: Update version to 2.3.4

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-3-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 65aca966381b80f5212ed027b68a8b37ddf6f864
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Tue Aug 16 11:54:50 2022 +0100

    Update version to 2.3.4
---
 README.md                                              | 14 +++++++-------
 airflow/utils/db.py                                    |  1 +
 .../apache-airflow/installation/supported-versions.rst |  2 +-
 docs/docker-stack/README.md                            | 10 +++++-----
 .../customizing/pypi-extras-and-deps.sh                |  2 +-
 .../customizing/pypi-selected-version.sh               |  2 +-
 .../extending/add-apt-packages/Dockerfile              |  2 +-
 .../extending/add-build-essential-extend/Dockerfile    |  2 +-
 .../docker-examples/extending/add-providers/Dockerfile |  2 +-
 .../extending/add-pypi-packages/Dockerfile             |  2 +-
 .../extending/custom-providers/Dockerfile              |  2 +-
 .../extending/embedding-dags/Dockerfile                |  2 +-
 .../extending/writable-directory/Dockerfile            |  2 +-
 docs/docker-stack/entrypoint.rst                       | 18 +++++++++---------
 scripts/ci/pre_commit/pre_commit_supported_versions.py |  2 +-
 setup.py                                               |  2 +-
 16 files changed, 34 insertions(+), 33 deletions(-)

diff --git a/README.md b/README.md
index eaaaa69d6a..5c772b2f10 100644
--- a/README.md
+++ b/README.md
@@ -85,7 +85,7 @@ Airflow is not a streaming solution, but it is often used to process real-time d
 
 Apache Airflow is tested with:
 
-|                     | Main version (dev)           | Stable version (2.3.3)       |
+|                     | Main version (dev)           | Stable version (2.3.4)       |
 |---------------------|------------------------------|------------------------------|
 | Python              | 3.7, 3.8, 3.9, 3.10          | 3.7, 3.8, 3.9, 3.10          |
 | Platform            | AMD64/ARM64(\*)              | AMD64/ARM64(\*)              |
@@ -160,15 +160,15 @@ them to the appropriate format and workflow that your tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.3.3' \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.3.3/constraints-3.7.txt"
+pip install 'apache-airflow==2.3.4' \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.3.4/constraints-3.7.txt"
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.3.3' \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.3.3/constraints-3.7.txt"
+pip install 'apache-airflow[postgres,google]==2.3.4' \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.3.4/constraints-3.7.txt"
 ```
 
 For information on installing provider packages, check
@@ -273,7 +273,7 @@ Apache Airflow version life cycle:
 
 | Version   | Current Patch/Minor   | State     | First Release   | Limited Support   | EOL/Terminated   |
 |-----------|-----------------------|-----------|-----------------|-------------------|------------------|
-| 2         | 2.3.3                 | Supported | Dec 17, 2020    | TBD               | TBD              |
+| 2         | 2.3.4                 | Supported | Dec 17, 2020    | TBD               | TBD              |
 | 1.10      | 1.10.15               | EOL       | Aug 27, 2018    | Dec 17, 2020      | June 17, 2021    |
 | 1.9       | 1.9.0                 | EOL       | Jan 03, 2018    | Aug 27, 2018      | Aug 27, 2018     |
 | 1.8       | 1.8.2                 | EOL       | Mar 19, 2017    | Jan 03, 2018      | Jan 03, 2018     |
@@ -303,7 +303,7 @@ They are based on the official release schedule of Python and Kubernetes, nicely
 2. The "oldest" supported version of Python/Kubernetes is the default one until we decide to switch to
    later version. "Default" is only meaningful in terms of "smoke tests" in CI PRs, which are run using this
    default version and the default reference image available. Currently `apache/airflow:latest`
-   and `apache/airflow:2.3.3` images are Python 3.7 images. This means that default reference image will
+   and `apache/airflow:2.3.4` images are Python 3.7 images. This means that default reference image will
    become the default at the time when we start preparing for dropping 3.7 support which is few months
    before the end of life for Python 3.7.
 
diff --git a/airflow/utils/db.py b/airflow/utils/db.py
index 192a2343c4..25f861423b 100644
--- a/airflow/utils/db.py
+++ b/airflow/utils/db.py
@@ -94,6 +94,7 @@ REVISION_HEADS_MAP = {
     "2.3.1": "1de7bc13c950",
     "2.3.2": "3c94c427fdf6",
     "2.3.3": "f5fcbda3e651",
+    "2.3.4": "f5fcbda3e651",
 }
 
 
diff --git a/docs/apache-airflow/installation/supported-versions.rst b/docs/apache-airflow/installation/supported-versions.rst
index 3867ed0305..91241b6084 100644
--- a/docs/apache-airflow/installation/supported-versions.rst
+++ b/docs/apache-airflow/installation/supported-versions.rst
@@ -29,7 +29,7 @@ Apache Airflow version life cycle:
 =========  =====================  =========  ===============  =================  ================
 Version    Current Patch/Minor    State      First Release    Limited Support    EOL/Terminated
 =========  =====================  =========  ===============  =================  ================
-2          2.3.3                  Supported  Dec 17, 2020     TBD                TBD
+2          2.3.4                  Supported  Dec 17, 2020     TBD                TBD
 1.10       1.10.15                EOL        Aug 27, 2018     Dec 17, 2020       June 17, 2021
 1.9        1.9.0                  EOL        Jan 03, 2018     Aug 27, 2018       Aug 27, 2018
 1.8        1.8.2                  EOL        Mar 19, 2017     Jan 03, 2018       Jan 03, 2018
diff --git a/docs/docker-stack/README.md b/docs/docker-stack/README.md
index c474d888a6..c5b68ca8c4 100644
--- a/docs/docker-stack/README.md
+++ b/docs/docker-stack/README.md
@@ -31,12 +31,12 @@ Every time a new version of Airflow is released, the images are prepared in the
 [apache/airflow DockerHub](https://hub.docker.com/r/apache/airflow)
 for all the supported Python versions.
 
-You can find the following images there (Assuming Airflow version `2.3.3`):
+You can find the following images there (Assuming Airflow version `2.3.4`):
 
 * `apache/airflow:latest` - the latest released Airflow image with default Python version (3.7 currently)
 * `apache/airflow:latest-pythonX.Y` - the latest released Airflow image with specific Python version
-* `apache/airflow:2.3.3` - the versioned Airflow image with default Python version (3.7 currently)
-* `apache/airflow:2.3.3-pythonX.Y` - the versioned Airflow image with specific Python version
+* `apache/airflow:2.3.4` - the versioned Airflow image with default Python version (3.7 currently)
+* `apache/airflow:2.3.4-pythonX.Y` - the versioned Airflow image with specific Python version
 
 Those are "reference" regular images. They contain the most common set of extras, dependencies and providers that are
 often used by the users and they are good to "try-things-out" when you want to just take Airflow for a spin,
@@ -47,8 +47,8 @@ via [Building the image](https://airflow.apache.org/docs/docker-stack/build.html
 
 * `apache/airflow:slim-latest`              - the latest released Airflow image with default Python version (3.7 currently)
 * `apache/airflow:slim-latest-pythonX.Y`    - the latest released Airflow image with specific Python version
-* `apache/airflow:slim-2.3.3`           - the versioned Airflow image with default Python version (3.7 currently)
-* `apache/airflow:slim-2.3.3-pythonX.Y` - the versioned Airflow image with specific Python version
+* `apache/airflow:slim-2.3.4`           - the versioned Airflow image with default Python version (3.7 currently)
+* `apache/airflow:slim-2.3.4-pythonX.Y` - the versioned Airflow image with specific Python version
 
 The Apache Airflow image provided as convenience package is optimized for size, and
 it provides just a bare minimal set of the extras and dependencies installed and in most cases
diff --git a/docs/docker-stack/docker-examples/customizing/pypi-extras-and-deps.sh b/docs/docker-stack/docker-examples/customizing/pypi-extras-and-deps.sh
index 5bf3893a2e..eb70e9923a 100755
--- a/docs/docker-stack/docker-examples/customizing/pypi-extras-and-deps.sh
+++ b/docs/docker-stack/docker-examples/customizing/pypi-extras-and-deps.sh
@@ -26,7 +26,7 @@ pushd "${TEMP_DOCKER_DIR}"
 cp "${AIRFLOW_SOURCES}/Dockerfile" "${TEMP_DOCKER_DIR}"
 
 # [START build]
-export AIRFLOW_VERSION=2.3.3
+export AIRFLOW_VERSION=2.3.4
 export DEBIAN_VERSION="bullseye"
 export DOCKER_BUILDKIT=1
 
diff --git a/docs/docker-stack/docker-examples/customizing/pypi-selected-version.sh b/docs/docker-stack/docker-examples/customizing/pypi-selected-version.sh
index f53bae8c4f..0bb943599e 100755
--- a/docs/docker-stack/docker-examples/customizing/pypi-selected-version.sh
+++ b/docs/docker-stack/docker-examples/customizing/pypi-selected-version.sh
@@ -26,7 +26,7 @@ pushd "${TEMP_DOCKER_DIR}"
 cp "${AIRFLOW_SOURCES}/Dockerfile" "${TEMP_DOCKER_DIR}"
 
 # [START build]
-export AIRFLOW_VERSION=2.3.3
+export AIRFLOW_VERSION=2.3.4
 export DOCKER_BUILDKIT=1
 
 docker build . \
diff --git a/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile b/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
index 6a1ba125d1..a4f6bbd626 100644
--- a/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.3.3
+FROM apache/airflow:2.3.4
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git a/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile b/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
index d316e18be8..c8adefc6df 100644
--- a/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.3.3
+FROM apache/airflow:2.3.4
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git a/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile b/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
index ae88a437a6..e5d2275274 100644
--- a/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.3.3
+FROM apache/airflow:2.3.4
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git a/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile b/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
index 06a9df3427..fd3b91b0cb 100644
--- a/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
@@ -15,6 +15,6 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.3.3
+FROM apache/airflow:2.3.4
 RUN pip install --no-cache-dir lxml
 # [END Dockerfile]
diff --git a/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile b/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
index 0395ee6dbe..188825ebf4 100644
--- a/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
@@ -15,6 +15,6 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.3.3
+FROM apache/airflow:2.3.4
 RUN pip install --no-cache-dir apache-airflow-providers-docker==2.5.1
 # [END Dockerfile]
diff --git a/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile b/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
index 9c22233f85..6ae531caa5 100644
--- a/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.3.3
+FROM apache/airflow:2.3.4
 
 COPY --chown=airflow:root test_dag.py /opt/airflow/dags
 
diff --git a/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile b/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
index f5225819ca..eb5ad8243e 100644
--- a/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.3.3
+FROM apache/airflow:2.3.4
 RUN umask 0002; \
     mkdir -p ~/writeable-directory
 # [END Dockerfile]
diff --git a/docs/docker-stack/entrypoint.rst b/docs/docker-stack/entrypoint.rst
index f20c143815..094ffd8f18 100644
--- a/docs/docker-stack/entrypoint.rst
+++ b/docs/docker-stack/entrypoint.rst
@@ -132,7 +132,7 @@ if you specify extra arguments. For example:
 
 .. code-block:: bash
 
-  docker run -it apache/airflow:2.3.3-python3.6 bash -c "ls -la"
+  docker run -it apache/airflow:2.3.4-python3.6 bash -c "ls -la"
   total 16
   drwxr-xr-x 4 airflow root 4096 Jun  5 18:12 .
   drwxr-xr-x 1 root    root 4096 Jun  5 18:12 ..
@@ -144,7 +144,7 @@ you pass extra parameters. For example:
 
 .. code-block:: bash
 
-  > docker run -it apache/airflow:2.3.3-python3.6 python -c "print('test')"
+  > docker run -it apache/airflow:2.3.4-python3.6 python -c "print('test')"
   test
 
 If first argument equals to "airflow" - the rest of the arguments is treated as an airflow command
@@ -152,13 +152,13 @@ to execute. Example:
 
 .. code-block:: bash
 
-   docker run -it apache/airflow:2.3.3-python3.6 airflow webserver
+   docker run -it apache/airflow:2.3.4-python3.6 airflow webserver
 
 If there are any other arguments - they are simply passed to the "airflow" command
 
 .. code-block:: bash
 
-  > docker run -it apache/airflow:2.3.3-python3.6 help
+  > docker run -it apache/airflow:2.3.4-python3.6 help
     usage: airflow [-h] GROUP_OR_COMMAND ...
 
     positional arguments:
@@ -206,7 +206,7 @@ propagation (See the next chapter).
 
 .. code-block:: Dockerfile
 
-    FROM airflow:2.3.3
+    FROM airflow:2.3.4
     COPY my_entrypoint.sh /
     ENTRYPOINT ["/usr/bin/dumb-init", "--", "/my_entrypoint.sh"]
 
@@ -250,7 +250,7 @@ Similarly to custom entrypoint, it can be added to the image by extending it.
 
 .. code-block:: Dockerfile
 
-    FROM airflow:2.3.3
+    FROM airflow:2.3.4
     COPY my_after_entrypoint_script.sh /
 
 Build your image and then you can run this script by running the command:
@@ -363,7 +363,7 @@ database and creating an ``admin/admin`` Admin user with the following command:
     --env "_AIRFLOW_DB_UPGRADE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD=admin" \
-      apache/airflow:2.3.3-python3.8 webserver
+      apache/airflow:2.3.4-python3.8 webserver
 
 
 .. code-block:: bash
@@ -372,7 +372,7 @@ database and creating an ``admin/admin`` Admin user with the following command:
     --env "_AIRFLOW_DB_UPGRADE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \
-      apache/airflow:2.3.3-python3.8 webserver
+      apache/airflow:2.3.4-python3.8 webserver
 
 The commands above perform initialization of the SQLite database, create admin user with admin password
 and Admin role. They also forward local port ``8080`` to the webserver port and finally start the webserver.
@@ -412,6 +412,6 @@ Example:
     --env "_AIRFLOW_DB_UPGRADE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \
-      apache/airflow:2.3.3-python3.8 webserver
+      apache/airflow:2.3.4-python3.8 webserver
 
 This method is only available starting from Docker image of Airflow 2.1.1 and above.
diff --git a/scripts/ci/pre_commit/pre_commit_supported_versions.py b/scripts/ci/pre_commit/pre_commit_supported_versions.py
index d21c62194e..0bf08d7cf8 100755
--- a/scripts/ci/pre_commit/pre_commit_supported_versions.py
+++ b/scripts/ci/pre_commit/pre_commit_supported_versions.py
@@ -25,7 +25,7 @@ AIRFLOW_SOURCES = Path(__file__).resolve().parent.parent.parent.parent
 HEADERS = ("Version", "Current Patch/Minor", "State", "First Release", "Limited Support", "EOL/Terminated")
 
 SUPPORTED_VERSIONS = (
-    ("2", "2.3.3", "Supported", "Dec 17, 2020", "TBD", "TBD"),
+    ("2", "2.3.4", "Supported", "Dec 17, 2020", "TBD", "TBD"),
     ("1.10", "1.10.15", "EOL", "Aug 27, 2018", "Dec 17, 2020", "June 17, 2021"),
     ("1.9", "1.9.0", "EOL", "Jan 03, 2018", "Aug 27, 2018", "Aug 27, 2018"),
     ("1.8", "1.8.2", "EOL", "Mar 19, 2017", "Jan 03, 2018", "Jan 03, 2018"),
diff --git a/setup.py b/setup.py
index 4b90d36a85..13a7e1668f 100644
--- a/setup.py
+++ b/setup.py
@@ -45,7 +45,7 @@ PY39 = sys.version_info >= (3, 9)
 
 logger = logging.getLogger(__name__)
 
-version = '2.3.3'
+version = '2.3.4'
 
 AIRFLOW_SOURCES_ROOT = Path(__file__).parent.resolve()
 my_dir = dirname(__file__)


[airflow] 01/04: Remove certifi limitations from eager upgrade limits (#23995)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-3-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 151eebf112aae5022ae9fbc3c19edf319687ba0c
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Wed Jun 1 19:37:16 2022 +0200

    Remove certifi limitations from eager upgrade limits (#23995)
    
    The certifi limitation was introduced to keep snowflake happy while
    performing eager upgrade because it added limits on certifi. However
    seems like it is not limitation any more in latest versions of
    snowflake python connector, so we can safely remove it from here.
    
    The only remaining limit is dill but this one still holds.
    
    (cherry picked from commit e41b5a012427b5e7eab49de702b83dba4fc2fa13)
---
 Dockerfile.ci | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/Dockerfile.ci b/Dockerfile.ci
index bcbecd54e3..1c059854ad 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -1174,9 +1174,8 @@ RUN echo "Airflow version: ${AIRFLOW_VERSION}"
 
 # Those are additional constraints that are needed for some extras but we do not want to
 # force them on the main Airflow package. Those limitations are:
-# * certifi<2021.0.0: required by snowflake provider
 # * dill<0.3.3 required by apache-beam
-ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3 certifi<2021.0.0"
+ARG EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS="dill<0.3.3"
 ARG UPGRADE_TO_NEWER_DEPENDENCIES="false"
 ENV EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS=${EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS} \
     UPGRADE_TO_NEWER_DEPENDENCIES=${UPGRADE_TO_NEWER_DEPENDENCIES}