You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ep...@apache.org on 2023/10/05 10:45:47 UTC

[airflow] branch v2-7-test updated (c2173d05a8 -> d10f43c5b0)

This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


    omit c2173d05a8 Parse 'docker context ls --format=json' correctly (#34711)
    omit 2a4251d415 Avoid WSL2 ones when finding a context for Breeze (#34538)
    omit ea5c070d43 Support rootless mode for docker. (#34537)
    omit 4ba5e9e985 Fix broken breeze by fixing package version (#34701)
    omit 5c42275019 Update RELEASE_NOTES.rst
    omit 80a9833d2c Update version to 2.7.2
    omit 4461f60dff Add information about drop support MsSQL as DB Backend in the future (#34375)
    omit e2db96b7ea Update min-sqlalchemy version to account for latest features used (#34293)
    omit 959d5cd591 Import AUTH_REMOTE_USER from FAB in WSGI middleware example (#34721)
    omit 3c01559c82 Fixed rows count in the migration script (#34348)
    omit 53499f41f9 Fix SesssionExemptMixin spelling (#34696)
    omit 9b3c26a77f Fix foreign key warning re ab_user.id (#34656)
    omit 89ed34e8a1 Fix: make dry run optional for patch task instance  (#34568)
    omit 78dbe3bf9c Correct docs for multi-value select (#34690)
    omit a95050812a Document how to use the system's timezone database (#34667)
    omit 2dbc7a9c4e Fix non deterministic datetime deserialization (#34492)
    omit bbba4b4aef Restrict `astroid` version < 3 (#34658)
    omit e605d1d9bb Fail dag test if defer without triggerer (#34619)
    omit b4b5e28724 fix connections exported output (#34640)
    omit 295d858659 Don't run isort when creating new alembic migrations (#34636)
    omit 9c1da80412 Use iterative loop to look for mapped parent (#34622)
    omit afd178bdf5 Fix some whitespace (#34632)
    omit 610dcf3c33 Clarify what landing time means in doc (#34608)
    omit f15d02fa6a fix(cli): remove "to backfill" from --task-regex help message (#34598)
    omit 2e513f931c Fix is_parent_mapped value by checking if any of the parent tg is mapped (#34587)
    omit 621616f464 Avoid top-level airflow import to avoid circular dependency (#34586)
    omit c13c45cf18 Fix: Add 3.11 as supported Python version (#34575)
    omit 4a92a24568 Fix ODBC Connection page formatting (#34572)
    omit 383c1063e9 Fix screenshot in dynamic task mapping docs (#34566)
    omit 33b804d289 Restore EXISTING_ROLES from security.py (#34523)
    omit 81fd828f22 Add more exemptions to lengthy metric list (#34531)
    omit 76d56815cd using seconds for failed scenarios too (#34532)
    omit a7fbe5bec1 Change two whitespaces to one (#34519)
    omit d0f94edcaa Fix class reference in Public Interface documentation (#34454)
    omit ff92c1debb Clarify var.value.get  and var.json.get usage (#34411)
    omit 1983522d1a Deprecate numeric type python version in PythonVirtualEnvOperator (#34359)
    omit b9393a8b04 Schedule default value description (#34291)
    omit c71bcdf703 Docs for triggered_dataset_event (#34410)
    omit 864787dc37 Add LocalKubernetesExecutor in the config.yml's executor description (#34414)
    omit ee72e76d65 Fix dag warning endpoint permissions (#34355)
    omit d76001123f Update cluster-policies.rst (#34174)
    omit 27791a9ac5 Refactor os.path.splitext to Path.* (#34352)
    omit 3724360520 Fix spelling errors in readme and license files (#34383)
    omit 23654a4997 docs: correct typo in best-practices.rst (#34361)
    omit 9518ce635c Check that dag_ids passed in request are consistent (#34366)
     new 6da499c380 docs: correct typo in best-practices.rst (#34361)
     new c410f0e211 Fix spelling errors in readme and license files (#34383)
     new d9e36e1efe Refactor os.path.splitext to Path.* (#34352)
     new eefaeecd2e Update cluster-policies.rst (#34174)
     new 5a475bf56a Fix dag warning endpoint permissions (#34355)
     new e45e663d58 Add LocalKubernetesExecutor in the config.yml's executor description (#34414)
     new f3b44c6813 Docs for triggered_dataset_event (#34410)
     new 4fcdeaac63 Schedule default value description (#34291)
     new 701ac6de95 Deprecate numeric type python version in PythonVirtualEnvOperator (#34359)
     new dd6e614e94 Clarify var.value.get  and var.json.get usage (#34411)
     new e7944e010f Fix class reference in Public Interface documentation (#34454)
     new 1480d58704 Change two whitespaces to one (#34519)
     new bc2bef8a79 using seconds for failed scenarios too (#34532)
     new ec7007e7d7 Add more exemptions to lengthy metric list (#34531)
     new c42012f60d Restore EXISTING_ROLES from security.py (#34523)
     new e1d41ca98e Fix screenshot in dynamic task mapping docs (#34566)
     new b92ec745ed Fix ODBC Connection page formatting (#34572)
     new 4dceaed81b Fix: Add 3.11 as supported Python version (#34575)
     new b2bad04ba5 Avoid top-level airflow import to avoid circular dependency (#34586)
     new ba1b1da078 Fix is_parent_mapped value by checking if any of the parent tg is mapped (#34587)
     new a0b4ef16d2 fix(cli): remove "to backfill" from --task-regex help message (#34598)
     new bb5fbab13d Clarify what landing time means in doc (#34608)
     new a99cd81b53 Fix some whitespace (#34632)
     new f860f9d1a0 Use iterative loop to look for mapped parent (#34622)
     new 79a87dff32 Don't run isort when creating new alembic migrations (#34636)
     new af5c86e3df fix connections exported output (#34640)
     new 0c9c86a3b7 Fail dag test if defer without triggerer (#34619)
     new 262e231c16 Restrict `astroid` version < 3 (#34658)
     new 9ecdff0447 Fix non deterministic datetime deserialization (#34492)
     new 48e974db2f Document how to use the system's timezone database (#34667)
     new 8a50232e7f Correct docs for multi-value select (#34690)
     new c32570f2b4 Fix: make dry run optional for patch task instance  (#34568)
     new 033ffa91be Fix foreign key warning re ab_user.id (#34656)
     new 7fcb9317d2 Fix SesssionExemptMixin spelling (#34696)
     new 54f23231be Fixed rows count in the migration script (#34348)
     new 14e01708c3 Import AUTH_REMOTE_USER from FAB in WSGI middleware example (#34721)
     new 6d62f3febe Update min-sqlalchemy version to account for latest features used (#34293)
     new 6e22dbabb3 Add information about drop support MsSQL as DB Backend in the future (#34375)
     new 12f2fd5ced Update version to 2.7.2
     new 3f89ab4063 Update RELEASE_NOTES.rst
     new c86ebeb897 Fix broken breeze by fixing package version (#34701)
     new d2436e0046 Support rootless mode for docker. (#34537)
     new df43cbb6c5 Avoid WSL2 ones when finding a context for Breeze (#34538)
     new d10f43c5b0 Parse 'docker context ls --format=json' correctly (#34711)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (c2173d05a8)
            \
             N -- N -- N   refs/heads/v2-7-test (d10f43c5b0)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 44 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/www/auth.py    | 37 ++++----------------
 tests/www/test_auth.py | 93 --------------------------------------------------
 2 files changed, 7 insertions(+), 123 deletions(-)
 delete mode 100644 tests/www/test_auth.py


[airflow] 08/44: Schedule default value description (#34291)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4fcdeaac63136c1815c8289a7f9aeed5033c87b6
Author: Rafael Carrasco <ra...@gmail.com>
AuthorDate: Sat Sep 16 12:08:55 2023 -0500

    Schedule default value description (#34291)
    
    * added clarification around schedule parameter
    
    * rewriting explanation of default value
    
    * Update airflow/models/dag.py
    
    Co-authored-by: Tzu-ping Chung <ur...@gmail.com>
    
    * Update dag.py
    
    * Fix trailing whitespace
    
    ---------
    
    Co-authored-by: Tzu-ping Chung <ur...@gmail.com>
    (cherry picked from commit 5ab7517258716445ac583c41656fdb17f87f57a8)
---
 airflow/models/dag.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index d8c8628770..ca3cce4cc4 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -305,7 +305,8 @@ class DAG(LoggingMixin):
     :param description: The description for the DAG to e.g. be shown on the webserver
     :param schedule: Defines the rules according to which DAG runs are scheduled. Can
         accept cron string, timedelta object, Timetable, or list of Dataset objects.
-        See also :doc:`/howto/timetable`.
+        If this is not provided, the DAG will be set to the default
+        schedule ``timedelta(days=1)``. See also :doc:`/howto/timetable`.
     :param start_date: The timestamp from which the scheduler will
         attempt to backfill
     :param end_date: A date beyond which your DAG won't run, leave to None


[airflow] 30/44: Document how to use the system's timezone database (#34667)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 48e974db2f8ba29e49f350116d7c0ef2584c29b8
Author: Bolke de Bruin <bo...@xs4all.nl>
AuthorDate: Thu Sep 28 16:21:30 2023 +0200

    Document how to use the system's timezone database (#34667)
    
    The latest release of pendulum (2.1.2) contains an outdated
    timezone database. It is better to rely on the local
    system's database.
    
    (cherry picked from commit d0f246398ff871bfa177f91912980d8a0f0f1c50)
---
 docs/apache-airflow/authoring-and-scheduling/timezone.rst | 4 ++++
 1 file changed, 4 insertions(+)

diff --git a/docs/apache-airflow/authoring-and-scheduling/timezone.rst b/docs/apache-airflow/authoring-and-scheduling/timezone.rst
index 86ea468534..6475a8abfc 100644
--- a/docs/apache-airflow/authoring-and-scheduling/timezone.rst
+++ b/docs/apache-airflow/authoring-and-scheduling/timezone.rst
@@ -40,6 +40,10 @@ The time zone is set in ``airflow.cfg``. By default it is set to UTC, but you ch
 an arbitrary IANA time zone, e.g. ``Europe/Amsterdam``. It is dependent on ``pendulum``, which is more accurate than ``pytz``.
 Pendulum is installed when you install Airflow.
 
+.. note::
+     Pendulum relies by default on its own timezone database, which is not updated as frequently as the IANA database.
+     You can make Pendulum rely on the system's database by setting the ``PYTZDATA_TZDATADIR`` environment variable
+     to your system's database, e.g. ``/usr/share/zoneinfo``.
 
 Web UI
 ------


[airflow] 37/44: Update min-sqlalchemy version to account for latest features used (#34293)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6d62f3febeaf213ffc11b1a47dcbeb7292ed711e
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Tue Sep 12 11:49:56 2023 +0200

    Update min-sqlalchemy version to account for latest features used (#34293)
    
    Some of the recent sqlalchemy changes are not working with minimum
    version of sqlalchemy of ours - for example `where` syntax does
    not allow moe than one clause and we are already passing more
    in _do_delete_old_records (added in #33527). This syntax however
    was added in SQL Alchemy 1.4.28 and our minimum version was
    1.4.27.
    
    This change bumps the minimum SQLAlchemy version to 1.4.28 but it also
    adds a special test job that only runs on Postgres that downgrades
    the SQLAlchemy to the minimum supported version (retrieved from
    setup.cfg). This way, we will be able to detect such incompatible
    changes at the PR time. This is a new flag `--downgrade-sqlalchemy`
    on test command that works similar to earlier `--upgrade-boto`.
    
    We also enable the `--upgrade-boto` and `--downgrade-sqlalchemy` flags
    to be used for `breeze shell` command - thanks to that we can
    easily test both flags with `breeze shell` command.
    
    (cherry picked from commit efbead9fe7462b3634b6d9c842bd9a7ac78a0207)
---
 .github/workflows/ci.yml                           | 49 ++++++++++++++++++++++
 Dockerfile.ci                                      | 23 ++++++----
 .../airflow_breeze/commands/developer_commands.py  |  8 ++++
 .../commands/developer_commands_config.py          |  7 ++++
 .../airflow_breeze/commands/testing_commands.py    | 12 +++---
 .../commands/testing_commands_config.py            |  1 +
 .../src/airflow_breeze/params/shell_params.py      |  1 +
 .../src/airflow_breeze/utils/common_options.py     | 12 ++++++
 .../airflow_breeze/utils/docker_command_utils.py   |  2 +
 images/breeze/output-commands-hash.txt             |  6 +--
 images/breeze/output_shell.svg                     | 34 +++++++++++----
 images/breeze/output_testing_tests.svg             | 26 +++++++-----
 scripts/ci/docker-compose/_docker.env              |  1 +
 scripts/ci/docker-compose/base.yml                 |  1 +
 scripts/ci/docker-compose/devcontainer.env         |  1 +
 scripts/docker/entrypoint_ci.sh                    | 23 ++++++----
 setup.cfg                                          |  2 +-
 17 files changed, 165 insertions(+), 44 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 6ca7543136..39ed3ac08d 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -1021,6 +1021,55 @@ jobs:
         uses: ./.github/actions/post_tests_failure
         if: failure()
 
+  tests-postgres-min-sqlalchemy:
+    timeout-minutes: 130
+    name: >
+      MinSQLAlchemy${{needs.build-info.outputs.default-postgres-version}},
+      Py${{needs.build-info.outputs.default-python-version}}:
+      ${{needs.build-info.outputs.parallel-test-types-list-as-string}}
+    runs-on: "${{needs.build-info.outputs.runs-on}}"
+    needs: [build-info, wait-for-ci-images]
+    env:
+      RUNS_ON: "${{needs.build-info.outputs.runs-on}}"
+      PARALLEL_TEST_TYPES: "${{needs.build-info.outputs.parallel-test-types-list-as-string}}"
+      PR_LABELS: "${{needs.build-info.outputs.pull-request-labels}}"
+      FULL_TESTS_NEEDED: "${{needs.build-info.outputs.full-tests-needed}}"
+      DEBUG_RESOURCES: "${{needs.build-info.outputs.debug-resources}}"
+      BACKEND: "postgres"
+      PYTHON_MAJOR_MINOR_VERSION: "${{needs.build-info.outputs.default-python-version}}"
+      PYTHON_VERSION: "${needs.build-info.outputs.default-python-version}}"
+      POSTGRES_VERSION: "${{needs.build-info.outputs.default-postgres-version}}"
+      BACKEND_VERSION: "${{needs.build-info.outputs.default-postgres-version}}"
+      DOWNGRADE_SQLALCHEMY: "true"
+      JOB_ID: >
+        postgres-min-sqlalchemy-${{needs.build-info.outputs.default-python-version}}-
+        ${{needs.build-info.outputs.default-postgres-version}}
+      COVERAGE: "${{needs.build-info.outputs.run-coverage}}"
+    if: needs.build-info.outputs.run-tests == 'true'
+    steps:
+      - name: Cleanup repo
+        shell: bash
+        run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*"
+      - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
+        uses: actions/checkout@v3
+        with:
+          persist-credentials: false
+      - name: >
+          Prepare breeze & CI image: ${{needs.build-info.outputs.default-python-version}}:${{env.IMAGE_TAG}}
+        uses: ./.github/actions/prepare_breeze_and_image
+      - name: >
+          Tests: ${{needs.build-info.outputs.default-python-version}}:
+          ${{needs.build-info.outputs.parallel-test-types-list-as-string}}
+        run: breeze testing tests --run-in-parallel
+      - name: >
+          Post Tests success: ${{needs.build-info.outputs.default-python-version}}:Boto"
+        uses: ./.github/actions/post_tests_success
+        if: success()
+      - name: >
+          Post Tests failure: ${{needs.build-info.outputs.default-python-version}}:Boto"
+        uses: ./.github/actions/post_tests_failure
+        if: failure()
+
   tests-postgres-in-progress-features-disabled:
     timeout-minutes: 130
     name: >
diff --git a/Dockerfile.ci b/Dockerfile.ci
index 9092465c82..a0f47d86d3 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -937,6 +937,22 @@ if [[ ${SKIP_ENVIRONMENT_INITIALIZATION=} != "true" ]]; then
 fi
 
 rm -f "${AIRFLOW_SOURCES}/pytest.ini"
+if [[ ${UPGRADE_BOTO=} == "true" ]]; then
+    echo
+    echo "${COLOR_BLUE}Upgrading boto3, botocore to latest version to run Amazon tests with them${COLOR_RESET}"
+    echo
+    pip uninstall --root-user-action ignore aiobotocore -y || true
+    pip install --root-user-action ignore --upgrade boto3 botocore
+    pip check
+fi
+if [[ ${DOWNGRADE_SQLALCHEMY=} == "true" ]]; then
+    min_sqlalchemy_version=$(grep "sqlalchemy>=" setup.cfg | sed "s/.*>=\([0-9\.]*\).*/\1/")
+    echo
+    echo "${COLOR_BLUE}Downgrading sqlalchemy to minimum supported version: ${min_sqlalchemy_version}${COLOR_RESET}"
+    echo
+    pip install --root-user-action ignore "sqlalchemy==${min_sqlalchemy_version}"
+    pip check
+fi
 
 set +u
 if [[ "${RUN_TESTS}" != "true" ]]; then
@@ -1166,13 +1182,6 @@ else
         exit 1
     fi
 fi
-if [[ ${UPGRADE_BOTO=} == "true" ]]; then
-    echo
-    echo "${COLOR_BLUE}Upgrading boto3, botocore to latest version to run Amazon tests with them${COLOR_RESET}"
-    echo
-    pip uninstall aiobotocore -y || true
-    pip install --upgrade boto3 botocore
-fi
 readonly SELECTED_TESTS CLI_TESTS API_TESTS PROVIDERS_TESTS CORE_TESTS WWW_TESTS \
     ALL_TESTS ALL_PRESELECTED_TESTS
 
diff --git a/dev/breeze/src/airflow_breeze/commands/developer_commands.py b/dev/breeze/src/airflow_breeze/commands/developer_commands.py
index 92b3f7baa0..5c569ba811 100644
--- a/dev/breeze/src/airflow_breeze/commands/developer_commands.py
+++ b/dev/breeze/src/airflow_breeze/commands/developer_commands.py
@@ -51,6 +51,7 @@ from airflow_breeze.utils.common_options import (
     option_celery_broker,
     option_celery_flower,
     option_db_reset,
+    option_downgrade_sqlalchemy,
     option_dry_run,
     option_executor,
     option_force_build,
@@ -70,6 +71,7 @@ from airflow_breeze.utils.common_options import (
     option_platform_single,
     option_postgres_version,
     option_python,
+    option_upgrade_boto,
     option_use_airflow_version,
     option_use_packages_from_dist,
     option_verbose,
@@ -162,6 +164,8 @@ class TimerThread(threading.Thread):
 @option_image_tag_for_running
 @option_max_time
 @option_include_mypy_volume
+@option_upgrade_boto
+@option_downgrade_sqlalchemy
 @option_verbose
 @option_dry_run
 @option_github_repository
@@ -197,6 +201,8 @@ def shell(
     celery_broker: str,
     celery_flower: bool,
     extra_args: tuple,
+    upgrade_boto: bool,
+    downgrade_sqlalchemy: bool,
 ):
     """Enter breeze environment. this is the default command use when no other is selected."""
     if get_verbose() or get_dry_run():
@@ -234,6 +240,8 @@ def shell(
         executor=executor,
         celery_broker=celery_broker,
         celery_flower=celery_flower,
+        upgrade_boto=upgrade_boto,
+        downgrade_sqlalchemy=downgrade_sqlalchemy,
     )
     sys.exit(result.returncode)
 
diff --git a/dev/breeze/src/airflow_breeze/commands/developer_commands_config.py b/dev/breeze/src/airflow_breeze/commands/developer_commands_config.py
index 81c99e793f..9ef4f8ca96 100644
--- a/dev/breeze/src/airflow_breeze/commands/developer_commands_config.py
+++ b/dev/breeze/src/airflow_breeze/commands/developer_commands_config.py
@@ -99,6 +99,13 @@ DEVELOPER_PARAMETERS: dict[str, list[dict[str, str | list[str]]]] = {
                 "--package-format",
             ],
         },
+        {
+            "name": "Upgrading/downgrading selected packages",
+            "options": [
+                "--upgrade-boto",
+                "--downgrade-sqlalchemy",
+            ],
+        },
     ],
     "breeze compile-www-assets": [
         {
diff --git a/dev/breeze/src/airflow_breeze/commands/testing_commands.py b/dev/breeze/src/airflow_breeze/commands/testing_commands.py
index d892727ee4..7acd6cffd6 100644
--- a/dev/breeze/src/airflow_breeze/commands/testing_commands.py
+++ b/dev/breeze/src/airflow_breeze/commands/testing_commands.py
@@ -38,6 +38,7 @@ from airflow_breeze.utils.common_options import (
     option_backend,
     option_db_reset,
     option_debug_resources,
+    option_downgrade_sqlalchemy,
     option_dry_run,
     option_github_repository,
     option_image_name,
@@ -52,6 +53,7 @@ from airflow_breeze.utils.common_options import (
     option_python,
     option_run_in_parallel,
     option_skip_cleanup,
+    option_upgrade_boto,
     option_use_airflow_version,
     option_verbose,
 )
@@ -367,12 +369,8 @@ def run_tests_in_parallel(
     show_default=True,
     envvar="PARALLEL_TEST_TYPES",
 )
-@click.option(
-    "--upgrade-boto",
-    help="Remove aiobotocore and upgrade botocore and boto to the latest version.",
-    is_flag=True,
-    envvar="UPGRADE_BOTO",
-)
+@option_upgrade_boto
+@option_downgrade_sqlalchemy
 @click.option(
     "--collect-only",
     help="Collect tests only, do not run them.",
@@ -416,6 +414,7 @@ def command_for_tests(
     mount_sources: str,
     extra_pytest_args: tuple,
     upgrade_boto: bool,
+    downgrade_sqlalchemy: bool,
     collect_only: bool,
     remove_arm_packages: bool,
     github_repository: str,
@@ -436,6 +435,7 @@ def command_for_tests(
         forward_ports=False,
         test_type=test_type,
         upgrade_boto=upgrade_boto,
+        downgrade_sqlalchemy=downgrade_sqlalchemy,
         collect_only=collect_only,
         remove_arm_packages=remove_arm_packages,
         github_repository=github_repository,
diff --git a/dev/breeze/src/airflow_breeze/commands/testing_commands_config.py b/dev/breeze/src/airflow_breeze/commands/testing_commands_config.py
index 908843c224..84d0d2b6ec 100644
--- a/dev/breeze/src/airflow_breeze/commands/testing_commands_config.py
+++ b/dev/breeze/src/airflow_breeze/commands/testing_commands_config.py
@@ -56,6 +56,7 @@ TESTING_PARAMETERS: dict[str, list[dict[str, str | list[str]]]] = {
                 "--use-airflow-version",
                 "--mount-sources",
                 "--upgrade-boto",
+                "--downgrade-sqlalchemy",
                 "--remove-arm-packages",
                 "--skip-docker-compose-down",
             ],
diff --git a/dev/breeze/src/airflow_breeze/params/shell_params.py b/dev/breeze/src/airflow_breeze/params/shell_params.py
index 99bc8c4d3c..b2240a42c9 100644
--- a/dev/breeze/src/airflow_breeze/params/shell_params.py
+++ b/dev/breeze/src/airflow_breeze/params/shell_params.py
@@ -120,6 +120,7 @@ class ShellParams:
     dry_run: bool = False
     verbose: bool = False
     upgrade_boto: bool = False
+    downgrade_sqlalchemy: bool = False
     executor: str = START_AIRFLOW_DEFAULT_ALLOWED_EXECUTORS
     celery_broker: str = DEFAULT_CELERY_BROKER
     celery_flower: bool = False
diff --git a/dev/breeze/src/airflow_breeze/utils/common_options.py b/dev/breeze/src/airflow_breeze/utils/common_options.py
index f9a82064b8..5e8e50da61 100644
--- a/dev/breeze/src/airflow_breeze/utils/common_options.py
+++ b/dev/breeze/src/airflow_breeze/utils/common_options.py
@@ -601,3 +601,15 @@ option_eager_upgrade_additional_requirements = click.option(
     help="Optional additional requirements to upgrade eagerly to avoid backtracking "
     "(see `breeze ci find-backtracking-candidates`).",
 )
+option_upgrade_boto = click.option(
+    "--upgrade-boto",
+    help="Remove aiobotocore and upgrade botocore and boto to the latest version.",
+    is_flag=True,
+    envvar="UPGRADE_BOTO",
+)
+option_downgrade_sqlalchemy = click.option(
+    "--downgrade-sqlalchemy",
+    help="Downgrade SQLAlchemy to minimum supported version.",
+    is_flag=True,
+    envvar="DOWNGRADE_SQLALCHEMY",
+)
diff --git a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
index df6047c322..b0a5697b18 100644
--- a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
+++ b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
@@ -600,6 +600,7 @@ def update_expected_environment_variables(env: dict[str, str]) -> None:
     set_value_to_default_if_not_set(env, "TEST_TYPE", "")
     set_value_to_default_if_not_set(env, "TEST_TIMEOUT", "60")
     set_value_to_default_if_not_set(env, "UPGRADE_BOTO", "false")
+    set_value_to_default_if_not_set(env, "DOWNGRADE_SQLALCHEMY", "false")
     set_value_to_default_if_not_set(env, "UPGRADE_TO_NEWER_DEPENDENCIES", "false")
     set_value_to_default_if_not_set(env, "USE_PACKAGES_FROM_DIST", "false")
     set_value_to_default_if_not_set(env, "VERBOSE", "false")
@@ -647,6 +648,7 @@ DERIVE_ENV_VARIABLES_FROM_ATTRIBUTES = {
     "SQLITE_URL": "sqlite_url",
     "START_AIRFLOW": "start_airflow",
     "UPGRADE_BOTO": "upgrade_boto",
+    "DOWNGRADE_SQLALCHEMY": "downgrade_sqlalchemy",
     "USE_AIRFLOW_VERSION": "use_airflow_version",
     "USE_PACKAGES_FROM_DIST": "use_packages_from_dist",
     "VERSION_SUFFIX_FOR_PYPI": "version_suffix_for_pypi",
diff --git a/images/breeze/output-commands-hash.txt b/images/breeze/output-commands-hash.txt
index b15e25524a..09b6278bf9 100644
--- a/images/breeze/output-commands-hash.txt
+++ b/images/breeze/output-commands-hash.txt
@@ -62,11 +62,11 @@ setup:regenerate-command-images:d5e29ec6acb1a6af7d83772c2962f89d
 setup:self-upgrade:4af905a147fcd6670a0e33d3d369a94b
 setup:version:be116d90a21c2afe01087f7609774e1e
 setup:8de3ed2645928e8f9f89c58f0ccd2a60
-shell:30a9271ff59a0fe756dda402cc165683
+shell:1e901a677a6df6ba1d64d3fe79b42587
 start-airflow:ff0f63e20b9ff454e5d3c7b9ba9080d7
 static-checks:d319b1c7972a6623bc8ee1b174cacb48
 testing:docker-compose-tests:0c810047fc66a0cfe91119e2d08b3507
 testing:helm-tests:8e491da2e01ebd815322c37562059d77
 testing:integration-tests:486e4d91449ecdb7630ef2a470d705a3
-testing:tests:3c202e65824e405269e78f58936980e0
-testing:68a089bc30e0e60f834f205df1e9f086
+testing:tests:b3b921fd5a7d3435a0ad34e90b75cb2f
+testing:13325e047fc32d9e40b51bfd15212a91
diff --git a/images/breeze/output_shell.svg b/images/breeze/output_shell.svg
index a8dd2ecd2e..7c8b47ed11 100644
--- a/images/breeze/output_shell.svg
+++ b/images/breeze/output_shell.svg
@@ -1,4 +1,4 @@
-<svg class="rich-terminal" viewBox="0 0 1482 1611.6" xmlns="http://www.w3.org/2000/svg">
+<svg class="rich-terminal" viewBox="0 0 1482 1709.1999999999998" xmlns="http://www.w3.org/2000/svg">
     <!-- Generated with Rich https://www.textualize.io -->
     <style>
 
@@ -43,7 +43,7 @@
 
     <defs>
     <clipPath id="breeze-shell-clip-terminal">
-      <rect x="0" y="0" width="1463.0" height="1560.6" />
+      <rect x="0" y="0" width="1463.0" height="1658.1999999999998" />
     </clipPath>
     <clipPath id="breeze-shell-line-0">
     <rect x="0" y="1.5" width="1464" height="24.65"/>
@@ -234,9 +234,21 @@
 <clipPath id="breeze-shell-line-62">
     <rect x="0" y="1514.3" width="1464" height="24.65"/>
             </clipPath>
+<clipPath id="breeze-shell-line-63">
+    <rect x="0" y="1538.7" width="1464" height="24.65"/>
+            </clipPath>
+<clipPath id="breeze-shell-line-64">
+    <rect x="0" y="1563.1" width="1464" height="24.65"/>
+            </clipPath>
+<clipPath id="breeze-shell-line-65">
+    <rect x="0" y="1587.5" width="1464" height="24.65"/>
+            </clipPath>
+<clipPath id="breeze-shell-line-66">
+    <rect x="0" y="1611.9" width="1464" height="24.65"/>
+            </clipPath>
     </defs>
 
-    <rect fill="#292929" stroke="rgba(255,255,255,0.35)" stroke-width="1" x="1" y="1" width="1480" height="1609.6" rx="8"/><text class="breeze-shell-title" fill="#c5c8c6" text-anchor="middle" x="740" y="27">Command:&#160;shell</text>
+    <rect fill="#292929" stroke="rgba(255,255,255,0.35)" stroke-width="1" x="1" y="1" width="1480" height="1707.2" rx="8"/><text class="breeze-shell-title" fill="#c5c8c6" text-anchor="middle" x="740" y="27">Command:&#160;shell</text>
             <g transform="translate(26,22)">
             <circle cx="0" cy="0" r="7" fill="#ff5f57"/>
             <circle cx="22" cy="0" r="7" fill="#febc2e"/>
@@ -304,12 +316,16 @@
 </text><text class="breeze-shell-r5" x="0" y="1362" textLength="12.2" clip-path="url(#breeze-shell-line-55)">│</text><text class="breeze-shell-r4" x="24.4" y="1362" textLength="12.2" clip-path="url(#breeze-shell-line-55)">-</text><text class="breeze-shell-r4" x="36.6" y="1362" textLength="97.6" clip-path="url(#breeze-shell-line-55)">-package</text><text class="breeze-shell-r4" x="134.2" y="1362" textLength="85.4" clip-path="url(#breeze-shell-line-55)">-format</text><text class="breeze-sh [...]
 </text><text class="breeze-shell-r5" x="0" y="1386.4" textLength="12.2" clip-path="url(#breeze-shell-line-56)">│</text><text class="breeze-shell-r5" x="451.4" y="1386.4" textLength="658.8" clip-path="url(#breeze-shell-line-56)">[default:&#160;wheel]&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text cl [...]
 </text><text class="breeze-shell-r5" x="0" y="1410.8" textLength="1464" clip-path="url(#breeze-shell-line-57)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-shell-r2" x="1464" y="1410.8" textLength="12.2" clip-path="url(#breeze-shell-line-57)">
-</text><text class="breeze-shell-r5" x="0" y="1435.2" textLength="24.4" clip-path="url(#breeze-shell-line-58)">╭─</text><text class="breeze-shell-r5" x="24.4" y="1435.2" textLength="195.2" clip-path="url(#breeze-shell-line-58)">&#160;Common&#160;options&#160;</text><text class="breeze-shell-r5" x="219.6" y="1435.2" textLength="1220" clip-path="url(#breeze-shell-line-58)">────────────────────────────────────────────────────────────────────────────────────────────────────</text><text class [...]
-</text><text class="breeze-shell-r5" x="0" y="1459.6" textLength="12.2" clip-path="url(#breeze-shell-line-59)">│</text><text class="breeze-shell-r4" x="24.4" y="1459.6" textLength="12.2" clip-path="url(#breeze-shell-line-59)">-</text><text class="breeze-shell-r4" x="36.6" y="1459.6" textLength="97.6" clip-path="url(#breeze-shell-line-59)">-verbose</text><text class="breeze-shell-r6" x="158.6" y="1459.6" textLength="24.4" clip-path="url(#breeze-shell-line-59)">-v</text><text class="breeze [...]
-</text><text class="breeze-shell-r5" x="0" y="1484" textLength="12.2" clip-path="url(#breeze-shell-line-60)">│</text><text class="breeze-shell-r4" x="24.4" y="1484" textLength="12.2" clip-path="url(#breeze-shell-line-60)">-</text><text class="breeze-shell-r4" x="36.6" y="1484" textLength="48.8" clip-path="url(#breeze-shell-line-60)">-dry</text><text class="breeze-shell-r4" x="85.4" y="1484" textLength="48.8" clip-path="url(#breeze-shell-line-60)">-run</text><text class="breeze-shell-r6"  [...]
-</text><text class="breeze-shell-r5" x="0" y="1508.4" textLength="12.2" clip-path="url(#breeze-shell-line-61)">│</text><text class="breeze-shell-r4" x="24.4" y="1508.4" textLength="12.2" clip-path="url(#breeze-shell-line-61)">-</text><text class="breeze-shell-r4" x="36.6" y="1508.4" textLength="85.4" clip-path="url(#breeze-shell-line-61)">-answer</text><text class="breeze-shell-r6" x="158.6" y="1508.4" textLength="24.4" clip-path="url(#breeze-shell-line-61)">-a</text><text class="breeze- [...]
-</text><text class="breeze-shell-r5" x="0" y="1532.8" textLength="12.2" clip-path="url(#breeze-shell-line-62)">│</text><text class="breeze-shell-r4" x="24.4" y="1532.8" textLength="12.2" clip-path="url(#breeze-shell-line-62)">-</text><text class="breeze-shell-r4" x="36.6" y="1532.8" textLength="61" clip-path="url(#breeze-shell-line-62)">-help</text><text class="breeze-shell-r6" x="158.6" y="1532.8" textLength="24.4" clip-path="url(#breeze-shell-line-62)">-h</text><text class="breeze-shel [...]
-</text><text class="breeze-shell-r5" x="0" y="1557.2" textLength="1464" clip-path="url(#breeze-shell-line-63)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-shell-r2" x="1464" y="1557.2" textLength="12.2" clip-path="url(#breeze-shell-line-63)">
+</text><text class="breeze-shell-r5" x="0" y="1435.2" textLength="24.4" clip-path="url(#breeze-shell-line-58)">╭─</text><text class="breeze-shell-r5" x="24.4" y="1435.2" textLength="500.2" clip-path="url(#breeze-shell-line-58)">&#160;Upgrading/downgrading&#160;selected&#160;packages&#160;</text><text class="breeze-shell-r5" x="524.6" y="1435.2" textLength="915" clip-path="url(#breeze-shell-line-58)">───────────────────────────────────────────────────────────────────────────</text><text c [...]
+</text><text class="breeze-shell-r5" x="0" y="1459.6" textLength="12.2" clip-path="url(#breeze-shell-line-59)">│</text><text class="breeze-shell-r4" x="24.4" y="1459.6" textLength="12.2" clip-path="url(#breeze-shell-line-59)">-</text><text class="breeze-shell-r4" x="36.6" y="1459.6" textLength="97.6" clip-path="url(#breeze-shell-line-59)">-upgrade</text><text class="breeze-shell-r4" x="134.2" y="1459.6" textLength="61" clip-path="url(#breeze-shell-line-59)">-boto</text><text class="breez [...]
+</text><text class="breeze-shell-r5" x="0" y="1484" textLength="12.2" clip-path="url(#breeze-shell-line-60)">│</text><text class="breeze-shell-r4" x="24.4" y="1484" textLength="12.2" clip-path="url(#breeze-shell-line-60)">-</text><text class="breeze-shell-r4" x="36.6" y="1484" textLength="122" clip-path="url(#breeze-shell-line-60)">-downgrade</text><text class="breeze-shell-r4" x="158.6" y="1484" textLength="134.2" clip-path="url(#breeze-shell-line-60)">-sqlalchemy</text><text class="bre [...]
+</text><text class="breeze-shell-r5" x="0" y="1508.4" textLength="1464" clip-path="url(#breeze-shell-line-61)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-shell-r2" x="1464" y="1508.4" textLength="12.2" clip-path="url(#breeze-shell-line-61)">
+</text><text class="breeze-shell-r5" x="0" y="1532.8" textLength="24.4" clip-path="url(#breeze-shell-line-62)">╭─</text><text class="breeze-shell-r5" x="24.4" y="1532.8" textLength="195.2" clip-path="url(#breeze-shell-line-62)">&#160;Common&#160;options&#160;</text><text class="breeze-shell-r5" x="219.6" y="1532.8" textLength="1220" clip-path="url(#breeze-shell-line-62)">────────────────────────────────────────────────────────────────────────────────────────────────────</text><text class [...]
+</text><text class="breeze-shell-r5" x="0" y="1557.2" textLength="12.2" clip-path="url(#breeze-shell-line-63)">│</text><text class="breeze-shell-r4" x="24.4" y="1557.2" textLength="12.2" clip-path="url(#breeze-shell-line-63)">-</text><text class="breeze-shell-r4" x="36.6" y="1557.2" textLength="97.6" clip-path="url(#breeze-shell-line-63)">-verbose</text><text class="breeze-shell-r6" x="158.6" y="1557.2" textLength="24.4" clip-path="url(#breeze-shell-line-63)">-v</text><text class="breeze [...]
+</text><text class="breeze-shell-r5" x="0" y="1581.6" textLength="12.2" clip-path="url(#breeze-shell-line-64)">│</text><text class="breeze-shell-r4" x="24.4" y="1581.6" textLength="12.2" clip-path="url(#breeze-shell-line-64)">-</text><text class="breeze-shell-r4" x="36.6" y="1581.6" textLength="48.8" clip-path="url(#breeze-shell-line-64)">-dry</text><text class="breeze-shell-r4" x="85.4" y="1581.6" textLength="48.8" clip-path="url(#breeze-shell-line-64)">-run</text><text class="breeze-sh [...]
+</text><text class="breeze-shell-r5" x="0" y="1606" textLength="12.2" clip-path="url(#breeze-shell-line-65)">│</text><text class="breeze-shell-r4" x="24.4" y="1606" textLength="12.2" clip-path="url(#breeze-shell-line-65)">-</text><text class="breeze-shell-r4" x="36.6" y="1606" textLength="85.4" clip-path="url(#breeze-shell-line-65)">-answer</text><text class="breeze-shell-r6" x="158.6" y="1606" textLength="24.4" clip-path="url(#breeze-shell-line-65)">-a</text><text class="breeze-shell-r2 [...]
+</text><text class="breeze-shell-r5" x="0" y="1630.4" textLength="12.2" clip-path="url(#breeze-shell-line-66)">│</text><text class="breeze-shell-r4" x="24.4" y="1630.4" textLength="12.2" clip-path="url(#breeze-shell-line-66)">-</text><text class="breeze-shell-r4" x="36.6" y="1630.4" textLength="61" clip-path="url(#breeze-shell-line-66)">-help</text><text class="breeze-shell-r6" x="158.6" y="1630.4" textLength="24.4" clip-path="url(#breeze-shell-line-66)">-h</text><text class="breeze-shel [...]
+</text><text class="breeze-shell-r5" x="0" y="1654.8" textLength="1464" clip-path="url(#breeze-shell-line-67)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-shell-r2" x="1464" y="1654.8" textLength="12.2" clip-path="url(#breeze-shell-line-67)">
 </text>
     </g>
     </g>
diff --git a/images/breeze/output_testing_tests.svg b/images/breeze/output_testing_tests.svg
index ec3da499fa..9cf14a8662 100644
--- a/images/breeze/output_testing_tests.svg
+++ b/images/breeze/output_testing_tests.svg
@@ -1,4 +1,4 @@
-<svg class="rich-terminal" viewBox="0 0 1482 1489.6" xmlns="http://www.w3.org/2000/svg">
+<svg class="rich-terminal" viewBox="0 0 1482 1514.0" xmlns="http://www.w3.org/2000/svg">
     <!-- Generated with Rich https://www.textualize.io -->
     <style>
 
@@ -43,7 +43,7 @@
 
     <defs>
     <clipPath id="breeze-testing-tests-clip-terminal">
-      <rect x="0" y="0" width="1463.0" height="1438.6" />
+      <rect x="0" y="0" width="1463.0" height="1463.0" />
     </clipPath>
     <clipPath id="breeze-testing-tests-line-0">
     <rect x="0" y="1.5" width="1464" height="24.65"/>
@@ -219,9 +219,12 @@
 <clipPath id="breeze-testing-tests-line-57">
     <rect x="0" y="1392.3" width="1464" height="24.65"/>
             </clipPath>
+<clipPath id="breeze-testing-tests-line-58">
+    <rect x="0" y="1416.7" width="1464" height="24.65"/>
+            </clipPath>
     </defs>
 
-    <rect fill="#292929" stroke="rgba(255,255,255,0.35)" stroke-width="1" x="1" y="1" width="1480" height="1487.6" rx="8"/><text class="breeze-testing-tests-title" fill="#c5c8c6" text-anchor="middle" x="740" y="27">Command:&#160;testing&#160;tests</text>
+    <rect fill="#292929" stroke="rgba(255,255,255,0.35)" stroke-width="1" x="1" y="1" width="1480" height="1512" rx="8"/><text class="breeze-testing-tests-title" fill="#c5c8c6" text-anchor="middle" x="740" y="27">Command:&#160;testing&#160;tests</text>
             <g transform="translate(26,22)">
             <circle cx="0" cy="0" r="7" fill="#ff5f57"/>
             <circle cx="22" cy="0" r="7" fill="#febc2e"/>
@@ -282,14 +285,15 @@
 </text><text class="breeze-testing-tests-r5" x="0" y="1191.2" textLength="12.2" clip-path="url(#breeze-testing-tests-line-48)">│</text><text class="breeze-testing-tests-r7" x="414.8" y="1191.2" textLength="1024.8" clip-path="url(#breeze-testing-tests-line-48)">(selected&#160;|&#160;all&#160;|&#160;skip&#160;|&#160;remove)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;& [...]
 </text><text class="breeze-testing-tests-r5" x="0" y="1215.6" textLength="12.2" clip-path="url(#breeze-testing-tests-line-49)">│</text><text class="breeze-testing-tests-r5" x="414.8" y="1215.6" textLength="1024.8" clip-path="url(#breeze-testing-tests-line-49)">[default:&#160;selected]&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#1 [...]
 </text><text class="breeze-testing-tests-r5" x="0" y="1240" textLength="12.2" clip-path="url(#breeze-testing-tests-line-50)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1240" textLength="12.2" clip-path="url(#breeze-testing-tests-line-50)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1240" textLength="97.6" clip-path="url(#breeze-testing-tests-line-50)">-upgrade</text><text class="breeze-testing-tests-r4" x="134.2" y="1240" textLength="61" clip-path="url(#breez [...]
-</text><text class="breeze-testing-tests-r5" x="0" y="1264.4" textLength="12.2" clip-path="url(#breeze-testing-tests-line-51)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1264.4" textLength="12.2" clip-path="url(#breeze-testing-tests-line-51)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1264.4" textLength="85.4" clip-path="url(#breeze-testing-tests-line-51)">-remove</text><text class="breeze-testing-tests-r4" x="122" y="1264.4" textLength="158.6" clip-path="ur [...]
-</text><text class="breeze-testing-tests-r5" x="0" y="1288.8" textLength="12.2" clip-path="url(#breeze-testing-tests-line-52)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1288.8" textLength="12.2" clip-path="url(#breeze-testing-tests-line-52)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1288.8" textLength="61" clip-path="url(#breeze-testing-tests-line-52)">-skip</text><text class="breeze-testing-tests-r4" x="97.6" y="1288.8" textLength="244" clip-path="url(#br [...]
-</text><text class="breeze-testing-tests-r5" x="0" y="1313.2" textLength="1464" clip-path="url(#breeze-testing-tests-line-53)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-testing-tests-r2" x="1464" y="1313.2" textLength="12.2" clip-path="url(#breeze-testing-tests-line-53)">
-</text><text class="breeze-testing-tests-r5" x="0" y="1337.6" textLength="24.4" clip-path="url(#breeze-testing-tests-line-54)">╭─</text><text class="breeze-testing-tests-r5" x="24.4" y="1337.6" textLength="195.2" clip-path="url(#breeze-testing-tests-line-54)">&#160;Common&#160;options&#160;</text><text class="breeze-testing-tests-r5" x="219.6" y="1337.6" textLength="1220" clip-path="url(#breeze-testing-tests-line-54)">────────────────────────────────────────────────────────────────────── [...]
-</text><text class="breeze-testing-tests-r5" x="0" y="1362" textLength="12.2" clip-path="url(#breeze-testing-tests-line-55)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1362" textLength="12.2" clip-path="url(#breeze-testing-tests-line-55)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1362" textLength="97.6" clip-path="url(#breeze-testing-tests-line-55)">-verbose</text><text class="breeze-testing-tests-r6" x="158.6" y="1362" textLength="24.4" clip-path="url(#bre [...]
-</text><text class="breeze-testing-tests-r5" x="0" y="1386.4" textLength="12.2" clip-path="url(#breeze-testing-tests-line-56)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1386.4" textLength="12.2" clip-path="url(#breeze-testing-tests-line-56)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1386.4" textLength="48.8" clip-path="url(#breeze-testing-tests-line-56)">-dry</text><text class="breeze-testing-tests-r4" x="85.4" y="1386.4" textLength="48.8" clip-path="url(# [...]
-</text><text class="breeze-testing-tests-r5" x="0" y="1410.8" textLength="12.2" clip-path="url(#breeze-testing-tests-line-57)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1410.8" textLength="12.2" clip-path="url(#breeze-testing-tests-line-57)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1410.8" textLength="61" clip-path="url(#breeze-testing-tests-line-57)">-help</text><text class="breeze-testing-tests-r6" x="158.6" y="1410.8" textLength="24.4" clip-path="url(# [...]
-</text><text class="breeze-testing-tests-r5" x="0" y="1435.2" textLength="1464" clip-path="url(#breeze-testing-tests-line-58)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-testing-tests-r2" x="1464" y="1435.2" textLength="12.2" clip-path="url(#breeze-testing-tests-line-58)">
+</text><text class="breeze-testing-tests-r5" x="0" y="1264.4" textLength="12.2" clip-path="url(#breeze-testing-tests-line-51)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1264.4" textLength="12.2" clip-path="url(#breeze-testing-tests-line-51)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1264.4" textLength="122" clip-path="url(#breeze-testing-tests-line-51)">-downgrade</text><text class="breeze-testing-tests-r4" x="158.6" y="1264.4" textLength="134.2" clip-path [...]
+</text><text class="breeze-testing-tests-r5" x="0" y="1288.8" textLength="12.2" clip-path="url(#breeze-testing-tests-line-52)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1288.8" textLength="12.2" clip-path="url(#breeze-testing-tests-line-52)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1288.8" textLength="85.4" clip-path="url(#breeze-testing-tests-line-52)">-remove</text><text class="breeze-testing-tests-r4" x="122" y="1288.8" textLength="158.6" clip-path="ur [...]
+</text><text class="breeze-testing-tests-r5" x="0" y="1313.2" textLength="12.2" clip-path="url(#breeze-testing-tests-line-53)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1313.2" textLength="12.2" clip-path="url(#breeze-testing-tests-line-53)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1313.2" textLength="61" clip-path="url(#breeze-testing-tests-line-53)">-skip</text><text class="breeze-testing-tests-r4" x="97.6" y="1313.2" textLength="244" clip-path="url(#br [...]
+</text><text class="breeze-testing-tests-r5" x="0" y="1337.6" textLength="1464" clip-path="url(#breeze-testing-tests-line-54)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-testing-tests-r2" x="1464" y="1337.6" textLength="12.2" clip-path="url(#breeze-testing-tests-line-54)">
+</text><text class="breeze-testing-tests-r5" x="0" y="1362" textLength="24.4" clip-path="url(#breeze-testing-tests-line-55)">╭─</text><text class="breeze-testing-tests-r5" x="24.4" y="1362" textLength="195.2" clip-path="url(#breeze-testing-tests-line-55)">&#160;Common&#160;options&#160;</text><text class="breeze-testing-tests-r5" x="219.6" y="1362" textLength="1220" clip-path="url(#breeze-testing-tests-line-55)">──────────────────────────────────────────────────────────────────────────── [...]
+</text><text class="breeze-testing-tests-r5" x="0" y="1386.4" textLength="12.2" clip-path="url(#breeze-testing-tests-line-56)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1386.4" textLength="12.2" clip-path="url(#breeze-testing-tests-line-56)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1386.4" textLength="97.6" clip-path="url(#breeze-testing-tests-line-56)">-verbose</text><text class="breeze-testing-tests-r6" x="158.6" y="1386.4" textLength="24.4" clip-path=" [...]
+</text><text class="breeze-testing-tests-r5" x="0" y="1410.8" textLength="12.2" clip-path="url(#breeze-testing-tests-line-57)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1410.8" textLength="12.2" clip-path="url(#breeze-testing-tests-line-57)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1410.8" textLength="48.8" clip-path="url(#breeze-testing-tests-line-57)">-dry</text><text class="breeze-testing-tests-r4" x="85.4" y="1410.8" textLength="48.8" clip-path="url(# [...]
+</text><text class="breeze-testing-tests-r5" x="0" y="1435.2" textLength="12.2" clip-path="url(#breeze-testing-tests-line-58)">│</text><text class="breeze-testing-tests-r4" x="24.4" y="1435.2" textLength="12.2" clip-path="url(#breeze-testing-tests-line-58)">-</text><text class="breeze-testing-tests-r4" x="36.6" y="1435.2" textLength="61" clip-path="url(#breeze-testing-tests-line-58)">-help</text><text class="breeze-testing-tests-r6" x="158.6" y="1435.2" textLength="24.4" clip-path="url(# [...]
+</text><text class="breeze-testing-tests-r5" x="0" y="1459.6" textLength="1464" clip-path="url(#breeze-testing-tests-line-59)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-testing-tests-r2" x="1464" y="1459.6" textLength="12.2" clip-path="url(#breeze-testing-tests-line-59)">
 </text>
     </g>
     </g>
diff --git a/scripts/ci/docker-compose/_docker.env b/scripts/ci/docker-compose/_docker.env
index 2cd4bc0f7d..aa21804e9b 100644
--- a/scripts/ci/docker-compose/_docker.env
+++ b/scripts/ci/docker-compose/_docker.env
@@ -75,6 +75,7 @@ TEST_TIMEOUT
 TEST_TYPE
 UPGRADE_BOTO
 UPGRADE_TO_NEWER_DEPENDENCIES
+DOWNGRADE_SQLALCHEMY
 VERBOSE
 VERBOSE_COMMANDS
 VERSION_SUFFIX_FOR_PYPI
diff --git a/scripts/ci/docker-compose/base.yml b/scripts/ci/docker-compose/base.yml
index 160e8feec0..3b7417eab5 100644
--- a/scripts/ci/docker-compose/base.yml
+++ b/scripts/ci/docker-compose/base.yml
@@ -85,6 +85,7 @@ services:
       - TEST_TYPE=${TEST_TYPE}
       - TEST_TIMEOUT=${TEST_TIMEOUT}
       - UPGRADE_BOTO=${UPGRADE_BOTO}
+      - DOWNGRADE_SQLALCHEMY=${DOWNGRADE_SQLALCHEMY}
       - UPGRADE_TO_NEWER_DEPENDENCIES=${UPGRADE_TO_NEWER_DEPENDENCIES}
       - VERBOSE=${VERBOSE}
       - VERBOSE_COMMANDS=${VERBOSE_COMMANDS}
diff --git a/scripts/ci/docker-compose/devcontainer.env b/scripts/ci/docker-compose/devcontainer.env
index d66767fa02..f71a0e1e39 100644
--- a/scripts/ci/docker-compose/devcontainer.env
+++ b/scripts/ci/docker-compose/devcontainer.env
@@ -69,6 +69,7 @@ START_AIRFLOW="false"
 SUSPENDED_PROVIDERS_FOLDERS=""
 TEST_TYPE=
 UPGRADE_BOTO="false"
+DOWNGRADE_SQLALCHEMY="false"
 UPGRADE_TO_NEWER_DEPENDENCIES="false"
 VERBOSE="false"
 VERBOSE_COMMANDS="false"
diff --git a/scripts/docker/entrypoint_ci.sh b/scripts/docker/entrypoint_ci.sh
index 73b8a20deb..87e573d114 100755
--- a/scripts/docker/entrypoint_ci.sh
+++ b/scripts/docker/entrypoint_ci.sh
@@ -326,6 +326,22 @@ fi
 # Remove pytest.ini from the current directory if it exists. It has been removed from the source tree
 # but may still be present in the local directory if the user has old breeze image
 rm -f "${AIRFLOW_SOURCES}/pytest.ini"
+if [[ ${UPGRADE_BOTO=} == "true" ]]; then
+    echo
+    echo "${COLOR_BLUE}Upgrading boto3, botocore to latest version to run Amazon tests with them${COLOR_RESET}"
+    echo
+    pip uninstall --root-user-action ignore aiobotocore -y || true
+    pip install --root-user-action ignore --upgrade boto3 botocore
+    pip check
+fi
+if [[ ${DOWNGRADE_SQLALCHEMY=} == "true" ]]; then
+    min_sqlalchemy_version=$(grep "sqlalchemy>=" setup.cfg | sed "s/.*>=\([0-9\.]*\).*/\1/")
+    echo
+    echo "${COLOR_BLUE}Downgrading sqlalchemy to minimum supported version: ${min_sqlalchemy_version}${COLOR_RESET}"
+    echo
+    pip install --root-user-action ignore "sqlalchemy==${min_sqlalchemy_version}"
+    pip check
+fi
 
 set +u
 # If we do not want to run tests, we simply drop into bash
@@ -558,13 +574,6 @@ else
         exit 1
     fi
 fi
-if [[ ${UPGRADE_BOTO=} == "true" ]]; then
-    echo
-    echo "${COLOR_BLUE}Upgrading boto3, botocore to latest version to run Amazon tests with them${COLOR_RESET}"
-    echo
-    pip uninstall aiobotocore -y || true
-    pip install --upgrade boto3 botocore
-fi
 readonly SELECTED_TESTS CLI_TESTS API_TESTS PROVIDERS_TESTS CORE_TESTS WWW_TESTS \
     ALL_TESTS ALL_PRESELECTED_TESTS
 
diff --git a/setup.cfg b/setup.cfg
index 367924d82d..26c8cb2e03 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -142,7 +142,7 @@ install_requires =
     # See https://sqlalche.me/e/b8d9 for details of deprecated features
     # you can set environment variable SQLALCHEMY_WARN_20=1 to show all deprecation warnings.
     # The issue tracking it is https://github.com/apache/airflow/issues/28723
-    sqlalchemy>=1.4.24,<2.0
+    sqlalchemy>=1.4.28,<2.0
     sqlalchemy_jsonfield>=1.0
     tabulate>=0.7.5
     tenacity>=6.2.0,!=8.2.0


[airflow] 34/44: Fix SesssionExemptMixin spelling (#34696)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7fcb9317d266e92a288838c59e45436c5d9c03eb
Author: David Kalamarides <da...@amaforge.com>
AuthorDate: Mon Oct 2 12:42:03 2023 -0700

    Fix SesssionExemptMixin spelling (#34696)
    
    Co-authored-by: David Kalamarides <da...@capitalone.com>
    (cherry picked from commit 63945c71241e7b1b278068e1786e610facd569e0)
---
 airflow/www/session.py | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/airflow/www/session.py b/airflow/www/session.py
index 4e23d212fe..763b909ae0 100644
--- a/airflow/www/session.py
+++ b/airflow/www/session.py
@@ -21,7 +21,7 @@ from flask.sessions import SecureCookieSessionInterface
 from flask_session.sessions import SqlAlchemySessionInterface
 
 
-class SesssionExemptMixin:
+class SessionExemptMixin:
     """Exempt certain blueprints/paths from autogenerated sessions."""
 
     def save_session(self, *args, **kwargs):
@@ -33,9 +33,9 @@ class SesssionExemptMixin:
         return super().save_session(*args, **kwargs)
 
 
-class AirflowDatabaseSessionInterface(SesssionExemptMixin, SqlAlchemySessionInterface):
+class AirflowDatabaseSessionInterface(SessionExemptMixin, SqlAlchemySessionInterface):
     """Session interface that exempts some routes and stores session data in the database."""
 
 
-class AirflowSecureCookieSessionInterface(SesssionExemptMixin, SecureCookieSessionInterface):
+class AirflowSecureCookieSessionInterface(SessionExemptMixin, SecureCookieSessionInterface):
     """Session interface that exempts some routes and stores session data in a signed cookie."""


[airflow] 15/44: Restore EXISTING_ROLES from security.py (#34523)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c42012f60d2579ab2c3ec29c0f3249ba72789319
Author: Pankaj Singh <98...@users.noreply.github.com>
AuthorDate: Fri Sep 22 21:13:14 2023 +0530

    Restore EXISTING_ROLES from security.py (#34523)
    
    (cherry picked from commit 5d2cea4515cd656e85c39e04d148fcb2d6ba516a)
---
 airflow/www/security.py | 8 ++++++++
 1 file changed, 8 insertions(+)

diff --git a/airflow/www/security.py b/airflow/www/security.py
index fc9845b9f7..28d938d052 100644
--- a/airflow/www/security.py
+++ b/airflow/www/security.py
@@ -64,6 +64,14 @@ else:
     # Fetch the security manager override from the auth manager
     SecurityManagerOverride = get_auth_manager().get_security_manager_override_class()
 
+EXISTING_ROLES = {
+    "Admin",
+    "Viewer",
+    "User",
+    "Op",
+    "Public",
+}
+
 
 class AirflowSecurityManager(SecurityManagerOverride, SecurityManager, LoggingMixin):
     """Custom security manager, which introduces a permission model adapted to Airflow."""


[airflow] 39/44: Update version to 2.7.2

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 12f2fd5ced949fc27817d9380c69ccbc2d66772d
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Thu Oct 5 08:28:37 2023 +0100

    Update version to 2.7.2
---
 README.md                                                  | 12 ++++++------
 airflow/__init__.py                                        |  2 +-
 airflow/api_connexion/openapi/v1.yaml                      |  2 +-
 docs/apache-airflow/installation/supported-versions.rst    |  2 +-
 docs/docker-stack/README.md                                | 10 +++++-----
 .../extending/add-airflow-configuration/Dockerfile         |  2 +-
 .../docker-examples/extending/add-apt-packages/Dockerfile  |  2 +-
 .../extending/add-build-essential-extend/Dockerfile        |  2 +-
 .../docker-examples/extending/add-providers/Dockerfile     |  2 +-
 .../docker-examples/extending/add-pypi-packages/Dockerfile |  2 +-
 .../extending/add-requirement-packages/Dockerfile          |  2 +-
 .../docker-examples/extending/custom-providers/Dockerfile  |  2 +-
 .../docker-examples/extending/embedding-dags/Dockerfile    |  2 +-
 .../extending/writable-directory/Dockerfile                |  2 +-
 docs/docker-stack/entrypoint.rst                           | 14 +++++++-------
 generated/PYPI_README.md                                   | 10 +++++-----
 scripts/ci/pre_commit/pre_commit_supported_versions.py     |  2 +-
 17 files changed, 36 insertions(+), 36 deletions(-)

diff --git a/README.md b/README.md
index ec01fdcccf..cd8c597a44 100644
--- a/README.md
+++ b/README.md
@@ -89,7 +89,7 @@ Airflow is not a streaming solution, but it is often used to process real-time d
 
 Apache Airflow is tested with:
 
-|             | Main version (dev)           | Stable version (2.7.1) |
+|             | Main version (dev)           | Stable version (2.7.2) |
 |-------------|------------------------------|------------------------|
 | Python      | 3.8, 3.9, 3.10, 3.11         | 3.8, 3.9, 3.10, 3.11   |
 | Platform    | AMD64/ARM64(\*)              | AMD64/ARM64(\*)        |
@@ -172,15 +172,15 @@ them to the appropriate format and workflow that your tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.7.1' \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.7.1/constraints-3.8.txt"
+pip install 'apache-airflow==2.7.2' \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.7.2/constraints-3.8.txt"
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.7.1' \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.7.1/constraints-3.8.txt"
+pip install 'apache-airflow[postgres,google]==2.7.2' \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.7.2/constraints-3.8.txt"
 ```
 
 For information on installing provider packages, check
@@ -292,7 +292,7 @@ Apache Airflow version life cycle:
 
 | Version   | Current Patch/Minor   | State     | First Release   | Limited Support   | EOL/Terminated   |
 |-----------|-----------------------|-----------|-----------------|-------------------|------------------|
-| 2         | 2.7.1                 | Supported | Dec 17, 2020    | TBD               | TBD              |
+| 2         | 2.7.2                 | Supported | Dec 17, 2020    | TBD               | TBD              |
 | 1.10      | 1.10.15               | EOL       | Aug 27, 2018    | Dec 17, 2020      | June 17, 2021    |
 | 1.9       | 1.9.0                 | EOL       | Jan 03, 2018    | Aug 27, 2018      | Aug 27, 2018     |
 | 1.8       | 1.8.2                 | EOL       | Mar 19, 2017    | Jan 03, 2018      | Jan 03, 2018     |
diff --git a/airflow/__init__.py b/airflow/__init__.py
index cd5ec8b7ac..182b71dc2f 100644
--- a/airflow/__init__.py
+++ b/airflow/__init__.py
@@ -26,7 +26,7 @@ isort:skip_file
 """
 from __future__ import annotations
 
-__version__ = "2.7.1"
+__version__ = "2.7.2"
 
 # flake8: noqa: F401
 
diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml
index 58d5bef9f0..b94fe05b59 100644
--- a/airflow/api_connexion/openapi/v1.yaml
+++ b/airflow/api_connexion/openapi/v1.yaml
@@ -231,7 +231,7 @@ info:
     This means that the server encountered an unexpected condition that prevented it from
     fulfilling the request.
 
-  version: '2.7.1'
+  version: '2.7.2'
   license:
     name: Apache 2.0
     url: http://www.apache.org/licenses/LICENSE-2.0.html
diff --git a/docs/apache-airflow/installation/supported-versions.rst b/docs/apache-airflow/installation/supported-versions.rst
index 0482986e37..236d46ecb1 100644
--- a/docs/apache-airflow/installation/supported-versions.rst
+++ b/docs/apache-airflow/installation/supported-versions.rst
@@ -29,7 +29,7 @@ Apache Airflow™ version life cycle:
 =========  =====================  =========  ===============  =================  ================
 Version    Current Patch/Minor    State      First Release    Limited Support    EOL/Terminated
 =========  =====================  =========  ===============  =================  ================
-2          2.7.1                  Supported  Dec 17, 2020     TBD                TBD
+2          2.7.2                  Supported  Dec 17, 2020     TBD                TBD
 1.10       1.10.15                EOL        Aug 27, 2018     Dec 17, 2020       June 17, 2021
 1.9        1.9.0                  EOL        Jan 03, 2018     Aug 27, 2018       Aug 27, 2018
 1.8        1.8.2                  EOL        Mar 19, 2017     Jan 03, 2018       Jan 03, 2018
diff --git a/docs/docker-stack/README.md b/docs/docker-stack/README.md
index 3e1115d92a..10129d80fe 100644
--- a/docs/docker-stack/README.md
+++ b/docs/docker-stack/README.md
@@ -31,12 +31,12 @@ Every time a new version of Airflow is released, the images are prepared in the
 [apache/airflow DockerHub](https://hub.docker.com/r/apache/airflow)
 for all the supported Python versions.
 
-You can find the following images there (Assuming Airflow version `2.7.1`):
+You can find the following images there (Assuming Airflow version `2.7.2`):
 
 * `apache/airflow:latest` - the latest released Airflow image with default Python version (3.8 currently)
 * `apache/airflow:latest-pythonX.Y` - the latest released Airflow image with specific Python version
-* `apache/airflow:2.7.1` - the versioned Airflow image with default Python version (3.8 currently)
-* `apache/airflow:2.7.1-pythonX.Y` - the versioned Airflow image with specific Python version
+* `apache/airflow:2.7.2` - the versioned Airflow image with default Python version (3.8 currently)
+* `apache/airflow:2.7.2-pythonX.Y` - the versioned Airflow image with specific Python version
 
 Those are "reference" regular images. They contain the most common set of extras, dependencies and providers that are
 often used by the users and they are good to "try-things-out" when you want to just take Airflow for a spin,
@@ -47,8 +47,8 @@ via [Building the image](https://airflow.apache.org/docs/docker-stack/build.html
 
 * `apache/airflow:slim-latest`              - the latest released Airflow image with default Python version (3.8 currently)
 * `apache/airflow:slim-latest-pythonX.Y`    - the latest released Airflow image with specific Python version
-* `apache/airflow:slim-2.7.1`           - the versioned Airflow image with default Python version (3.8 currently)
-* `apache/airflow:slim-2.7.1-pythonX.Y` - the versioned Airflow image with specific Python version
+* `apache/airflow:slim-2.7.2`           - the versioned Airflow image with default Python version (3.8 currently)
+* `apache/airflow:slim-2.7.2-pythonX.Y` - the versioned Airflow image with specific Python version
 
 The Apache Airflow image provided as convenience package is optimized for size, and
 it provides just a bare minimal set of the extras and dependencies installed and in most cases
diff --git a/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile b/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile
index f37f16b961..6888723306 100644
--- a/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 ENV AIRFLOW__CORE__LOAD_EXAMPLES=True
 ENV AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=my_conn_string
 # [END Dockerfile]
diff --git a/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile b/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
index 6a03476436..31c4d91d54 100644
--- a/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git a/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile b/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
index fc7f3a74e3..2baf12487f 100644
--- a/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git a/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile b/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
index 94870a3019..34490290d9 100644
--- a/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git a/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile b/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
index 7cca44228e..baf4c1f30a 100644
--- a/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
@@ -15,6 +15,6 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 RUN pip install --no-cache-dir "apache-airflow==${AIRFLOW_VERSION}" lxml
 # [END Dockerfile]
diff --git a/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile b/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile
index 525296b91e..1704f64173 100644
--- a/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 COPY requirements.txt /
 RUN pip install --no-cache-dir "apache-airflow==${AIRFLOW_VERSION}" -r /requirements.txt
 # [END Dockerfile]
diff --git a/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile b/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
index 0b7f2eca7e..73c62e6a9a 100644
--- a/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
@@ -15,6 +15,6 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 RUN pip install "apache-airflow==${AIRFLOW_VERSION}" --no-cache-dir apache-airflow-providers-docker==2.5.1
 # [END Dockerfile]
diff --git a/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile b/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
index bce131ee73..b82de6e70d 100644
--- a/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 
 COPY --chown=airflow:root test_dag.py /opt/airflow/dags
 
diff --git a/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile b/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
index 5721fa4ba4..516628daab 100644
--- a/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.7.1
+FROM apache/airflow:2.7.2
 RUN umask 0002; \
     mkdir -p ~/writeable-directory
 # [END Dockerfile]
diff --git a/docs/docker-stack/entrypoint.rst b/docs/docker-stack/entrypoint.rst
index 3ee6119f10..25fb2ec626 100644
--- a/docs/docker-stack/entrypoint.rst
+++ b/docs/docker-stack/entrypoint.rst
@@ -132,7 +132,7 @@ if you specify extra arguments. For example:
 
 .. code-block:: bash
 
-  docker run -it apache/airflow:2.7.1-python3.8 bash -c "ls -la"
+  docker run -it apache/airflow:2.7.2-python3.8 bash -c "ls -la"
   total 16
   drwxr-xr-x 4 airflow root 4096 Jun  5 18:12 .
   drwxr-xr-x 1 root    root 4096 Jun  5 18:12 ..
@@ -144,7 +144,7 @@ you pass extra parameters. For example:
 
 .. code-block:: bash
 
-  > docker run -it apache/airflow:2.7.1-python3.8 python -c "print('test')"
+  > docker run -it apache/airflow:2.7.2-python3.8 python -c "print('test')"
   test
 
 If first argument equals to "airflow" - the rest of the arguments is treated as an airflow command
@@ -152,13 +152,13 @@ to execute. Example:
 
 .. code-block:: bash
 
-   docker run -it apache/airflow:2.7.1-python3.8 airflow webserver
+   docker run -it apache/airflow:2.7.2-python3.8 airflow webserver
 
 If there are any other arguments - they are simply passed to the "airflow" command
 
 .. code-block:: bash
 
-  > docker run -it apache/airflow:2.7.1-python3.8 help
+  > docker run -it apache/airflow:2.7.2-python3.8 help
     usage: airflow [-h] GROUP_OR_COMMAND ...
 
     positional arguments:
@@ -363,7 +363,7 @@ database and creating an ``admin/admin`` Admin user with the following command:
     --env "_AIRFLOW_DB_MIGRATE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD=admin" \
-      apache/airflow:2.7.1-python3.8 webserver
+      apache/airflow:2.7.2-python3.8 webserver
 
 
 .. code-block:: bash
@@ -372,7 +372,7 @@ database and creating an ``admin/admin`` Admin user with the following command:
     --env "_AIRFLOW_DB_MIGRATE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \
-      apache/airflow:2.7.1-python3.8 webserver
+      apache/airflow:2.7.2-python3.8 webserver
 
 The commands above perform initialization of the SQLite database, create admin user with admin password
 and Admin role. They also forward local port ``8080`` to the webserver port and finally start the webserver.
@@ -412,6 +412,6 @@ Example:
     --env "_AIRFLOW_DB_MIGRATE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \
-      apache/airflow:2.7.1-python3.8 webserver
+      apache/airflow:2.7.2-python3.8 webserver
 
 This method is only available starting from Docker image of Airflow 2.1.1 and above.
diff --git a/generated/PYPI_README.md b/generated/PYPI_README.md
index ab1e94f5ad..f0ecd7cb5f 100644
--- a/generated/PYPI_README.md
+++ b/generated/PYPI_README.md
@@ -47,7 +47,7 @@ Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The
 
 Apache Airflow is tested with:
 
-|             | Main version (dev)           | Stable version (2.7.1) |
+|             | Main version (dev)           | Stable version (2.7.2) |
 |-------------|------------------------------|------------------------|
 | Python      | 3.8, 3.9, 3.10, 3.11         | 3.8, 3.9, 3.10, 3.11   |
 | Platform    | AMD64/ARM64(\*)              | AMD64/ARM64(\*)        |
@@ -126,15 +126,15 @@ them to the appropriate format and workflow that your tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.7.1' \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.7.1/constraints-3.8.txt"
+pip install 'apache-airflow==2.7.2' \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.7.2/constraints-3.8.txt"
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.7.1' \
- --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.7.1/constraints-3.8.txt"
+pip install 'apache-airflow[postgres,google]==2.7.2' \
+ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.7.2/constraints-3.8.txt"
 ```
 
 For information on installing provider packages, check
diff --git a/scripts/ci/pre_commit/pre_commit_supported_versions.py b/scripts/ci/pre_commit/pre_commit_supported_versions.py
index c1b964cc56..0f526e6442 100755
--- a/scripts/ci/pre_commit/pre_commit_supported_versions.py
+++ b/scripts/ci/pre_commit/pre_commit_supported_versions.py
@@ -27,7 +27,7 @@ AIRFLOW_SOURCES = Path(__file__).resolve().parent.parent.parent.parent
 HEADERS = ("Version", "Current Patch/Minor", "State", "First Release", "Limited Support", "EOL/Terminated")
 
 SUPPORTED_VERSIONS = (
-    ("2", "2.7.1", "Supported", "Dec 17, 2020", "TBD", "TBD"),
+    ("2", "2.7.2", "Supported", "Dec 17, 2020", "TBD", "TBD"),
     ("1.10", "1.10.15", "EOL", "Aug 27, 2018", "Dec 17, 2020", "June 17, 2021"),
     ("1.9", "1.9.0", "EOL", "Jan 03, 2018", "Aug 27, 2018", "Aug 27, 2018"),
     ("1.8", "1.8.2", "EOL", "Mar 19, 2017", "Jan 03, 2018", "Jan 03, 2018"),


[airflow] 13/44: using seconds for failed scenarios too (#34532)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit bc2bef8a7996497d5000c76d4909e882e5a02611
Author: Shubham Raj <48...@users.noreply.github.com>
AuthorDate: Fri Sep 22 00:23:34 2023 +0530

    using seconds for failed scenarios too (#34532)
    
    Co-authored-by: Shubham <sh...@cloudera.com>
    (cherry picked from commit 117e40490865f04aed38a18724fc88a8cf94aacc)
---
 .../administration-and-deployment/logging-monitoring/metrics.rst        | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow/administration-and-deployment/logging-monitoring/metrics.rst b/docs/apache-airflow/administration-and-deployment/logging-monitoring/metrics.rst
index f8eee430f4..ec4558baf8 100644
--- a/docs/apache-airflow/administration-and-deployment/logging-monitoring/metrics.rst
+++ b/docs/apache-airflow/administration-and-deployment/logging-monitoring/metrics.rst
@@ -231,7 +231,7 @@ Name                                                Description
 ``dag.<dag_id>.<task_id>.queued_duration``          Seconds a task spends in the Queued state, before being Running
 ``dag_processing.last_duration.<dag_file>``         Seconds taken to load the given DAG file
 ``dagrun.duration.success.<dag_id>``                Seconds taken for a DagRun to reach success state
-``dagrun.duration.failed.<dag_id>``                 Milliseconds taken for a DagRun to reach failed state
+``dagrun.duration.failed.<dag_id>``                 Seconds taken for a DagRun to reach failed state
 ``dagrun.schedule_delay.<dag_id>``                  Seconds of delay between the scheduled DagRun
                                                     start date and the actual DagRun start date
 ``scheduler.critical_section_duration``             Milliseconds spent in the critical section of scheduler loop --


[airflow] 33/44: Fix foreign key warning re ab_user.id (#34656)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 033ffa91be1b827286d3eeb7dd1866c490f5b405
Author: Daniel Standish <15...@users.noreply.github.com>
AuthorDate: Sun Oct 1 05:33:17 2023 -0700

    Fix foreign key warning re ab_user.id (#34656)
    
    Introduced in #34120.
    
    (cherry picked from commit 1fdc2311250fbae47749822b192a99066600f8ad)
---
 airflow/models/dagrun.py       | 2 +-
 airflow/models/taskinstance.py | 4 +---
 2 files changed, 2 insertions(+), 4 deletions(-)

diff --git a/airflow/models/dagrun.py b/airflow/models/dagrun.py
index 7308758333..81e93938df 100644
--- a/airflow/models/dagrun.py
+++ b/airflow/models/dagrun.py
@@ -1412,8 +1412,8 @@ class DagRunNote(Base):
 
     user_id = Column(
         Integer,
+        ForeignKey("ab_user.id", name="dag_run_note_user_fkey"),
         nullable=True,
-        foreign_key=ForeignKey("ab_user.id", name="dag_run_note_user_fkey"),
     )
     dag_run_id = Column(Integer, primary_key=True, nullable=False)
     content = Column(String(1000).with_variant(Text(1000), "mysql"))
diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
index 4919514c13..ec28fe0f6a 100644
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -3018,9 +3018,7 @@ class TaskInstanceNote(Base):
 
     __tablename__ = "task_instance_note"
 
-    user_id = Column(
-        Integer, nullable=True, foreign_key=ForeignKey("ab_user.id", name="task_instance_note_user_fkey")
-    )
+    user_id = Column(Integer, ForeignKey("ab_user.id", name="task_instance_note_user_fkey"), nullable=True)
     task_id = Column(StringID(), primary_key=True, nullable=False)
     dag_id = Column(StringID(), primary_key=True, nullable=False)
     run_id = Column(StringID(), primary_key=True, nullable=False)


[airflow] 40/44: Update RELEASE_NOTES.rst

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3f89ab406388e5209f2259185ab5f32f8ae00c79
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Thu Oct 5 10:11:18 2023 +0100

    Update RELEASE_NOTES.rst
---
 RELEASE_NOTES.rst              | 113 +++++++++++++++++++++++++++++++++++++++++
 newsfragments/34348.bugfix.rst |   1 -
 2 files changed, 113 insertions(+), 1 deletion(-)

diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index 7f4ff593b4..0da7230814 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -21,6 +21,119 @@
 
 .. towncrier release notes start
 
+Airflow 2.7.2 (2023-10-09)
+--------------------------
+
+Significant Changes
+^^^^^^^^^^^^^^^^^^^
+
+No significant changes
+
+
+Bug Fixes
+"""""""""
+- Import ``AUTH_REMOTE_USER`` from FAB in WSGI middleware example (#34721)
+- Fixed rows count in the migration script (#34348)
+- Make dry run optional for patch task instance  (#34568)
+- Fix non deterministic datetime deserialization (#34492)
+- Use iterative loop to look for mapped parent (#34622)
+- Fix is_parent_mapped value by checking if any of the parent ``taskgroup`` is mapped (#34587)
+- Avoid top-level airflow import to avoid circular dependency (#34586)
+- Add more exemptions to lengthy metric list (#34531)
+- Fix dag warning endpoint permissions (#34355)
+- Fix task instance access issue in the batch endpoint (#34315)
+- Correcting wrong time showing in grid view (#34179)
+- Fix www ``cluster_activity`` view not loading due to ``standaloneDagProcessor`` templating (#34274)
+- Set ``loglevel=DEBUG`` in 'Not syncing ``DAG-level`` permissions' (#34268)
+- Make param validation consistent for DAG validation and triggering (#34248)
+- Ensure details panel is shown when any tab is selected (#34136)
+- Fix issues related to ``access_control={}`` (#34114)
+- Fix not found ``ab_user`` table in the CLI session (#34120)
+- Fix FAB-related logging format interpolation (#34139)
+- Fix query bug in ``next_run_datasets_summary`` endpoint (#34143)
+- Fix for TaskGroup toggles for duplicated labels (#34072)
+- Fix the required permissions to clear a TI from the UI (#34123)
+- Reuse ``_run_task_session`` in mapped ``render_template_fields`` (#33309)
+- Fix scheduler logic to plan new dag runs by ignoring manual runs (#34027)
+- Add missing audit logs for Flask actions add, edit and delete (#34090)
+- Hide Irrelevant Dag Processor from Cluster Activity Page (#33611)
+- Remove infinite animation for pinwheel, spin for 1.5s (#34020)
+- Restore rendering of provider configuration with ``version_added`` (#34011)
+
+Doc Only Changes
+""""""""""""""""
+- Add information about drop support MsSQL as DB Backend in the future (#34375)
+- Document how to use the system's timezone database (#34667)
+- Clarify what landing time means in doc (#34608)
+- Fix screenshot in dynamic task mapping docs (#34566)
+- Fix class reference in Public Interface documentation (#34454)
+- Clarify var.value.get  and var.json.get usage (#34411)
+- Schedule default value description (#34291)
+- Docs for triggered_dataset_event (#34410)
+- Add DagRun events (#34328)
+- Provide tabular overview about trigger form param types (#34285)
+- Add link to Amazon Provider Configuration in Core documentation (#34305)
+- Add "security infrastructure" paragraph to security model (#34301)
+- Change links to SQLAlchemy 1.4 (#34288)
+- Add SBOM entry in security documentation (#34261)
+- Added more example code for XCom push and pull (#34016)
+- Add state utils to Public Airflow Interface (#34059)
+- Replace markdown style link with rst style link (#33990)
+- Fix broken link to the "UPDATING.md" file (#33583)
+
+Misc/Internal
+"""""""""""""
+- Update min-sqlalchemy version to account for latest features used (#34293)
+- Fix SesssionExemptMixin spelling (#34696)
+- Restrict ``astroid`` version < 3 (#34658)
+- Fail dag test if defer without triggerer (#34619)
+- Fix connections exported output (#34640)
+- Don't run isort when creating new alembic migrations (#34636)
+- Deprecate numeric type python version in PythonVirtualEnvOperator (#34359)
+- Refactor ``os.path.splitext`` to ``Path.*`` (#34352, #33669)
+- Check that dag_ids passed in request are consistent (#34366)
+- Replace = by is for type comparison (#33983)
+- Refactor integer division (#34180)
+- Refactor: Simplify comparisons (#34181)
+- Refactor: Simplify string generation (#34118)
+- Replace unnecessary dict comprehension with dict() in core (#33858)
+- Change "not all" to "any" for ease of readability (#34259)
+- Replace assert by if...raise in code (#34250, #34249)
+- Move default timezone to except block (#34245)
+- Combine similar if logic in core (#33988)
+- Refactor: Consolidate import and usage of random (#34108)
+- Consolidate importing of os.path.* (#34060)
+- Replace sequence concatenation by unpacking in Airflow core (#33934)
+- Refactor unneeded 'continue' jumps around the repo (#33849, #33845, #33846, #33848, #33839, #33844, #33836, #33842)
+- Remove [project] section from ``pyproject.toml`` (#34014)
+- Move the try outside the loop when this is possible in Airflow core (#33975)
+- Replace loop by any when looking for a positive value in core (#33985)
+- Do not create lists we don't need (#33519)
+- Remove useless string join from core (#33969)
+- Add TCH001 and TCH002 rules to pre-commit to detect and move type checking modules (#33865)
+- Add cancel_trigger_ids to to_cancel dequeue in batch (#33944)
+- Avoid creating unnecessary list when parsing stats datadog tags (#33943)
+- Replace dict.items by dict.values when key is not used in core (#33940)
+- Replace lambdas with comprehensions (#33745)
+- Improve modules import in Airflow core by some of them into a type-checking block (#33755)
+- Refactor: remove unused state - SHUTDOWN (#33746, #34063, #33893)
+- Improve modules import (#33808, #33812, #33811, #33810, #33805, #33804, #33803, #33801, #33799, #33800, #33797, #33798)
+- Refactor: Use in-place .sort() (#33743)
+- Use literal dict instead of calling dict() in Airflow core (#33762)
+- remove unnecessary map and rewrite it using list in Airflow core (#33764)
+- Replace lambda by a def method in Airflow core (#33758)
+- Replace type func by ``isinstance`` in fab_security manager (#33760)
+- Replace single quotes by double quotes in all Airflow modules (#33766)
+- Merge multiple ``isinstance`` calls for the same object in a single call (#33767)
+- Use a single  statement with multiple contexts instead of nested  statements in core (#33769)
+- Refactor: Use f-strings (#33734, #33455)
+- Refactor: Use random.choices (#33631)
+- Use ``str.splitlines()`` to split lines (#33592)
+- Refactor: Remove useless str() calls (#33629)
+- Refactor: Improve detection of duplicates and list sorting (#33675)
+- Simplify conditions on ``len()`` (#33454)
+
+
 Airflow 2.7.1 (2023-09-07)
 --------------------------
 
diff --git a/newsfragments/34348.bugfix.rst b/newsfragments/34348.bugfix.rst
deleted file mode 100644
index c9f27e42f2..0000000000
--- a/newsfragments/34348.bugfix.rst
+++ /dev/null
@@ -1 +0,0 @@
-Fixed ``AttributeError: 'Select' object has no attribute 'count'`` during the ``airflow db migrate`` command


[airflow] 27/44: Fail dag test if defer without triggerer (#34619)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0c9c86a3b7f1fd8d2974eaf26a8621bc0fe15374
Author: Daniel Standish <15...@users.noreply.github.com>
AuthorDate: Wed Sep 27 09:25:46 2023 -0700

    Fail dag test if defer without triggerer (#34619)
    
    If user runs dag.test and task defers and no triggerer is running, we should fail so user does not sit there waiting forever.
    
    ---------
    
    Co-authored-by: Tzu-ping Chung <ur...@gmail.com>
    (cherry picked from commit e81bb487796780705f6df984fbfed04f555943d7)
---
 airflow/models/dag.py                  | 41 ++++++++++++++++++++++++++++++----
 tests/cli/commands/test_dag_command.py | 37 ++++++++++++++++++++++++++++++
 2 files changed, 74 insertions(+), 4 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index ca3cce4cc4..e3f575f650 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -96,7 +96,13 @@ from airflow.models.dagcode import DagCode
 from airflow.models.dagpickle import DagPickle
 from airflow.models.dagrun import RUN_ID_REGEX, DagRun
 from airflow.models.param import DagParam, ParamsDict
-from airflow.models.taskinstance import Context, TaskInstance, TaskInstanceKey, clear_task_instances
+from airflow.models.taskinstance import (
+    Context,
+    TaskInstance,
+    TaskInstanceKey,
+    TaskReturnCode,
+    clear_task_instances,
+)
 from airflow.secrets.local_filesystem import LocalFilesystemBackend
 from airflow.security import permissions
 from airflow.stats import Stats
@@ -276,6 +282,14 @@ def get_dataset_triggered_next_run_info(
     }
 
 
+class _StopDagTest(Exception):
+    """
+    Raise when DAG.test should stop immediately.
+
+    :meta private:
+    """
+
+
 @functools.total_ordering
 class DAG(LoggingMixin):
     """
@@ -2758,7 +2772,17 @@ class DAG(LoggingMixin):
                 try:
                     add_logger_if_needed(ti)
                     ti.task = tasks[ti.task_id]
-                    _run_task(ti, session=session)
+                    ret = _run_task(ti, session=session)
+                    if ret is TaskReturnCode.DEFERRED:
+                        if not _triggerer_is_healthy():
+                            raise _StopDagTest(
+                                "Task has deferred but triggerer component is not running. "
+                                "You can start the triggerer by running `airflow triggerer` in a terminal."
+                            )
+                except _StopDagTest:
+                    # Let this exception bubble out and not be swallowed by the
+                    # except block below.
+                    raise
                 except Exception:
                     self.log.exception("Task failed; ti=%s", ti)
         if conn_file_path or variable_file_path:
@@ -3886,7 +3910,14 @@ class DagContext:
             return None
 
 
-def _run_task(ti: TaskInstance, session):
+def _triggerer_is_healthy():
+    from airflow.jobs.triggerer_job_runner import TriggererJobRunner
+
+    job = TriggererJobRunner.most_recent_job()
+    return job and job.is_alive()
+
+
+def _run_task(ti: TaskInstance, session) -> TaskReturnCode | None:
     """
     Run a single task instance, and push result to Xcom for downstream tasks.
 
@@ -3896,18 +3927,20 @@ def _run_task(ti: TaskInstance, session):
     Args:
         ti: TaskInstance to run
     """
+    ret = None
     log.info("*****************************************************")
     if ti.map_index > 0:
         log.info("Running task %s index %d", ti.task_id, ti.map_index)
     else:
         log.info("Running task %s", ti.task_id)
     try:
-        ti._run_raw_task(session=session)
+        ret = ti._run_raw_task(session=session)
         session.flush()
         log.info("%s ran successfully!", ti.task_id)
     except AirflowSkipException:
         log.info("Task Skipped, continuing")
     log.info("*****************************************************")
+    return ret
 
 
 def _get_or_create_dagrun(
diff --git a/tests/cli/commands/test_dag_command.py b/tests/cli/commands/test_dag_command.py
index 1bb3541b7c..2387eebcb4 100644
--- a/tests/cli/commands/test_dag_command.py
+++ b/tests/cli/commands/test_dag_command.py
@@ -34,9 +34,13 @@ from airflow import settings
 from airflow.api_connexion.schemas.dag_schema import DAGSchema
 from airflow.cli import cli_parser
 from airflow.cli.commands import dag_command
+from airflow.decorators import task
 from airflow.exceptions import AirflowException
 from airflow.models import DagBag, DagModel, DagRun
+from airflow.models.baseoperator import BaseOperator
+from airflow.models.dag import _StopDagTest
 from airflow.models.serialized_dag import SerializedDagModel
+from airflow.triggers.temporal import TimeDeltaTrigger
 from airflow.utils import timezone
 from airflow.utils.session import create_session
 from airflow.utils.types import DagRunType
@@ -816,3 +820,36 @@ class TestCliDags:
         )
         dag_command.dag_test(cli_args)
         assert "data_interval" in mock__get_or_create_dagrun.call_args.kwargs
+
+    def test_dag_test_no_triggerer(self, dag_maker):
+        with dag_maker() as dag:
+
+            @task
+            def one():
+                return 1
+
+            @task
+            def two(val):
+                return val + 1
+
+            class MyOp(BaseOperator):
+                template_fields = ("tfield",)
+
+                def __init__(self, tfield, **kwargs):
+                    self.tfield = tfield
+                    super().__init__(**kwargs)
+
+                def execute(self, context, event=None):
+                    if event is None:
+                        print("I AM DEFERRING")
+                        self.defer(trigger=TimeDeltaTrigger(timedelta(seconds=20)), method_name="execute")
+                        return
+                    print("RESUMING")
+                    return self.tfield + 1
+
+            task_one = one()
+            task_two = two(task_one)
+            op = MyOp(task_id="abc", tfield=str(task_two))
+            task_two >> op
+        with pytest.raises(_StopDagTest, match="Task has deferred but triggerer component is not running"):
+            dag.test()


[airflow] 25/44: Don't run isort when creating new alembic migrations (#34636)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 79a87dff3252071950993070572539c757cc885c
Author: Jed Cunningham <66...@users.noreply.github.com>
AuthorDate: Wed Sep 27 00:46:37 2023 -0600

    Don't run isort when creating new alembic migrations (#34636)
    
    (cherry picked from commit db89a33b60f46975850e3f696a7e05e61839befc)
---
 airflow/alembic.ini | 6 ------
 1 file changed, 6 deletions(-)

diff --git a/airflow/alembic.ini b/airflow/alembic.ini
index e16f51979f..988fa234ee 100644
--- a/airflow/alembic.ini
+++ b/airflow/alembic.ini
@@ -83,9 +83,3 @@ formatter = generic
 [formatter_generic]
 format = %(levelname)-5.5s [%(name)s] %(message)s
 datefmt = %H:%M:%S
-
-[post_write_hooks]
-hooks=isort
-
-isort.type=console_scripts
-isort.entrypoint=isort


[airflow] 03/44: Refactor os.path.splitext to Path.* (#34352)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d9e36e1efed9aedc8c6f18063b2efba2d246f46f
Author: Miroslav Šedivý <67...@users.noreply.github.com>
AuthorDate: Fri Sep 15 05:57:05 2023 +0000

    Refactor os.path.splitext to Path.* (#34352)
    
    (cherry picked from commit 4869575b2c538b54cbc9368791a924f7cd5f7ce8)
---
 airflow/cli/commands/internal_api_command.py |  7 +++++--
 airflow/cli/commands/webserver_command.py    |  7 +++++--
 airflow/dag_processing/manager.py            |  7 ++-----
 airflow/models/dagbag.py                     | 21 +++++++++------------
 docs/exts/docs_build/dev_index_generator.py  | 22 +++++++---------------
 5 files changed, 28 insertions(+), 36 deletions(-)

diff --git a/airflow/cli/commands/internal_api_command.py b/airflow/cli/commands/internal_api_command.py
index 7b2cf798da..73ed2e2501 100644
--- a/airflow/cli/commands/internal_api_command.py
+++ b/airflow/cli/commands/internal_api_command.py
@@ -24,6 +24,7 @@ import subprocess
 import sys
 import textwrap
 from contextlib import suppress
+from pathlib import Path
 from tempfile import gettempdir
 from time import sleep
 
@@ -170,13 +171,15 @@ def internal_api(args):
 
             handle = setup_logging(log_file)
 
-            base, ext = os.path.splitext(pid_file)
+            pid_path = Path(pid_file)
+            pidlock_path = pid_path.with_name(f"{pid_path.stem}-monitor{pid_path.suffix}")
+
             with open(stdout, "a") as stdout, open(stderr, "a") as stderr:
                 stdout.truncate(0)
                 stderr.truncate(0)
 
                 ctx = daemon.DaemonContext(
-                    pidfile=TimeoutPIDLockFile(f"{base}-monitor{ext}", -1),
+                    pidfile=TimeoutPIDLockFile(pidlock_path, -1),
                     files_preserve=[handle],
                     stdout=stdout,
                     stderr=stderr,
diff --git a/airflow/cli/commands/webserver_command.py b/airflow/cli/commands/webserver_command.py
index d4ca7dfa6a..a9258a1ae2 100644
--- a/airflow/cli/commands/webserver_command.py
+++ b/airflow/cli/commands/webserver_command.py
@@ -26,6 +26,7 @@ import textwrap
 import time
 import types
 from contextlib import suppress
+from pathlib import Path
 from time import sleep
 from typing import NoReturn
 
@@ -474,13 +475,15 @@ def webserver(args):
 
             handle = setup_logging(log_file)
 
-            base, ext = os.path.splitext(pid_file)
+            pid_path = Path(pid_file)
+            pidlock_path = pid_path.with_name(f"{pid_path.stem}-monitor{pid_path.suffix}")
+
             with open(stdout, "a") as stdout, open(stderr, "a") as stderr:
                 stdout.truncate(0)
                 stderr.truncate(0)
 
                 ctx = daemon.DaemonContext(
-                    pidfile=TimeoutPIDLockFile(f"{base}-monitor{ext}", -1),
+                    pidfile=TimeoutPIDLockFile(pidlock_path, -1),
                     files_preserve=[handle],
                     stdout=stdout,
                     stderr=stderr,
diff --git a/airflow/dag_processing/manager.py b/airflow/dag_processing/manager.py
index 9b76bb9374..ab93a21026 100644
--- a/airflow/dag_processing/manager.py
+++ b/airflow/dag_processing/manager.py
@@ -850,9 +850,7 @@ class DagFileProcessorManager(LoggingMixin):
             last_runtime = self.get_last_runtime(file_path)
             num_dags = self.get_last_dag_count(file_path)
             num_errors = self.get_last_error_count(file_path)
-            file_name = os.path.basename(file_path)
-            file_name = os.path.splitext(file_name)[0].replace(os.sep, ".")
-
+            file_name = Path(file_path).stem
             processor_pid = self.get_pid(file_path)
             processor_start_time = self.get_start_time(file_path)
             runtime = (now - processor_start_time) if processor_start_time else None
@@ -1042,8 +1040,7 @@ class DagFileProcessorManager(LoggingMixin):
             run_count=self.get_run_count(processor.file_path) + 1,
         )
         self._file_stats[processor.file_path] = stat
-
-        file_name = os.path.splitext(os.path.basename(processor.file_path))[0].replace(os.sep, ".")
+        file_name = Path(processor.file_path).stem
         Stats.timing(f"dag_processing.last_duration.{file_name}", last_duration)
         Stats.timing("dag_processing.last_duration", last_duration, tags={"file_name": file_name})
 
diff --git a/airflow/models/dagbag.py b/airflow/models/dagbag.py
index 7a926f2736..3e2ea3dd26 100644
--- a/airflow/models/dagbag.py
+++ b/airflow/models/dagbag.py
@@ -28,6 +28,7 @@ import traceback
 import warnings
 import zipfile
 from datetime import datetime, timedelta
+from pathlib import Path
 from typing import TYPE_CHECKING, NamedTuple
 
 from sqlalchemy.exc import OperationalError
@@ -55,8 +56,6 @@ from airflow.utils.timeout import timeout
 from airflow.utils.types import NOTSET
 
 if TYPE_CHECKING:
-    import pathlib
-
     from sqlalchemy.orm import Session
 
     from airflow.models.dag import DAG
@@ -95,7 +94,7 @@ class DagBag(LoggingMixin):
 
     def __init__(
         self,
-        dag_folder: str | pathlib.Path | None = None,
+        dag_folder: str | Path | None = None,
         include_examples: bool | ArgNotSet = NOTSET,
         safe_mode: bool | ArgNotSet = NOTSET,
         read_dags_from_db: bool = False,
@@ -327,8 +326,8 @@ class DagBag(LoggingMixin):
             return []
 
         self.log.debug("Importing %s", filepath)
-        org_mod_name, _ = os.path.splitext(os.path.split(filepath)[-1])
         path_hash = hashlib.sha1(filepath.encode("utf-8")).hexdigest()
+        org_mod_name = Path(filepath).stem
         mod_name = f"unusual_prefix_{path_hash}_{org_mod_name}"
 
         if mod_name in sys.modules:
@@ -380,15 +379,12 @@ class DagBag(LoggingMixin):
         mods = []
         with zipfile.ZipFile(filepath) as current_zip_file:
             for zip_info in current_zip_file.infolist():
-                head, _ = os.path.split(zip_info.filename)
-                mod_name, ext = os.path.splitext(zip_info.filename)
-                if ext not in [".py", ".pyc"]:
-                    continue
-                if head:
+                zip_path = Path(zip_info.filename)
+                if zip_path.suffix not in [".py", ".pyc"] or len(zip_path.parts) > 1:
                     continue
 
-                if mod_name == "__init__":
-                    self.log.warning("Found __init__.%s at root of %s", ext, filepath)
+                if zip_path.stem == "__init__":
+                    self.log.warning("Found %s at root of %s", zip_path.name, filepath)
 
                 self.log.debug("Reading %s from %s", zip_info.filename, filepath)
 
@@ -402,6 +398,7 @@ class DagBag(LoggingMixin):
                         )
                     continue
 
+                mod_name = zip_path.stem
                 if mod_name in sys.modules:
                     del sys.modules[mod_name]
 
@@ -518,7 +515,7 @@ class DagBag(LoggingMixin):
 
     def collect_dags(
         self,
-        dag_folder: str | pathlib.Path | None = None,
+        dag_folder: str | Path | None = None,
         only_if_updated: bool = True,
         include_examples: bool = conf.getboolean("core", "LOAD_EXAMPLES"),
         safe_mode: bool = conf.getboolean("core", "DAG_DISCOVERY_SAFE_MODE"),
diff --git a/docs/exts/docs_build/dev_index_generator.py b/docs/exts/docs_build/dev_index_generator.py
index 055384e764..0b9e9072ab 100644
--- a/docs/exts/docs_build/dev_index_generator.py
+++ b/docs/exts/docs_build/dev_index_generator.py
@@ -19,7 +19,7 @@ from __future__ import annotations
 import argparse
 import os
 import sys
-from glob import glob
+from pathlib import Path
 
 import jinja2
 
@@ -45,24 +45,16 @@ def _render_template(template_name, **kwargs):
 
 
 def _render_content():
-    provider_packages = [
-        os.path.basename(os.path.dirname(p)) for p in glob(f"{BUILD_DIR}/docs/apache-airflow-providers-*/")
-    ]
     providers = []
-    for package_name in provider_packages:
+    provider_yamls = {p["package-name"]: p for p in ALL_PROVIDER_YAMLS}
+    for path in sorted(Path(BUILD_DIR).glob("docs/apache-airflow-providers-*/")):
+        package_name = path.name
         try:
-            current_provider = next(
-                provider_yaml
-                for provider_yaml in ALL_PROVIDER_YAMLS
-                if provider_yaml["package-name"] == package_name
-            )
-            providers.append(current_provider)
-        except StopIteration:
+            providers.append(provider_yamls[package_name])
+        except KeyError:
             print(f"WARNING! Could not find provider.yaml file for package: {package_name}")
 
-    content = _render_template(
-        "dev_index_template.html.jinja2", providers=sorted(providers, key=lambda k: k["package-name"])
-    )
+    content = _render_template("dev_index_template.html.jinja2", providers=providers)
     return content
 
 


[airflow] 02/44: Fix spelling errors in readme and license files (#34383)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c410f0e211ac7415d71ee54668be6dcf5347ca05
Author: yanfangli <63...@users.noreply.github.com>
AuthorDate: Fri Sep 15 13:35:37 2023 +0800

    Fix spelling errors in readme and license files (#34383)
    
    (cherry picked from commit decac54722fd3fc20633541190ccb97edcc5daa9)
---
 airflow/_vendor/README.md | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/airflow/_vendor/README.md b/airflow/_vendor/README.md
index e708f1e507..e76d8beea3 100644
--- a/airflow/_vendor/README.md
+++ b/airflow/_vendor/README.md
@@ -10,7 +10,7 @@ the `_vendor` package.
 All Vendored libraries must follow these rules:
 
 1. Vendored libraries must be pure Python--no compiling (so that we do not have to release multi-platform airflow packages on PyPI).
-2. Source code for the libary is included in this directory.
+2. Source code for the library is included in this directory.
 3. License must be included in this repo and in the [LICENSE](../../LICENSE) file and in the
    [licenses](../../licenses) folder.
 4. Requirements of the library become requirements of airflow core.
@@ -19,7 +19,7 @@ All Vendored libraries must follow these rules:
 7. Apply the fixes necessary to use the vendored library as separate commits - each package separately,
    so that they can be cherry-picked later if we upgrade the vendored package. Changes to airflow code to
    use the vendored packages should be applied as separate commits/PRs.
-8. The `_vendor` packages should be excluded from any refactorings, static checks and automated fixes.
+8. The `_vendor` packages should be excluded from any refactoring, static checks and automated fixes.
 
 ## Adding and upgrading a vendored package
 
@@ -28,7 +28,7 @@ Way to vendor a library or update a version:
 1. Update ``vendor.txt`` with the library, version, and SHA256 (`pypi` provides hashes as of recently)
 2. Remove all old files and directories of the old version.
 3. Replace them with new files (only replace relevant python packages:move LICENSE )
-   * move licence files to [licenses](../../licenses) folder
+   * move license files to [licenses](../../licenses) folder
    * remove README and any other supporting files (they can be found in PyPI)
    * make sure to add requirements from setup.py to airflow's setup.py with appropriate comment stating
      why the requirements are added and when they should be removed


[airflow] 09/44: Deprecate numeric type python version in PythonVirtualEnvOperator (#34359)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 701ac6de95c207b0b744f6685ff4eaf07c8a0ee6
Author: starone <53...@users.noreply.github.com>
AuthorDate: Sun Sep 17 02:12:19 2023 +0900

    Deprecate numeric type python version in PythonVirtualEnvOperator (#34359)
    
    * Remove float type python version in PythonVirtualEnvOperator
    
    Remove float type python version in PythonVirtualEnvOperator
    
    * Remove int type python version in PythonVirtualEnvOperator
    
    * change deprecated to removed
    
    change deprecated to removed
    
    * change removal to deprecation
    
    change removal to deprecation
    
    * fix typo
    
    fix typo
    
    * fix line too long
    
    fix line too long
    
    * change condition statement
    
    change condition statement
    
    * Use 'is not'
    
    ---------
    
    Co-authored-by: kyeonghoon Kim <ky...@bagelcode.com>
    Co-authored-by: Tzu-ping Chung <ur...@gmail.com>
    (cherry picked from commit b23d3f964b2699d4c7f579e22d50fabc9049d1b6)
---
 airflow/operators/python.py    | 9 ++++++++-
 tests/operators/test_python.py | 2 +-
 2 files changed, 9 insertions(+), 2 deletions(-)

diff --git a/airflow/operators/python.py b/airflow/operators/python.py
index 50cad387ad..1f0baec609 100644
--- a/airflow/operators/python.py
+++ b/airflow/operators/python.py
@@ -531,7 +531,7 @@ class PythonVirtualenvOperator(_BasePythonVirtualenvOperator):
         *,
         python_callable: Callable,
         requirements: None | Iterable[str] | str = None,
-        python_version: str | int | float | None = None,
+        python_version: str | None = None,
         use_dill: bool = False,
         system_site_packages: bool = True,
         pip_install_options: list[str] | None = None,
@@ -554,6 +554,13 @@ class PythonVirtualenvOperator(_BasePythonVirtualenvOperator):
                 "major versions for PythonVirtualenvOperator. Please use string_args."
                 f"Sys version: {sys.version_info}. Venv version: {python_version}"
             )
+        if python_version is not None and not isinstance(python_version, str):
+            warnings.warn(
+                "Passing non-string types (e.g. int or float) as python_version "
+                "is deprecated. Please use string value instead.",
+                RemovedInAirflow3Warning,
+                stacklevel=2,
+            )
         if not is_venv_installed():
             raise AirflowException("PythonVirtualenvOperator requires virtualenv, please install it.")
         if not requirements:
diff --git a/tests/operators/test_python.py b/tests/operators/test_python.py
index 28c70537ae..6a9a4058ce 100644
--- a/tests/operators/test_python.py
+++ b/tests/operators/test_python.py
@@ -963,7 +963,7 @@ class TestPythonVirtualenvOperator(BaseTestPythonVirtualenvOperator):
                 return
             raise Exception
 
-        self.run_as_task(f, python_version=3, use_dill=False, requirements=["dill"])
+        self.run_as_task(f, python_version="3", use_dill=False, requirements=["dill"])
 
     def test_without_dill(self):
         def f(a):


[airflow] 24/44: Use iterative loop to look for mapped parent (#34622)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f860f9d1a0867e3a8cb8b7c3b8fa1935e4522bb8
Author: Tzu-ping Chung <ur...@gmail.com>
AuthorDate: Wed Sep 27 08:43:25 2023 +0800

    Use iterative loop to look for mapped parent (#34622)
    
    (cherry picked from commit d9ba152c15dd50baa1fef41a63424225ba8ddd47)
---
 airflow/www/views.py | 9 ++-------
 1 file changed, 2 insertions(+), 7 deletions(-)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index ba275416d2..b55af8ff40 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -121,7 +121,7 @@ from airflow.utils.net import get_hostname
 from airflow.utils.session import NEW_SESSION, create_session, provide_session
 from airflow.utils.state import DagRunState, State, TaskInstanceState
 from airflow.utils.strings import to_boolean
-from airflow.utils.task_group import MappedTaskGroup, TaskGroup, task_group_to_dict
+from airflow.utils.task_group import TaskGroup, task_group_to_dict
 from airflow.utils.timezone import td_format, utcnow
 from airflow.version import version
 from airflow.www import auth, utils as wwwutils
@@ -425,14 +425,9 @@ def dag_to_grid(dag: DagModel, dag_runs: Sequence[DagRun], session: Session):
                 **setup_teardown_type,
             }
 
-        def check_group_is_mapped(tg: TaskGroup | None) -> bool:
-            if tg is None:
-                return False
-            return isinstance(tg, MappedTaskGroup) or check_group_is_mapped(tg.parent_group)
-
         # Task Group
         task_group = item
-        group_is_mapped = check_group_is_mapped(task_group)
+        group_is_mapped = next(task_group.iter_mapped_task_groups(), None) is not None
 
         children = [
             task_group_to_grid(child, grouped_tis, is_parent_mapped=group_is_mapped)


[airflow] 11/44: Fix class reference in Public Interface documentation (#34454)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e7944e010f5f9c42d2763771f209ced19f39d7d5
Author: Andrey Anshin <An...@taragol.is>
AuthorDate: Wed Sep 20 00:17:00 2023 +0400

    Fix class reference in Public Interface documentation (#34454)
    
    (cherry picked from commit 26f6a51c47804d28779fa6277b48497c113fe721)
---
 docs/apache-airflow/public-airflow-interface.rst | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/apache-airflow/public-airflow-interface.rst b/docs/apache-airflow/public-airflow-interface.rst
index 248d46472f..534cc0d0e6 100644
--- a/docs/apache-airflow/public-airflow-interface.rst
+++ b/docs/apache-airflow/public-airflow-interface.rst
@@ -59,8 +59,8 @@ DAGs
 
 The DAG is Airflow's core entity that represents a recurring workflow. You can create a DAG by
 instantiating the :class:`~airflow.models.dag.DAG` class in your DAG file. You can also instantiate
-them via :class::`~airflow.models.dagbag.DagBag` class that reads DAGs from a file or a folder. DAGs
-can also have parameters specified via :class::`~airflow.models.param.Param` class.
+them via :class:`~airflow.models.dagbag.DagBag` class that reads DAGs from a file or a folder. DAGs
+can also have parameters specified via :class:`~airflow.models.param.Param` class.
 
 Airflow has a set of example DAGs that you can use to learn how to write DAGs
 


[airflow] 21/44: fix(cli): remove "to backfill" from --task-regex help message (#34598)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a0b4ef16d21d4cb6fc238027a1b114851731f9cf
Author: Wei Lee <we...@gmail.com>
AuthorDate: Tue Sep 26 04:20:41 2023 +0800

    fix(cli): remove "to backfill" from --task-regex help message (#34598)
    
    this arg is used by both "airflow tasks clear" and "airflow tasks backfill"
    and it does not make sense for "airflow tasks clear" to have the description "to backfile"
    
    (cherry picked from commit c019cf18dd1be3b20baf7503326a53002c236b45)
---
 airflow/cli/cli_config.py | 4 +---
 1 file changed, 1 insertion(+), 3 deletions(-)

diff --git a/airflow/cli/cli_config.py b/airflow/cli/cli_config.py
index 6a7056d78e..6c133d8a71 100644
--- a/airflow/cli/cli_config.py
+++ b/airflow/cli/cli_config.py
@@ -156,9 +156,7 @@ ARG_EXECUTION_DATE_OR_RUN_ID_OPTIONAL = Arg(
     nargs="?",
     help="The execution_date of the DAG or run_id of the DAGRun (optional)",
 )
-ARG_TASK_REGEX = Arg(
-    ("-t", "--task-regex"), help="The regex to filter specific task_ids to backfill (optional)"
-)
+ARG_TASK_REGEX = Arg(("-t", "--task-regex"), help="The regex to filter specific task_ids (optional)")
 ARG_SUBDIR = Arg(
     ("-S", "--subdir"),
     help=(


[airflow] 19/44: Avoid top-level airflow import to avoid circular dependency (#34586)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b2bad04ba5f29819b26cccaffddb6e92fad2a570
Author: Hussein Awala <hu...@awala.fr>
AuthorDate: Mon Sep 25 12:52:50 2023 +0200

    Avoid top-level airflow import to avoid circular dependency (#34586)
    
    (cherry picked from commit d1b7bca6a36c146119fb5746019116c4d1e15275)
---
 airflow/decorators/base.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/decorators/base.py b/airflow/decorators/base.py
index 70608be3d1..7199f3da55 100644
--- a/airflow/decorators/base.py
+++ b/airflow/decorators/base.py
@@ -41,7 +41,7 @@ import attr
 import re2
 import typing_extensions
 
-from airflow import Dataset
+from airflow.datasets import Dataset
 from airflow.exceptions import AirflowException
 from airflow.models.abstractoperator import DEFAULT_RETRIES, DEFAULT_RETRY_DELAY
 from airflow.models.baseoperator import (


[airflow] 14/44: Add more exemptions to lengthy metric list (#34531)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ec7007e7d78477474b29f776c2beef025d1cad68
Author: Saurabh Kumar <55...@users.noreply.github.com>
AuthorDate: Thu Sep 21 19:31:37 2023 -0400

    Add more exemptions to lengthy metric list (#34531)
    
    Co-authored-by: Saurabh Kumar <ma...@sa1.me>
    (cherry picked from commit fa6ca5d5316a9bd759a702e1688a69b19e4e63bc)
---
 airflow/metrics/validators.py  | 3 +++
 tests/core/test_otel_logger.py | 4 ++--
 2 files changed, 5 insertions(+), 2 deletions(-)

diff --git a/airflow/metrics/validators.py b/airflow/metrics/validators.py
index 7f0bbac218..8bd6dd4476 100644
--- a/airflow/metrics/validators.py
+++ b/airflow/metrics/validators.py
@@ -70,9 +70,12 @@ BACK_COMPAT_METRIC_NAME_PATTERNS: set[str] = {
     r"^pool\.open_slots\.(?P<pool_name>.*)$",
     r"^pool\.queued_slots\.(?P<pool_name>.*)$",
     r"^pool\.running_slots\.(?P<pool_name>.*)$",
+    r"^pool\.deferred_slots\.(?P<pool_name>.*)$",
     r"^pool\.starving_tasks\.(?P<pool_name>.*)$",
     r"^dagrun\.dependency-check\.(?P<dag_id>.*)$",
     r"^dag\.(?P<dag_id>.*)\.(?P<task_id>.*)\.duration$",
+    r"^dag\.(?P<dag_id>.*)\.(?P<task_id>.*)\.queued_duration$",
+    r"^dag\.(?P<dag_id>.*)\.(?P<task_id>.*)\.scheduled_duration$",
     r"^dag_processing\.last_duration\.(?P<dag_file>.*)$",
     r"^dagrun\.duration\.success\.(?P<dag_id>.*)$",
     r"^dagrun\.duration\.failed\.(?P<dag_id>.*)$",
diff --git a/tests/core/test_otel_logger.py b/tests/core/test_otel_logger.py
index 1f04edd1bf..ba19e3c9a2 100644
--- a/tests/core/test_otel_logger.py
+++ b/tests/core/test_otel_logger.py
@@ -66,9 +66,9 @@ class TestOtelMetrics:
         assert not _is_up_down_counter("this_is_not_a_udc")
 
     def test_exemption_list_has_not_grown(self):
-        assert len(BACK_COMPAT_METRIC_NAMES) <= 23, (
+        assert len(BACK_COMPAT_METRIC_NAMES) <= 26, (
             "This test exists solely to ensure that nobody is adding names to the exemption list. "
-            "There are 23 names which are potentially too long for OTel and that number should "
+            "There are 26 names which are potentially too long for OTel and that number should "
             "only ever go down as these names are deprecated.  If this test is failing, please "
             "adjust your new stat's name; do not add as exemption without a very good reason."
         )


[airflow] 05/44: Fix dag warning endpoint permissions (#34355)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 5a475bf56aac9fdebedb7d94c16f64ca3453515d
Author: Hussein Awala <hu...@awala.fr>
AuthorDate: Fri Sep 15 19:08:49 2023 +0200

    Fix dag warning endpoint permissions (#34355)
    
    * Fix dag warning endpoint permissions
    
    * update the query to have an accurate result for total entries and pagination
    
    * add unit tests
    
    * Update test_dag_warning_endpoint.py
    
    Co-authored-by: Tzu-ping Chung <ur...@gmail.com>
    
    ---------
    
    Co-authored-by: Tzu-ping Chung <ur...@gmail.com>
    (cherry picked from commit 3570bbfbea69e2965f91b9964ce28bc268c68129)
---
 .../endpoints/dag_warning_endpoint.py              |  8 ++++++++
 .../endpoints/test_dag_warning_endpoint.py         | 23 +++++++++++++++++++++-
 2 files changed, 30 insertions(+), 1 deletion(-)

diff --git a/airflow/api_connexion/endpoints/dag_warning_endpoint.py b/airflow/api_connexion/endpoints/dag_warning_endpoint.py
index 203bb9d6a5..367b0ae104 100644
--- a/airflow/api_connexion/endpoints/dag_warning_endpoint.py
+++ b/airflow/api_connexion/endpoints/dag_warning_endpoint.py
@@ -18,9 +18,11 @@ from __future__ import annotations
 
 from typing import TYPE_CHECKING
 
+from flask import g
 from sqlalchemy import select
 
 from airflow.api_connexion import security
+from airflow.api_connexion.exceptions import PermissionDenied
 from airflow.api_connexion.parameters import apply_sorting, check_limit, format_parameters
 from airflow.api_connexion.schemas.dag_warning_schema import (
     DagWarningCollection,
@@ -28,6 +30,7 @@ from airflow.api_connexion.schemas.dag_warning_schema import (
 )
 from airflow.models.dagwarning import DagWarning as DagWarningModel
 from airflow.security import permissions
+from airflow.utils.airflow_flask_app import get_airflow_app
 from airflow.utils.db import get_query_count
 from airflow.utils.session import NEW_SESSION, provide_session
 
@@ -57,7 +60,12 @@ def get_dag_warnings(
     allowed_filter_attrs = ["dag_id", "warning_type", "message", "timestamp"]
     query = select(DagWarningModel)
     if dag_id:
+        if not get_airflow_app().appbuilder.sm.can_read_dag(dag_id, g.user):
+            raise PermissionDenied(detail=f"User not allowed to access this DAG: {dag_id}")
         query = query.where(DagWarningModel.dag_id == dag_id)
+    else:
+        readable_dags = get_airflow_app().appbuilder.sm.get_accessible_dag_ids(g.user)
+        query = query.where(DagWarningModel.dag_id.in_(readable_dags))
     if warning_type:
         query = query.where(DagWarningModel.warning_type == warning_type)
     total_entries = get_query_count(query, session=session)
diff --git a/tests/api_connexion/endpoints/test_dag_warning_endpoint.py b/tests/api_connexion/endpoints/test_dag_warning_endpoint.py
index 621b043667..041a61634e 100644
--- a/tests/api_connexion/endpoints/test_dag_warning_endpoint.py
+++ b/tests/api_connexion/endpoints/test_dag_warning_endpoint.py
@@ -35,14 +35,27 @@ def configured_app(minimal_app_for_api):
         app,  # type:ignore
         username="test",
         role_name="Test",
-        permissions=[(permissions.ACTION_CAN_READ, permissions.RESOURCE_DAG_WARNING)],  # type: ignore
+        permissions=[
+            (permissions.ACTION_CAN_READ, permissions.RESOURCE_DAG_WARNING),
+            (permissions.ACTION_CAN_READ, permissions.RESOURCE_DAG),
+        ],  # type: ignore
     )
     create_user(app, username="test_no_permissions", role_name="TestNoPermissions")  # type: ignore
+    create_user(
+        app,  # type:ignore
+        username="test_with_dag2_read",
+        role_name="TestWithDag2Read",
+        permissions=[
+            (permissions.ACTION_CAN_READ, permissions.RESOURCE_DAG_WARNING),
+            (permissions.ACTION_CAN_READ, f"{permissions.RESOURCE_DAG_PREFIX}dag2"),
+        ],  # type: ignore
+    )
 
     yield minimal_app_for_api
 
     delete_user(app, username="test")  # type: ignore
     delete_user(app, username="test_no_permissions")  # type: ignore
+    delete_user(app, username="test_with_dag2_read")  # type: ignore
 
 
 class TestBaseDagWarning:
@@ -147,3 +160,11 @@ class TestGetDagWarningEndpoint(TestBaseDagWarning):
             "/api/v1/dagWarnings", environ_overrides={"REMOTE_USER": "test_no_permissions"}
         )
         assert response.status_code == 403
+
+    def test_should_raise_403_forbidden_when_user_has_no_dag_read_permission(self):
+        response = self.client.get(
+            "/api/v1/dagWarnings",
+            environ_overrides={"REMOTE_USER": "test_with_dag2_read"},
+            query_string={"dag_id": "dag1"},
+        )
+        assert response.status_code == 403


[airflow] 10/44: Clarify var.value.get and var.json.get usage (#34411)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit dd6e614e947d27024a6e65e7a425d45be73465a8
Author: Hussein Awala <hu...@awala.fr>
AuthorDate: Sat Sep 16 20:24:20 2023 +0200

    Clarify var.value.get  and var.json.get usage (#34411)
    
    (cherry picked from commit 03db0f6b785a4983c09d6eec7433cf28f7759610)
---
 docs/apache-airflow/templates-ref.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow/templates-ref.rst b/docs/apache-airflow/templates-ref.rst
index 3cb2f4a93e..763d3550fd 100644
--- a/docs/apache-airflow/templates-ref.rst
+++ b/docs/apache-airflow/templates-ref.rst
@@ -120,7 +120,7 @@ You can access them as either plain-text or JSON. If you use JSON, you are
 also able to walk nested structures, such as dictionaries like:
 ``{{ var.json.my_dict_var.key1 }}``.
 
-It is also possible to fetch a variable by string if needed with
+It is also possible to fetch a variable by string if needed (for example your variable key contains dots) with
 ``{{ var.value.get('my.var', 'fallback') }}`` or
 ``{{ var.json.get('my.dict.var', {'key1': 'val1'}) }}``. Defaults can be
 supplied in case the variable does not exist.


[airflow] 23/44: Fix some whitespace (#34632)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a99cd81b5399f36c0fed9daad3508fd1e8a39c7b
Author: D. Ferruzzi <fe...@amazon.com>
AuthorDate: Tue Sep 26 15:21:54 2023 -0700

    Fix some whitespace (#34632)
    
    (cherry picked from commit 97de019995185cba1e7e63ea525d099ff5c94ea7)
---
 airflow/configuration.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/configuration.py b/airflow/configuration.py
index 68e9a41f22..963ddb5cc9 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -1855,8 +1855,8 @@ class AirflowConfigParser(ConfigParser):
                     raise AirflowConfigException(
                         f"The provider {provider} is attempting to contribute "
                         f"configuration section {provider_section} that "
-                        f"has already been added before. The source of it: {section_source}."
-                        "This is forbidden. A provider can only add new sections. It"
+                        f"has already been added before. The source of it: {section_source}. "
+                        "This is forbidden. A provider can only add new sections. It "
                         "cannot contribute options to existing sections or override other "
                         "provider's configuration.",
                         UserWarning,


[airflow] 01/44: docs: correct typo in best-practices.rst (#34361)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6da499c38051cf88cb0f8988a09aee141688ea77
Author: Jérôme Gurhem <88...@users.noreply.github.com>
AuthorDate: Thu Sep 14 17:32:12 2023 +0200

    docs: correct typo in best-practices.rst (#34361)
    
    (cherry picked from commit f93b046a38baa8eb4e4b513ef515635bc6c20fe8)
---
 docs/apache-airflow/best-practices.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow/best-practices.rst b/docs/apache-airflow/best-practices.rst
index a150da936c..50aa44a57d 100644
--- a/docs/apache-airflow/best-practices.rst
+++ b/docs/apache-airflow/best-practices.rst
@@ -23,7 +23,7 @@ Best Practices
 Creating a new DAG is a three-step process:
 
 - writing Python code to create a DAG object,
-- testing if the code meets our expectations,
+- testing if the code meets your expectations,
 - configuring environment dependencies to run your DAG
 
 This tutorial will introduce you to the best practices for these three steps.


[airflow] 44/44: Parse 'docker context ls --format=json' correctly (#34711)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d10f43c5b02d1b9fe867ef230ccc592592dbd4b8
Author: Tzu-ping Chung <ur...@gmail.com>
AuthorDate: Tue Oct 3 17:42:50 2023 +0800

    Parse 'docker context ls --format=json' correctly (#34711)
    
    (cherry picked from commit 19284981f88e45dca4c4003837e3cead1723caf1)
---
 .../src/airflow_breeze/utils/docker_command_utils.py  |  7 ++-----
 dev/breeze/tests/test_docker_command_utils.py         | 19 ++++++++-----------
 2 files changed, 10 insertions(+), 16 deletions(-)

diff --git a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
index 6aae84fb5f..b6bcea335a 100644
--- a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
+++ b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
@@ -832,11 +832,8 @@ def autodetect_docker_context():
     if result.returncode != 0:
         get_console().print("[warning]Could not detect docker builder. Using default.[/]")
         return "default"
-    context_json = json.loads(result.stdout)
-    if isinstance(context_json, dict):
-        # In case there is one context it is returned as dict not array of dicts ¯\_(ツ)_/¯
-        context_json = [context_json]
-    known_contexts = {info["Name"]: info for info in context_json}
+    context_dicts = (json.loads(line) for line in result.stdout.splitlines() if line.strip())
+    known_contexts = {info["Name"]: info for info in context_dicts}
     if not known_contexts:
         get_console().print("[warning]Could not detect docker builder. Using default.[/]")
         return "default"
diff --git a/dev/breeze/tests/test_docker_command_utils.py b/dev/breeze/tests/test_docker_command_utils.py
index b125fb2bd7..8cd7924445 100644
--- a/dev/breeze/tests/test_docker_command_utils.py
+++ b/dev/breeze/tests/test_docker_command_utils.py
@@ -191,39 +191,36 @@ def test_check_docker_compose_version_ok(mock_get_console, mock_run_command):
     )
 
 
-def _fake_ctx(name: str) -> dict[str, str]:
-    return {
-        "Name": name,
-        "DockerEndpoint": f"unix://{name}",
-    }
+def _fake_ctx_output(*names: str) -> str:
+    return "\n".join(json.dumps({"Name": name, "DockerEndpoint": f"unix://{name}"}) for name in names)
 
 
 @pytest.mark.parametrize(
     "context_output, selected_context, console_output",
     [
         (
-            json.dumps([_fake_ctx("default")]),
+            _fake_ctx_output("default"),
             "default",
             "[info]Using default as context",
         ),
-        ("[]", "default", "[warning]Could not detect docker builder"),
+        ("\n", "default", "[warning]Could not detect docker builder"),
         (
-            json.dumps([_fake_ctx("a"), _fake_ctx("b")]),
+            _fake_ctx_output("a", "b"),
             "a",
             "[warning]Could not use any of the preferred docker contexts",
         ),
         (
-            json.dumps([_fake_ctx("a"), _fake_ctx("desktop-linux")]),
+            _fake_ctx_output("a", "desktop-linux"),
             "desktop-linux",
             "[info]Using desktop-linux as context",
         ),
         (
-            json.dumps([_fake_ctx("a"), _fake_ctx("default")]),
+            _fake_ctx_output("a", "default"),
             "default",
             "[info]Using default as context",
         ),
         (
-            json.dumps([_fake_ctx("a"), _fake_ctx("default"), _fake_ctx("desktop-linux")]),
+            _fake_ctx_output("a", "default", "desktop-linux"),
             "desktop-linux",
             "[info]Using desktop-linux as context",
         ),


[airflow] 41/44: Fix broken breeze by fixing package version (#34701)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c86ebeb897c48338fc0f9410bce7393eebb1050e
Author: Hussein Awala <hu...@awala.fr>
AuthorDate: Mon Oct 2 11:21:55 2023 +0200

    Fix broken breeze by fixing package version (#34701)
    
    (cherry picked from commit 6618c5f90d037d57e9f3bf1e90cd0712426d6caa)
---
 Dockerfile.ci                | 2 +-
 dev/breeze/README.md         | 2 +-
 dev/breeze/setup.cfg         | 2 +-
 scripts/ci/install_breeze.sh | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/Dockerfile.ci b/Dockerfile.ci
index a0f47d86d3..aba7e6913b 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -1242,7 +1242,7 @@ ARG PYTHON_BASE_IMAGE
 ARG AIRFLOW_IMAGE_REPOSITORY="https://github.com/apache/airflow"
 
 # By increasing this number we can do force build of all dependencies
-ARG DEPENDENCIES_EPOCH_NUMBER="9"
+ARG DEPENDENCIES_EPOCH_NUMBER="10"
 
 # Make sure noninteractive debian install is used and language variables set
 ENV PYTHON_BASE_IMAGE=${PYTHON_BASE_IMAGE} \
diff --git a/dev/breeze/README.md b/dev/breeze/README.md
index 7751502524..12278aaf9a 100644
--- a/dev/breeze/README.md
+++ b/dev/breeze/README.md
@@ -52,6 +52,6 @@ PLEASE DO NOT MODIFY THE HASH BELOW! IT IS AUTOMATICALLY UPDATED BY PRE-COMMIT.
 
 ---------------------------------------------------------------------------------------------------------
 
-Package config hash: 9d095d522c9f6fcf0c5834fcdc050bc98231d17fad07ec054c4e437580129d547b693b66b61442757f81fc1a505483da5267cc973dbf86babba7cd2c11697708
+Package config hash: 782a39916ea95eedd0cd81f76c9dbf3bbb5cbdc5c03271621a8dd3805324ee6868fbead2b95ac653d9efea0225db85de46b17c6f0e3b07923c7d18de666d236e
 
 ---------------------------------------------------------------------------------------------------------
diff --git a/dev/breeze/setup.cfg b/dev/breeze/setup.cfg
index 735962b801..95b8df62e3 100644
--- a/dev/breeze/setup.cfg
+++ b/dev/breeze/setup.cfg
@@ -56,7 +56,7 @@ install_requires =
     filelock
     inputimeout
     jinja2
-    packaging
+    packaging==23.1
     pendulum
     pre-commit
     psutil
diff --git a/scripts/ci/install_breeze.sh b/scripts/ci/install_breeze.sh
index e73f6c28b6..7a0e7a927b 100755
--- a/scripts/ci/install_breeze.sh
+++ b/scripts/ci/install_breeze.sh
@@ -19,6 +19,6 @@ set -euxo pipefail
 
 cd "$( dirname "${BASH_SOURCE[0]}" )/../../"
 
-python -m pip install pipx
+python -m pip install pipx packaging==23.1
 python -m pipx install --editable ./dev/breeze/ --force
 echo '/home/runner/.local/bin' >> "${GITHUB_PATH}"


[airflow] 29/44: Fix non deterministic datetime deserialization (#34492)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9ecdff04471508345e6e23a5f99e5b4c8c97aee6
Author: Hussein Awala <hu...@awala.fr>
AuthorDate: Thu Sep 28 09:31:06 2023 +0200

    Fix non deterministic datetime deserialization (#34492)
    
    tzname() does not return full timezones and returned short hand notations are not deterministic. This changes the serialization to be deterministic and adds some logic to deal with serialized short-hand US Timezones and CEST.
    
    ---------
    
    Co-authored-by: bolkedebruin <bo...@users.noreply.github.com>
    (cherry picked from commit a3c06c02e31cc77b2c19554892b72ed91b8387de)
---
 airflow/serialization/serializers/datetime.py      | 32 +++++++++--
 .../serialization/serializers/test_serializers.py  | 62 +++++++++++++++++++++-
 2 files changed, 88 insertions(+), 6 deletions(-)

diff --git a/airflow/serialization/serializers/datetime.py b/airflow/serialization/serializers/datetime.py
index bdb9a6cb6c..49f0899a59 100644
--- a/airflow/serialization/serializers/datetime.py
+++ b/airflow/serialization/serializers/datetime.py
@@ -19,6 +19,10 @@ from __future__ import annotations
 
 from typing import TYPE_CHECKING
 
+from airflow.serialization.serializers.timezone import (
+    deserialize as deserialize_timezone,
+    serialize as serialize_timezone,
+)
 from airflow.utils.module_loading import qualname
 from airflow.utils.timezone import convert_to_utc, is_naive
 
@@ -27,7 +31,7 @@ if TYPE_CHECKING:
 
     from airflow.serialization.serde import U
 
-__version__ = 1
+__version__ = 2
 
 serializers = ["datetime.date", "datetime.datetime", "datetime.timedelta", "pendulum.datetime.DateTime"]
 deserializers = serializers
@@ -44,7 +48,7 @@ def serialize(o: object) -> tuple[U, str, int, bool]:
         if is_naive(o):
             o = convert_to_utc(o)
 
-        tz = o.tzname()
+        tz = serialize_timezone(o.tzinfo)
 
         return {TIMESTAMP: o.timestamp(), TIMEZONE: tz}, qn, __version__, True
 
@@ -61,13 +65,31 @@ def deserialize(classname: str, version: int, data: dict | str) -> datetime.date
     import datetime
 
     from pendulum import DateTime
-    from pendulum.tz import timezone
+    from pendulum.tz import fixed_timezone, timezone
+
+    tz: datetime.tzinfo | None = None
+    if isinstance(data, dict) and TIMEZONE in data:
+        if version == 1:
+            # try to deserialize unsupported timezones
+            timezone_mapping = {
+                "EDT": fixed_timezone(-4 * 3600),
+                "CDT": fixed_timezone(-5 * 3600),
+                "MDT": fixed_timezone(-6 * 3600),
+                "PDT": fixed_timezone(-7 * 3600),
+                "CEST": timezone("CET"),
+            }
+            if data[TIMEZONE] in timezone_mapping:
+                tz = timezone_mapping[data[TIMEZONE]]
+            else:
+                tz = timezone(data[TIMEZONE])
+        else:
+            tz = deserialize_timezone(data[TIMEZONE][1], data[TIMEZONE][2], data[TIMEZONE][0])
 
     if classname == qualname(datetime.datetime) and isinstance(data, dict):
-        return datetime.datetime.fromtimestamp(float(data[TIMESTAMP]), tz=timezone(data[TIMEZONE]))
+        return datetime.datetime.fromtimestamp(float(data[TIMESTAMP]), tz=tz)
 
     if classname == qualname(DateTime) and isinstance(data, dict):
-        return DateTime.fromtimestamp(float(data[TIMESTAMP]), tz=timezone(data[TIMEZONE]))
+        return DateTime.fromtimestamp(float(data[TIMESTAMP]), tz=tz)
 
     if classname == qualname(datetime.timedelta) and isinstance(data, (str, float)):
         return datetime.timedelta(seconds=float(data))
diff --git a/tests/serialization/serializers/test_serializers.py b/tests/serialization/serializers/test_serializers.py
index e9805d4d77..1d19760417 100644
--- a/tests/serialization/serializers/test_serializers.py
+++ b/tests/serialization/serializers/test_serializers.py
@@ -32,7 +32,6 @@ from airflow.serialization.serde import DATA, deserialize, serialize
 class TestSerializers:
     def test_datetime(self):
         i = datetime.datetime(2022, 7, 10, 22, 10, 43, microsecond=0, tzinfo=pendulum.tz.UTC)
-
         s = serialize(i)
         d = deserialize(s)
         assert i.timestamp() == d.timestamp()
@@ -52,6 +51,67 @@ class TestSerializers:
         d = deserialize(s)
         assert i == d
 
+        i = datetime.datetime(
+            2022, 7, 10, 22, 10, 43, microsecond=0, tzinfo=pendulum.timezone("America/New_York")
+        )
+        s = serialize(i)
+        d = deserialize(s)
+        assert i.timestamp() == d.timestamp()
+
+        i = DateTime(2022, 7, 10, tzinfo=pendulum.timezone("America/New_York"))
+        s = serialize(i)
+        d = deserialize(s)
+        assert i.timestamp() == d.timestamp()
+
+    def test_deserialize_datetime_v1(self):
+
+        s = {
+            "__classname__": "pendulum.datetime.DateTime",
+            "__version__": 1,
+            "__data__": {"timestamp": 1657505443.0, "tz": "UTC"},
+        }
+        d = deserialize(s)
+        assert d.timestamp() == 1657505443.0
+        assert d.tzinfo.name == "UTC"
+
+        s["__data__"]["tz"] = "Europe/Paris"
+        d = deserialize(s)
+        assert d.timestamp() == 1657505443.0
+        assert d.tzinfo.name == "Europe/Paris"
+
+        s["__data__"]["tz"] = "America/New_York"
+        d = deserialize(s)
+        assert d.timestamp() == 1657505443.0
+        assert d.tzinfo.name == "America/New_York"
+
+        s["__data__"]["tz"] = "EDT"
+        d = deserialize(s)
+        assert d.timestamp() == 1657505443.0
+        assert d.tzinfo.name == "-04:00"
+        # assert that it's serializable with the new format
+        assert deserialize(serialize(d)) == d
+
+        s["__data__"]["tz"] = "CDT"
+        d = deserialize(s)
+        assert d.timestamp() == 1657505443.0
+        assert d.tzinfo.name == "-05:00"
+        # assert that it's serializable with the new format
+        assert deserialize(serialize(d)) == d
+
+        s["__data__"]["tz"] = "MDT"
+        d = deserialize(s)
+        assert d.timestamp() == 1657505443.0
+        assert d.tzinfo.name == "-06:00"
+        # assert that it's serializable with the new format
+        assert deserialize(serialize(d)) == d
+
+        s["__data__"]["tz"] = "PDT"
+        d = deserialize(s)
+        assert d.timestamp() == 1657505443.0
+        assert d.tzinfo.name == "-07:00"
+        # assert that it's serializable with the new format
+        assert deserialize(serialize(d)) == d
+
     @pytest.mark.parametrize(
         "expr, expected",
         [("1", "1"), ("52e4", "520000"), ("2e0", "2"), ("12e-2", "0.12"), ("12.34", "12.34")],


[airflow] 20/44: Fix is_parent_mapped value by checking if any of the parent tg is mapped (#34587)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ba1b1da078da3e3037e95fe0a393eb58d5d7c8ba
Author: Hussein Awala <hu...@awala.fr>
AuthorDate: Mon Sep 25 18:26:32 2023 +0200

    Fix is_parent_mapped value by checking if any of the parent tg is mapped (#34587)
    
    (cherry picked from commit 97916ba45ccf73185a5fbf50270a493369da0344)
---
 airflow/www/views.py | 7 ++++++-
 1 file changed, 6 insertions(+), 1 deletion(-)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index 390b2396fb..ba275416d2 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -425,9 +425,14 @@ def dag_to_grid(dag: DagModel, dag_runs: Sequence[DagRun], session: Session):
                 **setup_teardown_type,
             }
 
+        def check_group_is_mapped(tg: TaskGroup | None) -> bool:
+            if tg is None:
+                return False
+            return isinstance(tg, MappedTaskGroup) or check_group_is_mapped(tg.parent_group)
+
         # Task Group
         task_group = item
-        group_is_mapped = isinstance(task_group, MappedTaskGroup)
+        group_is_mapped = check_group_is_mapped(task_group)
 
         children = [
             task_group_to_grid(child, grouped_tis, is_parent_mapped=group_is_mapped)


[airflow] 22/44: Clarify what landing time means in doc (#34608)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit bb5fbab13dd353f8ec196e0f9668f84a6542cd5b
Author: Daniel Standish <15...@users.noreply.github.com>
AuthorDate: Mon Sep 25 14:52:02 2023 -0700

    Clarify what landing time means in doc (#34608)
    
    (cherry picked from commit f99e65b4f3df71baf2fdf738643bbda1263e15a0)
---
 docs/apache-airflow/ui.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/apache-airflow/ui.rst b/docs/apache-airflow/ui.rst
index 87f20d7a88..7afe26cf3d 100644
--- a/docs/apache-airflow/ui.rst
+++ b/docs/apache-airflow/ui.rst
@@ -166,9 +166,9 @@ DAG over many runs.
 
 Landing Times
 .............
-Airflow landing times are calculated from the task's scheduled time to
-the time the task finishes, either with success or another state (see
-:ref:`concepts:task-instances`).
+
+The landing time for a task instance is the delta between the dag run's data interval end
+(typically this means, when the dag "should" run) and the task instance completion time.
 
 ------------
 


[airflow] 18/44: Fix: Add 3.11 as supported Python version (#34575)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4dceaed81b5c7339eb9ebe7afc5ae4cdb33b69df
Author: Jan Frederik Léger <ja...@gmail.com>
AuthorDate: Sun Sep 24 21:26:57 2023 +0200

    Fix: Add 3.11 as supported Python version (#34575)
    
    * Fix: Add 3.11 as supported Python version
    
    On the Quick Start page
    Fixes #34574
    
    * Update docs/apache-airflow/start.rst
    
    ---------
    
    Co-authored-by: Hussein Awala <hu...@awala.fr>
    (cherry picked from commit 9b96f76ac820b3dc020286b685a236da842e407c)
---
 docs/apache-airflow/start.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/apache-airflow/start.rst b/docs/apache-airflow/start.rst
index 5ed97ae4d7..ee95f1391d 100644
--- a/docs/apache-airflow/start.rst
+++ b/docs/apache-airflow/start.rst
@@ -24,8 +24,7 @@ This quick start guide will help you bootstrap an Airflow standalone instance on
 
 .. note::
 
-   Successful installation requires a Python 3 environment. Starting with Airflow 2.3.0, Airflow is tested with Python 3.8, 3.9, 3.10.
-   Note that Python 3.11 is not yet supported.
+   Successful installation requires a Python 3 environment. Starting with Airflow 2.7.0, Airflow supports Python 3.8, 3.9, 3.10 and 3.11.
 
    Only ``pip`` installation is currently officially supported.
 
@@ -61,7 +60,8 @@ constraint files to enable reproducible installation, so using ``pip`` and const
 
       AIRFLOW_VERSION=|version|
 
-      # Extract the version of Python you have installed. If you're currently using Python 3.11 you may want to set this manually as noted above, Python 3.11 is not yet supported.
+      # Extract the version of Python you have installed. If you're currently using a Python version that is not supported by Airflow, you may want to set this manually.
+      # See above for supported versions.
       PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
 
       CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"


[airflow] 06/44: Add LocalKubernetesExecutor in the config.yml's executor description (#34414)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e45e663d58817181f78bd72bbe65a6b130b28c8d
Author: Xiaodong DENG <xd...@apache.org>
AuthorDate: Sat Sep 16 10:06:15 2023 -0700

    Add LocalKubernetesExecutor in the config.yml's executor description (#34414)
    
    (cherry picked from commit e64269e63c2f772740d02eea01ea60f83f426fdc)
---
 airflow/config_templates/config.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/config_templates/config.yml b/airflow/config_templates/config.yml
index 4ff95db84f..caaec5dc1c 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -64,7 +64,7 @@ core:
       description: |
         The executor class that airflow should use. Choices include
         ``SequentialExecutor``, ``LocalExecutor``, ``CeleryExecutor``, ``DaskExecutor``,
-        ``KubernetesExecutor``, ``CeleryKubernetesExecutor`` or the
+        ``KubernetesExecutor``, ``CeleryKubernetesExecutor``, ``LocalKubernetesExecutor`` or the
         full import path to the class when using a custom executor.
       version_added: ~
       type: string


[airflow] 04/44: Update cluster-policies.rst (#34174)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit eefaeecd2e93e9541895da7a2ef1a1ffc67a9a96
Author: Cody Rich <83...@users.noreply.github.com>
AuthorDate: Fri Sep 15 02:49:04 2023 -0400

    Update cluster-policies.rst (#34174)
    
    Removed experimental tag from Pluggy interface. Resolves #34147
    
    (cherry picked from commit 88623acae867c2a9d34f5030809102379080641a)
---
 docs/apache-airflow/administration-and-deployment/cluster-policies.rst | 2 --
 1 file changed, 2 deletions(-)

diff --git a/docs/apache-airflow/administration-and-deployment/cluster-policies.rst b/docs/apache-airflow/administration-and-deployment/cluster-policies.rst
index c664d0b503..de99113231 100644
--- a/docs/apache-airflow/administration-and-deployment/cluster-policies.rst
+++ b/docs/apache-airflow/administration-and-deployment/cluster-policies.rst
@@ -67,8 +67,6 @@ There are two ways to configure cluster policies:
 
    .. versionadded:: 2.6
 
-   .. note:: |experimental|
-
    This method is more advanced and for people who are already comfortable with python packaging.
 
    First create your policy function in a module:


[airflow] 38/44: Add information about drop support MsSQL as DB Backend in the future (#34375)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6e22dbabb3cbacfbc94e6551ede276846e2c9ae3
Author: Andrey Anshin <An...@taragol.is>
AuthorDate: Thu Sep 14 20:58:06 2023 +0400

    Add information about drop support MsSQL as DB Backend in the future (#34375)
    
    (cherry picked from commit a122b5709680e2d208a76bc5d40bd039b29dbd4e)
---
 README.md                                     | 20 +++++++++++---------
 docs/apache-airflow/howto/set-up-database.rst | 12 ++++++++++--
 generated/PYPI_README.md                      | 20 +++++++++++---------
 3 files changed, 32 insertions(+), 20 deletions(-)

diff --git a/README.md b/README.md
index e8c2adc346..ec01fdcccf 100644
--- a/README.md
+++ b/README.md
@@ -89,18 +89,20 @@ Airflow is not a streaming solution, but it is often used to process real-time d
 
 Apache Airflow is tested with:
 
-|             | Main version (dev)     | Stable version (2.7.1) |
-|-------------|------------------------|------------------------|
-| Python      | 3.8, 3.9, 3.10, 3.11   | 3.8, 3.9, 3.10, 3.11   |
-| Platform    | AMD64/ARM64(\*)        | AMD64/ARM64(\*)        |
-| Kubernetes  | 1.24, 1.25, 1.26, 1.27 | 1.24, 1.25, 1.26, 1.27 |
-| PostgreSQL  | 11, 12, 13, 14, 15     | 11, 12, 13, 14, 15     |
-| MySQL       | 5.7, 8.0, 8.1          | 5.7, 8.0, 8.1          |
-| SQLite      | 3.15.0+                | 3.15.0+                |
-| MSSQL       | 2017(\*), 2019(\*)     | 2017(\*), 2019(\*)     |
+|             | Main version (dev)           | Stable version (2.7.1) |
+|-------------|------------------------------|------------------------|
+| Python      | 3.8, 3.9, 3.10, 3.11         | 3.8, 3.9, 3.10, 3.11   |
+| Platform    | AMD64/ARM64(\*)              | AMD64/ARM64(\*)        |
+| Kubernetes  | 1.24, 1.25, 1.26, 1.27, 1.28 | 1.24, 1.25, 1.26, 1.27 |
+| PostgreSQL  | 11, 12, 13, 14, 15           | 11, 12, 13, 14, 15     |
+| MySQL       | 5.7, 8.0, 8.1                | 5.7, 8.0               |
+| SQLite      | 3.15.0+                      | 3.15.0+                |
+| MSSQL       | 2017(\*\*), 2019(\*\*)       | 2017(\*), 2019(\*)     |
 
 \* Experimental
 
+\*\* **Discontinued soon**, not recommended for the new installation
+
 **Note**: MySQL 5.x versions are unable to or have limitations with
 running multiple schedulers -- please see the [Scheduler docs](https://airflow.apache.org/docs/apache-airflow/stable/scheduler.html).
 MariaDB is not tested/recommended.
diff --git a/docs/apache-airflow/howto/set-up-database.rst b/docs/apache-airflow/howto/set-up-database.rst
index d9cf82b756..c5cab0a2e5 100644
--- a/docs/apache-airflow/howto/set-up-database.rst
+++ b/docs/apache-airflow/howto/set-up-database.rst
@@ -27,14 +27,14 @@ The document below describes the database engine configurations, the necessary c
 Choosing database backend
 -------------------------
 
-If you want to take a real test drive of Airflow, you should consider setting up a database backend to **PostgreSQL**, **MySQL**, or **MSSQL**.
+If you want to take a real test drive of Airflow, you should consider setting up a database backend to **PostgreSQL** or **MySQL**.
 By default, Airflow uses **SQLite**, which is intended for development purposes only.
 
 Airflow supports the following database engine versions, so make sure which version you have. Old versions may not support all SQL statements.
 
 * PostgreSQL: 11, 12, 13, 14, 15
 * MySQL: 5.7, 8
-* MSSQL (Experimental): 2017, 2019
+* MSSQL (Experimental, **Discontinued soon**): 2017, 2019
 * SQLite: 3.15.0+
 
 If you plan on running more than one scheduler, you have to meet additional requirements.
@@ -323,6 +323,14 @@ In addition, you also should pay particular attention to MySQL's encoding. Altho
 Setting up a MsSQL Database
 ---------------------------
 
+.. warning::
+
+    After `discussion <https://lists.apache.org/thread/r06j306hldg03g2my1pd4nyjxg78b3h4>`__
+    and a `voting process <https://lists.apache.org/thread/pgcgmhf6560k8jbsmz8nlyoxosvltph2>`__,
+    the Airflow's PMC and Committers have reached a resolution to no longer maintain MsSQL as a supported Database Backend.
+
+    For new Airflow installations, it is advised against using MsSQL as the database backend.
+
 You need to create a database and a database user that Airflow will use to access this database.
 In the example below, a database ``airflow_db`` and user  with username ``airflow_user`` with password ``airflow_pass`` will be created.
 Note, that in case of MsSQL, Airflow uses ``READ COMMITTED`` transaction isolation and it must have
diff --git a/generated/PYPI_README.md b/generated/PYPI_README.md
index 9bb8767a19..ab1e94f5ad 100644
--- a/generated/PYPI_README.md
+++ b/generated/PYPI_README.md
@@ -47,18 +47,20 @@ Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The
 
 Apache Airflow is tested with:
 
-|             | Main version (dev)     | Stable version (2.7.1) |
-|-------------|------------------------|------------------------|
-| Python      | 3.8, 3.9, 3.10, 3.11   | 3.8, 3.9, 3.10, 3.11   |
-| Platform    | AMD64/ARM64(\*)        | AMD64/ARM64(\*)        |
-| Kubernetes  | 1.24, 1.25, 1.26, 1.27 | 1.24, 1.25, 1.26, 1.27 |
-| PostgreSQL  | 11, 12, 13, 14, 15     | 11, 12, 13, 14, 15     |
-| MySQL       | 5.7, 8.0, 8.1          | 5.7, 8.0, 8.1          |
-| SQLite      | 3.15.0+                | 3.15.0+                |
-| MSSQL       | 2017(\*), 2019(\*)     | 2017(\*), 2019(\*)     |
+|             | Main version (dev)           | Stable version (2.7.1) |
+|-------------|------------------------------|------------------------|
+| Python      | 3.8, 3.9, 3.10, 3.11         | 3.8, 3.9, 3.10, 3.11   |
+| Platform    | AMD64/ARM64(\*)              | AMD64/ARM64(\*)        |
+| Kubernetes  | 1.24, 1.25, 1.26, 1.27, 1.28 | 1.24, 1.25, 1.26, 1.27 |
+| PostgreSQL  | 11, 12, 13, 14, 15           | 11, 12, 13, 14, 15     |
+| MySQL       | 5.7, 8.0, 8.1                | 5.7, 8.0               |
+| SQLite      | 3.15.0+                      | 3.15.0+                |
+| MSSQL       | 2017(\*\*), 2019(\*\*)       | 2017(\*), 2019(\*)     |
 
 \* Experimental
 
+\*\* **Discontinued soon**, not recommended for the new installation
+
 **Note**: MySQL 5.x versions are unable to or have limitations with
 running multiple schedulers -- please see the [Scheduler docs](https://airflow.apache.org/docs/apache-airflow/stable/scheduler.html).
 MariaDB is not tested/recommended.


[airflow] 42/44: Support rootless mode for docker. (#34537)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d2436e004685333cd395bc32a59cd13a2639838f
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Fri Sep 22 05:12:11 2023 -0400

    Support rootless mode for docker. (#34537)
    
    In case docker is run in rootless mode, the host UID is mapped to root
    user automatically and host user id is mapped to 100999 (unknown) so
    changing ownership for created files in rootless mode is problematic
    as it makes the generated files inaccessible
    
    (cherry picked from commit 0631af86525ad98e90cdc0bf120df7192ea2e912)
---
 dev/breeze/src/airflow_breeze/utils/docker_command_utils.py | 13 +++++++++++++
 scripts/ci/docker-compose/_docker.env                       |  1 +
 scripts/ci/docker-compose/base.yml                          |  1 +
 scripts/ci/docker-compose/devcontainer.env                  |  1 +
 scripts/in_container/_in_container_utils.sh                 |  4 ++++
 5 files changed, 20 insertions(+)

diff --git a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
index b0a5697b18..b7b8041cb5 100644
--- a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
+++ b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
@@ -146,6 +146,16 @@ def get_extra_docker_flags(mount_sources: str, include_mypy_volume: bool = False
     return extra_docker_flags
 
 
+def is_docker_rootless():
+    response = run_command(
+        ["docker", "info", "-f", "{{println .SecurityOptions}}"], capture_output=True, check=True, text=True
+    )
+    if "rootless" in response.stdout.strip():
+        get_console().print("[info]Docker is running in rootless mode.[/]\n")
+        return True
+    return False
+
+
 def check_docker_resources(airflow_image_name: str) -> RunCommandResult:
     """
     Check if we have enough resources to run docker. This is done via running script embedded in our image.
@@ -575,6 +585,7 @@ def update_expected_environment_variables(env: dict[str, str]) -> None:
     set_value_to_default_if_not_set(env, "COLLECT_ONLY", "false")
     set_value_to_default_if_not_set(env, "DB_RESET", "false")
     set_value_to_default_if_not_set(env, "DEFAULT_BRANCH", AIRFLOW_BRANCH)
+    set_value_to_default_if_not_set(env, "DOCKER_IS_ROOTLESS", "false")
     set_value_to_default_if_not_set(env, "ENABLED_SYSTEMS", "")
     set_value_to_default_if_not_set(env, "ENABLE_TEST_COVERAGE", "false")
     set_value_to_default_if_not_set(env, "HELM_TEST_PACKAGE", "")
@@ -710,6 +721,8 @@ def prepare_broker_url(params, env_variables):
 def perform_environment_checks():
     check_docker_is_running()
     check_docker_version()
+    if is_docker_rootless():
+        os.environ["DOCKER_IS_ROOTLESS"] = "true"
     check_docker_compose_version()
 
 
diff --git a/scripts/ci/docker-compose/_docker.env b/scripts/ci/docker-compose/_docker.env
index aa21804e9b..f0efac7768 100644
--- a/scripts/ci/docker-compose/_docker.env
+++ b/scripts/ci/docker-compose/_docker.env
@@ -37,6 +37,7 @@ DB_RESET
 DEFAULT_BRANCH
 DEFAULT_CONSTRAINTS_BRANCH
 DEV_MODE
+DOCKER_IS_ROOTLESS
 ENABLED_SYSTEMS
 ENABLE_TEST_COVERAGE
 GITHUB_ACTIONS
diff --git a/scripts/ci/docker-compose/base.yml b/scripts/ci/docker-compose/base.yml
index 3b7417eab5..f3b4742a31 100644
--- a/scripts/ci/docker-compose/base.yml
+++ b/scripts/ci/docker-compose/base.yml
@@ -48,6 +48,7 @@ services:
       - DEFAULT_BRANCH=${DEFAULT_BRANCH}
       - DEFAULT_CONSTRAINTS_BRANCH=${DEFAULT_CONSTRAINTS_BRANCH}
       - DEV_MODE=${DEV_MODE}
+      - DOCKER_IS_ROOTLESS=${DOCKER_IS_ROOTLESS}
       - ENABLED_SYSTEMS=${ENABLED_SYSTEMS}
       - ENABLE_TEST_COVERAGE=${ENABLE_TEST_COVERAGE}
       - GITHUB_ACTIONS=${GITHUB_ACTIONS}
diff --git a/scripts/ci/docker-compose/devcontainer.env b/scripts/ci/docker-compose/devcontainer.env
index f71a0e1e39..a297d7579c 100644
--- a/scripts/ci/docker-compose/devcontainer.env
+++ b/scripts/ci/docker-compose/devcontainer.env
@@ -35,6 +35,7 @@ DB_RESET="false"
 DEFAULT_BRANCH="main"
 DEFAULT_CONSTRAINTS_BRANCH="constraints-main"
 DEV_MODE="true"
+DOCKER_IS_ROOTLESS="false"
 ENABLED_SYSTEMS=
 ENABLE_TEST_COVERAGE="false"
 GITHUB_ACTIONS="false"
diff --git a/scripts/in_container/_in_container_utils.sh b/scripts/in_container/_in_container_utils.sh
index 2ed267dd54..c962856827 100644
--- a/scripts/in_container/_in_container_utils.sh
+++ b/scripts/in_container/_in_container_utils.sh
@@ -66,6 +66,10 @@ function in_container_script_start() {
 #
 function in_container_fix_ownership() {
     if [[ ${HOST_OS:=} == "linux" ]]; then
+        if [[ ${DOCKER_IS_ROOTLESS=} == "true" ]]; then
+             echo "${COLOR_YELLOW}Skip fixing ownership of generated files: Docker is rootless${COLOR_RESET}"
+             return
+        fi
         DIRECTORIES_TO_FIX=(
             "/dist"
             "/files"


[airflow] 32/44: Fix: make dry run optional for patch task instance (#34568)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c32570f2b473e388a3a8fdc34df92911d06dcd7d
Author: Tyler Calder <ca...@protonmail.com>
AuthorDate: Sat Sep 30 12:46:56 2023 -0600

    Fix: make dry run optional for patch task instance  (#34568)
    
    * fix: Make dry_run optional per docs
    
    This fixes an issue where dry_run is not actually and optional parameter
    in the patch task_instance api.
    
    * chore: remove formatting changes
    
    * fix: Make changes for api docs
    
    This updates the docs and the code so that they are in alignment while
    also being consistent with all other endpoints. All other Endpoints have
    dry run set to be True by default.
    
    * fix: Update static ts file for api change
    
    * fix: Remove dump_default
    
    (cherry picked from commit a4357ca25cc3d014e50968bac7858f533e6421e4)
---
 airflow/api_connexion/openapi/v1.yaml               |  2 +-
 .../api_connexion/schemas/task_instance_schema.py   |  2 +-
 airflow/www/static/js/types/api-generated.ts        |  2 +-
 .../endpoints/test_task_instance_endpoint.py        | 21 +++++++++++++++++++++
 4 files changed, 24 insertions(+), 3 deletions(-)

diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml
index 341b1d78c3..58d5bef9f0 100644
--- a/airflow/api_connexion/openapi/v1.yaml
+++ b/airflow/api_connexion/openapi/v1.yaml
@@ -4307,7 +4307,7 @@ components:
             If set, don't actually run this operation. The response will contain the task instance
             planned to be affected, but won't be modified in any way.
           type: boolean
-          default: false
+          default: true
 
         new_state:
           description: Expected new state.
diff --git a/airflow/api_connexion/schemas/task_instance_schema.py b/airflow/api_connexion/schemas/task_instance_schema.py
index 1d5fd29665..02dc1fb3f6 100644
--- a/airflow/api_connexion/schemas/task_instance_schema.py
+++ b/airflow/api_connexion/schemas/task_instance_schema.py
@@ -180,7 +180,7 @@ class SetTaskInstanceStateFormSchema(Schema):
 class SetSingleTaskInstanceStateFormSchema(Schema):
     """Schema for handling the request of updating state of a single task instance."""
 
-    dry_run = fields.Boolean(dump_default=True)
+    dry_run = fields.Boolean(load_default=True)
     new_state = TaskInstanceStateField(
         required=True,
         validate=validate.OneOf(
diff --git a/airflow/www/static/js/types/api-generated.ts b/airflow/www/static/js/types/api-generated.ts
index 8dead88fc7..9477c87db3 100644
--- a/airflow/www/static/js/types/api-generated.ts
+++ b/airflow/www/static/js/types/api-generated.ts
@@ -1908,7 +1908,7 @@ export interface components {
        * @description If set, don't actually run this operation. The response will contain the task instance
        * planned to be affected, but won't be modified in any way.
        *
-       * @default false
+       * @default true
        */
       dry_run?: boolean;
       /**
diff --git a/tests/api_connexion/endpoints/test_task_instance_endpoint.py b/tests/api_connexion/endpoints/test_task_instance_endpoint.py
index f09b55cf41..5056f7736d 100644
--- a/tests/api_connexion/endpoints/test_task_instance_endpoint.py
+++ b/tests/api_connexion/endpoints/test_task_instance_endpoint.py
@@ -1772,6 +1772,27 @@ class TestPatchTaskInstance(TestTaskInstanceEndpoint):
         assert response2.status_code == 200
         assert response2.json["state"] == NEW_STATE
 
+    def test_should_update_task_instance_state_default_dry_run_to_true(self, session):
+        self.create_task_instances(session)
+
+        NEW_STATE = "running"
+
+        self.client.patch(
+            self.ENDPOINT_URL,
+            environ_overrides={"REMOTE_USER": "test"},
+            json={
+                "new_state": NEW_STATE,
+            },
+        )
+
+        response2 = self.client.get(
+            self.ENDPOINT_URL,
+            environ_overrides={"REMOTE_USER": "test"},
+            json={},
+        )
+        assert response2.status_code == 200
+        assert response2.json["state"] == NEW_STATE
+
     def test_should_update_mapped_task_instance_state(self, session):
         NEW_STATE = "failed"
         map_index = 1


[airflow] 17/44: Fix ODBC Connection page formatting (#34572)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b92ec745edf0fab370d0edf4499d2898917e2625
Author: Andrey Anshin <An...@taragol.is>
AuthorDate: Sat Sep 23 08:05:17 2023 +0400

    Fix ODBC Connection page formatting (#34572)
    
    (cherry picked from commit af03051962abf222cc8f05b243451b9675d7ee00)
---
 docs/apache-airflow-providers-odbc/connections/odbc.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow-providers-odbc/connections/odbc.rst b/docs/apache-airflow-providers-odbc/connections/odbc.rst
index 11d382eeb6..3c53870f8e 100644
--- a/docs/apache-airflow-providers-odbc/connections/odbc.rst
+++ b/docs/apache-airflow-providers-odbc/connections/odbc.rst
@@ -73,7 +73,7 @@ Extra (optional)
           this config from env vars, use ``AIRFLOW__PROVIDERS_ODBC__ALLOW_DRIVER_IN_EXTRA=true``.
 
     .. note::
-        If setting ``allow_driver_extra``to True, this allows users to set the driver via the Airflow Connection's
+        If setting ``allow_driver_extra`` to True, this allows users to set the driver via the Airflow Connection's
         ``extra`` field.  By default this is not allowed.  If enabling this functionality, you should make sure
         that you trust the users who can edit connections in the UI to not use it maliciously.
 


[airflow] 26/44: fix connections exported output (#34640)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit af5c86e3df6a8809c0e19e53afcd6441a9b95e5d
Author: Jayden Chiu <57...@users.noreply.github.com>
AuthorDate: Wed Sep 27 05:15:03 2023 -0400

    fix connections exported output (#34640)
    
    (cherry picked from commit a5f5e2fc7f7b7f461458645c8826f015c1fa8d78)
---
 airflow/cli/commands/connection_command.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/cli/commands/connection_command.py b/airflow/cli/commands/connection_command.py
index c00e09087d..2ff7c50ae2 100644
--- a/airflow/cli/commands/connection_command.py
+++ b/airflow/cli/commands/connection_command.py
@@ -203,9 +203,9 @@ def connections_export(args):
         f.write(msg)
 
     if file_is_stdout:
-        print("\nConnections successfully exported.", file=sys.stderr)
+        print(f"\n{len(connections)} connections successfully exported.", file=sys.stderr)
     else:
-        print(f"Connections successfully exported to {args.file.name}.")
+        print(f"{len(connections)} connections successfully exported to {args.file.name}.")
 
 
 alternative_conn_specs = ["conn_type", "conn_host", "conn_login", "conn_password", "conn_schema", "conn_port"]


[airflow] 12/44: Change two whitespaces to one (#34519)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 1480d587047d7946eeff80d6a2c74708f6976a42
Author: Yiannis Hadjicharalambous <ha...@gmail.com>
AuthorDate: Thu Sep 21 16:24:39 2023 +0100

    Change two whitespaces to one (#34519)
    
    (cherry picked from commit a1bd8719581f2ef1fb25aeaa89e3520e8bc81172)
---
 airflow/www/static/js/dag/grid/index.tsx | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/www/static/js/dag/grid/index.tsx b/airflow/www/static/js/dag/grid/index.tsx
index a67305994a..638f4268c0 100644
--- a/airflow/www/static/js/dag/grid/index.tsx
+++ b/airflow/www/static/js/dag/grid/index.tsx
@@ -153,7 +153,7 @@ const Grid = ({
           zIndex={2}
           top={-8}
           onClick={onPanelToggle}
-          title={`${isPanelOpen ? "Hide " : "Show "} Details Panel`}
+          title={`${isPanelOpen ? "Hide" : "Show"} Details Panel`}
           aria-label={isPanelOpen ? "Show Details" : "Hide Details"}
           icon={<MdDoubleArrow />}
           transform={isPanelOpen ? undefined : "rotateZ(180deg)"}


[airflow] 35/44: Fixed rows count in the migration script (#34348)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 54f23231be9a518ff6071868c19034b49e18f1b2
Author: Aleksandr Artemenkov <al...@users.noreply.github.com>
AuthorDate: Tue Oct 3 10:33:08 2023 +0300

    Fixed rows count in the migration script (#34348)
    
    * Fixed row count for SQLAlchemy 1.4+
    
    * Updated newsfragments
    
    * Fixed typo
    
    * Added newline
    
    * Added test for `check_bad_references`
    
    (cherry picked from commit f349fda125c2251ac4129c2c28fbf6f7dbb69294)
---
 airflow/utils/db.py            |  2 +-
 newsfragments/34348.bugfix.rst |  1 +
 tests/utils/test_db.py         | 86 +++++++++++++++++++++++++++++++++++++++++-
 3 files changed, 86 insertions(+), 3 deletions(-)

diff --git a/airflow/utils/db.py b/airflow/utils/db.py
index 34cd1c1bb4..80dd788688 100644
--- a/airflow/utils/db.py
+++ b/airflow/utils/db.py
@@ -1446,7 +1446,7 @@ def check_bad_references(session: Session) -> Iterable[str]:
 
         dangling_table_name = _format_airflow_moved_table_name(source_table.name, change_version, "dangling")
         if dangling_table_name in existing_table_names:
-            invalid_row_count = bad_rows_query.count()
+            invalid_row_count = get_query_count(bad_rows_query, session=session)
             if invalid_row_count:
                 yield _format_dangling_error(
                     source_table=source_table.name,
diff --git a/newsfragments/34348.bugfix.rst b/newsfragments/34348.bugfix.rst
new file mode 100644
index 0000000000..c9f27e42f2
--- /dev/null
+++ b/newsfragments/34348.bugfix.rst
@@ -0,0 +1 @@
+Fixed ``AttributeError: 'Select' object has no attribute 'count'`` during the ``airflow db migrate`` command
diff --git a/tests/utils/test_db.py b/tests/utils/test_db.py
index 915e4b5238..aa4ad1d89d 100644
--- a/tests/utils/test_db.py
+++ b/tests/utils/test_db.py
@@ -31,13 +31,15 @@ from alembic.config import Config
 from alembic.migration import MigrationContext
 from alembic.runtime.environment import EnvironmentContext
 from alembic.script import ScriptDirectory
-from sqlalchemy import MetaData
+from sqlalchemy import MetaData, Table
+from sqlalchemy.sql import Select
 
 from airflow.exceptions import AirflowException
 from airflow.models import Base as airflow_base
 from airflow.settings import engine
 from airflow.utils.db import (
     _get_alembic_config,
+    check_bad_references,
     check_migrations,
     compare_server_default,
     compare_type,
@@ -49,6 +51,7 @@ from airflow.utils.db import (
     resetdb,
     upgradedb,
 )
+from airflow.utils.session import NEW_SESSION
 
 
 class TestDb:
@@ -57,7 +60,7 @@ class TestDb:
 
         airflow.models.import_all_models()
         all_meta_data = MetaData()
-        for (table_name, table) in airflow_base.metadata.tables.items():
+        for table_name, table in airflow_base.metadata.tables.items():
             all_meta_data._add_table(table_name, table.schema, table)
 
         # create diff between database schema and SQLAlchemy model
@@ -251,3 +254,82 @@ class TestDb:
         import airflow
 
         assert config.config_file_name == os.path.join(os.path.dirname(airflow.__file__), "alembic.ini")
+
+    @mock.patch("airflow.utils.db._move_dangling_data_to_new_table")
+    @mock.patch("airflow.utils.db.get_query_count")
+    @mock.patch("airflow.utils.db._dangling_against_task_instance")
+    @mock.patch("airflow.utils.db._dangling_against_dag_run")
+    @mock.patch("airflow.utils.db.reflect_tables")
+    @mock.patch("airflow.utils.db.inspect")
+    def test_check_bad_references(
+        self,
+        mock_inspect: MagicMock,
+        mock_reflect_tables: MagicMock,
+        mock_dangling_against_dag_run: MagicMock,
+        mock_dangling_against_task_instance: MagicMock,
+        mock_get_query_count: MagicMock,
+        mock_move_dangling_data_to_new_table: MagicMock,
+    ):
+        from airflow.models.dagrun import DagRun
+        from airflow.models.renderedtifields import RenderedTaskInstanceFields
+        from airflow.models.taskfail import TaskFail
+        from airflow.models.taskinstance import TaskInstance
+        from airflow.models.taskreschedule import TaskReschedule
+        from airflow.models.xcom import XCom
+
+        mock_session = MagicMock(spec=NEW_SESSION)
+        mock_bind = MagicMock()
+        mock_session.get_bind.return_value = mock_bind
+        task_instance_table = MagicMock(spec=Table)
+        task_instance_table.name = TaskInstance.__tablename__
+        dag_run_table = MagicMock(spec=Table)
+        task_fail_table = MagicMock(spec=Table)
+        task_fail_table.name = TaskFail.__tablename__
+
+        mock_reflect_tables.return_value = MagicMock(
+            tables={
+                DagRun.__tablename__: dag_run_table,
+                TaskInstance.__tablename__: task_instance_table,
+                TaskFail.__tablename__: task_fail_table,
+            }
+        )
+
+        # Simulate that there is a moved `task_instance` table from the
+        # previous run, but no moved `task_fail` table
+        dangling_task_instance_table_name = f"_airflow_moved__2_2__dangling__{task_instance_table.name}"
+        dangling_task_fail_table_name = f"_airflow_moved__2_3__dangling__{task_fail_table.name}"
+        mock_get_table_names = MagicMock(
+            return_value=[
+                TaskInstance.__tablename__,
+                DagRun.__tablename__,
+                TaskFail.__tablename__,
+                dangling_task_instance_table_name,
+            ]
+        )
+        mock_inspect.return_value = MagicMock(
+            get_table_names=mock_get_table_names,
+        )
+        mock_select = MagicMock(spec=Select)
+        mock_dangling_against_dag_run.return_value = mock_select
+        mock_dangling_against_task_instance.return_value = mock_select
+        mock_get_query_count.return_value = 1
+
+        # Should return a single error related to the dangling `task_instance` table
+        errs = list(check_bad_references(session=mock_session))
+        assert len(errs) == 1
+        assert dangling_task_instance_table_name in errs[0]
+
+        mock_reflect_tables.assert_called_once_with(
+            [TaskInstance, TaskReschedule, RenderedTaskInstanceFields, TaskFail, XCom, DagRun, TaskInstance],
+            mock_session,
+        )
+        mock_inspect.assert_called_once_with(mock_bind)
+        mock_get_table_names.assert_called_once()
+        mock_dangling_against_dag_run.assert_called_once_with(
+            mock_session, task_instance_table, dag_run=dag_run_table
+        )
+        mock_get_query_count.assert_called_once_with(mock_select, session=mock_session)
+        mock_move_dangling_data_to_new_table.assert_called_once_with(
+            mock_session, task_fail_table, mock_select, dangling_task_fail_table_name
+        )
+        mock_session.rollback.assert_called_once()


[airflow] 43/44: Avoid WSL2 ones when finding a context for Breeze (#34538)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit df43cbb6c57d40d887cbd2c303f68952a729a5e5
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Fri Sep 22 05:13:50 2023 -0400

    Avoid WSL2 ones when finding a context for Breeze (#34538)
    
    * Avoid WSL2 ones when finding a context for Breeze
    
    * fixup! Avoid WSL2 ones when finding a context for Breeze
    
    ---------
    
    Co-authored-by: Tzu-ping Chung <ur...@gmail.com>
    (cherry picked from commit 2e764fb0978fde33f59918416bd2732294c4bf23)
---
 .../airflow_breeze/utils/docker_command_utils.py   | 38 ++++++++++++++--------
 dev/breeze/tests/test_docker_command_utils.py      | 33 ++++++++++++++++---
 2 files changed, 53 insertions(+), 18 deletions(-)

diff --git a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
index b7b8041cb5..6aae84fb5f 100644
--- a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
+++ b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
@@ -18,6 +18,7 @@
 from __future__ import annotations
 
 import copy
+import json
 import os
 import random
 import re
@@ -822,23 +823,34 @@ def autodetect_docker_context():
 
     :return: name of the docker context to use
     """
-    output = run_command(["docker", "context", "ls", "-q"], capture_output=True, check=False, text=True)
-    if output.returncode != 0:
+    result = run_command(
+        ["docker", "context", "ls", "--format=json"],
+        capture_output=True,
+        check=False,
+        text=True,
+    )
+    if result.returncode != 0:
         get_console().print("[warning]Could not detect docker builder. Using default.[/]")
         return "default"
-    context_list = output.stdout.splitlines()
-    if not context_list:
+    context_json = json.loads(result.stdout)
+    if isinstance(context_json, dict):
+        # In case there is one context it is returned as dict not array of dicts ¯\_(ツ)_/¯
+        context_json = [context_json]
+    known_contexts = {info["Name"]: info for info in context_json}
+    if not known_contexts:
         get_console().print("[warning]Could not detect docker builder. Using default.[/]")
         return "default"
-    elif len(context_list) == 1:
-        get_console().print(f"[info]Using {context_list[0]} as context.[/]")
-        return context_list[0]
-    else:
-        for preferred_context in PREFERRED_CONTEXTS:
-            if preferred_context in context_list:
-                get_console().print(f"[info]Using {preferred_context} as context.[/]")
-                return preferred_context
-    fallback_context = context_list[0]
+    for preferred_context_name in PREFERRED_CONTEXTS:
+        try:
+            context = known_contexts[preferred_context_name]
+        except KeyError:
+            continue
+        # On Windows, some contexts are used for WSL2. We don't want to use those.
+        if context["DockerEndpoint"] == "npipe:////./pipe/dockerDesktopLinuxEngine":
+            continue
+        get_console().print(f"[info]Using {preferred_context_name} as context.[/]")
+        return preferred_context_name
+    fallback_context = next(iter(known_contexts))
     get_console().print(
         f"[warning]Could not use any of the preferred docker contexts {PREFERRED_CONTEXTS}.\n"
         f"Using {fallback_context} as context.[/]"
diff --git a/dev/breeze/tests/test_docker_command_utils.py b/dev/breeze/tests/test_docker_command_utils.py
index 83081ad770..b125fb2bd7 100644
--- a/dev/breeze/tests/test_docker_command_utils.py
+++ b/dev/breeze/tests/test_docker_command_utils.py
@@ -191,19 +191,42 @@ def test_check_docker_compose_version_ok(mock_get_console, mock_run_command):
     )
 
 
+def _fake_ctx(name: str) -> dict[str, str]:
+    return {
+        "Name": name,
+        "DockerEndpoint": f"unix://{name}",
+    }
+
+
 @pytest.mark.parametrize(
     "context_output, selected_context, console_output",
     [
         (
+            json.dumps([_fake_ctx("default")]),
             "default",
+            "[info]Using default as context",
+        ),
+        ("[]", "default", "[warning]Could not detect docker builder"),
+        (
+            json.dumps([_fake_ctx("a"), _fake_ctx("b")]),
+            "a",
+            "[warning]Could not use any of the preferred docker contexts",
+        ),
+        (
+            json.dumps([_fake_ctx("a"), _fake_ctx("desktop-linux")]),
+            "desktop-linux",
+            "[info]Using desktop-linux as context",
+        ),
+        (
+            json.dumps([_fake_ctx("a"), _fake_ctx("default")]),
             "default",
             "[info]Using default as context",
         ),
-        ("", "default", "[warning]Could not detect docker builder"),
-        ("a\nb", "a", "[warning]Could not use any of the preferred docker contexts"),
-        ("a\ndesktop-linux", "desktop-linux", "[info]Using desktop-linux as context"),
-        ("a\ndefault", "default", "[info]Using default as context"),
-        ("a\ndefault\ndesktop-linux", "desktop-linux", "[info]Using desktop-linux as context"),
+        (
+            json.dumps([_fake_ctx("a"), _fake_ctx("default"), _fake_ctx("desktop-linux")]),
+            "desktop-linux",
+            "[info]Using desktop-linux as context",
+        ),
     ],
 )
 def test_autodetect_docker_context(context_output: str, selected_context: str, console_output: str):


[airflow] 31/44: Correct docs for multi-value select (#34690)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8a50232e7fae27bd86d7e4071a272cfa7ff5ef44
Author: James Ko <ja...@gmail.com>
AuthorDate: Fri Sep 29 20:06:12 2023 -0400

    Correct docs for multi-value select (#34690)
    
    (cherry picked from commit 8e935861f6bd0529543756d3b750de218aef5462)
---
 docs/apache-airflow/core-concepts/params.rst | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/apache-airflow/core-concepts/params.rst b/docs/apache-airflow/core-concepts/params.rst
index 512057e6d8..e5bf04ae62 100644
--- a/docs/apache-airflow/core-concepts/params.rst
+++ b/docs/apache-airflow/core-concepts/params.rst
@@ -255,11 +255,11 @@ The following features are supported in the Trigger UI Form:
           - | Generates a HTML multi line text field,
             | every line edited will be made into a
             | string array as value.
-          - * | If you add the attribute ``example``
+          - * | If you add the attribute ``examples``
               | with a list, a multi-value select option
               | will be generated instead of a free text field.
             * | ``values_display={"a": "Alpha", "b": "Beta"}``:
-              | For multi-value selects ``example`` you can add
+              | For multi-value selects ``examples`` you can add
               | the attribute ``values_display`` with a dict and
               | map data values to display labels.
             * | If you add the attribute ``items``, a JSON entry


[airflow] 28/44: Restrict `astroid` version < 3 (#34658)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 262e231c167da9b43e0ebb55f38bdd5b7d5330a0
Author: Andrey Anshin <An...@taragol.is>
AuthorDate: Thu Sep 28 04:55:06 2023 +0400

    Restrict `astroid` version < 3 (#34658)
    
    (cherry picked from commit 19450e03f534f63399bf5db2df7690fdd47b09c8)
---
 setup.py | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index 3bd81e8391..bec06d628b 100644
--- a/setup.py
+++ b/setup.py
@@ -299,7 +299,9 @@ deprecated_api = [
     "requests>=2.26.0",
 ]
 doc = [
-    "astroid>=2.12.3",
+    # sphinx-autoapi fails with astroid 3.0, see: https://github.com/readthedocs/sphinx-autoapi/issues/407
+    # This was fixed in sphinx-autoapi 3.0, however it has requirement sphinx>=6.1, but we stuck on 5.x
+    "astroid>=2.12.3, <3.0",
     "checksumdir",
     # click 8.1.4 and 8.1.5 generate mypy errors due to typing issue in the upstream package:
     # https://github.com/pallets/click/issues/2558


[airflow] 36/44: Import AUTH_REMOTE_USER from FAB in WSGI middleware example (#34721)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 14e01708c32cc23fdf3a963314444ad83f1cfb57
Author: Hussein Awala <hu...@awala.fr>
AuthorDate: Wed Oct 4 11:36:38 2023 +0200

    Import AUTH_REMOTE_USER from FAB in WSGI middleware example (#34721)
    
    (cherry picked from commit feaa5087e6a6b89d9d3ac7eaf9872d5b626bf1ce)
---
 docs/apache-airflow/security/webserver.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow/security/webserver.rst b/docs/apache-airflow/security/webserver.rst
index d990afa4d4..bf7991ac8a 100644
--- a/docs/apache-airflow/security/webserver.rst
+++ b/docs/apache-airflow/security/webserver.rst
@@ -113,7 +113,7 @@ and leverage the REMOTE_USER method:
     from typing import Any, Callable
 
     from flask import current_app
-    from airflow.www.fab_security.manager import AUTH_REMOTE_USER
+    from flask_appbuilder.const import AUTH_REMOTE_USER
 
 
     class CustomMiddleware:


[airflow] 16/44: Fix screenshot in dynamic task mapping docs (#34566)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e1d41ca98e70d5e16e87834dcab2efca659dea43
Author: Jed Cunningham <66...@users.noreply.github.com>
AuthorDate: Fri Sep 22 18:06:32 2023 -0600

    Fix screenshot in dynamic task mapping docs (#34566)
    
    (cherry picked from commit d2be036ace49f8e5dd98780c75a37106544bf4d6)
---
 docs/apache-airflow/img/mapping-simple-graph.png | Bin 40118 -> 13312 bytes
 1 file changed, 0 insertions(+), 0 deletions(-)

diff --git a/docs/apache-airflow/img/mapping-simple-graph.png b/docs/apache-airflow/img/mapping-simple-graph.png
index 36ec7f9bc1..4bd16876c4 100644
Binary files a/docs/apache-airflow/img/mapping-simple-graph.png and b/docs/apache-airflow/img/mapping-simple-graph.png differ


[airflow] 07/44: Docs for triggered_dataset_event (#34410)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f3b44c6813a00f43b307e8d55739419905dca1f7
Author: fritz-astronomer <80...@users.noreply.github.com>
AuthorDate: Sat Sep 16 13:07:05 2023 -0400

    Docs for triggered_dataset_event (#34410)
    
    * add templates reference for triggering_dataset_events and a note to check the templates page on the datasets page
    
    * add working example, correct type of triggering_dataset_events
    
    * explain | first | first
    
    (cherry picked from commit 21610c1d6763e61108fae35585536a5c409d5cbc)
---
 .../authoring-and-scheduling/datasets.rst          | 39 ++++++++++++++++++++++
 docs/apache-airflow/templates-ref.rst              |  4 +++
 docs/spelling_wordlist.txt                         |  2 ++
 3 files changed, 45 insertions(+)

diff --git a/docs/apache-airflow/authoring-and-scheduling/datasets.rst b/docs/apache-airflow/authoring-and-scheduling/datasets.rst
index 605635a4e1..60268bedff 100644
--- a/docs/apache-airflow/authoring-and-scheduling/datasets.rst
+++ b/docs/apache-airflow/authoring-and-scheduling/datasets.rst
@@ -197,3 +197,42 @@ Notes on schedules
 The ``schedule`` parameter to your DAG can take either a list of datasets to consume or a timetable-based option. The two cannot currently be mixed.
 
 When using datasets, in this first release (v2.4) waiting for all datasets in the list to be updated is the only option when multiple datasets are consumed by a DAG. A later release may introduce more fine-grained options allowing for greater flexibility.
+
+Fetching information from a Triggering Dataset Event
+----------------------------------------------------
+
+A triggered DAG can fetch information from the Dataset that triggered it using the ``triggering_dataset_events`` template or parameter.
+See more at :ref:`templates-ref`.
+
+Example:
+
+.. code-block:: python
+
+    example_snowflake_dataset = Dataset("snowflake://my_db.my_schema.my_table")
+
+    with DAG(dag_id="load_snowflake_data", schedule="@hourly", ...):
+        SQLExecuteQueryOperator(
+            task_id="load", conn_id="snowflake_default", outlets=[example_snowflake_dataset], ...
+        )
+
+    with DAG(dag_id="query_snowflake_data", schedule=[example_snowflake_dataset], ...):
+        SQLExecuteQueryOperator(
+            task_id="query",
+            conn_id="snowflake_default",
+            sql="""
+              SELECT *
+              FROM my_db.my_schema.my_table
+              WHERE "updated_at" >= '{{ (triggering_dataset_events.values() | first | first).source_dag_run.data_interval_start }}'
+              AND "updated_at" < '{{ (triggering_dataset_events.values() | first | first).source_dag_run.data_interval_end }}';
+            """,
+        )
+
+        @task
+        def print_triggering_dataset_events(triggering_dataset_events=None):
+            for dataset, dataset_list in triggering_dataset_events.items():
+                print(dataset, dataset_list, dataset_list[dataset])
+                print(dataset_list[dataset][0].source_dag_run.dag_run_id)
+
+        print_triggering_dataset_events()
+
+Note that this example is using `(.values() | first | first) <https://jinja.palletsprojects.com/en/3.1.x/templates/#jinja-filters.first>`_ to fetch the first of one Dataset given to the DAG, and the first of one DatasetEvent for that Dataset. An implementation may be quite complex if you have multiple Datasets, potentially with multiple DatasetEvents.
diff --git a/docs/apache-airflow/templates-ref.rst b/docs/apache-airflow/templates-ref.rst
index bfb3cad9eb..3cb2f4a93e 100644
--- a/docs/apache-airflow/templates-ref.rst
+++ b/docs/apache-airflow/templates-ref.rst
@@ -74,6 +74,10 @@ Variable                                    Type                  Description
 ``{{ expanded_ti_count }}``                 int | ``None``        | Number of task instances that a mapped task was expanded into. If
                                                                   | the current task is not mapped, this should be ``None``.
                                                                   | Added in version 2.5.
+``{{ triggering_dataset_events }}``         dict[str,             | If in a Dataset Scheduled DAG, a map of Dataset URI to a list of triggering :class:`~airflow.models.dataset.DatasetEvent`
+                                            list[DatasetEvent]]   | (there may be more than one, if there are multiple Datasets with different frequencies).
+                                                                  | Read more here :doc:`Datasets <authoring-and-scheduling/datasets>`.
+                                                                  | Added in version 2.4.
 =========================================== ===================== ===================================================================
 
 .. note::
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index fbca728f8e..f8f613dfc3 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -375,6 +375,8 @@ Dataproc
 dataproc
 Dataset
 dataset
+DatasetEvent
+DatasetEvents
 datasetId
 Datasets
 datasets