You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ep...@apache.org on 2023/03/08 13:17:55 UTC

[airflow] branch v2-5-test updated (caca35c546 -> 4e2af12f99)

This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


    from caca35c546 Migrate remaining core sensors tests to `pytest` (#28204)
     new 154ad9a02b Migrate amazon provider sensor tests from `unittests` to `pytest` (#28139)
     new 2ecbfd0118 Replace freezegun with time-machine (#28193)
     new d503914b60 Add deferrable mode to CloudBuildCreateBuildOperator (#27783)
     new 7787fdb10f Fix discoverability of tests for ARM in Breeze (#28432)
     new 3784f2e25b add hostname argument to DockerOperator (#27822)
     new 393bba236e Update codespell and fix typos (#28568)
     new 6f25448868 Improve "other" test category selection (#28630)
     new 56b0b76b3a Variables set in variables.env are automatically exported (#28633)
     new 43b8ea65a2 Rerun flaky PinotDB integration test (#28562)
     new 587b144569 Update black version automatically in pre-commit configuration (#28578)
     new 14fdfc07ac Switch to ruff for faster static checks (#28893)
     new 4e2af12f99 Make static checks generated file  more stable accross the board (#29080)

The 12 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .codespellignorelines                              |   2 +
 .flake8                                            |   8 -
 .github/boring-cyborg.yml                          |   1 -
 .github/workflows/ci.yml                           |   9 +-
 .pre-commit-config.yaml                            |  90 +++----
 .rat-excludes                                      |   1 -
 Dockerfile.ci                                      |   4 +-
 RELEASE_NOTES.rst                                  |   2 +-
 STATIC_CODE_CHECKS.rst                             |  14 +-
 airflow/cli/commands/connection_command.py         |   2 +-
 airflow/compat/functools.pyi                       |   1 +
 airflow/decorators/__init__.pyi                    |  13 +-
 airflow/example_dags/example_sensor_decorator.py   |   1 +
 airflow/example_dags/tutorial_taskflow_api.py      |   1 +
 airflow/hooks/dbapi.py                             |   6 +-
 airflow/migrations/db_types.pyi                    |   1 +
 airflow/providers/amazon/aws/hooks/emr.py          |  10 +-
 airflow/providers/amazon/aws/operators/sns.py      |   2 +-
 .../amazon/aws/transfers/dynamodb_to_s3.py         |   4 +-
 .../providers/cncf/kubernetes/utils/__init__.py    |   2 +
 airflow/providers/docker/operators/docker.py       |   4 +
 .../providers/google/cloud/hooks/cloud_build.py    |  98 +++++++-
 .../google/cloud/operators/cloud_build.py          |  90 +++++--
 .../providers/google/cloud/operators/dataproc.py   |   2 +-
 .../google/cloud/operators/kubernetes_engine.py    |   2 +-
 .../providers/google/cloud/triggers/cloud_build.py | 125 ++++++++++
 airflow/providers/microsoft/azure/hooks/wasb.py    |   2 +-
 airflow/providers/odbc/hooks/odbc.py               |   2 +-
 airflow/providers/sftp/hooks/sftp.py               |   2 +-
 airflow/utils/code_utils.py                        |   2 +-
 airflow/utils/context.pyi                          |   5 +-
 airflow/utils/log/action_logger.py                 |   1 +
 airflow/utils/process_utils.py                     |   4 +-
 .../0002-implement-standalone-python-command.md    |   2 +-
 .../src/airflow_breeze/commands/main_command.py    |   4 +-
 dev/breeze/src/airflow_breeze/global_constants.py  |   9 +-
 dev/breeze/src/airflow_breeze/pre_commit_ids.py    |   9 +-
 .../pre_commit_ids_TEMPLATE.py.jinja2              |   1 +
 .../src/airflow_breeze/utils/selective_checks.py   |  10 +-
 dev/breeze/tests/test_selective_checks.py          |  25 +-
 dev/deprecations/generate_deprecated_dicts.py      | 217 -----------------
 dev/provider_packages/prepare_provider_packages.py |  18 +-
 dev/stats/get_important_pr_candidates.py           |   2 +-
 .../operators/cloud/cloud_build.rst                |  50 ++++
 docs/apache-airflow/extra-packages-ref.rst         |   2 +-
 docs/apache-airflow/img/airflow_erd.sha256         |   2 +-
 docs/build_docs.py                                 |   6 +-
 docs/exts/provider_init_hack.py                    |   4 +-
 docs/spelling_wordlist.txt                         |   3 +
 images/breeze/output-commands-hash.txt             |   2 +-
 images/breeze/output_static-checks.svg             |  20 +-
 images/breeze/output_stop.svg                      |  24 +-
 provider_packages/.flake8                          |   1 -
 pyproject.toml                                     | 108 ++++++++-
 ...elm_lint.py => common_precommit_black_utils.py} |  32 +--
 scripts/ci/pre_commit/common_precommit_utils.py    |   3 +-
 .../pre_commit_check_pre_commit_hooks.py           |  72 ++----
 .../ci/pre_commit/pre_commit_compile_www_assets.py |   3 +-
 scripts/ci/pre_commit/pre_commit_flake8.py         |  72 ------
 scripts/ci/pre_commit/pre_commit_insert_extras.py  |   4 +-
 .../ci/pre_commit/pre_commit_local_yml_mounts.py   |  18 +-
 scripts/ci/pre_commit/pre_commit_mypy.py           |  13 +-
 .../pre_commit/pre_commit_update_black_version.py  |  28 +--
 scripts/docker/entrypoint_ci.sh                    |   4 +-
 scripts/in_container/configure_environment.sh      |   2 +
 scripts/in_container/run_extract_tests.sh          |  25 --
 scripts/in_container/run_flake8.sh                 |  20 --
 scripts/in_container/test_pytest_collection.py     |  66 +++++
 setup.py                                           |  12 +-
 tests/api/client/test_local_client.py              |   4 +-
 tests/api_connexion/endpoints/test_dag_endpoint.py |  15 --
 .../endpoints/test_dag_run_endpoint.py             |   4 +-
 tests/api_connexion/schemas/test_dataset_schema.py |   4 +-
 tests/conftest.py                                  |  20 +-
 tests/core/test_sentry.py                          |   4 +-
 tests/dag_processing/test_manager.py               |   6 +-
 tests/executors/test_celery_executor.py            |   6 +-
 .../providers/apache/pinot/hooks/test_pinot.py     |   2 +
 tests/jobs/test_scheduler_job.py                   |   4 +-
 tests/models/test_dag.py                           |   6 +-
 tests/models/test_dagbag.py                        |  14 +-
 tests/models/test_taskinstance.py                  |  18 +-
 tests/models/test_timestamp.py                     |   6 +-
 tests/operators/test_datetime.py                   |  16 +-
 tests/operators/test_generic_transfer.py           |   6 +-
 tests/operators/test_latest_only_operator.py       |   4 +-
 tests/operators/test_weekday.py                    |  10 +-
 tests/providers/amazon/aws/hooks/test_eks.py       |  10 +-
 tests/providers/amazon/aws/sensors/test_athena.py  |  11 +-
 tests/providers/amazon/aws/sensors/test_batch.py   |  49 ++--
 .../amazon/aws/sensors/test_cloud_formation.py     |   8 +-
 .../providers/amazon/aws/sensors/test_dms_task.py  |   5 +-
 tests/providers/amazon/aws/sensors/test_eks.py     |  12 +-
 .../providers/amazon/aws/sensors/test_emr_base.py  |   4 +-
 .../amazon/aws/sensors/test_emr_containers.py      |   5 +-
 .../amazon/aws/sensors/test_emr_job_flow.py        |  10 +-
 .../providers/amazon/aws/sensors/test_emr_step.py  |  10 +-
 tests/providers/amazon/aws/sensors/test_glacier.py |   7 +-
 tests/providers/amazon/aws/sensors/test_glue.py    |   9 +-
 .../amazon/aws/sensors/test_glue_crawler.py        |  15 +-
 .../amazon/aws/sensors/test_quicksight.py          |  51 ++--
 tests/providers/amazon/aws/sensors/test_s3_key.py  |  16 +-
 .../amazon/aws/sensors/test_s3_keys_unchanged.py   |  62 +++--
 .../amazon/aws/sensors/test_sagemaker_base.py      |   4 +-
 .../amazon/aws/sensors/test_sagemaker_endpoint.py  |   3 +-
 .../amazon/aws/sensors/test_sagemaker_training.py  |   3 +-
 .../amazon/aws/sensors/test_sagemaker_transform.py |   3 +-
 .../amazon/aws/sensors/test_sagemaker_tuning.py    |   3 +-
 tests/providers/amazon/aws/sensors/test_sqs.py     |   5 +-
 .../amazon/aws/sensors/test_step_function.py       |  10 +-
 .../amazon/aws/utils/test_eks_get_token.py         |   4 +-
 .../apache/hive/transfers/test_mssql_to_hive.py    |  12 +-
 .../apache/hive/transfers/test_mysql_to_hive.py    |   8 +-
 tests/providers/docker/operators/test_docker.py    |   7 +
 .../elasticsearch/log/test_es_task_handler.py      |  15 +-
 .../google/cloud/hooks/test_cloud_build.py         |  42 +++-
 .../google/cloud/operators/test_cloud_build.py     | 265 +++++++++++++++++----
 .../test_cloud_storage_transfer_service.py         |   4 +-
 .../cloud/transfers/test_bigquery_to_mssql.py      |   9 +-
 .../google/cloud/transfers/test_mssql_to_gcs.py    |   8 +-
 .../google/cloud/transfers/test_mysql_to_gcs.py    |  11 +-
 .../google/cloud/triggers/test_cloud_build.py      | 238 ++++++++++++++++++
 .../providers/google/leveldb/hooks/test_leveldb.py |   7 +-
 .../google/leveldb/operators/test_leveldb.py       |  11 +-
 .../providers/google/suite/hooks/test_calendar.py  |   3 +-
 tests/providers/microsoft/azure/hooks/test_asb.py  |   8 +-
 .../microsoft/azure/operators/test_asb.py          |   6 +-
 .../providers/microsoft/mssql/hooks/test_mssql.py  |   6 +-
 .../microsoft/mssql/operators/test_mssql.py        |  10 +-
 tests/providers/mysql/hooks/test_mysql.py          |  11 +-
 .../mysql/transfers/test_vertica_to_mysql.py       |   8 +-
 tests/providers/ssh/hooks/test_ssh.py              |   4 +-
 tests/sensors/test_base.py                         | 107 ++++-----
 tests/sensors/test_time_sensor.py                  |  11 +-
 .../cncf/kubernetes/example_spark_kubernetes.py    |   3 +-
 .../cloud/bigquery/example_bigquery_to_mssql.py    |   8 +-
 .../google/cloud/bigtable/example_bigtable.py      |   6 +-
 .../cloud/cloud_build/example_cloud_build.py       |  18 +-
 ...cloud_build.py => example_cloud_build_async.py} |  44 ++--
 .../cloud_build/example_cloud_build_trigger.py     |  12 +-
 .../google/cloud/gcs/example_mssql_to_gcs.py       |   9 +-
 .../google/cloud/gcs/example_mysql_to_gcs.py       |   9 +-
 .../providers/google/leveldb/example_leveldb.py    |  10 +-
 .../microsoft/azure/example_azure_service_bus.py   |  30 ++-
 .../providers/microsoft/mssql/example_mssql.py     |  10 +-
 tests/test_utils/get_all_tests.py                  |   4 +-
 tests/ti_deps/deps/test_not_in_retry_period_dep.py |   6 +-
 tests/ti_deps/deps/test_runnable_exec_date_dep.py  |   6 +-
 tests/timetables/test_interval_timetable.py        |   8 +-
 tests/timetables/test_trigger_timetable.py         |   6 +-
 tests/utils/log/test_file_processor_handler.py     |   8 +-
 tests/utils/test_serve_logs.py                     |  14 +-
 tests/www/test_security.py                         |  10 +-
 tests/www/views/test_views_grid.py                 |  23 +-
 tests/www/views/test_views_tasks.py                |  17 +-
 155 files changed, 1787 insertions(+), 1159 deletions(-)
 delete mode 100644 .flake8
 create mode 100644 airflow/providers/google/cloud/triggers/cloud_build.py
 delete mode 100644 dev/deprecations/generate_deprecated_dicts.py
 delete mode 120000 provider_packages/.flake8
 copy scripts/ci/pre_commit/{pre_commit_helm_lint.py => common_precommit_black_utils.py} (50%)
 mode change 100755 => 100644
 delete mode 100755 scripts/ci/pre_commit/pre_commit_flake8.py
 copy tests/charts/test_chart_quality.py => scripts/ci/pre_commit/pre_commit_update_black_version.py (57%)
 mode change 100644 => 100755
 delete mode 100755 scripts/in_container/run_extract_tests.sh
 delete mode 100755 scripts/in_container/run_flake8.sh
 create mode 100755 scripts/in_container/test_pytest_collection.py
 create mode 100644 tests/providers/google/cloud/triggers/test_cloud_build.py
 copy tests/system/providers/google/cloud/cloud_build/{example_cloud_build.py => example_cloud_build_async.py} (83%)


[airflow] 01/12: Migrate amazon provider sensor tests from `unittests` to `pytest` (#28139)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 154ad9a02b126c2e09185fb2b1fcd2071699e9c1
Author: Adrian Castro <56...@users.noreply.github.com>
AuthorDate: Tue Dec 6 18:28:41 2022 +0100

    Migrate amazon provider sensor tests from `unittests` to `pytest` (#28139)
    
    (cherry picked from commit b726d8eeb84871fafea3395764815b4ddc0c3216)
---
 tests/providers/amazon/aws/sensors/test_athena.py  | 11 +++--
 tests/providers/amazon/aws/sensors/test_batch.py   | 49 +++++++++------------
 .../amazon/aws/sensors/test_cloud_formation.py     |  8 +---
 .../providers/amazon/aws/sensors/test_dms_task.py  |  5 +--
 tests/providers/amazon/aws/sensors/test_eks.py     | 12 ++---
 .../providers/amazon/aws/sensors/test_emr_base.py  |  4 +-
 .../amazon/aws/sensors/test_emr_containers.py      |  5 +--
 .../amazon/aws/sensors/test_emr_job_flow.py        | 10 ++---
 .../providers/amazon/aws/sensors/test_emr_step.py  | 10 ++---
 tests/providers/amazon/aws/sensors/test_glacier.py |  7 ++-
 tests/providers/amazon/aws/sensors/test_glue.py    |  9 +---
 .../amazon/aws/sensors/test_glue_crawler.py        | 15 +++----
 .../amazon/aws/sensors/test_quicksight.py          | 51 ++++++++++------------
 tests/providers/amazon/aws/sensors/test_s3_key.py  | 16 +++----
 .../amazon/aws/sensors/test_s3_keys_unchanged.py   | 38 ++++++++++------
 .../amazon/aws/sensors/test_sagemaker_base.py      |  4 +-
 .../amazon/aws/sensors/test_sagemaker_endpoint.py  |  3 +-
 .../amazon/aws/sensors/test_sagemaker_training.py  |  3 +-
 .../amazon/aws/sensors/test_sagemaker_transform.py |  3 +-
 .../amazon/aws/sensors/test_sagemaker_tuning.py    |  3 +-
 tests/providers/amazon/aws/sensors/test_sqs.py     |  5 +--
 .../amazon/aws/sensors/test_step_function.py       | 10 ++---
 22 files changed, 123 insertions(+), 158 deletions(-)

diff --git a/tests/providers/amazon/aws/sensors/test_athena.py b/tests/providers/amazon/aws/sensors/test_athena.py
index c6019296dd..a9809be1d0 100644
--- a/tests/providers/amazon/aws/sensors/test_athena.py
+++ b/tests/providers/amazon/aws/sensors/test_athena.py
@@ -17,7 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
@@ -27,8 +26,8 @@ from airflow.providers.amazon.aws.hooks.athena import AthenaHook
 from airflow.providers.amazon.aws.sensors.athena import AthenaSensor
 
 
-class TestAthenaSensor(unittest.TestCase):
-    def setUp(self):
+class TestAthenaSensor:
+    def setup_method(self):
         self.sensor = AthenaSensor(
             task_id="test_athena_sensor",
             query_execution_id="abc",
@@ -39,15 +38,15 @@ class TestAthenaSensor(unittest.TestCase):
 
     @mock.patch.object(AthenaHook, "poll_query_status", side_effect=("SUCCEEDED",))
     def test_poke_success(self, mock_poll_query_status):
-        assert self.sensor.poke({})
+        assert self.sensor.poke({}) is True
 
     @mock.patch.object(AthenaHook, "poll_query_status", side_effect=("RUNNING",))
     def test_poke_running(self, mock_poll_query_status):
-        assert not self.sensor.poke({})
+        assert self.sensor.poke({}) is False
 
     @mock.patch.object(AthenaHook, "poll_query_status", side_effect=("QUEUED",))
     def test_poke_queued(self, mock_poll_query_status):
-        assert not self.sensor.poke({})
+        assert self.sensor.poke({}) is False
 
     @mock.patch.object(AthenaHook, "poll_query_status", side_effect=("FAILED",))
     def test_poke_failed(self, mock_poll_query_status):
diff --git a/tests/providers/amazon/aws/sensors/test_batch.py b/tests/providers/amazon/aws/sensors/test_batch.py
index e7e20d65a2..d7905d563f 100644
--- a/tests/providers/amazon/aws/sensors/test_batch.py
+++ b/tests/providers/amazon/aws/sensors/test_batch.py
@@ -16,11 +16,9 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
-from parameterized import parameterized
 
 from airflow.exceptions import AirflowException
 from airflow.providers.amazon.aws.hooks.batch_client import BatchClientHook
@@ -34,8 +32,8 @@ TASK_ID = "batch_job_sensor"
 JOB_ID = "8222a1c2-b246-4e19-b1b8-0039bb4407c0"
 
 
-class TestBatchSensor(unittest.TestCase):
-    def setUp(self):
+class TestBatchSensor:
+    def setup_method(self):
         self.batch_sensor = BatchSensor(
             task_id="batch_job_sensor",
             job_id=JOB_ID,
@@ -44,45 +42,38 @@ class TestBatchSensor(unittest.TestCase):
     @mock.patch.object(BatchClientHook, "get_job_description")
     def test_poke_on_success_state(self, mock_get_job_description):
         mock_get_job_description.return_value = {"status": "SUCCEEDED"}
-        self.assertTrue(self.batch_sensor.poke({}))
+        assert self.batch_sensor.poke({})
         mock_get_job_description.assert_called_once_with(JOB_ID)
 
     @mock.patch.object(BatchClientHook, "get_job_description")
     def test_poke_on_failure_state(self, mock_get_job_description):
         mock_get_job_description.return_value = {"status": "FAILED"}
-        with self.assertRaises(AirflowException) as e:
+        with pytest.raises(AirflowException, match="Batch sensor failed. AWS Batch job status: FAILED"):
             self.batch_sensor.poke({})
 
-        self.assertEqual("Batch sensor failed. AWS Batch job status: FAILED", str(e.exception))
         mock_get_job_description.assert_called_once_with(JOB_ID)
 
     @mock.patch.object(BatchClientHook, "get_job_description")
     def test_poke_on_invalid_state(self, mock_get_job_description):
         mock_get_job_description.return_value = {"status": "INVALID"}
-        with self.assertRaises(AirflowException) as e:
+        with pytest.raises(
+            AirflowException, match="Batch sensor failed. Unknown AWS Batch job status: INVALID"
+        ):
             self.batch_sensor.poke({})
 
-        self.assertEqual("Batch sensor failed. Unknown AWS Batch job status: INVALID", str(e.exception))
         mock_get_job_description.assert_called_once_with(JOB_ID)
 
-    @parameterized.expand(
-        [
-            ("SUBMITTED",),
-            ("PENDING",),
-            ("RUNNABLE",),
-            ("STARTING",),
-            ("RUNNING",),
-        ]
-    )
+    @pytest.mark.parametrize("job_status", ["SUBMITTED", "PENDING", "RUNNABLE", "STARTING", "RUNNING"])
     @mock.patch.object(BatchClientHook, "get_job_description")
-    def test_poke_on_intermediate_state(self, job_status, mock_get_job_description):
+    def test_poke_on_intermediate_state(self, mock_get_job_description, job_status):
+        print(job_status)
         mock_get_job_description.return_value = {"status": job_status}
-        self.assertFalse(self.batch_sensor.poke({}))
+        assert self.batch_sensor.poke({}) is False
         mock_get_job_description.assert_called_once_with(JOB_ID)
 
 
-class TestBatchComputeEnvironmentSensor(unittest.TestCase):
-    def setUp(self):
+class TestBatchComputeEnvironmentSensor:
+    def setup_method(self):
         self.environment_name = "environment_name"
         self.sensor = BatchComputeEnvironmentSensor(
             task_id="test_batch_compute_environment_sensor",
@@ -104,7 +95,7 @@ class TestBatchComputeEnvironmentSensor(unittest.TestCase):
         mock_batch_client.describe_compute_environments.return_value = {
             "computeEnvironments": [{"status": "VALID"}]
         }
-        assert self.sensor.poke({})
+        assert self.sensor.poke({}) is True
         mock_batch_client.describe_compute_environments.assert_called_once_with(
             computeEnvironments=[self.environment_name],
         )
@@ -118,7 +109,7 @@ class TestBatchComputeEnvironmentSensor(unittest.TestCase):
                 }
             ]
         }
-        assert not self.sensor.poke({})
+        assert self.sensor.poke({}) is False
         mock_batch_client.describe_compute_environments.assert_called_once_with(
             computeEnvironments=[self.environment_name],
         )
@@ -140,8 +131,8 @@ class TestBatchComputeEnvironmentSensor(unittest.TestCase):
         assert "AWS Batch compute environment failed" in str(ctx.value)
 
 
-class TestBatchJobQueueSensor(unittest.TestCase):
-    def setUp(self):
+class TestBatchJobQueueSensor:
+    def setup_method(self):
         self.job_queue = "job_queue"
         self.sensor = BatchJobQueueSensor(
             task_id="test_batch_job_queue_sensor",
@@ -162,7 +153,7 @@ class TestBatchJobQueueSensor(unittest.TestCase):
     def test_poke_no_queue_with_treat_non_existing_as_deleted(self, mock_batch_client):
         self.sensor.treat_non_existing_as_deleted = True
         mock_batch_client.describe_job_queues.return_value = {"jobQueues": []}
-        assert self.sensor.poke({})
+        assert self.sensor.poke({}) is True
         mock_batch_client.describe_job_queues.assert_called_once_with(
             jobQueues=[self.job_queue],
         )
@@ -170,7 +161,7 @@ class TestBatchJobQueueSensor(unittest.TestCase):
     @mock.patch.object(BatchClientHook, "client")
     def test_poke_valid(self, mock_batch_client):
         mock_batch_client.describe_job_queues.return_value = {"jobQueues": [{"status": "VALID"}]}
-        assert self.sensor.poke({})
+        assert self.sensor.poke({}) is True
         mock_batch_client.describe_job_queues.assert_called_once_with(
             jobQueues=[self.job_queue],
         )
@@ -184,7 +175,7 @@ class TestBatchJobQueueSensor(unittest.TestCase):
                 }
             ]
         }
-        assert not self.sensor.poke({})
+        assert self.sensor.poke({}) is False
         mock_batch_client.describe_job_queues.assert_called_once_with(
             jobQueues=[self.job_queue],
         )
diff --git a/tests/providers/amazon/aws/sensors/test_cloud_formation.py b/tests/providers/amazon/aws/sensors/test_cloud_formation.py
index 63a54c0bb2..14610df267 100644
--- a/tests/providers/amazon/aws/sensors/test_cloud_formation.py
+++ b/tests/providers/amazon/aws/sensors/test_cloud_formation.py
@@ -63,12 +63,10 @@ class TestCloudFormationCreateStackSensor:
             self.cloudformation_client_mock.describe_stacks.return_value = {
                 "Stacks": [{"StackStatus": "bar"}]
             }
-            with pytest.raises(ValueError) as ctx:
+            with pytest.raises(ValueError, match="Stack foo in bad state: bar"):
                 op = CloudFormationCreateStackSensor(task_id="task", stack_name="foo")
                 op.poke({})
 
-            assert "Stack foo in bad state: bar" == str(ctx.value)
-
 
 class TestCloudFormationDeleteStackSensor:
     task_id = "test_cloudformation_cluster_delete_sensor"
@@ -105,12 +103,10 @@ class TestCloudFormationDeleteStackSensor:
             self.cloudformation_client_mock.describe_stacks.return_value = {
                 "Stacks": [{"StackStatus": "bar"}]
             }
-            with pytest.raises(ValueError) as ctx:
+            with pytest.raises(ValueError, match="Stack foo in bad state: bar"):
                 op = CloudFormationDeleteStackSensor(task_id="task", stack_name="foo")
                 op.poke({})
 
-            assert "Stack foo in bad state: bar" == str(ctx.value)
-
     @mock_cloudformation
     def test_poke_stack_does_not_exist(self):
         op = CloudFormationDeleteStackSensor(task_id="task", stack_name="foo")
diff --git a/tests/providers/amazon/aws/sensors/test_dms_task.py b/tests/providers/amazon/aws/sensors/test_dms_task.py
index e5770593fd..810510c80b 100644
--- a/tests/providers/amazon/aws/sensors/test_dms_task.py
+++ b/tests/providers/amazon/aws/sensors/test_dms_task.py
@@ -16,7 +16,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
@@ -26,8 +25,8 @@ from airflow.providers.amazon.aws.hooks.dms import DmsHook
 from airflow.providers.amazon.aws.sensors.dms import DmsTaskCompletedSensor
 
 
-class TestDmsTaskCompletedSensor(unittest.TestCase):
-    def setUp(self):
+class TestDmsTaskCompletedSensor:
+    def setup_method(self):
         self.sensor = DmsTaskCompletedSensor(
             task_id="test_dms_sensor",
             aws_conn_id="aws_default",
diff --git a/tests/providers/amazon/aws/sensors/test_eks.py b/tests/providers/amazon/aws/sensors/test_eks.py
index 1e3e93791e..fa5457f889 100644
--- a/tests/providers/amazon/aws/sensors/test_eks.py
+++ b/tests/providers/amazon/aws/sensors/test_eks.py
@@ -63,7 +63,7 @@ class TestEksClusterStateSensor:
 
     @mock.patch.object(EksHook, "get_cluster_state", return_value=ClusterStates.ACTIVE)
     def test_poke_reached_target_state(self, mock_get_cluster_state, setUp):
-        assert self.sensor.poke({})
+        assert self.sensor.poke({}) is True
         mock_get_cluster_state.assert_called_once_with(clusterName=CLUSTER_NAME)
 
     @mock.patch("airflow.providers.amazon.aws.hooks.eks.EksHook.get_cluster_state")
@@ -71,7 +71,7 @@ class TestEksClusterStateSensor:
     def test_poke_reached_pending_state(self, mock_get_cluster_state, setUp, pending_state):
         mock_get_cluster_state.return_value = pending_state
 
-        assert not self.sensor.poke({})
+        assert self.sensor.poke({}) is False
         mock_get_cluster_state.assert_called_once_with(clusterName=CLUSTER_NAME)
 
     @mock.patch("airflow.providers.amazon.aws.hooks.eks.EksHook.get_cluster_state")
@@ -104,7 +104,7 @@ class TestEksFargateProfileStateSensor:
 
     @mock.patch.object(EksHook, "get_fargate_profile_state", return_value=FargateProfileStates.ACTIVE)
     def test_poke_reached_target_state(self, mock_get_fargate_profile_state, setUp):
-        assert self.sensor.poke({})
+        assert self.sensor.poke({}) is True
         mock_get_fargate_profile_state.assert_called_once_with(
             clusterName=CLUSTER_NAME, fargateProfileName=FARGATE_PROFILE_NAME
         )
@@ -114,7 +114,7 @@ class TestEksFargateProfileStateSensor:
     def test_poke_reached_pending_state(self, mock_get_fargate_profile_state, setUp, pending_state):
         mock_get_fargate_profile_state.return_value = pending_state
 
-        assert not self.sensor.poke({})
+        assert self.sensor.poke({}) is False
         mock_get_fargate_profile_state.assert_called_once_with(
             clusterName=CLUSTER_NAME, fargateProfileName=FARGATE_PROFILE_NAME
         )
@@ -153,7 +153,7 @@ class TestEksNodegroupStateSensor:
 
     @mock.patch.object(EksHook, "get_nodegroup_state", return_value=NodegroupStates.ACTIVE)
     def test_poke_reached_target_state(self, mock_get_nodegroup_state, setUp):
-        assert self.sensor.poke({})
+        assert self.sensor.poke({}) is True
         mock_get_nodegroup_state.assert_called_once_with(
             clusterName=CLUSTER_NAME, nodegroupName=NODEGROUP_NAME
         )
@@ -163,7 +163,7 @@ class TestEksNodegroupStateSensor:
     def test_poke_reached_pending_state(self, mock_get_nodegroup_state, setUp, pending_state):
         mock_get_nodegroup_state.return_value = pending_state
 
-        assert not self.sensor.poke({})
+        assert self.sensor.poke({}) is False
         mock_get_nodegroup_state.assert_called_once_with(
             clusterName=CLUSTER_NAME, nodegroupName=NODEGROUP_NAME
         )
diff --git a/tests/providers/amazon/aws/sensors/test_emr_base.py b/tests/providers/amazon/aws/sensors/test_emr_base.py
index 87650674b7..b0dfd66233 100644
--- a/tests/providers/amazon/aws/sensors/test_emr_base.py
+++ b/tests/providers/amazon/aws/sensors/test_emr_base.py
@@ -17,8 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
-
 import pytest
 
 from airflow.exceptions import AirflowException
@@ -60,7 +58,7 @@ class EmrBaseSensorSubclass(EmrBaseSensor):
         return None
 
 
-class TestEmrBaseSensor(unittest.TestCase):
+class TestEmrBaseSensor:
     def test_poke_returns_true_when_state_is_in_target_states(self):
         operator = EmrBaseSensorSubclass(
             task_id="test_task",
diff --git a/tests/providers/amazon/aws/sensors/test_emr_containers.py b/tests/providers/amazon/aws/sensors/test_emr_containers.py
index b12a69c789..38d7688f66 100644
--- a/tests/providers/amazon/aws/sensors/test_emr_containers.py
+++ b/tests/providers/amazon/aws/sensors/test_emr_containers.py
@@ -17,7 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
@@ -27,8 +26,8 @@ from airflow.providers.amazon.aws.hooks.emr import EmrContainerHook
 from airflow.providers.amazon.aws.sensors.emr import EmrContainerSensor
 
 
-class TestEmrContainerSensor(unittest.TestCase):
-    def setUp(self):
+class TestEmrContainerSensor:
+    def setup_method(self):
         self.sensor = EmrContainerSensor(
             task_id="test_emrcontainer_sensor",
             virtual_cluster_id="vzwemreks",
diff --git a/tests/providers/amazon/aws/sensors/test_emr_job_flow.py b/tests/providers/amazon/aws/sensors/test_emr_job_flow.py
index 79baee716c..87a80d6a01 100644
--- a/tests/providers/amazon/aws/sensors/test_emr_job_flow.py
+++ b/tests/providers/amazon/aws/sensors/test_emr_job_flow.py
@@ -18,7 +18,7 @@
 from __future__ import annotations
 
 import datetime
-import unittest
+from unittest import mock
 from unittest.mock import MagicMock, patch
 
 import pytest
@@ -188,8 +188,8 @@ DESCRIBE_CLUSTER_TERMINATED_WITH_ERRORS_RETURN = {
 }
 
 
-class TestEmrJobFlowSensor(unittest.TestCase):
-    def setUp(self):
+class TestEmrJobFlowSensor:
+    def setup_method(self):
         # Mock out the emr_client (moto has incorrect response)
         self.mock_emr_client = MagicMock()
 
@@ -216,7 +216,7 @@ class TestEmrJobFlowSensor(unittest.TestCase):
             assert self.mock_emr_client.describe_cluster.call_count == 3
 
             # make sure it was called with the job_flow_id
-            calls = [unittest.mock.call(ClusterId="j-8989898989")]
+            calls = [mock.call(ClusterId="j-8989898989")]
             self.mock_emr_client.describe_cluster.assert_has_calls(calls)
 
     def test_execute_calls_with_the_job_flow_id_until_it_reaches_failed_state_with_exception(self):
@@ -262,5 +262,5 @@ class TestEmrJobFlowSensor(unittest.TestCase):
             assert self.mock_emr_client.describe_cluster.call_count == 3
 
             # make sure it was called with the job_flow_id
-            calls = [unittest.mock.call(ClusterId="j-8989898989")]
+            calls = [mock.call(ClusterId="j-8989898989")]
             self.mock_emr_client.describe_cluster.assert_has_calls(calls)
diff --git a/tests/providers/amazon/aws/sensors/test_emr_step.py b/tests/providers/amazon/aws/sensors/test_emr_step.py
index 1fb5aab378..d053bda97c 100644
--- a/tests/providers/amazon/aws/sensors/test_emr_step.py
+++ b/tests/providers/amazon/aws/sensors/test_emr_step.py
@@ -17,8 +17,8 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from datetime import datetime
+from unittest import mock
 from unittest.mock import MagicMock, patch
 
 import pytest
@@ -142,8 +142,8 @@ DESCRIBE_JOB_STEP_COMPLETED_RETURN = {
 }
 
 
-class TestEmrStepSensor(unittest.TestCase):
-    def setUp(self):
+class TestEmrStepSensor:
+    def setup_method(self):
         self.emr_client_mock = MagicMock()
         self.sensor = EmrStepSensor(
             task_id="test_task",
@@ -170,8 +170,8 @@ class TestEmrStepSensor(unittest.TestCase):
 
             assert self.emr_client_mock.describe_step.call_count == 2
             calls = [
-                unittest.mock.call(ClusterId="j-8989898989", StepId="s-VK57YR1Z9Z5N"),
-                unittest.mock.call(ClusterId="j-8989898989", StepId="s-VK57YR1Z9Z5N"),
+                mock.call(ClusterId="j-8989898989", StepId="s-VK57YR1Z9Z5N"),
+                mock.call(ClusterId="j-8989898989", StepId="s-VK57YR1Z9Z5N"),
             ]
             self.emr_client_mock.describe_step.assert_has_calls(calls)
 
diff --git a/tests/providers/amazon/aws/sensors/test_glacier.py b/tests/providers/amazon/aws/sensors/test_glacier.py
index adac1b358a..20c4156e1b 100644
--- a/tests/providers/amazon/aws/sensors/test_glacier.py
+++ b/tests/providers/amazon/aws/sensors/test_glacier.py
@@ -17,7 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
@@ -29,8 +28,8 @@ SUCCEEDED = "Succeeded"
 IN_PROGRESS = "InProgress"
 
 
-class TestAmazonGlacierSensor(unittest.TestCase):
-    def setUp(self):
+class TestAmazonGlacierSensor:
+    def setup_method(self):
         self.op = GlacierJobOperationSensor(
             task_id="test_athena_sensor",
             aws_conn_id="aws_default",
@@ -63,7 +62,7 @@ class TestAmazonGlacierSensor(unittest.TestCase):
         assert "Sensor failed" in str(ctx.value)
 
 
-class TestSensorJobDescription(unittest.TestCase):
+class TestSensorJobDescription:
     def test_job_status_success(self):
         assert JobStatus.SUCCEEDED.value == SUCCEEDED
 
diff --git a/tests/providers/amazon/aws/sensors/test_glue.py b/tests/providers/amazon/aws/sensors/test_glue.py
index 1b1239c2cf..c8d593eed4 100644
--- a/tests/providers/amazon/aws/sensors/test_glue.py
+++ b/tests/providers/amazon/aws/sensors/test_glue.py
@@ -16,7 +16,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 from unittest.mock import ANY
 
@@ -28,8 +27,8 @@ from airflow.providers.amazon.aws.hooks.glue import GlueJobHook
 from airflow.providers.amazon.aws.sensors.glue import GlueJobSensor
 
 
-class TestGlueJobSensor(unittest.TestCase):
-    def setUp(self):
+class TestGlueJobSensor:
+    def setup_method(self):
         conf.load_test_config()
 
     @mock.patch.object(GlueJobHook, "print_job_logs")
@@ -142,7 +141,3 @@ class TestGlueJobSensor(unittest.TestCase):
                 filter_pattern="?ERROR ?Exception",
                 next_token=ANY,
             )
-
-
-if __name__ == "__main__":
-    unittest.main()
diff --git a/tests/providers/amazon/aws/sensors/test_glue_crawler.py b/tests/providers/amazon/aws/sensors/test_glue_crawler.py
index 17a2953f54..6a6ee5ae89 100644
--- a/tests/providers/amazon/aws/sensors/test_glue_crawler.py
+++ b/tests/providers/amazon/aws/sensors/test_glue_crawler.py
@@ -16,15 +16,14 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 from airflow.providers.amazon.aws.hooks.glue_crawler import GlueCrawlerHook
 from airflow.providers.amazon.aws.sensors.glue_crawler import GlueCrawlerSensor
 
 
-class TestGlueCrawlerSensor(unittest.TestCase):
-    def setUp(self):
+class TestGlueCrawlerSensor:
+    def setup_method(self):
         self.sensor = GlueCrawlerSensor(
             task_id="test_glue_crawler_sensor",
             crawler_name="aws_test_glue_crawler",
@@ -36,21 +35,17 @@ class TestGlueCrawlerSensor(unittest.TestCase):
     @mock.patch.object(GlueCrawlerHook, "get_crawler")
     def test_poke_success(self, mock_get_crawler):
         mock_get_crawler.return_value["LastCrawl"]["Status"] = "SUCCEEDED"
-        self.assertFalse(self.sensor.poke({}))
+        assert self.sensor.poke({}) is False
         mock_get_crawler.assert_called_once_with("aws_test_glue_crawler")
 
     @mock.patch.object(GlueCrawlerHook, "get_crawler")
     def test_poke_failed(self, mock_get_crawler):
         mock_get_crawler.return_value["LastCrawl"]["Status"] = "FAILED"
-        self.assertFalse(self.sensor.poke({}))
+        assert self.sensor.poke({}) is False
         mock_get_crawler.assert_called_once_with("aws_test_glue_crawler")
 
     @mock.patch.object(GlueCrawlerHook, "get_crawler")
     def test_poke_cancelled(self, mock_get_crawler):
         mock_get_crawler.return_value["LastCrawl"]["Status"] = "CANCELLED"
-        self.assertFalse(self.sensor.poke({}))
+        assert self.sensor.poke({}) is False
         mock_get_crawler.assert_called_once_with("aws_test_glue_crawler")
-
-
-if __name__ == "__main__":
-    unittest.main()
diff --git a/tests/providers/amazon/aws/sensors/test_quicksight.py b/tests/providers/amazon/aws/sensors/test_quicksight.py
index 3dbf5f7778..562a986c88 100644
--- a/tests/providers/amazon/aws/sensors/test_quicksight.py
+++ b/tests/providers/amazon/aws/sensors/test_quicksight.py
@@ -17,21 +17,22 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
+import pytest
+from moto import mock_sts
+from moto.core import DEFAULT_ACCOUNT_ID
+
 from airflow.exceptions import AirflowException
 from airflow.providers.amazon.aws.hooks.quicksight import QuickSightHook
-from airflow.providers.amazon.aws.hooks.sts import StsHook
 from airflow.providers.amazon.aws.sensors.quicksight import QuickSightSensor
 
-AWS_ACCOUNT_ID = "123456789012"
 DATA_SET_ID = "DemoDataSet"
 INGESTION_ID = "DemoDataSet_Ingestion"
 
 
-class TestQuickSightSensor(unittest.TestCase):
-    def setUp(self):
+class TestQuickSightSensor:
+    def setup_method(self):
         self.sensor = QuickSightSensor(
             task_id="test_quicksight_sensor",
             aws_conn_id="aws_default",
@@ -39,40 +40,32 @@ class TestQuickSightSensor(unittest.TestCase):
             ingestion_id="DemoDataSet_Ingestion",
         )
 
+    @mock_sts
     @mock.patch.object(QuickSightHook, "get_status")
-    @mock.patch.object(StsHook, "get_conn")
-    @mock.patch.object(StsHook, "get_account_number")
-    def test_poke_success(self, mock_get_account_number, sts_conn, mock_get_status):
-        mock_get_account_number.return_value = AWS_ACCOUNT_ID
+    def test_poke_success(self, mock_get_status):
         mock_get_status.return_value = "COMPLETED"
-        self.assertTrue(self.sensor.poke({}))
-        mock_get_status.assert_called_once_with(AWS_ACCOUNT_ID, DATA_SET_ID, INGESTION_ID)
+        assert self.sensor.poke({}) is True
+        mock_get_status.assert_called_once_with(DEFAULT_ACCOUNT_ID, DATA_SET_ID, INGESTION_ID)
 
+    @mock_sts
     @mock.patch.object(QuickSightHook, "get_status")
-    @mock.patch.object(StsHook, "get_conn")
-    @mock.patch.object(StsHook, "get_account_number")
-    def test_poke_cancelled(self, mock_get_account_number, sts_conn, mock_get_status):
-        mock_get_account_number.return_value = AWS_ACCOUNT_ID
+    def test_poke_cancelled(self, mock_get_status):
         mock_get_status.return_value = "CANCELLED"
-        with self.assertRaises(AirflowException):
+        with pytest.raises(AirflowException):
             self.sensor.poke({})
-        mock_get_status.assert_called_once_with(AWS_ACCOUNT_ID, DATA_SET_ID, INGESTION_ID)
+        mock_get_status.assert_called_once_with(DEFAULT_ACCOUNT_ID, DATA_SET_ID, INGESTION_ID)
 
+    @mock_sts
     @mock.patch.object(QuickSightHook, "get_status")
-    @mock.patch.object(StsHook, "get_conn")
-    @mock.patch.object(StsHook, "get_account_number")
-    def test_poke_failed(self, mock_get_account_number, sts_conn, mock_get_status):
-        mock_get_account_number.return_value = AWS_ACCOUNT_ID
+    def test_poke_failed(self, mock_get_status):
         mock_get_status.return_value = "FAILED"
-        with self.assertRaises(AirflowException):
+        with pytest.raises(AirflowException):
             self.sensor.poke({})
-        mock_get_status.assert_called_once_with(AWS_ACCOUNT_ID, DATA_SET_ID, INGESTION_ID)
+        mock_get_status.assert_called_once_with(DEFAULT_ACCOUNT_ID, DATA_SET_ID, INGESTION_ID)
 
+    @mock_sts
     @mock.patch.object(QuickSightHook, "get_status")
-    @mock.patch.object(StsHook, "get_conn")
-    @mock.patch.object(StsHook, "get_account_number")
-    def test_poke_initialized(self, mock_get_account_number, sts_conn, mock_get_status):
-        mock_get_account_number.return_value = AWS_ACCOUNT_ID
+    def test_poke_initialized(self, mock_get_status):
         mock_get_status.return_value = "INITIALIZED"
-        self.assertFalse(self.sensor.poke({}))
-        mock_get_status.assert_called_once_with(AWS_ACCOUNT_ID, DATA_SET_ID, INGESTION_ID)
+        assert self.sensor.poke({}) is False
+        mock_get_status.assert_called_once_with(DEFAULT_ACCOUNT_ID, DATA_SET_ID, INGESTION_ID)
diff --git a/tests/providers/amazon/aws/sensors/test_s3_key.py b/tests/providers/amazon/aws/sensors/test_s3_key.py
index cd3c64da46..8d560e2c82 100644
--- a/tests/providers/amazon/aws/sensors/test_s3_key.py
+++ b/tests/providers/amazon/aws/sensors/test_s3_key.py
@@ -17,11 +17,9 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
-from parameterized import parameterized
 
 from airflow.exceptions import AirflowException
 from airflow.models import DAG, DagRun, TaskInstance
@@ -30,7 +28,7 @@ from airflow.providers.amazon.aws.sensors.s3 import S3KeySensor
 from airflow.utils import timezone
 
 
-class TestS3KeySensor(unittest.TestCase):
+class TestS3KeySensor:
     def test_bucket_name_none_and_bucket_key_as_relative_path(self):
         """
         Test if exception is raised when bucket_name is None
@@ -81,14 +79,16 @@ class TestS3KeySensor(unittest.TestCase):
         with pytest.raises(TypeError):
             op.poke(None)
 
-    @parameterized.expand(
+    @pytest.mark.parametrize(
+        "key, bucket, parsed_key, parsed_bucket",
         [
-            ["s3://bucket/key", None, "key", "bucket"],
-            ["key", "bucket", "key", "bucket"],
-        ]
+            pytest.param("s3://bucket/key", None, "key", "bucket", id="key as s3url"),
+            pytest.param("key", "bucket", "key", "bucket", id="separate bucket and key"),
+        ],
     )
     @mock.patch("airflow.providers.amazon.aws.sensors.s3.S3Hook.head_object")
-    def test_parse_bucket_key(self, key, bucket, parsed_key, parsed_bucket, mock_head_object):
+    def test_parse_bucket_key(self, mock_head_object, key, bucket, parsed_key, parsed_bucket):
+        print(key, bucket, parsed_key, parsed_bucket)
         mock_head_object.return_value = None
 
         op = S3KeySensor(
diff --git a/tests/providers/amazon/aws/sensors/test_s3_keys_unchanged.py b/tests/providers/amazon/aws/sensors/test_s3_keys_unchanged.py
index 0fec724621..251f8d6258 100644
--- a/tests/providers/amazon/aws/sensors/test_s3_keys_unchanged.py
+++ b/tests/providers/amazon/aws/sensors/test_s3_keys_unchanged.py
@@ -18,11 +18,10 @@
 from __future__ import annotations
 
 from datetime import datetime
-from unittest import TestCase, mock
+from unittest import mock
 
 import pytest
 from freezegun import freeze_time
-from parameterized import parameterized
 
 from airflow.models.dag import DAG, AirflowException
 from airflow.providers.amazon.aws.sensors.s3 import S3KeysUnchangedSensor
@@ -31,8 +30,8 @@ TEST_DAG_ID = "unit_tests_aws_sensor"
 DEFAULT_DATE = datetime(2015, 1, 1)
 
 
-class TestS3KeysUnchangedSensor(TestCase):
-    def setUp(self):
+class TestS3KeysUnchangedSensor:
+    def setup_method(self):
         self.dag = DAG(f"{TEST_DAG_ID}test_schedule_dag_once", start_date=DEFAULT_DATE, schedule="@once")
 
         self.sensor = S3KeysUnchangedSensor(
@@ -76,17 +75,28 @@ class TestS3KeysUnchangedSensor(TestCase):
         with pytest.raises(AirflowException):
             self.sensor.is_keys_unchanged({"a"})
 
-    @parameterized.expand(
+    @pytest.mark.parametrize(
+        "current_objects, expected_returns, inactivity_periods",
         [
-            # Test: resetting inactivity period after key change
-            (({"a"}, {"a", "b"}, {"a", "b", "c"}), (False, False, False), (0, 0, 0)),
-            # ..and in case an item was deleted with option `allow_delete=True`
-            (({"a", "b"}, {"a"}, {"a", "c"}), (False, False, False), (0, 0, 0)),
-            # Test: passes after inactivity period was exceeded
-            (({"a"}, {"a"}, {"a"}), (False, False, True), (0, 10, 20)),
-            # ..and do not pass if empty key is given
-            ((set(), set(), set()), (False, False, False), (0, 10, 20)),
-        ]
+            pytest.param(
+                ({"a"}, {"a", "b"}, {"a", "b", "c"}),
+                (False, False, False),
+                (0, 0, 0),
+                id="resetting inactivity period after key change",
+            ),
+            pytest.param(
+                ({"a", "b"}, {"a"}, {"a", "c"}),
+                (False, False, False),
+                (0, 0, 0),
+                id="item was deleted with option `allow_delete=True`",
+            ),
+            pytest.param(
+                ({"a"}, {"a"}, {"a"}), (False, False, True), (0, 10, 20), id="inactivity period was exceeded"
+            ),
+            pytest.param(
+                (set(), set(), set()), (False, False, False), (0, 10, 20), id="not pass if empty key is given"
+            ),
+        ],
     )
     @freeze_time(DEFAULT_DATE, auto_tick_seconds=10)
     def test_key_changes(self, current_objects, expected_returns, inactivity_periods):
diff --git a/tests/providers/amazon/aws/sensors/test_sagemaker_base.py b/tests/providers/amazon/aws/sensors/test_sagemaker_base.py
index 6eaa9c18d9..7cfc5c29c4 100644
--- a/tests/providers/amazon/aws/sensors/test_sagemaker_base.py
+++ b/tests/providers/amazon/aws/sensors/test_sagemaker_base.py
@@ -17,15 +17,13 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
-
 import pytest
 
 from airflow.exceptions import AirflowException
 from airflow.providers.amazon.aws.sensors.sagemaker import SageMakerBaseSensor
 
 
-class TestSagemakerBaseSensor(unittest.TestCase):
+class TestSagemakerBaseSensor:
     def test_execute(self):
         class SageMakerBaseSensorSubclass(SageMakerBaseSensor):
             def non_terminal_states(self):
diff --git a/tests/providers/amazon/aws/sensors/test_sagemaker_endpoint.py b/tests/providers/amazon/aws/sensors/test_sagemaker_endpoint.py
index f71183be3a..6f5158c042 100644
--- a/tests/providers/amazon/aws/sensors/test_sagemaker_endpoint.py
+++ b/tests/providers/amazon/aws/sensors/test_sagemaker_endpoint.py
@@ -17,7 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
@@ -55,7 +54,7 @@ DESCRIBE_ENDPOINT_UPDATING_RESPONSE = {
 }
 
 
-class TestSageMakerEndpointSensor(unittest.TestCase):
+class TestSageMakerEndpointSensor:
     @mock.patch.object(SageMakerHook, "get_conn")
     @mock.patch.object(SageMakerHook, "describe_endpoint")
     def test_sensor_with_failure(self, mock_describe, mock_get_conn):
diff --git a/tests/providers/amazon/aws/sensors/test_sagemaker_training.py b/tests/providers/amazon/aws/sensors/test_sagemaker_training.py
index 3a13384f25..0fc8bb5f52 100644
--- a/tests/providers/amazon/aws/sensors/test_sagemaker_training.py
+++ b/tests/providers/amazon/aws/sensors/test_sagemaker_training.py
@@ -17,7 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from datetime import datetime
 from unittest import mock
 
@@ -48,7 +47,7 @@ DESCRIBE_TRAINING_STOPPING_RESPONSE = dict(DESCRIBE_TRAINING_COMPLETED_RESPONSE)
 DESCRIBE_TRAINING_STOPPING_RESPONSE.update({"TrainingJobStatus": "Stopping"})
 
 
-class TestSageMakerTrainingSensor(unittest.TestCase):
+class TestSageMakerTrainingSensor:
     @mock.patch.object(SageMakerHook, "get_conn")
     @mock.patch.object(SageMakerHook, "__init__")
     @mock.patch.object(SageMakerHook, "describe_training_job")
diff --git a/tests/providers/amazon/aws/sensors/test_sagemaker_transform.py b/tests/providers/amazon/aws/sensors/test_sagemaker_transform.py
index c6777165b2..3b4d939e8f 100644
--- a/tests/providers/amazon/aws/sensors/test_sagemaker_transform.py
+++ b/tests/providers/amazon/aws/sensors/test_sagemaker_transform.py
@@ -17,7 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
@@ -53,7 +52,7 @@ DESCRIBE_TRANSFORM_STOPPING_RESPONSE = {
 }
 
 
-class TestSageMakerTransformSensor(unittest.TestCase):
+class TestSageMakerTransformSensor:
     @mock.patch.object(SageMakerHook, "get_conn")
     @mock.patch.object(SageMakerHook, "describe_transform_job")
     def test_sensor_with_failure(self, mock_describe_job, mock_client):
diff --git a/tests/providers/amazon/aws/sensors/test_sagemaker_tuning.py b/tests/providers/amazon/aws/sensors/test_sagemaker_tuning.py
index d7ff9153e4..b89f1a85d5 100644
--- a/tests/providers/amazon/aws/sensors/test_sagemaker_tuning.py
+++ b/tests/providers/amazon/aws/sensors/test_sagemaker_tuning.py
@@ -17,7 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 
 import pytest
@@ -56,7 +55,7 @@ DESCRIBE_TUNING_STOPPING_RESPONSE = {
 }
 
 
-class TestSageMakerTuningSensor(unittest.TestCase):
+class TestSageMakerTuningSensor:
     @mock.patch.object(SageMakerHook, "get_conn")
     @mock.patch.object(SageMakerHook, "describe_tuning_job")
     def test_sensor_with_failure(self, mock_describe_job, mock_client):
diff --git a/tests/providers/amazon/aws/sensors/test_sqs.py b/tests/providers/amazon/aws/sensors/test_sqs.py
index 7a46cbdbb5..73a3ce8278 100644
--- a/tests/providers/amazon/aws/sensors/test_sqs.py
+++ b/tests/providers/amazon/aws/sensors/test_sqs.py
@@ -18,7 +18,6 @@
 from __future__ import annotations
 
 import json
-import unittest
 from unittest import mock
 
 import pytest
@@ -36,8 +35,8 @@ QUEUE_NAME = "test-queue"
 QUEUE_URL = f"https://{QUEUE_NAME}"
 
 
-class TestSqsSensor(unittest.TestCase):
-    def setUp(self):
+class TestSqsSensor:
+    def setup_method(self):
         args = {"owner": "airflow", "start_date": DEFAULT_DATE}
 
         self.dag = DAG("test_dag_id", default_args=args)
diff --git a/tests/providers/amazon/aws/sensors/test_step_function.py b/tests/providers/amazon/aws/sensors/test_step_function.py
index c3cf62d6a3..d0452dd0c2 100644
--- a/tests/providers/amazon/aws/sensors/test_step_function.py
+++ b/tests/providers/amazon/aws/sensors/test_step_function.py
@@ -17,12 +17,10 @@
 # under the License.
 from __future__ import annotations
 
-import unittest
 from unittest import mock
 from unittest.mock import MagicMock
 
 import pytest
-from parameterized import parameterized
 
 from airflow.exceptions import AirflowException
 from airflow.providers.amazon.aws.sensors.step_function import StepFunctionExecutionSensor
@@ -36,8 +34,8 @@ AWS_CONN_ID = "aws_non_default"
 REGION_NAME = "us-west-2"
 
 
-class TestStepFunctionExecutionSensor(unittest.TestCase):
-    def setUp(self):
+class TestStepFunctionExecutionSensor:
+    def setup_method(self):
         self.mock_context = MagicMock()
 
     def test_init(self):
@@ -50,9 +48,9 @@ class TestStepFunctionExecutionSensor(unittest.TestCase):
         assert AWS_CONN_ID == sensor.aws_conn_id
         assert REGION_NAME == sensor.region_name
 
-    @parameterized.expand([("FAILED",), ("TIMED_OUT",), ("ABORTED",)])
+    @pytest.mark.parametrize("mock_status", ["FAILED", "TIMED_OUT", "ABORTED"])
     @mock.patch("airflow.providers.amazon.aws.sensors.step_function.StepFunctionHook")
-    def test_exceptions(self, mock_status, mock_hook):
+    def test_exceptions(self, mock_hook, mock_status):
         hook_response = {"status": mock_status}
 
         hook_instance = mock_hook.return_value


[airflow] 06/12: Update codespell and fix typos (#28568)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 393bba236e100b74b88254c486a71aded27c1eec
Author: KarshVashi <41...@users.noreply.github.com>
AuthorDate: Sat Dec 24 03:24:48 2022 +0000

    Update codespell and fix typos (#28568)
    
    Co-authored-by: Kaxil Naik <ka...@gmail.com>
    (cherry picked from commit c0a7bf243461bf5e546367094e46eaab41e3831e)
---
 .codespellignorelines                                          | 2 ++
 .pre-commit-config.yaml                                        | 2 +-
 RELEASE_NOTES.rst                                              | 2 +-
 airflow/providers/sftp/hooks/sftp.py                           | 2 +-
 airflow/utils/code_utils.py                                    | 2 +-
 dev/breeze/doc/adr/0002-implement-standalone-python-command.md | 2 +-
 dev/stats/get_important_pr_candidates.py                       | 2 +-
 docs/apache-airflow/extra-packages-ref.rst                     | 2 +-
 docs/spelling_wordlist.txt                                     | 2 ++
 tests/providers/docker/operators/test_docker.py                | 8 ++++----
 10 files changed, 15 insertions(+), 11 deletions(-)

diff --git a/.codespellignorelines b/.codespellignorelines
index d641f0aaaa..4b0179fdc9 100644
--- a/.codespellignorelines
+++ b/.codespellignorelines
@@ -1,3 +1,5 @@
             f"DELETE {source_table} FROM { ', '.join(_from_name(tbl) for tbl in stmt.froms) }"
         for frm in source_query.selectable.froms:
     roles = relationship("Role", secondary=assoc_user_role, backref="user", lazy="selectin")
+    The platform supports **C**reate, **R**ead, **U**pdate, and **D**elete operations on most resources.
+<pre><code>Code block\ndoes not\nrespect\nnewlines\n</code></pre>
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index dae2207a9a..e188a72a5d 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -283,7 +283,7 @@ repos:
          - --line-length
          - '99999'
   - repo: https://github.com/codespell-project/codespell
-    rev: v2.1.0
+    rev: v2.2.2
     hooks:
       - id: codespell
         name: Run codespell to check for common misspellings in files
diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index ff777a75ed..20936bb620 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -8069,7 +8069,7 @@ Improvement
 - [AIRFLOW-3862] Check types with mypy. (#4685)
 - [AIRFLOW-251] Add option SQL_ALCHEMY_SCHEMA parameter to specify schema for metadata (#4199)
 - [AIRFLOW-1814] Temple PythonOperator {op_args,op_kwargs} fields (#4691)
-- [AIRFLOW-3730] Standarization use of logs mechanisms (#4556)
+- [AIRFLOW-3730] Standardization use of logs mechanisms (#4556)
 - [AIRFLOW-3770] Validation of documentation on CI] (#4593)
 - [AIRFLOW-3866] Run docker-compose pull silently in CI (#4688)
 - [AIRFLOW-3685] Move licence header check (#4497)
diff --git a/airflow/providers/sftp/hooks/sftp.py b/airflow/providers/sftp/hooks/sftp.py
index 66ae50665b..7ba9a8162b 100644
--- a/airflow/providers/sftp/hooks/sftp.py
+++ b/airflow/providers/sftp/hooks/sftp.py
@@ -305,7 +305,7 @@ class SFTPHook(SSHHook):
     ) -> None:
         """
         Recursively descend, depth first, the directory tree rooted at
-        path, calling discreet callback functions for each regular file,
+        path, calling discrete callback functions for each regular file,
         directory and unknown file type.
 
         :param str path:
diff --git a/airflow/utils/code_utils.py b/airflow/utils/code_utils.py
index b5723ce7ef..7783fec0a1 100644
--- a/airflow/utils/code_utils.py
+++ b/airflow/utils/code_utils.py
@@ -56,7 +56,7 @@ def prepare_code_snippet(file_path: str, line_no: int, context_lines_count: int
     """
     Prepare code snippet with line numbers and  a specific line marked.
 
-    :param file_path: File nam
+    :param file_path: File name
     :param line_no: Line number
     :param context_lines_count: The number of lines that will be cut before and after.
     :return: str
diff --git a/dev/breeze/doc/adr/0002-implement-standalone-python-command.md b/dev/breeze/doc/adr/0002-implement-standalone-python-command.md
index b8b56d7588..8aab0db417 100644
--- a/dev/breeze/doc/adr/0002-implement-standalone-python-command.md
+++ b/dev/breeze/doc/adr/0002-implement-standalone-python-command.md
@@ -127,7 +127,7 @@ The main decision is:
 **Vast majority of both Breeze and our CI scripts should be Python-based**
 
 There are likely a number of scripts that will remain in Bash, but they should contain no sophisticated
-logic, they should not haave common code in form of libraries and only used to execute simple tasks inside
+logic, they should not have common code in form of libraries and only used to execute simple tasks inside
 Docker containers. No Bash should ever be used in the host environment.
 
 There are a few properties of Breeze/CI scripts that should be maintained though
diff --git a/dev/stats/get_important_pr_candidates.py b/dev/stats/get_important_pr_candidates.py
index 5e22966b35..28a8081e01 100755
--- a/dev/stats/get_important_pr_candidates.py
+++ b/dev/stats/get_important_pr_candidates.py
@@ -354,7 +354,7 @@ DEFAULT_TOP_PRS = 10
 )
 @click.option("--top-number", type=int, default=DEFAULT_TOP_PRS, help="The number of PRs to select")
 @click.option("--save", type=click.File("wb"), help="Save PR data to a pickle file")
-@click.option("--load", type=click.File("rb"), help="Load PR data from a file and recalcuate scores")
+@click.option("--load", type=click.File("rb"), help="Load PR data from a file and recalculate scores")
 @click.option("--verbose", is_flag="True", help="Print scoring details")
 def main(
     github_token: str,
diff --git a/docs/apache-airflow/extra-packages-ref.rst b/docs/apache-airflow/extra-packages-ref.rst
index e9f4bfb399..6315d31703 100644
--- a/docs/apache-airflow/extra-packages-ref.rst
+++ b/docs/apache-airflow/extra-packages-ref.rst
@@ -298,7 +298,7 @@ These are extras that provide support for integration with external systems via
 Bundle extras
 -------------
 
-These are extras that install one ore more extras as a bundle. Note that these extras should only be used for "development" version
+These are extras that install one or more extras as a bundle. Note that these extras should only be used for "development" version
 of Airflow - i.e. when Airflow is installed from sources. Because of the way how bundle extras are constructed they might not
 work when airflow is installed from 'PyPI`.
 
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 3c3b855133..8bd3aa4b75 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -391,6 +391,7 @@ Decrypt
 decrypt
 decrypted
 Decrypts
+dedented
 deduplicate
 deduplication
 deepcopy
@@ -406,6 +407,7 @@ DependencyMixin
 deploymentUrl
 Deprecations
 deps
+deques
 deregister
 desc
 deserialization
diff --git a/tests/providers/docker/operators/test_docker.py b/tests/providers/docker/operators/test_docker.py
index 0430e0ae2c..02734eede8 100644
--- a/tests/providers/docker/operators/test_docker.py
+++ b/tests/providers/docker/operators/test_docker.py
@@ -101,7 +101,7 @@ class TestDockerOperator:
             host_tmp_dir="/host/airflow",
             container_name="test_container",
             tty=True,
-            hostname="test.contrainer.host",
+            hostname="test.container.host",
             device_requests=[DeviceRequest(count=-1, capabilities=[["gpu"]])],
             log_opts_max_file="5",
             log_opts_max_size="10m",
@@ -128,7 +128,7 @@ class TestDockerOperator:
             entrypoint=["sh", "-c"],
             working_dir="/container/path",
             tty=True,
-            hostname="test.contrainer.host",
+            hostname="test.container.host",
         )
         self.client_mock.create_host_config.assert_called_once_with(
             mounts=[
@@ -185,7 +185,7 @@ class TestDockerOperator:
             shm_size=1000,
             host_tmp_dir="/host/airflow",
             container_name="test_container",
-            hostname="test.contrainer.host",
+            hostname="test.container.host",
             tty=True,
         )
         operator.execute(None)
@@ -204,7 +204,7 @@ class TestDockerOperator:
             entrypoint=["sh", "-c"],
             working_dir="/container/path",
             tty=True,
-            hostname="test.contrainer.host",
+            hostname="test.container.host",
         )
         self.client_mock.create_host_config.assert_called_once_with(
             mounts=[


[airflow] 09/12: Rerun flaky PinotDB integration test (#28562)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 43b8ea65a2b3868c849bab856f85f46e9b8f7a67
Author: Andrey Anshin <An...@taragol.is>
AuthorDate: Tue Dec 27 11:21:15 2022 +0400

    Rerun flaky PinotDB integration test (#28562)
    
    (cherry picked from commit fff9fd3a53d239625692e141a996e98db5b8d88f)
---
 setup.py                                                     | 1 -
 tests/integration/providers/apache/pinot/hooks/test_pinot.py | 2 ++
 tests/providers/ssh/hooks/test_ssh.py                        | 4 ++--
 3 files changed, 4 insertions(+), 3 deletions(-)

diff --git a/setup.py b/setup.py
index a132eb861c..622d630c05 100644
--- a/setup.py
+++ b/setup.py
@@ -369,7 +369,6 @@ devel_only = [
     "flake8>=3.9.0",
     "flake8-colors",
     "flake8-implicit-str-concat",
-    "flaky",
     "gitpython",
     "ipdb",
     # make sure that we are using stable sorting order from 5.* version (some changes were introduced
diff --git a/tests/integration/providers/apache/pinot/hooks/test_pinot.py b/tests/integration/providers/apache/pinot/hooks/test_pinot.py
index d99e8efdf4..432521f0ae 100644
--- a/tests/integration/providers/apache/pinot/hooks/test_pinot.py
+++ b/tests/integration/providers/apache/pinot/hooks/test_pinot.py
@@ -26,6 +26,8 @@ from airflow.providers.apache.pinot.hooks.pinot import PinotDbApiHook
 
 @pytest.mark.integration("pinot")
 class TestPinotDbApiHookIntegration:
+    # This test occasionally fail in the CI. Re-run this test if it failed after timeout but only once.
+    @pytest.mark.flaky(reruns=1, reruns_delay=30)
     @mock.patch.dict("os.environ", AIRFLOW_CONN_PINOT_BROKER_DEFAULT="pinot://pinot:8000/")
     def test_should_return_records(self):
         hook = PinotDbApiHook()
diff --git a/tests/providers/ssh/hooks/test_ssh.py b/tests/providers/ssh/hooks/test_ssh.py
index 6448d88efe..7789ee754c 100644
--- a/tests/providers/ssh/hooks/test_ssh.py
+++ b/tests/providers/ssh/hooks/test_ssh.py
@@ -890,7 +890,7 @@ class TestSSHHook:
                 session.delete(conn)
                 session.commit()
 
-    @pytest.mark.flaky(max_runs=5, min_passes=1)
+    @pytest.mark.flaky(reruns=5)
     def test_exec_ssh_client_command(self):
         hook = SSHHook(
             ssh_conn_id="ssh_default",
@@ -907,7 +907,7 @@ class TestSSHHook:
             )
             assert ret == (0, b"airflow\n", b"")
 
-    @pytest.mark.flaky(max_runs=5, min_passes=1)
+    @pytest.mark.flaky(reruns=5)
     def test_command_timeout(self):
         hook = SSHHook(
             ssh_conn_id="ssh_default",


[airflow] 03/12: Add deferrable mode to CloudBuildCreateBuildOperator (#27783)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d503914b60a16622733a443540b71975edab1faf
Author: VladaZakharova <80...@users.noreply.github.com>
AuthorDate: Sat Dec 3 11:47:03 2022 +0100

    Add deferrable mode to CloudBuildCreateBuildOperator (#27783)
    
    (cherry picked from commit c931d888936a958ae40b69077d35215227bf1dff)
---
 .../providers/google/cloud/hooks/cloud_build.py    |  98 +++++++-
 .../google/cloud/operators/cloud_build.py          |  90 +++++--
 .../providers/google/cloud/triggers/cloud_build.py | 125 ++++++++++
 .../operators/cloud/cloud_build.rst                |  50 ++++
 .../google/cloud/hooks/test_cloud_build.py         |  42 +++-
 .../google/cloud/operators/test_cloud_build.py     | 265 +++++++++++++++++----
 .../google/cloud/triggers/test_cloud_build.py      | 240 +++++++++++++++++++
 .../cloud/cloud_build/example_cloud_build.py       |  18 +-
 ...cloud_build.py => example_cloud_build_async.py} |  44 ++--
 .../cloud_build/example_cloud_build_trigger.py     |  12 +-
 10 files changed, 890 insertions(+), 94 deletions(-)

diff --git a/airflow/providers/google/cloud/hooks/cloud_build.py b/airflow/providers/google/cloud/hooks/cloud_build.py
index 6ba6fd06e9..0702968ccf 100644
--- a/airflow/providers/google/cloud/hooks/cloud_build.py
+++ b/airflow/providers/google/cloud/hooks/cloud_build.py
@@ -20,10 +20,11 @@ from __future__ import annotations
 
 from typing import Sequence
 
+from google.api_core.exceptions import AlreadyExists
 from google.api_core.gapic_v1.method import DEFAULT, _MethodDefault
 from google.api_core.operation import Operation
 from google.api_core.retry import Retry
-from google.cloud.devtools.cloudbuild import CloudBuildClient
+from google.cloud.devtools.cloudbuild_v1 import CloudBuildAsyncClient, CloudBuildClient, GetBuildRequest
 from google.cloud.devtools.cloudbuild_v1.types import Build, BuildTrigger, RepoSource
 
 from airflow.exceptions import AirflowException
@@ -77,6 +78,14 @@ class CloudBuildHook(GoogleBaseHook):
         except Exception:
             raise AirflowException("Could not retrieve Build ID from Operation.")
 
+    def wait_for_operation(self, operation: Operation, timeout: float | None = None):
+        """Waits for long-lasting operation to complete."""
+        try:
+            return operation.result(timeout=timeout)
+        except Exception:
+            error = operation.exception(timeout=timeout)
+            raise AirflowException(error)
+
     def get_conn(self) -> CloudBuildClient:
         """
         Retrieves the connection to Google Cloud Build.
@@ -123,6 +132,41 @@ class CloudBuildHook(GoogleBaseHook):
 
         return build
 
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_build_without_waiting_for_result(
+        self,
+        build: dict | Build,
+        project_id: str = PROVIDE_PROJECT_ID,
+        retry: Retry | _MethodDefault = DEFAULT,
+        timeout: float | None = None,
+        metadata: Sequence[tuple[str, str]] = (),
+    ) -> tuple[Operation, str]:
+        """
+        Starts a build with the specified configuration without waiting for it to finish.
+
+        :param build: The build resource to create. If a dict is provided, it must be of the same form
+            as the protobuf message `google.cloud.devtools.cloudbuild_v1.types.Build`
+        :param project_id: Optional, Google Cloud Project project_id where the function belongs.
+            If set to None or missing, the default project_id from the GCP connection is used.
+        :param retry: Optional, a retry object used  to retry requests. If `None` is specified, requests
+            will not be retried.
+        :param timeout: Optional, the amount of time, in seconds, to wait for the request to complete.
+            Note that if `retry` is specified, the timeout applies to each individual attempt.
+        :param metadata: Optional, additional metadata that is provided to the method.
+        """
+        client = self.get_conn()
+
+        self.log.info("Start creating build...")
+
+        operation = client.create_build(
+            request={"project_id": project_id, "build": build},
+            retry=retry,
+            timeout=timeout,
+            metadata=metadata,
+        )
+        id_ = self._get_build_id_from_operation(operation)
+        return operation, id_
+
     @GoogleBaseHook.fallback_to_default_project_id
     def create_build(
         self,
@@ -150,7 +194,7 @@ class CloudBuildHook(GoogleBaseHook):
         """
         client = self.get_conn()
 
-        self.log.info("Start creating build.")
+        self.log.info("Start creating build...")
 
         operation = client.create_build(
             request={"project_id": project_id, "build": build},
@@ -195,14 +239,17 @@ class CloudBuildHook(GoogleBaseHook):
         """
         client = self.get_conn()
 
-        self.log.info("Start creating build trigger.")
+        self.log.info("Start creating build trigger...")
 
-        trigger = client.create_build_trigger(
-            request={"project_id": project_id, "trigger": trigger},
-            retry=retry,
-            timeout=timeout,
-            metadata=metadata,
-        )
+        try:
+            trigger = client.create_build_trigger(
+                request={"project_id": project_id, "trigger": trigger},
+                retry=retry,
+                timeout=timeout,
+                metadata=metadata,
+            )
+        except AlreadyExists:
+            raise AirflowException("Cloud Build Trigger with such parameters already exists.")
 
         self.log.info("Build trigger has been created.")
 
@@ -492,7 +539,6 @@ class CloudBuildHook(GoogleBaseHook):
         client = self.get_conn()
 
         self.log.info("Start running build trigger: %s.", trigger_id)
-
         operation = client.run_build_trigger(
             request={"project_id": project_id, "trigger_id": trigger_id, "source": source},
             retry=retry,
@@ -504,7 +550,6 @@ class CloudBuildHook(GoogleBaseHook):
 
         if not wait:
             return self.get_build(id_=id_, project_id=project_id)
-
         operation.result()
 
         self.log.info("Build trigger has been run: %s.", trigger_id)
@@ -550,3 +595,34 @@ class CloudBuildHook(GoogleBaseHook):
         self.log.info("Build trigger has been updated: %s.", trigger_id)
 
         return trigger
+
+
+class CloudBuildAsyncHook(GoogleBaseHook):
+    """Asynchronous Hook for the Google Cloud Build Service."""
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    async def get_cloud_build(
+        self,
+        id_: str,
+        project_id: str = PROVIDE_PROJECT_ID,
+        retry: Retry | _MethodDefault = DEFAULT,
+        timeout: float | None = None,
+        metadata: Sequence[tuple[str, str]] = (),
+    ) -> Build:
+        """Retrieves a Cloud Build with a specified id."""
+        if not id_:
+            raise AirflowException("Google Cloud Build id is required.")
+
+        client = CloudBuildAsyncClient()
+
+        request = GetBuildRequest(
+            project_id=project_id,
+            id=id_,
+        )
+        build_instance = await client.get_build(
+            request=request,
+            retry=retry,
+            timeout=timeout,
+            metadata=metadata,
+        )
+        return build_instance
diff --git a/airflow/providers/google/cloud/operators/cloud_build.py b/airflow/providers/google/cloud/operators/cloud_build.py
index c33fa36c64..33d8f40c88 100644
--- a/airflow/providers/google/cloud/operators/cloud_build.py
+++ b/airflow/providers/google/cloud/operators/cloud_build.py
@@ -37,6 +37,8 @@ from airflow.providers.google.cloud.links.cloud_build import (
     CloudBuildTriggerDetailsLink,
     CloudBuildTriggersListLink,
 )
+from airflow.providers.google.cloud.triggers.cloud_build import CloudBuildCreateBuildTrigger
+from airflow.providers.google.common.consts import GOOGLE_DEFAULT_DEFERRABLE_METHOD_NAME
 from airflow.utils import yaml
 
 if TYPE_CHECKING:
@@ -147,7 +149,13 @@ class CloudBuildCreateBuildOperator(BaseOperator):
         If set as a sequence, the identities from the list must grant
         Service Account Token Creator IAM role to the directly preceding identity, with first
         account from the list granting this role to the originating account (templated).
-
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :param retry: Designation of what errors, if any, should be retried.
+    :param timeout: The timeout for this request.
+    :param metadata: Strings which should be sent along with the request as metadata.
+    :param deferrable: Run operator in the deferrable mode
     """
 
     template_fields: Sequence[str] = ("project_id", "build", "gcp_conn_id", "impersonation_chain")
@@ -164,9 +172,15 @@ class CloudBuildCreateBuildOperator(BaseOperator):
         metadata: Sequence[tuple[str, str]] = (),
         gcp_conn_id: str = "google_cloud_default",
         impersonation_chain: str | Sequence[str] | None = None,
+        delegate_to: str | None = None,
+        poll_interval: float = 4.0,
+        deferrable: bool = False,
         **kwargs,
     ) -> None:
         super().__init__(**kwargs)
+        self.build = build
+        # Not template fields to keep original value
+        self.build_raw = build
         self.project_id = project_id
         self.wait = wait
         self.retry = retry
@@ -174,9 +188,9 @@ class CloudBuildCreateBuildOperator(BaseOperator):
         self.metadata = metadata
         self.gcp_conn_id = gcp_conn_id
         self.impersonation_chain = impersonation_chain
-        self.build = build
-        # Not template fields to keep original value
-        self.build_raw = build
+        self.delegate_to = delegate_to
+        self.poll_interval = poll_interval
+        self.deferrable = deferrable
 
     def prepare_template(self) -> None:
         # if no file is specified, skip
@@ -189,29 +203,69 @@ class CloudBuildCreateBuildOperator(BaseOperator):
                 self.build = json.loads(file.read())
 
     def execute(self, context: Context):
-        hook = CloudBuildHook(gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain)
-
+        hook = CloudBuildHook(
+            gcp_conn_id=self.gcp_conn_id,
+            impersonation_chain=self.impersonation_chain,
+            delegate_to=self.delegate_to,
+        )
         build = BuildProcessor(build=self.build).process_body()
 
-        result = hook.create_build(
+        self.cloud_build_operation, self.id_ = hook.create_build_without_waiting_for_result(
             build=build,
             project_id=self.project_id,
-            wait=self.wait,
             retry=self.retry,
             timeout=self.timeout,
             metadata=self.metadata,
         )
-
-        self.xcom_push(context, key="id", value=result.id)
-        project_id = self.project_id or hook.project_id
-        if project_id:
-            CloudBuildLink.persist(
-                context=context,
-                task_instance=self,
-                project_id=project_id,
-                build_id=result.id,
+        self.xcom_push(context, key="id", value=self.id_)
+        if not self.wait:
+            return Build.to_dict(hook.get_build(id_=self.id_, project_id=self.project_id))
+
+        if self.deferrable:
+            self.defer(
+                trigger=CloudBuildCreateBuildTrigger(
+                    id_=self.id_,
+                    project_id=self.project_id,
+                    gcp_conn_id=self.gcp_conn_id,
+                    impersonation_chain=self.impersonation_chain,
+                    delegate_to=self.delegate_to,
+                    poll_interval=self.poll_interval,
+                ),
+                method_name=GOOGLE_DEFAULT_DEFERRABLE_METHOD_NAME,
             )
-        return Build.to_dict(result)
+        else:
+            cloud_build_instance_result = hook.wait_for_operation(
+                timeout=self.timeout, operation=self.cloud_build_operation
+            )
+            project_id = self.project_id or hook.project_id
+            if project_id:
+                CloudBuildLink.persist(
+                    context=context,
+                    task_instance=self,
+                    project_id=project_id,
+                    build_id=cloud_build_instance_result.id,
+                )
+            return Build.to_dict(cloud_build_instance_result)
+
+    def execute_complete(self, context: Context, event: dict):
+        if event["status"] == "success":
+            hook = CloudBuildHook(
+                gcp_conn_id=self.gcp_conn_id,
+                impersonation_chain=self.impersonation_chain,
+                delegate_to=self.delegate_to,
+            )
+            self.log.info("Cloud Build completed with response %s ", event["message"])
+            project_id = self.project_id or hook.project_id
+            if project_id:
+                CloudBuildLink.persist(
+                    context=context,
+                    task_instance=self,
+                    project_id=project_id,
+                    build_id=event["id_"],
+                )
+            return event["instance"]
+        else:
+            raise AirflowException(f"Unexpected error in the operation: {event['message']}")
 
 
 class CloudBuildCreateBuildTriggerOperator(BaseOperator):
diff --git a/airflow/providers/google/cloud/triggers/cloud_build.py b/airflow/providers/google/cloud/triggers/cloud_build.py
new file mode 100644
index 0000000000..130fc857f8
--- /dev/null
+++ b/airflow/providers/google/cloud/triggers/cloud_build.py
@@ -0,0 +1,125 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+import asyncio
+from typing import Any, AsyncIterator, Sequence
+
+from google.cloud.devtools.cloudbuild_v1.types import Build
+
+from airflow.providers.google.cloud.hooks.cloud_build import CloudBuildAsyncHook
+from airflow.triggers.base import BaseTrigger, TriggerEvent
+
+
+class CloudBuildCreateBuildTrigger(BaseTrigger):
+    """
+    CloudBuildCreateBuildTrigger run on the trigger worker to perform create Build operation
+
+    :param id_: The ID of the build.
+    :param project_id: Google Cloud Project where the job is running
+    :param gcp_conn_id: Optional, the connection ID used to connect to Google Cloud Platform.
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :param poll_interval: polling period in seconds to check for the status
+    """
+
+    def __init__(
+        self,
+        id_: str,
+        project_id: str | None,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        delegate_to: str | None = None,
+        poll_interval: float = 4.0,
+    ):
+        super().__init__()
+        self.id_ = id_
+        self.project_id = project_id
+        self.gcp_conn_id = gcp_conn_id
+        self.impersonation_chain = impersonation_chain
+        self.delegate_to = delegate_to
+        self.poll_interval = poll_interval
+
+    def serialize(self) -> tuple[str, dict[str, Any]]:
+        """Serializes CloudBuildCreateBuildTrigger arguments and classpath."""
+        return (
+            "airflow.providers.google.cloud.triggers.cloud_build.CloudBuildCreateBuildTrigger",
+            {
+                "id_": self.id_,
+                "project_id": self.project_id,
+                "gcp_conn_id": self.gcp_conn_id,
+                "impersonation_chain": self.impersonation_chain,
+                "delegate_to": self.delegate_to,
+                "poll_interval": self.poll_interval,
+            },
+        )
+
+    async def run(self) -> AsyncIterator["TriggerEvent"]:  # type: ignore[override]
+        """Gets current build execution status and yields a TriggerEvent"""
+        hook = self._get_async_hook()
+        while True:
+            try:
+                # Poll for job execution status
+                cloud_build_instance = await hook.get_cloud_build(
+                    id_=self.id_,
+                    project_id=self.project_id,
+                )
+                if cloud_build_instance._pb.status in (Build.Status.SUCCESS,):
+                    yield TriggerEvent(
+                        {
+                            "instance": Build.to_dict(cloud_build_instance),
+                            "id_": self.id_,
+                            "status": "success",
+                            "message": "Build completed",
+                        }
+                    )
+                elif cloud_build_instance._pb.status in (
+                    Build.Status.WORKING,
+                    Build.Status.PENDING,
+                    Build.Status.QUEUED,
+                ):
+                    self.log.info("Build is still running...")
+                    self.log.info("Sleeping for %s seconds.", self.poll_interval)
+                    await asyncio.sleep(self.poll_interval)
+                elif cloud_build_instance._pb.status in (
+                    Build.Status.FAILURE,
+                    Build.Status.INTERNAL_ERROR,
+                    Build.Status.TIMEOUT,
+                    Build.Status.CANCELLED,
+                    Build.Status.EXPIRED,
+                ):
+                    yield TriggerEvent({"status": "error", "message": cloud_build_instance.status_detail})
+                else:
+                    yield TriggerEvent(
+                        {"status": "error", "message": "Unidentified status of Cloud Build instance"}
+                    )
+
+            except Exception as e:
+                self.log.exception("Exception occurred while checking for Cloud Build completion")
+                yield TriggerEvent({"status": "error", "message": str(e)})
+
+    def _get_async_hook(self) -> CloudBuildAsyncHook:
+        return CloudBuildAsyncHook(gcp_conn_id=self.gcp_conn_id)
diff --git a/docs/apache-airflow-providers-google/operators/cloud/cloud_build.rst b/docs/apache-airflow-providers-google/operators/cloud/cloud_build.rst
index 0e39a1399a..454e378c2f 100644
--- a/docs/apache-airflow-providers-google/operators/cloud/cloud_build.rst
+++ b/docs/apache-airflow-providers-google/operators/cloud/cloud_build.rst
@@ -102,6 +102,14 @@ Trigger a build is performed with the
     :start-after: [START howto_operator_create_build_from_storage]
     :end-before: [END howto_operator_create_build_from_storage]
 
+You can use deferrable mode for this action in order to run the operator asynchronously:
+
+.. exampleinclude:: /../../tests/system/providers/google/cloud/cloud_build/example_cloud_build_async.py
+    :language: python
+    :dedent: 0
+    :start-after: [START howto_operator_create_build_from_storage_async]
+    :end-before: [END howto_operator_create_build_from_storage_async]
+
 You can use :ref:`Jinja templating <concepts:jinja-templating>` with
 :template-fields:`airflow.providers.google.cloud.operators.cloud_build.CloudBuildCreateBuildOperator`
 parameters which allows you to dynamically determine values. The result is saved to :ref:`XCom <concepts:xcom>`, which allows it
@@ -122,6 +130,48 @@ you can pass wait=False as example shown below.
     :start-after: [START howto_operator_create_build_without_wait]
     :end-before: [END howto_operator_create_build_without_wait]
 
+You can use deferrable mode for this action in order to run the operator asynchronously:
+
+.. exampleinclude:: /../../tests/system/providers/google/cloud/cloud_build/example_cloud_build_async.py
+    :language: python
+    :dedent: 0
+    :start-after: [START howto_operator_create_build_without_wait_async]
+    :end-before: [END howto_operator_create_build_without_wait_async]
+
+In order to start a build on Cloud Build you can use a build configuration file. A build config file defines the fields
+that are needed for Cloud Build to perform your tasks. You can write the build config file using the YAML or the JSON syntax.
+
+.. exampleinclude:: ../../../../tests/system/providers/google/cloud/cloud_build/example_cloud_build.py
+    :language: python
+    :dedent: 4
+    :start-after: [START howto_operator_gcp_create_build_from_yaml_body]
+    :end-before: [END howto_operator_gcp_create_build_from_yaml_body]
+
+You can use deferrable mode for this action in order to run the operator asynchronously:
+
+.. exampleinclude:: /../../tests/system/providers/google/cloud/cloud_build/example_cloud_build_async.py
+    :language: python
+    :dedent: 0
+    :start-after: [START howto_operator_gcp_create_build_from_yaml_body_async]
+    :end-before: [END howto_operator_gcp_create_build_from_yaml_body_async]
+
+In addition, a Cloud Build can refer to source stored in `Google Cloud Source Repositories <https://cloud.google.com/source-repositories/docs/>`__.
+Once build has started, it ill build the code in source repositories.
+
+.. exampleinclude:: ../../../../tests/system/providers/google/cloud/cloud_build/example_cloud_build.py
+    :language: python
+    :dedent: 0
+    :start-after: [START howto_operator_create_build_from_repo]
+    :end-before: [END howto_operator_create_build_from_repo]
+
+You can use deferrable mode for this action in order to run the operator asynchronously:
+
+.. exampleinclude:: /../../tests/system/providers/google/cloud/cloud_build/example_cloud_build_async.py
+    :language: python
+    :dedent: 0
+    :start-after: [START howto_operator_create_build_from_repo_async]
+    :end-before: [END howto_operator_create_build_from_repo_async]
+
 .. _howto/operator:CloudBuildCreateBuildTriggerOperator:
 
 CloudBuildCreateBuildTriggerOperator
diff --git a/tests/providers/google/cloud/hooks/test_cloud_build.py b/tests/providers/google/cloud/hooks/test_cloud_build.py
index ec0e83e5e9..78af318414 100644
--- a/tests/providers/google/cloud/hooks/test_cloud_build.py
+++ b/tests/providers/google/cloud/hooks/test_cloud_build.py
@@ -21,18 +21,29 @@ functions in CloudBuildHook
 """
 from __future__ import annotations
 
+import sys
 import unittest
+from concurrent.futures import Future
 from unittest.mock import MagicMock, patch
 
+import pytest
 from google.api_core.gapic_v1.method import DEFAULT
+from google.cloud.devtools.cloudbuild_v1 import CloudBuildAsyncClient, GetBuildRequest
 
-from airflow.providers.google.cloud.hooks.cloud_build import CloudBuildHook
+from airflow import AirflowException
+from airflow.providers.google.cloud.hooks.cloud_build import CloudBuildAsyncHook, CloudBuildHook
 from airflow.providers.google.common.consts import CLIENT_INFO
 from tests.providers.google.cloud.utils.base_gcp_mock import mock_base_gcp_hook_no_default_project_id
 
+if sys.version_info < (3, 8):
+    from asynctest import mock
+else:
+    from unittest import mock
+
 PROJECT_ID = "cloud-build-project"
 LOCATION = "test-location"
 PARENT = f"projects/{PROJECT_ID}/locations/{LOCATION}"
+CLOUD_BUILD_PATH = "airflow.providers.google.cloud.hooks.cloud_build.{}"
 BUILD_ID = "test-build-id-9832662"
 REPO_SOURCE = {"repo_source": {"repo_name": "test_repo", "branch_name": "main"}}
 BUILD = {
@@ -298,3 +309,32 @@ class TestCloudBuildHook(unittest.TestCase):
             timeout=None,
             metadata=(),
         )
+
+
+class TestAsyncHook:
+    @pytest.fixture
+    def hook(self):
+        return CloudBuildAsyncHook(
+            gcp_conn_id="google_cloud_default",
+        )
+
+    @pytest.mark.asyncio
+    @mock.patch.object(CloudBuildAsyncClient, "__init__", lambda self: None)
+    @mock.patch(CLOUD_BUILD_PATH.format("CloudBuildAsyncClient.get_build"))
+    async def test_async_cloud_build_service_client_creation_should_execute_successfully(
+        self, mocked_get_build, hook
+    ):
+        mocked_get_build.return_value = Future()
+        await hook.get_cloud_build(project_id=PROJECT_ID, id_=BUILD_ID)
+        request = GetBuildRequest(
+            dict(
+                project_id=PROJECT_ID,
+                id=BUILD_ID,
+            )
+        )
+        mocked_get_build.assert_called_once_with(request=request, retry=DEFAULT, timeout=None, metadata=())
+
+    @pytest.mark.asyncio
+    async def test_async_get_clod_build_without_build_id_should_throw_exception(self, hook):
+        with pytest.raises(AirflowException, match=r"Google Cloud Build id is required."):
+            await hook.get_cloud_build(project_id=PROJECT_ID, id_=None)
diff --git a/tests/providers/google/cloud/operators/test_cloud_build.py b/tests/providers/google/cloud/operators/test_cloud_build.py
index 1334679a78..e80dcd1fb0 100644
--- a/tests/providers/google/cloud/operators/test_cloud_build.py
+++ b/tests/providers/google/cloud/operators/test_cloud_build.py
@@ -30,7 +30,10 @@ from google.api_core.gapic_v1.method import DEFAULT
 from google.cloud.devtools.cloudbuild_v1.types import Build, BuildTrigger, RepoSource, StorageSource
 from parameterized import parameterized
 
-from airflow.exceptions import AirflowException
+from airflow.exceptions import AirflowException, TaskDeferred
+from airflow.models import DAG
+from airflow.models.dagrun import DagRun
+from airflow.models.taskinstance import TaskInstance
 from airflow.providers.google.cloud.operators.cloud_build import (
     BuildProcessor,
     CloudBuildCancelBuildOperator,
@@ -45,12 +48,16 @@ from airflow.providers.google.cloud.operators.cloud_build import (
     CloudBuildRunBuildTriggerOperator,
     CloudBuildUpdateBuildTriggerOperator,
 )
+from airflow.providers.google.cloud.triggers.cloud_build import CloudBuildCreateBuildTrigger
+from airflow.utils.timezone import datetime
+from airflow.utils.types import DagRunType
 
 # pylint: disable=R0904, C0111
 
 
 GCP_CONN_ID = "google_cloud_default"
 PROJECT_ID = "cloud-build-project"
+CLOUD_BUILD_HOOK_PATH = "airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook"
 BUILD_ID = "test-build-id-9832661"
 REPO_SOURCE = {"repo_source": {"repo_name": "test_repo", "branch_name": "main"}}
 BUILD = {
@@ -65,35 +72,71 @@ BUILD_TRIGGER = {
 }
 OPERATION = {"metadata": {"build": {"id": BUILD_ID}}}
 TRIGGER_ID = "32488e7f-09d6-4fe9-a5fb-4ca1419a6e7a"
+TEST_BUILD_INSTANCE = dict(
+    id="test-build-id-9832662",
+    status=3,
+    steps=[
+        {
+            "name": "ubuntu",
+            "env": [],
+            "args": [],
+            "dir_": "",
+            "id": "",
+            "wait_for": [],
+            "entrypoint": "",
+            "secret_env": [],
+            "volumes": [],
+            "status": 0,
+            "script": "",
+        }
+    ],
+    name="",
+    project_id="",
+    status_detail="",
+    images=[],
+    logs_bucket="",
+    build_trigger_id="",
+    log_url="",
+    substitutions={},
+    tags=[],
+    secrets=[],
+    timing={},
+    service_account="",
+    warnings=[],
+)
 
 
 class TestCloudBuildOperator(TestCase):
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_cancel_build(self, mock_hook):
         mock_hook.return_value.cancel_build.return_value = Build()
+
         operator = CloudBuildCancelBuildOperator(id_=TRIGGER_ID, task_id="id")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.cancel_build.assert_called_once_with(
             id_=TRIGGER_ID, project_id=None, retry=DEFAULT, timeout=None, metadata=()
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_create_build(self, mock_hook):
-        mock_hook.return_value.create_build.return_value = Build()
+        mock_hook.return_value.create_build_without_waiting_for_result.return_value = (BUILD, BUILD_ID)
+        mock_hook.return_value.wait_for_operation.return_value = Build()
+
         operator = CloudBuildCreateBuildOperator(build=BUILD, task_id="id")
-        context = mock.MagicMock()
-        operator.execute(context=context)
-        mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
+        operator.execute(context=mock.MagicMock())
+
+        mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None, delegate_to=None)
         build = Build(BUILD)
-        mock_hook.return_value.create_build.assert_called_once_with(
-            build=build, project_id=None, wait=True, retry=DEFAULT, timeout=None, metadata=()
+        mock_hook.return_value.create_build_without_waiting_for_result.assert_called_once_with(
+            build=build, project_id=None, retry=DEFAULT, timeout=None, metadata=()
         )
+        mock_hook.return_value.wait_for_operation.assert_called_once_with(timeout=None, operation=BUILD)
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_create_build_with_missing_build(self, mock_hook):
-        mock_hook.return_value.create_build.return_value = Build()
+        mock_hook.return_value.create_build_without_waiting_for_result.return_value = Build()
         with pytest.raises(AirflowException, match="missing keyword argument 'build'"):
             CloudBuildCreateBuildOperator(task_id="id")
 
@@ -125,56 +168,61 @@ class TestCloudBuildOperator(TestCase):
             expected_body = {"steps": [{"name": "ubuntu", "args": ["echo", "Hello {{ params.name }}!"]}]}
             assert expected_body == operator.build
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_create_build_trigger(self, mock_hook):
         mock_hook.return_value.create_build_trigger.return_value = BuildTrigger()
+
         operator = CloudBuildCreateBuildTriggerOperator(trigger=BUILD_TRIGGER, task_id="id")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.create_build_trigger.assert_called_once_with(
             trigger=BUILD_TRIGGER, project_id=None, retry=DEFAULT, timeout=None, metadata=()
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_delete_build_trigger(self, mock_hook):
         mock_hook.return_value.delete_build_trigger.return_value = None
+
         operator = CloudBuildDeleteBuildTriggerOperator(trigger_id=TRIGGER_ID, task_id="id")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.delete_build_trigger.assert_called_once_with(
             trigger_id=TRIGGER_ID, project_id=None, retry=DEFAULT, timeout=None, metadata=()
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_get_build(self, mock_hook):
         mock_hook.return_value.get_build.return_value = Build()
+
         operator = CloudBuildGetBuildOperator(id_=BUILD_ID, task_id="id")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.get_build.assert_called_once_with(
             id_=BUILD_ID, project_id=None, retry=DEFAULT, timeout=None, metadata=()
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_get_build_trigger(self, mock_hook):
         mock_hook.return_value.get_build_trigger.return_value = BuildTrigger()
+
         operator = CloudBuildGetBuildTriggerOperator(trigger_id=TRIGGER_ID, task_id="id")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.get_build_trigger.assert_called_once_with(
             trigger_id=TRIGGER_ID, project_id=None, retry=DEFAULT, timeout=None, metadata=()
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_list_build_triggers(self, mock_hook):
         mock_hook.return_value.list_build_triggers.return_value = mock.MagicMock()
+
         operator = CloudBuildListBuildTriggersOperator(task_id="id", location="global")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.list_build_triggers.assert_called_once_with(
             project_id=None,
@@ -186,12 +234,13 @@ class TestCloudBuildOperator(TestCase):
             metadata=(),
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_list_builds(self, mock_hook):
         mock_hook.return_value.list_builds.return_value = mock.MagicMock()
+
         operator = CloudBuildListBuildsOperator(task_id="id", location="global")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.list_builds.assert_called_once_with(
             project_id=None,
@@ -203,23 +252,25 @@ class TestCloudBuildOperator(TestCase):
             metadata=(),
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_retry_build(self, mock_hook):
         mock_hook.return_value.retry_build.return_value = Build()
+
         operator = CloudBuildRetryBuildOperator(id_=BUILD_ID, task_id="id")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.retry_build.assert_called_once_with(
             id_=BUILD_ID, project_id=None, wait=True, retry=DEFAULT, timeout=None, metadata=()
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_run_build_trigger(self, mock_hook):
         mock_hook.return_value.run_build_trigger.return_value = Build()
+
         operator = CloudBuildRunBuildTriggerOperator(trigger_id=TRIGGER_ID, source=REPO_SOURCE, task_id="id")
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.run_build_trigger.assert_called_once_with(
             trigger_id=TRIGGER_ID,
@@ -231,14 +282,15 @@ class TestCloudBuildOperator(TestCase):
             metadata=(),
         )
 
-    @mock.patch("airflow.providers.google.cloud.operators.cloud_build.CloudBuildHook")
+    @mock.patch(CLOUD_BUILD_HOOK_PATH)
     def test_update_build_trigger(self, mock_hook):
         mock_hook.return_value.update_build_trigger.return_value = BuildTrigger()
+
         operator = CloudBuildUpdateBuildTriggerOperator(
             trigger_id=TRIGGER_ID, trigger=BUILD_TRIGGER, task_id="id"
         )
-        context = mock.MagicMock()
-        operator.execute(context=context)
+        operator.execute(context=mock.MagicMock())
+
         mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None)
         mock_hook.return_value.update_build_trigger.assert_called_once_with(
             trigger_id=TRIGGER_ID,
@@ -324,3 +376,136 @@ class TestBuildProcessor(TestCase):
 
         BuildProcessor(build=body).process_body()
         assert body == expected_body
+
+
+@mock.patch(CLOUD_BUILD_HOOK_PATH)
+def test_async_create_build_fires_correct_trigger_should_execute_successfully(mock_hook):
+    mock_hook.return_value.create_build_without_waiting_for_result.return_value = (BUILD, BUILD_ID)
+
+    operator = CloudBuildCreateBuildOperator(
+        build=BUILD,
+        task_id="id",
+        deferrable=True,
+    )
+
+    with pytest.raises(TaskDeferred) as exc:
+        operator.execute(create_context(operator))
+
+    assert isinstance(
+        exc.value.trigger, CloudBuildCreateBuildTrigger
+    ), "Trigger is not a CloudBuildCreateBuildTrigger"
+
+
+@mock.patch(CLOUD_BUILD_HOOK_PATH)
+def test_async_create_build_without_wait_should_execute_successfully(mock_hook):
+    mock_hook.return_value.create_build_without_waiting_for_result.return_value = (BUILD, BUILD_ID)
+    mock_hook.return_value.get_build.return_value = Build()
+
+    operator = CloudBuildCreateBuildOperator(
+        build=BUILD,
+        task_id="id",
+        wait=False,
+        deferrable=True,
+    )
+    operator.execute(context=mock.MagicMock())
+
+    mock_hook.assert_called_once_with(gcp_conn_id=GCP_CONN_ID, impersonation_chain=None, delegate_to=None)
+    build = Build(BUILD)
+    mock_hook.return_value.create_build_without_waiting_for_result.assert_called_once_with(
+        build=build, project_id=None, retry=DEFAULT, timeout=None, metadata=()
+    )
+    mock_hook.return_value.get_build.assert_called_once_with(id_=BUILD_ID, project_id=None)
+
+
+@mock.patch(CLOUD_BUILD_HOOK_PATH)
+def test_async_create_build_correct_logging_should_execute_successfully(mock_hook):
+    mock_hook.return_value.create_build_without_waiting_for_result.return_value = (BUILD, BUILD_ID)
+    mock_hook.return_value.get_build.return_value = Build()
+
+    operator = CloudBuildCreateBuildOperator(
+        build=BUILD,
+        task_id="id",
+        deferrable=True,
+    )
+    with mock.patch.object(operator.log, "info") as mock_log_info:
+        operator.execute_complete(
+            context=create_context(operator),
+            event={
+                "instance": TEST_BUILD_INSTANCE,
+                "status": "success",
+                "message": "Build completed",
+                "id_": BUILD_ID,
+            },
+        )
+    mock_log_info.assert_called_with("Cloud Build completed with response %s ", "Build completed")
+
+
+def test_async_create_build_error_event_should_throw_exception():
+    operator = CloudBuildCreateBuildOperator(
+        build=BUILD,
+        task_id="id",
+        deferrable=True,
+    )
+    with pytest.raises(AirflowException):
+        operator.execute_complete(context=None, event={"status": "error", "message": "test failure message"})
+
+
+@mock.patch(CLOUD_BUILD_HOOK_PATH)
+def test_async_create_build_with_missing_build_should_throw_exception(mock_hook):
+    mock_hook.return_value.create_build.return_value = Build()
+    with pytest.raises(AirflowException, match="missing keyword argument 'build'"):
+        CloudBuildCreateBuildOperator(task_id="id")
+
+
+@parameterized.expand(
+    [
+        (
+            ".json",
+            json.dumps({"steps": [{"name": "ubuntu", "args": ["echo", "Hello {{ params.name }}!"]}]}),
+        ),
+        (
+            ".yaml",
+            """
+            steps:
+            - name: 'ubuntu'
+              args: ['echo', 'Hello {{ params.name }}!']
+            """,
+        ),
+    ]
+)
+def test_async_load_templated_should_execute_successfully(file_type, file_content):
+    with tempfile.NamedTemporaryFile(suffix=file_type, mode="w+") as f:
+        f.writelines(file_content)
+        f.flush()
+
+        operator = CloudBuildCreateBuildOperator(
+            build=f.name,
+            task_id="task-id",
+            params={"name": "airflow"},
+            deferrable=True,
+        )
+        operator.prepare_template()
+        expected_body = {"steps": [{"name": "ubuntu", "args": ["echo", "Hello {{ params.name }}!"]}]}
+        assert expected_body == operator.build
+
+
+def create_context(task):
+    dag = DAG(dag_id="dag")
+    logical_date = datetime(2022, 1, 1, 0, 0, 0)
+    dag_run = DagRun(
+        dag_id=dag.dag_id,
+        execution_date=logical_date,
+        run_id=DagRun.generate_run_id(DagRunType.MANUAL, logical_date),
+    )
+    task_instance = TaskInstance(task=task)
+    task_instance.dag_run = dag_run
+    task_instance.dag_id = dag.dag_id
+    task_instance.xcom_push = mock.Mock()
+    return {
+        "dag": dag,
+        "run_id": dag_run.run_id,
+        "task": task,
+        "ti": task_instance,
+        "task_instance": task_instance,
+        "logical_date": logical_date,
+    }
diff --git a/tests/providers/google/cloud/triggers/test_cloud_build.py b/tests/providers/google/cloud/triggers/test_cloud_build.py
new file mode 100644
index 0000000000..62203ddacb
--- /dev/null
+++ b/tests/providers/google/cloud/triggers/test_cloud_build.py
@@ -0,0 +1,240 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+import asyncio
+import logging
+import sys
+
+import pytest
+from google.cloud.devtools.cloudbuild_v1 import CloudBuildAsyncClient
+from google.cloud.devtools.cloudbuild_v1.types import Build, BuildStep
+
+from airflow.providers.google.cloud.hooks.cloud_build import CloudBuildAsyncHook
+from airflow.providers.google.cloud.triggers.cloud_build import CloudBuildCreateBuildTrigger
+from airflow.triggers.base import TriggerEvent
+
+if sys.version_info < (3, 8):
+    from asynctest import mock
+else:
+    from unittest import mock
+
+CLOUD_BUILD_PATH = "airflow.providers.google.cloud.hooks.cloud_build.{}"
+TEST_PROJECT_ID = "cloud-build-project"
+TEST_BUILD_ID = "test-build-id-9832662"
+REPO_SOURCE = {"repo_source": {"repo_name": "test_repo", "branch_name": "main"}}
+TEST_BUILD = {
+    "source": REPO_SOURCE,
+    "steps": [{"name": "gcr.io/cloud-builders/gcloud", "entrypoint": "/bin/sh", "args": ["-c", "ls"]}],
+    "status": "SUCCESS",
+}
+TEST_BUILD_WORKING = {
+    "source": REPO_SOURCE,
+    "steps": [{"name": "gcr.io/cloud-builders/gcloud", "entrypoint": "/bin/sh", "args": ["-c", "ls"]}],
+    "status": "WORKING",
+}
+
+TEST_CONN_ID = "google_cloud_default"
+TEST_POLL_INTERVAL = 4.0
+TEST_BUILD_INSTANCE = dict(
+    id="test-build-id-9832662",
+    status=3,
+    steps=[
+        {
+            "name": "ubuntu",
+            "env": [],
+            "args": [],
+            "dir_": "",
+            "id": "",
+            "wait_for": [],
+            "entrypoint": "",
+            "secret_env": [],
+            "volumes": [],
+            "status": 0,
+            "script": "",
+        }
+    ],
+    name="",
+    project_id="",
+    status_detail="",
+    images=[],
+    logs_bucket="",
+    build_trigger_id="",
+    log_url="",
+    substitutions={},
+    tags=[],
+    secrets=[],
+    timing={},
+    service_account="",
+    warnings=[],
+)
+
+pytest.hook = CloudBuildAsyncHook(gcp_conn_id="google_cloud_default")
+
+
+@pytest.fixture
+def hook():
+    return CloudBuildAsyncHook(
+        gcp_conn_id="google_cloud_default",
+    )
+
+
+def test_async_create_build_trigger_serialization_should_execute_successfully():
+    """
+    Asserts that the CloudBuildCreateBuildTrigger correctly serializes its arguments
+    and classpath.
+    """
+    trigger = CloudBuildCreateBuildTrigger(
+        id_=TEST_BUILD_ID,
+        project_id=TEST_PROJECT_ID,
+        gcp_conn_id=TEST_CONN_ID,
+        impersonation_chain=None,
+        delegate_to=None,
+        poll_interval=TEST_POLL_INTERVAL,
+    )
+    classpath, kwargs = trigger.serialize()
+    assert classpath == "airflow.providers.google.cloud.triggers.cloud_build.CloudBuildCreateBuildTrigger"
+    assert kwargs == {
+        "id_": TEST_BUILD_ID,
+        "project_id": TEST_PROJECT_ID,
+        "gcp_conn_id": TEST_CONN_ID,
+        "impersonation_chain": None,
+        "delegate_to": None,
+        "poll_interval": TEST_POLL_INTERVAL,
+    }
+
+
+@pytest.mark.asyncio
+@mock.patch.object(CloudBuildAsyncClient, "__init__", lambda self: None)
+@mock.patch(CLOUD_BUILD_PATH.format("CloudBuildAsyncClient.get_build"))
+async def test_async_create_build_trigger_triggers_on_success_should_execute_successfully(
+    mock_get_build, hook
+):
+    """
+    Tests the CloudBuildCreateBuildTrigger only fires once the job execution reaches a successful state.
+    """
+    mock_get_build.return_value = Build(
+        id=TEST_BUILD_ID, status=Build.Status.SUCCESS, steps=[BuildStep(name="ubuntu")]
+    )
+
+    trigger = CloudBuildCreateBuildTrigger(
+        id_=TEST_BUILD_ID,
+        project_id=TEST_PROJECT_ID,
+        gcp_conn_id=TEST_CONN_ID,
+        impersonation_chain=None,
+        delegate_to=None,
+        poll_interval=TEST_POLL_INTERVAL,
+    )
+
+    generator = trigger.run()
+    actual = await generator.asend(None)
+    assert (
+        TriggerEvent(
+            {
+                "instance": TEST_BUILD_INSTANCE,
+                "id_": TEST_BUILD_ID,
+                "status": "success",
+                "message": "Build completed",
+            }
+        )
+        == actual
+    )
+
+
+@pytest.mark.asyncio
+@mock.patch.object(CloudBuildAsyncClient, "__init__", lambda self: None)
+@mock.patch(CLOUD_BUILD_PATH.format("CloudBuildAsyncClient.get_build"))
+async def test_async_create_build_trigger_triggers_on_running_should_execute_successfully(
+    mock_get_build, hook, caplog
+):
+    """
+    Test that CloudBuildCreateBuildTrigger does not fire while a build is still running.
+    """
+    mock_get_build.return_value = Build(
+        id=TEST_BUILD_ID, status=Build.Status.WORKING, steps=[BuildStep(name="ubuntu")]
+    )
+    caplog.set_level(logging.INFO)
+
+    trigger = CloudBuildCreateBuildTrigger(
+        id_=TEST_BUILD_ID,
+        project_id=TEST_PROJECT_ID,
+        gcp_conn_id=TEST_CONN_ID,
+        impersonation_chain=None,
+        delegate_to=None,
+        poll_interval=TEST_POLL_INTERVAL,
+    )
+    task = asyncio.create_task(trigger.run().__anext__())
+    await asyncio.sleep(0.5)
+
+    # TriggerEvent was not returned
+    assert task.done() is False
+
+    assert "Build is still running..." in caplog.text
+    assert f"Sleeping for {TEST_POLL_INTERVAL} seconds." in caplog.text
+
+    # Prevents error when task is destroyed while in "pending" state
+    asyncio.get_event_loop().stop()
+
+
+@pytest.mark.asyncio
+@mock.patch.object(CloudBuildAsyncClient, "__init__", lambda self: None)
+@mock.patch(CLOUD_BUILD_PATH.format("CloudBuildAsyncClient.get_build"))
+async def test_async_create_build_trigger_triggers_on_error_should_execute_successfully(
+    mock_get_build, hook, caplog
+):
+    """
+    Test that CloudBuildCreateBuildTrigger fires the correct event in case of an error.
+    """
+    mock_get_build.return_value = Build(
+        id=TEST_BUILD_ID, status=Build.Status.FAILURE, steps=[BuildStep(name="ubuntu")], status_detail="error"
+    )
+    caplog.set_level(logging.INFO)
+
+    trigger = CloudBuildCreateBuildTrigger(
+        id_=TEST_BUILD_ID,
+        project_id=TEST_PROJECT_ID,
+        gcp_conn_id=TEST_CONN_ID,
+        impersonation_chain=None,
+        delegate_to=None,
+        poll_interval=TEST_POLL_INTERVAL,
+    )
+
+    generator = trigger.run()
+    actual = await generator.asend(None)
+    assert TriggerEvent({"status": "error", "message": "error"}) == actual
+
+
+@pytest.mark.asyncio
+@mock.patch(CLOUD_BUILD_PATH.format("CloudBuildAsyncHook.get_cloud_build"))
+async def test_async_create_build_trigger_triggers_on_excp_should_execute_successfully(mock_build_inst):
+    """
+    Test that CloudBuildCreateBuildTrigger fires the correct event in case of an error.
+    """
+    mock_build_inst.side_effect = Exception("Test exception")
+
+    trigger = CloudBuildCreateBuildTrigger(
+        id_=TEST_BUILD_ID,
+        project_id=TEST_PROJECT_ID,
+        gcp_conn_id=TEST_CONN_ID,
+        impersonation_chain=None,
+        delegate_to=None,
+        poll_interval=TEST_POLL_INTERVAL,
+    )
+
+    generator = trigger.run()
+    actual = await generator.asend(None)
+    assert TriggerEvent({"status": "error", "message": "Test exception"}) == actual
diff --git a/tests/system/providers/google/cloud/cloud_build/example_cloud_build.py b/tests/system/providers/google/cloud/cloud_build/example_cloud_build.py
index bb770e46a7..0a420b1635 100644
--- a/tests/system/providers/google/cloud/cloud_build/example_cloud_build.py
+++ b/tests/system/providers/google/cloud/cloud_build/example_cloud_build.py
@@ -57,7 +57,7 @@ DAG_ID = "example_gcp_cloud_build"
 
 BUCKET_NAME_SRC = f"bucket-src-{DAG_ID}-{ENV_ID}"
 
-GCP_SOURCE_ARCHIVE_URL = os.environ.get("GCP_CLOUD_BUILD_ARCHIVE_URL", f"gs://{BUCKET_NAME_SRC}/file.tar.gz")
+GCP_SOURCE_ARCHIVE_URL = f"gs://{BUCKET_NAME_SRC}/file.tar.gz"
 GCP_SOURCE_REPOSITORY_NAME = "test-cloud-build-repo"
 
 GCP_SOURCE_ARCHIVE_URL_PARTS = urlsplit(GCP_SOURCE_ARCHIVE_URL)
@@ -100,7 +100,9 @@ with models.DAG(
 
     # [START howto_operator_create_build_from_storage]
     create_build_from_storage = CloudBuildCreateBuildOperator(
-        task_id="create_build_from_storage", project_id=PROJECT_ID, build=create_build_from_storage_body
+        task_id="create_build_from_storage",
+        project_id=PROJECT_ID,
+        build=create_build_from_storage_body,
     )
     # [END howto_operator_create_build_from_storage]
 
@@ -113,7 +115,9 @@ with models.DAG(
 
     # [START howto_operator_create_build_from_repo]
     create_build_from_repo = CloudBuildCreateBuildOperator(
-        task_id="create_build_from_repo", project_id=PROJECT_ID, build=create_build_from_repo_body
+        task_id="create_build_from_repo",
+        project_id=PROJECT_ID,
+        build=create_build_from_repo_body,
     )
     # [END howto_operator_create_build_from_repo]
 
@@ -126,7 +130,9 @@ with models.DAG(
 
     # [START howto_operator_list_builds]
     list_builds = CloudBuildListBuildsOperator(
-        task_id="list_builds", project_id=PROJECT_ID, location="global"
+        task_id="list_builds",
+        project_id=PROJECT_ID,
+        location="global",
     )
     # [END howto_operator_list_builds]
 
@@ -173,7 +179,9 @@ with models.DAG(
     # [END howto_operator_gcp_create_build_from_yaml_body]
 
     delete_bucket_src = GCSDeleteBucketOperator(
-        task_id="delete_bucket_src", bucket_name=BUCKET_NAME_SRC, trigger_rule=TriggerRule.ALL_DONE
+        task_id="delete_bucket_src",
+        bucket_name=BUCKET_NAME_SRC,
+        trigger_rule=TriggerRule.ALL_DONE,
     )
 
     chain(
diff --git a/tests/system/providers/google/cloud/cloud_build/example_cloud_build.py b/tests/system/providers/google/cloud/cloud_build/example_cloud_build_async.py
similarity index 83%
copy from tests/system/providers/google/cloud/cloud_build/example_cloud_build.py
copy to tests/system/providers/google/cloud/cloud_build/example_cloud_build_async.py
index bb770e46a7..c652bd4aef 100644
--- a/tests/system/providers/google/cloud/cloud_build/example_cloud_build.py
+++ b/tests/system/providers/google/cloud/cloud_build/example_cloud_build_async.py
@@ -33,7 +33,7 @@ from pathlib import Path
 from typing import Any, cast
 
 import yaml
-from future.backports.urllib.parse import urlsplit
+from future.backports.urllib.parse import urlparse
 
 from airflow import models
 from airflow.models.baseoperator import chain
@@ -53,14 +53,14 @@ from airflow.utils.trigger_rule import TriggerRule
 ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
 PROJECT_ID = os.environ.get("SYSTEM_TESTS_GCP_PROJECT")
 
-DAG_ID = "example_gcp_cloud_build"
+DAG_ID = "example_gcp_cloud_build_async"
 
 BUCKET_NAME_SRC = f"bucket-src-{DAG_ID}-{ENV_ID}"
 
-GCP_SOURCE_ARCHIVE_URL = os.environ.get("GCP_CLOUD_BUILD_ARCHIVE_URL", f"gs://{BUCKET_NAME_SRC}/file.tar.gz")
+GCP_SOURCE_ARCHIVE_URL = f"gs://{BUCKET_NAME_SRC}/file.tar.gz"
 GCP_SOURCE_REPOSITORY_NAME = "test-cloud-build-repo"
 
-GCP_SOURCE_ARCHIVE_URL_PARTS = urlsplit(GCP_SOURCE_ARCHIVE_URL)
+GCP_SOURCE_ARCHIVE_URL_PARTS = urlparse(GCP_SOURCE_ARCHIVE_URL)
 GCP_SOURCE_BUCKET_NAME = GCP_SOURCE_ARCHIVE_URL_PARTS.netloc
 
 CURRENT_FOLDER = Path(__file__).parent
@@ -98,11 +98,14 @@ with models.DAG(
         bucket=BUCKET_NAME_SRC,
     )
 
-    # [START howto_operator_create_build_from_storage]
+    # [START howto_operator_create_build_from_storage_async]
     create_build_from_storage = CloudBuildCreateBuildOperator(
-        task_id="create_build_from_storage", project_id=PROJECT_ID, build=create_build_from_storage_body
+        task_id="create_build_from_storage",
+        project_id=PROJECT_ID,
+        build=create_build_from_storage_body,
+        deferrable=True,
     )
-    # [END howto_operator_create_build_from_storage]
+    # [END howto_operator_create_build_from_storage_async]
 
     # [START howto_operator_create_build_from_storage_result]
     create_build_from_storage_result = BashOperator(
@@ -111,11 +114,14 @@ with models.DAG(
     )
     # [END howto_operator_create_build_from_storage_result]
 
-    # [START howto_operator_create_build_from_repo]
+    # [START howto_operator_create_build_from_repo_async]
     create_build_from_repo = CloudBuildCreateBuildOperator(
-        task_id="create_build_from_repo", project_id=PROJECT_ID, build=create_build_from_repo_body
+        task_id="create_build_from_repo",
+        project_id=PROJECT_ID,
+        build=create_build_from_repo_body,
+        deferrable=True,
     )
-    # [END howto_operator_create_build_from_repo]
+    # [END howto_operator_create_build_from_repo_async]
 
     # [START howto_operator_create_build_from_repo_result]
     create_build_from_repo_result = BashOperator(
@@ -126,18 +132,21 @@ with models.DAG(
 
     # [START howto_operator_list_builds]
     list_builds = CloudBuildListBuildsOperator(
-        task_id="list_builds", project_id=PROJECT_ID, location="global"
+        task_id="list_builds",
+        project_id=PROJECT_ID,
+        location="global",
     )
     # [END howto_operator_list_builds]
 
-    # [START howto_operator_create_build_without_wait]
+    # [START howto_operator_create_build_without_wait_async]
     create_build_without_wait = CloudBuildCreateBuildOperator(
         task_id="create_build_without_wait",
         project_id=PROJECT_ID,
         build=create_build_from_repo_body,
         wait=False,
+        deferrable=True,
     )
-    # [END howto_operator_create_build_without_wait]
+    # [END howto_operator_create_build_without_wait_async]
 
     # [START howto_operator_cancel_build]
     cancel_build = CloudBuildCancelBuildOperator(
@@ -163,17 +172,20 @@ with models.DAG(
     )
     # [END howto_operator_get_build]
 
-    # [START howto_operator_gcp_create_build_from_yaml_body]
+    # [START howto_operator_gcp_create_build_from_yaml_body_async]
     create_build_from_file = CloudBuildCreateBuildOperator(
         task_id="create_build_from_file",
         project_id=PROJECT_ID,
         build=yaml.safe_load((Path(CURRENT_FOLDER) / "resources" / "example_cloud_build.yaml").read_text()),
         params={"name": "Airflow"},
+        deferrable=True,
     )
-    # [END howto_operator_gcp_create_build_from_yaml_body]
+    # [END howto_operator_gcp_create_build_from_yaml_body_async]
 
     delete_bucket_src = GCSDeleteBucketOperator(
-        task_id="delete_bucket_src", bucket_name=BUCKET_NAME_SRC, trigger_rule=TriggerRule.ALL_DONE
+        task_id="delete_bucket_src",
+        bucket_name=BUCKET_NAME_SRC,
+        trigger_rule=TriggerRule.ALL_DONE,
     )
 
     chain(
diff --git a/tests/system/providers/google/cloud/cloud_build/example_cloud_build_trigger.py b/tests/system/providers/google/cloud/cloud_build/example_cloud_build_trigger.py
index 91e48415d6..5b909dd966 100644
--- a/tests/system/providers/google/cloud/cloud_build/example_cloud_build_trigger.py
+++ b/tests/system/providers/google/cloud/cloud_build/example_cloud_build_trigger.py
@@ -37,9 +37,11 @@ from airflow.providers.google.cloud.operators.cloud_build import (
     CloudBuildRunBuildTriggerOperator,
     CloudBuildUpdateBuildTriggerOperator,
 )
+from airflow.utils.trigger_rule import TriggerRule
 
 ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
 PROJECT_ID = os.environ.get("SYSTEM_TESTS_GCP_PROJECT")
+TRIGGER_NAME = f"cloud-build-trigger-{ENV_ID}"
 
 DAG_ID = "example_gcp_cloud_build_trigger"
 
@@ -47,7 +49,7 @@ GCP_SOURCE_REPOSITORY_NAME = "test-cloud-build-repo"
 
 # [START howto_operator_gcp_create_build_trigger_body]
 create_build_trigger_body = {
-    "name": f"test-cloud-build-trigger-{ENV_ID}",
+    "name": TRIGGER_NAME,
     "trigger_template": {
         "project_id": PROJECT_ID,
         "repo_name": GCP_SOURCE_REPOSITORY_NAME,
@@ -59,7 +61,7 @@ create_build_trigger_body = {
 
 # [START howto_operator_gcp_update_build_trigger_body]
 update_build_trigger_body = {
-    "name": f"test-cloud-build-trigger-{ENV_ID}",
+    "name": TRIGGER_NAME,
     "trigger_template": {
         "project_id": PROJECT_ID,
         "repo_name": GCP_SOURCE_REPOSITORY_NAME,
@@ -126,10 +128,14 @@ with models.DAG(
         trigger_id=build_trigger_id,
     )
     # [END howto_operator_delete_build_trigger]
+    delete_build_trigger.trigger_rule = TriggerRule.ALL_DONE
 
     # [START howto_operator_list_build_triggers]
     list_build_triggers = CloudBuildListBuildTriggersOperator(
-        task_id="list_build_triggers", project_id=PROJECT_ID, location="global", page_size=5
+        task_id="list_build_triggers",
+        project_id=PROJECT_ID,
+        location="global",
+        page_size=5,
     )
     # [END howto_operator_list_build_triggers]
 


[airflow] 07/12: Improve "other" test category selection (#28630)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6f2544886826880ccce80e48e5b864b6c11eb7dd
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Wed Dec 28 23:09:16 2022 +0100

    Improve "other" test category selection (#28630)
    
    The "Other" test category automatically selects all tests that
    are not included in any of the regular categories. That is to
    make sure that we do not forget to add any directory that
    has been added. However this led to a long directory selection
    for "Other" category including system tests that have been
    automatically added there. However those tests are always skipped
    in regular tests and collecting those tests during "Other"
    execution is not needed and slows it down.
    
    Similarly System tests changes were treated as "Other change"
    for incoming PRs. This means that any change to system tests
    would trigger "all tests" rather than selective subset of
    those - the same as any core change.
    
    However System tests are also part of the documentation,
    so any change in system tests should trigger docs
    builds.
    
    This change improves it in a few ways:
    
    * Other tests now do not include "System Tests" - they are
      treated the same way as other test categories (but not
      included in test category selection for now - until we get
      a good way of breeze-integration for System Tests
    
    * They are also excluded from treating them as "other" change
      when considering which tests to run. Changes to system tests
      will not trigger "all" tests, just those that accompanying
      changes would trigger.
    
    * The changes to system tests only, however, triggers docs build
      because those are triggered by any source change.a
    
    * The __pycache__ directories are removed from the list of
      "Other" packages to run.
    
    * In order to make sure system tests are pytest-collectable,
      we perform pytest collection for all tests right after downloading
      the CI images and verifying them. This makes sure that the
      tests are collectible before we even attempt to run them - this way
      we avoid unnecessary machine spin-up and breze installation for the
      multiple jobs that run the tests. This slows down feedback time
      a litle, but should increase overall robustness of the test suite.
    
    * an old, unused nosetest collection script doing the same in the past
      has been removed (it was discovered during implementation)
    
    Noticed in: #28319
    
    (cherry picked from commit e8657ce5596575a0408d97f6b4a72e7d185edc7f)
---
 .github/workflows/ci.yml                           |  6 +++--
 Dockerfile.ci                                      |  4 +++-
 .../src/airflow_breeze/utils/selective_checks.py   | 10 ++++++--
 dev/breeze/tests/test_selective_checks.py          | 25 +++++++++++++++-----
 scripts/docker/entrypoint_ci.sh                    |  4 +++-
 scripts/in_container/run_extract_tests.sh          | 25 --------------------
 ...est_collection.py => test_pytest_collection.py} | 27 ++++++++++++++++------
 7 files changed, 57 insertions(+), 44 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 4c8bc96151..ce5004c7d9 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -540,6 +540,8 @@ jobs:
         env:
           PYTHON_VERSIONS: ${{ needs.build-info.outputs.python-versions-list-as-string }}
           DEBUG_RESOURCES: ${{needs.build-info.outputs.debug-resources}}
+      - name: "Tests Pytest collection: ${{matrix.python-version}}"
+        run: breeze shell "python /opt/airflow/scripts/in_container/test_pytest_collection.py"
       - name: "Fix ownership"
         run: breeze ci fix-ownership
         if: always()
@@ -867,7 +869,7 @@ jobs:
       - name: "Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         run: breeze testing tests --run-in-parallel
       - name: "Tests ARM Pytest collection: ${{matrix.python-version}}"
-        run: breeze shell "python /opt/airflow/scripts/in_container/test_arm_pytest_collection.py"
+        run: breeze shell "python /opt/airflow/scripts/in_container/test_pytest_collection.py" arm
       - name: "Post Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         uses: ./.github/actions/post_tests
 
@@ -992,7 +994,7 @@ jobs:
       - name: "Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         run: breeze testing tests --run-in-parallel
       - name: "Tests ARM Pytest collection: ${{matrix.python-version}}"
-        run: breeze shell "python /opt/airflow/scripts/in_container/test_arm_pytest_collection.py"
+        run: breeze shell "python /opt/airflow/scripts/in_container/test_pytest_collection.py" arm
       - name: "Post Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         uses: ./.github/actions/post_tests
 
diff --git a/Dockerfile.ci b/Dockerfile.ci
index 72b01bb098..369faab2ab 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -881,7 +881,7 @@ declare -a SELECTED_TESTS CLI_TESTS API_TESTS PROVIDERS_TESTS CORE_TESTS WWW_TES
 
 function find_all_other_tests() {
     local all_tests_dirs
-    all_tests_dirs=$(find "tests" -type d)
+    all_tests_dirs=$(find "tests" -type d ! -name '__pycache__')
     all_tests_dirs=$(echo "${all_tests_dirs}" | sed "/tests$/d" )
     all_tests_dirs=$(echo "${all_tests_dirs}" | sed "/tests\/dags/d" )
     local path
@@ -915,6 +915,7 @@ else
     WWW_TESTS=("tests/www")
     HELM_CHART_TESTS=("tests/charts")
     INTEGRATION_TESTS=("tests/integration")
+    SYSTEM_TESTS=("tests/system")
     ALL_TESTS=("tests")
     ALL_PRESELECTED_TESTS=(
         "${CLI_TESTS[@]}"
@@ -925,6 +926,7 @@ else
         "${CORE_TESTS[@]}"
         "${ALWAYS_TESTS[@]}"
         "${WWW_TESTS[@]}"
+        "${SYSTEM_TESTS[@]}"
     )
 
     NO_PROVIDERS_INTEGRATION_TESTS=(
diff --git a/dev/breeze/src/airflow_breeze/utils/selective_checks.py b/dev/breeze/src/airflow_breeze/utils/selective_checks.py
index 2aee00491b..75fb5f0faa 100644
--- a/dev/breeze/src/airflow_breeze/utils/selective_checks.py
+++ b/dev/breeze/src/airflow_breeze/utils/selective_checks.py
@@ -70,6 +70,7 @@ class FileGroupForCi(Enum):
     SETUP_FILES = "setup_files"
     DOC_FILES = "doc_files"
     WWW_FILES = "www_files"
+    SYSTEM_TEST_FILES = "system_tests"
     KUBERNETES_FILES = "kubernetes_files"
     ALL_PYTHON_FILES = "all_python_files"
     ALL_SOURCE_FILES = "all_sources_for_tests"
@@ -159,6 +160,9 @@ CI_FILE_GROUP_MATCHES = HashableDict(
             "^tests",
             "^kubernetes_tests",
         ],
+        FileGroupForCi.SYSTEM_TEST_FILES: [
+            "^tests/system/",
+        ],
     }
 )
 
@@ -178,7 +182,6 @@ TEST_TYPE_MATCHES = HashableDict(
         SelectiveUnitTestTypes.PROVIDERS: [
             "^airflow/providers/",
             "^tests/providers/",
-            "^tests/system/",
         ],
         SelectiveUnitTestTypes.WWW: ["^airflow/www", "^tests/www"],
     }
@@ -523,9 +526,12 @@ class SelectiveChecks:
         )
 
         kubernetes_files = self._matching_files(FileGroupForCi.KUBERNETES_FILES, CI_FILE_GROUP_MATCHES)
+        system_test_files = self._matching_files(FileGroupForCi.SYSTEM_TEST_FILES, CI_FILE_GROUP_MATCHES)
         all_source_files = self._matching_files(FileGroupForCi.ALL_SOURCE_FILES, CI_FILE_GROUP_MATCHES)
 
-        remaining_files = set(all_source_files) - set(matched_files) - set(kubernetes_files)
+        remaining_files = (
+            set(all_source_files) - set(matched_files) - set(kubernetes_files) - set(system_test_files)
+        )
         count_remaining_files = len(remaining_files)
         if count_remaining_files > 0:
             get_console().print(
diff --git a/dev/breeze/tests/test_selective_checks.py b/dev/breeze/tests/test_selective_checks.py
index 5bf2ac41c3..05fcfa3567 100644
--- a/dev/breeze/tests/test_selective_checks.py
+++ b/dev/breeze/tests/test_selective_checks.py
@@ -177,7 +177,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str):
                 (
                     "INTHEWILD.md",
                     "chart/aaaa.txt",
-                    "tests/system/providers/airbyte/file.py",
+                    "tests/providers/airbyte/file.py",
                 ),
                 {
                     "all-python-versions": "['3.7']",
@@ -214,10 +214,9 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str):
                     "docs-build": "true",
                     "run-kubernetes-tests": "true",
                     "upgrade-to-newer-dependencies": "false",
-                    "test-types": "Always Providers",
+                    "test-types": "Always",
                 },
-                id="Helm tests, all providers as common util system file changed, kubernetes tests and "
-                "docs should run even if unimportant files were added",
+                id="Docs should run even if unimportant files were added",
             )
         ),
         (
@@ -465,7 +464,7 @@ def test_expected_output_full_tests_needed(
                 "skip-provider-tests": "true",
                 "test-types": "API Always CLI Core Other WWW",
             },
-            id="All tests except Providers and should run if core file changed in non-main branch",
+            id="All tests except Providers should run if core file changed in non-main branch",
         ),
     ],
 )
@@ -501,6 +500,20 @@ def test_expected_output_pull_request_v2_3(
             },
             id="Nothing should run if only non-important files changed",
         ),
+        pytest.param(
+            ("tests/system/any_file.py",),
+            {
+                "all-python-versions": "['3.7']",
+                "all-python-versions-list-as-string": "3.7",
+                "image-build": "true",
+                "needs-helm-tests": "false",
+                "run-tests": "true",
+                "docs-build": "true",
+                "upgrade-to-newer-dependencies": "false",
+                "test-types": "Always",
+            },
+            id="Only Always and docs build should run if only system tests changed",
+        ),
         pytest.param(
             (
                 "airflow/cli/test.py",
@@ -538,7 +551,7 @@ def test_expected_output_pull_request_v2_3(
                 "skip-provider-tests": "false",
                 "test-types": "API Always CLI Core Other Providers WWW",
             },
-            id="All tests except should run if core file changed",
+            id="All tests should run if core file changed",
         ),
     ],
 )
diff --git a/scripts/docker/entrypoint_ci.sh b/scripts/docker/entrypoint_ci.sh
index 3f19ce4bae..9454626d6c 100755
--- a/scripts/docker/entrypoint_ci.sh
+++ b/scripts/docker/entrypoint_ci.sh
@@ -333,7 +333,7 @@ declare -a SELECTED_TESTS CLI_TESTS API_TESTS PROVIDERS_TESTS CORE_TESTS WWW_TES
 # - so that we do not skip any in the future if new directories are added
 function find_all_other_tests() {
     local all_tests_dirs
-    all_tests_dirs=$(find "tests" -type d)
+    all_tests_dirs=$(find "tests" -type d ! -name '__pycache__')
     all_tests_dirs=$(echo "${all_tests_dirs}" | sed "/tests$/d" )
     all_tests_dirs=$(echo "${all_tests_dirs}" | sed "/tests\/dags/d" )
     local path
@@ -367,6 +367,7 @@ else
     WWW_TESTS=("tests/www")
     HELM_CHART_TESTS=("tests/charts")
     INTEGRATION_TESTS=("tests/integration")
+    SYSTEM_TESTS=("tests/system")
     ALL_TESTS=("tests")
     ALL_PRESELECTED_TESTS=(
         "${CLI_TESTS[@]}"
@@ -377,6 +378,7 @@ else
         "${CORE_TESTS[@]}"
         "${ALWAYS_TESTS[@]}"
         "${WWW_TESTS[@]}"
+        "${SYSTEM_TESTS[@]}"
     )
 
     NO_PROVIDERS_INTEGRATION_TESTS=(
diff --git a/scripts/in_container/run_extract_tests.sh b/scripts/in_container/run_extract_tests.sh
deleted file mode 100755
index dc7b972a76..0000000000
--- a/scripts/in_container/run_extract_tests.sh
+++ /dev/null
@@ -1,25 +0,0 @@
-#!/usr/bin/env bash
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-# shellcheck source=scripts/in_container/_in_container_script_init.sh
-. "$( dirname "${BASH_SOURCE[0]}" )/_in_container_script_init.sh"
-
-TMP_FILE=$(mktemp)
-
-nosetests --collect-only --with-xunit --xunit-file="${TMP_FILE}"
-
-python "${AIRFLOW_SOURCES}/tests/test_utils/get_all_tests.py" "${TMP_FILE}" | sort >> "${HOME}/all_tests.txt"
diff --git a/scripts/in_container/test_arm_pytest_collection.py b/scripts/in_container/test_pytest_collection.py
similarity index 73%
rename from scripts/in_container/test_arm_pytest_collection.py
rename to scripts/in_container/test_pytest_collection.py
index 43277c5562..e5c9d44407 100755
--- a/scripts/in_container/test_arm_pytest_collection.py
+++ b/scripts/in_container/test_pytest_collection.py
@@ -20,15 +20,18 @@ from __future__ import annotations
 import json
 import re
 import subprocess
+import sys
 from pathlib import Path
 
 from rich.console import Console
 
 AIRFLOW_SOURCES_ROOT = Path(__file__).parents[2].resolve()
 
-if __name__ == "__main__":
-    console = Console(width=400, color_system="standard")
+console = Console(width=400, color_system="standard")
+
 
+def remove_packages_missing_on_arm():
+    console.print("[bright_blue]Removing packages missing on ARM.")
     provider_dependencies = json.loads(
         (AIRFLOW_SOURCES_ROOT / "generated" / "provider_dependencies.json").read_text()
     )
@@ -43,11 +46,21 @@ if __name__ == "__main__":
         + "\n"
     )
     subprocess.run(["pip", "uninstall", "-y"] + all_dependencies_to_remove)
+
+
+if __name__ == "__main__":
+    arm = False
+    if len(sys.argv) > 1 and sys.argv[1].lower() == "arm":
+        arm = True
+        remove_packages_missing_on_arm()
     result = subprocess.run(["pytest", "--collect-only", "-qqqq", "--disable-warnings", "tests"], check=False)
     if result.returncode != 0:
-        console.print("\n[red]Test collection in ARM environment failed.")
-        console.print(
-            "[yellow]You should wrap the failing imports in try/except/skip clauses\n"
-            "See similar examples as skipped tests right above.\n"
-        )
+        console.print("\n[red]Test collection failed.")
+        if arm:
+            console.print(
+                "[yellow]You should wrap the failing imports in try/except/skip clauses\n"
+                "See similar examples as skipped tests right above.\n"
+            )
+        else:
+            console.print("[yellow]Please add missing packages\n")
         exit(result.returncode)


[airflow] 11/12: Switch to ruff for faster static checks (#28893)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 14fdfc07acc9da01b0de4afa915cba8288d8d954
Author: Ash Berlin-Taylor <as...@apache.org>
AuthorDate: Thu Jan 12 21:40:23 2023 +0000

    Switch to ruff for faster static checks (#28893)
    
    Gone are:
    
    - isort
    - pyupgrade
    - pydocstyle
    - yesqa
    - autoflake
    - flake8
    
    All replaced with [ruff](https://github.com/charliermarsh/ruff). A chunk
    of the performance of ruff comes from the fact that it makes very good
    use of multiple cores. And since most docker virtual machines are only
    one or two core I have chosen to run it directly, not inside the breeze
    docker container so we get the full benefit of speed.
    
    * Work around namespace packages issue for providers
    
    Ruff is currently detecting "google" as a the name of the current
    package, so it thinks it goes in the "first" party import section
    
    (cherry picked from commit ce858a5d719fb1dff85ad7e4747f0777404d1f56)
---
 .flake8                                            |   8 --
 .github/boring-cyborg.yml                          |   1 -
 .github/workflows/ci.yml                           |   1 +
 .pre-commit-config.yaml                            |  66 +++----------
 .rat-excludes                                      |   1 -
 STATIC_CODE_CHECKS.rst                             |  12 +--
 airflow/cli/commands/connection_command.py         |   2 +-
 airflow/compat/functools.pyi                       |   1 +
 airflow/decorators/__init__.pyi                    |  13 ++-
 airflow/example_dags/example_sensor_decorator.py   |   1 +
 airflow/example_dags/tutorial_taskflow_api.py      |   1 +
 airflow/hooks/dbapi.py                             |   6 +-
 airflow/migrations/db_types.pyi                    |   1 +
 airflow/providers/amazon/aws/hooks/emr.py          |  10 +-
 airflow/providers/amazon/aws/operators/sns.py      |   2 +-
 .../amazon/aws/transfers/dynamodb_to_s3.py         |   4 +-
 .../providers/cncf/kubernetes/utils/__init__.py    |   2 +
 .../providers/google/cloud/operators/dataproc.py   |   2 +-
 .../google/cloud/operators/kubernetes_engine.py    |   2 +-
 airflow/providers/microsoft/azure/hooks/wasb.py    |   2 +-
 airflow/providers/odbc/hooks/odbc.py               |   2 +-
 airflow/utils/context.pyi                          |   5 +-
 airflow/utils/log/action_logger.py                 |   1 +
 airflow/utils/process_utils.py                     |   4 +-
 .../src/airflow_breeze/commands/main_command.py    |   4 +-
 dev/breeze/src/airflow_breeze/global_constants.py  |   9 +-
 dev/breeze/src/airflow_breeze/pre_commit_ids.py    |   8 +-
 .../pre_commit_ids_TEMPLATE.py.jinja2              |   1 +
 docs/apache-airflow/img/airflow_erd.sha256         |   2 +-
 docs/build_docs.py                                 |   6 +-
 docs/exts/provider_init_hack.py                    |   4 +-
 docs/spelling_wordlist.txt                         |   1 +
 images/breeze/output-commands-hash.txt             |   2 +-
 images/breeze/output_static-checks.svg             |  58 ++++++-----
 provider_packages/.flake8                          |   1 -
 pyproject.toml                                     | 108 ++++++++++++++++++---
 .../pre_commit_check_pre_commit_hooks.py           |  10 +-
 scripts/ci/pre_commit/pre_commit_flake8.py         |  72 --------------
 scripts/in_container/run_flake8.sh                 |  20 ----
 setup.py                                           |   8 +-
 tests/api_connexion/endpoints/test_dag_endpoint.py |  15 ---
 .../providers/google/suite/hooks/test_calendar.py  |   3 +-
 .../cncf/kubernetes/example_spark_kubernetes.py    |   3 +-
 .../google/cloud/bigtable/example_bigtable.py      |   6 +-
 tests/test_utils/get_all_tests.py                  |   4 +-
 45 files changed, 205 insertions(+), 290 deletions(-)

diff --git a/.flake8 b/.flake8
deleted file mode 100644
index 14de564a32..0000000000
--- a/.flake8
+++ /dev/null
@@ -1,8 +0,0 @@
-[flake8]
-max-line-length = 110
-ignore = E203,E231,E731,W504,I001,W503
-exclude = .svn,CVS,.bzr,.hg,.git,__pycache__,.eggs,*.egg,node_modules
-format = ${cyan}%(path)s${reset}:${yellow_bold}%(row)d${reset}:${green_bold}%(col)d${reset}: ${red_bold}%(code)s${reset} %(text)s
-per-file-ignores =
-    airflow/models/__init__.py:F401
-    airflow/models/sqla_models.py:F401
diff --git a/.github/boring-cyborg.yml b/.github/boring-cyborg.yml
index 7a444cf253..de8a944d70 100644
--- a/.github/boring-cyborg.yml
+++ b/.github/boring-cyborg.yml
@@ -96,7 +96,6 @@ labelPRBasedOnFilePath:
     - .asf.yaml
     - .bash_completion
     - .dockerignore
-    - .flake8
     - .hadolint.yaml
     - .pre-commit-config.yaml
     - .rat-excludes
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index ce5004c7d9..589582ae8a 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -583,6 +583,7 @@ jobs:
           COLUMNS: "250"
           SKIP_GROUP_OUTPUT: "true"
           DEFAULT_BRANCH: ${{ needs.build-info.outputs.default-branch }}
+          RUFF_FORMAT: "github"
       - name: "Fix ownership"
         run: breeze ci fix-ownership
         if: always()
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 93ca966d0b..07e5d18d58 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -156,11 +156,17 @@ repos:
         additional_dependencies: ['pyyaml']
         pass_filenames: false
         require_serial: true
-  - repo: https://github.com/PyCQA/isort
-    rev: 5.11.2
-    hooks:
-      - id: isort
-        name: Run isort to sort imports in Python files
+      - id: ruff
+        name: ruff
+        language: python
+        require_serial: true
+        pass_filenames: true
+        # Since ruff makes use of multiple cores we _purposefully_ don't run this in docker so it can use the
+        # host CPU to it's fullest
+        entry: ruff --fix --no-update-check --force-exclude
+        additional_dependencies: ['ruff>=0.0.219']
+        files: \.pyi?$
+        exclude: ^airflow/_vendor/
   - repo: https://github.com/psf/black
     rev: 22.12.0
     hooks:
@@ -223,14 +229,6 @@ repos:
           - "4"
         files: ^chart/values\.schema\.json$|^chart/values_schema\.schema\.json$
         pass_filenames: true
-  # TODO: Bump to Python 3.8 when support for Python 3.7 is dropped in Airflow.
-  - repo: https://github.com/asottile/pyupgrade
-    rev: v3.3.1
-    hooks:
-      - id: pyupgrade
-        name: Upgrade Python code automatically
-        args: ["--py37-plus"]
-        exclude: ^airflow/_vendor/
   - repo: https://github.com/pre-commit/pygrep-hooks
     rev: v1.9.0
     hooks:
@@ -248,35 +246,6 @@ repos:
         entry: yamllint -c yamllint-config.yml --strict
         types: [yaml]
         exclude: ^.*init_git_sync\.template\.yaml$|^.*airflow\.template\.yaml$|^chart/(?:templates|files)/.*\.yaml$|openapi/.*\.yaml$|^\.pre-commit-config\.yaml$|^airflow/_vendor/
-  - repo: https://github.com/pycqa/pydocstyle
-    rev: 6.1.1
-    hooks:
-      - id: pydocstyle
-        name: Run pydocstyle
-        args:
-          - --convention=pep257
-          - --add-ignore=D100,D102,D103,D104,D105,D107,D205,D400,D401
-        exclude: |
-          (?x)
-          ^tests/.*\.py$|
-          ^scripts/.*\.py$|
-          ^dev|
-          ^provider_packages|
-          ^docker_tests|
-          ^kubernetes_tests|
-          .*example_dags/.*|
-          ^chart/.*\.py$|
-          ^airflow/_vendor/
-        additional_dependencies: ['toml']
-  - repo: https://github.com/asottile/yesqa
-    rev: v1.4.0
-    hooks:
-      - id: yesqa
-        name: Remove unnecessary noqa statements
-        exclude: |
-          (?x)
-          ^airflow/_vendor/
-        additional_dependencies: ['flake8>=4.0.1']
   - repo: https://github.com/ikamensh/flynt
     rev: '0.77'
     hooks:
@@ -315,11 +284,6 @@ repos:
         types: [file, text]
         exclude: ^airflow/_vendor/|^clients/gen/go\.sh$|^\.gitmodules$
         additional_dependencies: ['rich>=12.4.4']
-      - id: static-check-autoflake
-        name: Remove all unused code
-        entry: autoflake --remove-all-unused-imports --ignore-init-module-imports --in-place
-        language: python
-        additional_dependencies: ['autoflake']
       - id: lint-openapi
         name: Lint OpenAPI using spectral
         language: docker_image
@@ -891,14 +855,6 @@ repos:
         exclude: ^docs/rtd-deprecation
         require_serial: true
         additional_dependencies: ['rich>=12.4.4', 'inputimeout']
-      - id: run-flake8
-        name: Run flake8
-        language: python
-        entry: ./scripts/ci/pre_commit/pre_commit_flake8.py
-        files: \.py$
-        pass_filenames: true
-        exclude: ^airflow/_vendor/
-        additional_dependencies: ['rich>=12.4.4', 'inputimeout']
       - id: check-provider-yaml-valid
         name: Validate provider.yaml files
         pass_filenames: false
diff --git a/.rat-excludes b/.rat-excludes
index 1e16d61a67..138e8a0787 100644
--- a/.rat-excludes
+++ b/.rat-excludes
@@ -15,7 +15,6 @@
 .codespellignorelines
 .eslintrc
 .eslintignore
-.flake8
 .rat-excludes
 .stylelintignore
 .stylelintrc
diff --git a/STATIC_CODE_CHECKS.rst b/STATIC_CODE_CHECKS.rst
index 1b4732cd44..a31a29171e 100644
--- a/STATIC_CODE_CHECKS.rst
+++ b/STATIC_CODE_CHECKS.rst
@@ -260,8 +260,6 @@ require Breeze Docker image to be build locally.
 |                                                           | * Add license for all md files                                   |         |
 |                                                           | * Add license for all other files                                |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
-| isort                                                     | Run isort to sort imports in Python files                        |         |
-+-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | lint-chart-schema                                         | Lint chart/values.schema.json file                               |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | lint-css                                                  | stylelint                                                        |         |
@@ -286,17 +284,13 @@ require Breeze Docker image to be build locally.
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | pretty-format-json                                        | Format json files                                                |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
-| pydocstyle                                                | Run pydocstyle                                                   |         |
-+-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | python-no-log-warn                                        | Check if there are no deprecate log warn                         |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
-| pyupgrade                                                 | Upgrade Python code automatically                                |         |
-+-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | replace-bad-characters                                    | Replace bad characters                                           |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | rst-backticks                                             | Check if RST files use double backticks for code                 |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
-| run-flake8                                                | Run flake8                                                       | *       |
+| ruff                                                      | ruff                                                             |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | run-mypy                                                  | * Run mypy for dev                                               | *       |
 |                                                           | * Run mypy for core                                              |         |
@@ -305,8 +299,6 @@ require Breeze Docker image to be build locally.
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | run-shellcheck                                            | Check Shell scripts syntax correctness                           |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
-| static-check-autoflake                                    | Remove all unused code                                           |         |
-+-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | trailing-whitespace                                       | Remove trailing whitespace at end of line                        |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | ts-compile-and-lint-javascript                            | TS types generation and ESLint against current UI files          |         |
@@ -340,8 +332,6 @@ require Breeze Docker image to be build locally.
 | update-version                                            | Update version to the latest version in the documentation        |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | yamllint                                                  | Check YAML files with yamllint                                   |         |
-+-----------------------------------------------------------+------------------------------------------------------------------+---------+
-| yesqa                                                     | Remove unnecessary noqa statements                               |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 
   .. END AUTO-GENERATED STATIC CHECK LIST
diff --git a/airflow/cli/commands/connection_command.py b/airflow/cli/commands/connection_command.py
index c19490da93..9206ae2098 100644
--- a/airflow/cli/commands/connection_command.py
+++ b/airflow/cli/commands/connection_command.py
@@ -209,7 +209,7 @@ def connections_add(args):
     if has_json and has_uri:
         raise SystemExit("Cannot supply both conn-uri and conn-json")
 
-    if has_type and not (args.conn_type in _get_connection_types()):
+    if has_type and args.conn_type not in _get_connection_types():
         warnings.warn(f"The type provided to --conn-type is invalid: {args.conn_type}")
         warnings.warn(
             f"Supported --conn-types are:{_get_connection_types()}."
diff --git a/airflow/compat/functools.pyi b/airflow/compat/functools.pyi
index 8dabbd6004..32cbbaa431 100644
--- a/airflow/compat/functools.pyi
+++ b/airflow/compat/functools.pyi
@@ -18,6 +18,7 @@
 
 # This stub exists to work around false linter errors due to python/mypy#10408.
 # TODO: Remove this file after the upstream fix is available in our toolchain.
+from __future__ import annotations
 
 from typing import Callable, TypeVar
 
diff --git a/airflow/decorators/__init__.pyi b/airflow/decorators/__init__.pyi
index b0edc7d2c2..c6a90139a8 100644
--- a/airflow/decorators/__init__.pyi
+++ b/airflow/decorators/__init__.pyi
@@ -18,9 +18,10 @@
 # dynamically generated task decorators. Functions declared in this stub do not
 # necessarily exist at run time. See "Creating Custom @task Decorators"
 # documentation for more details.
+from __future__ import annotations
 
 from datetime import timedelta
-from typing import Any, Callable, Iterable, Mapping, Union, overload
+from typing import Any, Callable, Iterable, Mapping, overload
 
 from kubernetes.client import models as k8s
 
@@ -30,6 +31,7 @@ from airflow.decorators.external_python import external_python_task
 from airflow.decorators.python import python_task
 from airflow.decorators.python_virtualenv import virtualenv_task
 from airflow.decorators.sensor import sensor_task
+from airflow.decorators.short_circuit import short_circuit_task
 from airflow.decorators.task_group import task_group
 from airflow.kubernetes.secret import Secret
 from airflow.models.dag import dag
@@ -98,8 +100,8 @@ class TaskDecoratorCollection:
         multiple_outputs: bool | None = None,
         # 'python_callable', 'op_args' and 'op_kwargs' since they are filled by
         # _PythonVirtualenvDecoratedOperator.
-        requirements: Union[None, Iterable[str], str] = None,
-        python_version: Union[None, str, int, float] = None,
+        requirements: None | Iterable[str] | str = None,
+        python_version: None | str | int | float = None,
         use_dill: bool = False,
         system_site_packages: bool = True,
         templates_dict: Mapping[str, Any] | None = None,
@@ -263,7 +265,8 @@ class TaskDecoratorCollection:
             None - No networking for this container
             container:<name|id> - Use the network stack of another container specified via <name|id>
             host - Use the host network stack. Incompatible with `port_bindings`
-            '<network-name>|<network-id>' - Connects the container to user created network(using `docker network create` command)
+            '<network-name>|<network-id>' - Connects the container to user created network(using `docker
+            network create` command)
         :param tls_ca_cert: Path to a PEM-encoded certificate authority
             to secure the docker connection.
         :param tls_client_cert: Path to the PEM-encoded certificate
@@ -448,6 +451,6 @@ class TaskDecoratorCollection:
         :param max_wait: maximum wait interval between pokes, can be ``timedelta`` or ``float`` seconds
         """
     @overload
-    def sensor(self, python_callable: Optional[FParams, FReturn] = None) -> Task[FParams, FReturn]: ...
+    def sensor(self, python_callable: FParams | FReturn | None = None) -> Task[FParams, FReturn]: ...
 
 task: TaskDecoratorCollection
diff --git a/airflow/example_dags/example_sensor_decorator.py b/airflow/example_dags/example_sensor_decorator.py
index 2197a6c53a..2ead792850 100644
--- a/airflow/example_dags/example_sensor_decorator.py
+++ b/airflow/example_dags/example_sensor_decorator.py
@@ -27,6 +27,7 @@ import pendulum
 from airflow.decorators import dag, task
 from airflow.sensors.base import PokeReturnValue
 
+
 # [END import_module]
 
 
diff --git a/airflow/example_dags/tutorial_taskflow_api.py b/airflow/example_dags/tutorial_taskflow_api.py
index f41f729af8..27a28f4b79 100644
--- a/airflow/example_dags/tutorial_taskflow_api.py
+++ b/airflow/example_dags/tutorial_taskflow_api.py
@@ -25,6 +25,7 @@ import pendulum
 
 from airflow.decorators import dag, task
 
+
 # [END import_module]
 
 
diff --git a/airflow/hooks/dbapi.py b/airflow/hooks/dbapi.py
index cd4a39af8d..b4cd1be667 100644
--- a/airflow/hooks/dbapi.py
+++ b/airflow/hooks/dbapi.py
@@ -20,8 +20,10 @@ from __future__ import annotations
 import warnings
 
 from airflow.exceptions import RemovedInAirflow3Warning
-from airflow.providers.common.sql.hooks.sql import ConnectorProtocol  # noqa
-from airflow.providers.common.sql.hooks.sql import DbApiHook  # noqa
+from airflow.providers.common.sql.hooks.sql import (
+    ConnectorProtocol,  # noqa
+    DbApiHook,  # noqa
+)
 
 warnings.warn(
     "This module is deprecated. Please use `airflow.providers.common.sql.hooks.sql`.",
diff --git a/airflow/migrations/db_types.pyi b/airflow/migrations/db_types.pyi
index bdde6f9692..7fa9ff24b3 100644
--- a/airflow/migrations/db_types.pyi
+++ b/airflow/migrations/db_types.pyi
@@ -16,6 +16,7 @@
 # specific language governing permissions and limitations
 # under the License.
 #
+from __future__ import annotations
 
 import sqlalchemy as sa
 
diff --git a/airflow/providers/amazon/aws/hooks/emr.py b/airflow/providers/amazon/aws/hooks/emr.py
index 5423dd1af8..171a07263b 100644
--- a/airflow/providers/amazon/aws/hooks/emr.py
+++ b/airflow/providers/amazon/aws/hooks/emr.py
@@ -357,7 +357,7 @@ class EmrContainerHook(AwsBaseHook):
         Submit a job to the EMR Containers API and return the job ID.
         A job run is a unit of work, such as a Spark jar, PySpark script,
         or SparkSQL query, that you submit to Amazon EMR on EKS.
-        See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/emr-containers.html#EMRContainers.Client.start_job_run  # noqa: E501
+        See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/emr-containers.html#EMRContainers.Client.start_job_run
 
         :param name: The name of the job run.
         :param execution_role_arn: The IAM role ARN associated with the job run.
@@ -369,7 +369,7 @@ class EmrContainerHook(AwsBaseHook):
             Use this if you want to specify a unique ID to prevent two jobs from getting started.
         :param tags: The tags assigned to job runs.
         :return: Job ID
-        """
+        """  # noqa: E501
         params = {
             "name": name,
             "virtualClusterId": self.virtual_cluster_id,
@@ -422,10 +422,12 @@ class EmrContainerHook(AwsBaseHook):
     def check_query_status(self, job_id: str) -> str | None:
         """
         Fetch the status of submitted job run. Returns None or one of valid query states.
-        See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/emr-containers.html#EMRContainers.Client.describe_job_run  # noqa: E501
+
+        See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/emr-containers.html#EMRContainers.Client.describe_job_run
+
         :param job_id: Id of submitted job run
         :return: str
-        """
+        """  # noqa: E501
         try:
             response = self.conn.describe_job_run(
                 virtualClusterId=self.virtual_cluster_id,
diff --git a/airflow/providers/amazon/aws/operators/sns.py b/airflow/providers/amazon/aws/operators/sns.py
index 99525c4cb6..2f5b9844bf 100644
--- a/airflow/providers/amazon/aws/operators/sns.py
+++ b/airflow/providers/amazon/aws/operators/sns.py
@@ -15,9 +15,9 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+"""Publish message to SNS queue"""
 from __future__ import annotations
 
-"""Publish message to SNS queue"""
 from typing import TYPE_CHECKING, Sequence
 
 from airflow.models import BaseOperator
diff --git a/airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py b/airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py
index 155f5439a6..017c897778 100644
--- a/airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py
+++ b/airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py
@@ -69,7 +69,7 @@ class DynamoDBToS3Operator(BaseOperator):
     :param dynamodb_table_name: Dynamodb table to replicate data from
     :param s3_bucket_name: S3 bucket to replicate data to
     :param file_size: Flush file to s3 if file size >= file_size
-    :param dynamodb_scan_kwargs: kwargs pass to <https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Table.scan>  # noqa: E501
+    :param dynamodb_scan_kwargs: kwargs pass to <https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Table.scan>
     :param s3_key_prefix: Prefix of s3 object key
     :param process_func: How we transforms a dynamodb item to bytes. By default we dump the json
     :param aws_conn_id: The Airflow connection used for AWS credentials.
@@ -77,7 +77,7 @@ class DynamoDBToS3Operator(BaseOperator):
         running Airflow in a distributed manner and aws_conn_id is None or
         empty, then default boto3 configuration would be used (and must be
         maintained on each worker node).
-    """
+    """  # noqa: E501
 
     template_fields: Sequence[str] = (
         "s3_bucket_name",
diff --git a/airflow/providers/cncf/kubernetes/utils/__init__.py b/airflow/providers/cncf/kubernetes/utils/__init__.py
index 84e243c6db..69d825b440 100644
--- a/airflow/providers/cncf/kubernetes/utils/__init__.py
+++ b/airflow/providers/cncf/kubernetes/utils/__init__.py
@@ -14,4 +14,6 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from __future__ import annotations
+
 __all__ = ["xcom_sidecar", "pod_manager"]
diff --git a/airflow/providers/google/cloud/operators/dataproc.py b/airflow/providers/google/cloud/operators/dataproc.py
index 24ca7de401..3bb7684ed0 100644
--- a/airflow/providers/google/cloud/operators/dataproc.py
+++ b/airflow/providers/google/cloud/operators/dataproc.py
@@ -129,7 +129,7 @@ class ClusterGenerator:
         ``projects/[PROJECT_STORING_KEYS]/locations/[LOCATION]/keyRings/[KEY_RING_NAME]/cryptoKeys/[KEY_NAME]`` # noqa
     :param enable_component_gateway: Provides access to the web interfaces of default and selected optional
         components on the cluster.
-    """
+    """  # noqa: E501
 
     def __init__(
         self,
diff --git a/airflow/providers/google/cloud/operators/kubernetes_engine.py b/airflow/providers/google/cloud/operators/kubernetes_engine.py
index 045935771c..6e057f6481 100644
--- a/airflow/providers/google/cloud/operators/kubernetes_engine.py
+++ b/airflow/providers/google/cloud/operators/kubernetes_engine.py
@@ -210,7 +210,7 @@ class GKECreateClusterOperator(BaseOperator):
     def _check_input(self) -> None:
         if (
             not all([self.project_id, self.location, self.body])
-            or (isinstance(self.body, dict) and not ("name" in self.body))
+            or (isinstance(self.body, dict) and "name" not in self.body)
             or (
                 isinstance(self.body, dict)
                 and ("initial_node_count" not in self.body and "node_pools" not in self.body)
diff --git a/airflow/providers/microsoft/azure/hooks/wasb.py b/airflow/providers/microsoft/azure/hooks/wasb.py
index 27680a5b69..ff1914061c 100644
--- a/airflow/providers/microsoft/azure/hooks/wasb.py
+++ b/airflow/providers/microsoft/azure/hooks/wasb.py
@@ -442,7 +442,7 @@ class WasbHook(BaseHook):
             self.log.info("Deleted container: %s", container_name)
         except ResourceNotFoundError:
             self.log.info("Unable to delete container %s (not found)", container_name)
-        except:  # noqa: E722
+        except:
             self.log.info("Error deleting container: %s", container_name)
             raise
 
diff --git a/airflow/providers/odbc/hooks/odbc.py b/airflow/providers/odbc/hooks/odbc.py
index 20e8e8864e..b1d754965e 100644
--- a/airflow/providers/odbc/hooks/odbc.py
+++ b/airflow/providers/odbc/hooks/odbc.py
@@ -145,7 +145,7 @@ class OdbcHook(DbApiHook):
 
             extra_exclude = {"driver", "dsn", "connect_kwargs", "sqlalchemy_scheme"}
             extra_params = {
-                k: v for k, v in self.connection.extra_dejson.items() if not k.lower() in extra_exclude
+                k: v for k, v in self.connection.extra_dejson.items() if k.lower() not in extra_exclude
             }
             for k, v in extra_params.items():
                 conn_str += f"{k}={v};"
diff --git a/airflow/utils/context.pyi b/airflow/utils/context.pyi
index c7bab20c85..838162649a 100644
--- a/airflow/utils/context.pyi
+++ b/airflow/utils/context.pyi
@@ -24,8 +24,9 @@
 # attributes are injected at runtime, and giving them a class would trigger
 # undefined attribute errors from Mypy. Hopefully there will be a mechanism to
 # declare "these are defined, but don't error if others are accessed" someday.
+from __future__ import annotations
 
-from typing import Any, Collection, Container, Iterable, Mapping, Union, overload
+from typing import Any, Collection, Container, Iterable, Mapping, overload
 
 from pendulum import DateTime
 
@@ -61,7 +62,7 @@ class Context(TypedDict, total=False):
     data_interval_start: DateTime
     ds: str
     ds_nodash: str
-    exception: Union[KeyboardInterrupt, Exception, str, None]
+    exception: KeyboardInterrupt | Exception | str | None
     execution_date: DateTime
     expanded_ti_count: int | None
     inlets: list
diff --git a/airflow/utils/log/action_logger.py b/airflow/utils/log/action_logger.py
index 1968604fe2..66e71c4653 100644
--- a/airflow/utils/log/action_logger.py
+++ b/airflow/utils/log/action_logger.py
@@ -15,6 +15,7 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+from __future__ import annotations
 
 
 def action_event_from_permission(prefix: str, permission: str) -> str:
diff --git a/airflow/utils/process_utils.py b/airflow/utils/process_utils.py
index 98ff1e9147..6cbd18416e 100644
--- a/airflow/utils/process_utils.py
+++ b/airflow/utils/process_utils.py
@@ -30,9 +30,9 @@ import sys
 from airflow.utils.platform import IS_WINDOWS
 
 if not IS_WINDOWS:
-    import tty
-    import termios
     import pty
+    import termios
+    import tty
 
 from contextlib import contextmanager
 from typing import Generator
diff --git a/dev/breeze/src/airflow_breeze/commands/main_command.py b/dev/breeze/src/airflow_breeze/commands/main_command.py
index 3761c4dabc..60a7598244 100644
--- a/dev/breeze/src/airflow_breeze/commands/main_command.py
+++ b/dev/breeze/src/airflow_breeze/commands/main_command.py
@@ -141,7 +141,7 @@ def check_for_python_emulation():
                 prompt="Are you REALLY sure you want to continue? (answer with y otherwise we exit in 20s)\n",
                 timeout=20,
             )
-            if not user_status.upper() in ["Y", "YES"]:
+            if user_status.upper() not in ["Y", "YES"]:
                 sys.exit(1)
     except TimeoutOccurred:
         get_console().print("\nNo answer, exiting...")
@@ -189,7 +189,7 @@ def check_for_rosetta_environment():
                 prompt="Are you REALLY sure you want to continue? (answer with y otherwise we exit in 20s)\n",
                 timeout=20,
             )
-            if not user_status.upper() in ["Y", "YES"]:
+            if user_status.upper() not in ["Y", "YES"]:
                 sys.exit(1)
     except TimeoutOccurred:
         get_console().print("\nNo answer, exiting...")
diff --git a/dev/breeze/src/airflow_breeze/global_constants.py b/dev/breeze/src/airflow_breeze/global_constants.py
index 6079f2d928..323efd1c19 100644
--- a/dev/breeze/src/airflow_breeze/global_constants.py
+++ b/dev/breeze/src/airflow_breeze/global_constants.py
@@ -14,17 +14,16 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-from __future__ import annotations
-
-import json
-from pathlib import Path
-
 """
 Global constants that are used by all other Breeze components.
 """
+from __future__ import annotations
+
+import json
 import platform
 from enum import Enum
 from functools import lru_cache
+from pathlib import Path
 
 from airflow_breeze.utils.host_info_utils import Architecture
 from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT
diff --git a/dev/breeze/src/airflow_breeze/pre_commit_ids.py b/dev/breeze/src/airflow_breeze/pre_commit_ids.py
index 851e68be39..3a95ec574f 100644
--- a/dev/breeze/src/airflow_breeze/pre_commit_ids.py
+++ b/dev/breeze/src/airflow_breeze/pre_commit_ids.py
@@ -20,6 +20,7 @@
 #
 # IF YOU WANT TO MODIFY IT, YOU SHOULD MODIFY THE TEMPLATE
 # `pre_commit_ids_TEMPLATE.py.jinja2` IN the `dev/breeze/src/airflow_breeze` DIRECTORY
+from __future__ import annotations
 
 PRE_COMMIT_LIST = [
     "all",
@@ -78,7 +79,6 @@ PRE_COMMIT_LIST = [
     "flynt",
     "identity",
     "insert-license",
-    "isort",
     "lint-chart-schema",
     "lint-css",
     "lint-dockerfile",
@@ -88,15 +88,12 @@ PRE_COMMIT_LIST = [
     "lint-openapi",
     "mixed-line-ending",
     "pretty-format-json",
-    "pydocstyle",
     "python-no-log-warn",
-    "pyupgrade",
     "replace-bad-characters",
     "rst-backticks",
-    "run-flake8",
+    "ruff",
     "run-mypy",
     "run-shellcheck",
-    "static-check-autoflake",
     "trailing-whitespace",
     "ts-compile-and-lint-javascript",
     "update-black-version",
@@ -114,5 +111,4 @@ PRE_COMMIT_LIST = [
     "update-vendored-in-k8s-json-schema",
     "update-version",
     "yamllint",
-    "yesqa",
 ]
diff --git a/dev/breeze/src/airflow_breeze/pre_commit_ids_TEMPLATE.py.jinja2 b/dev/breeze/src/airflow_breeze/pre_commit_ids_TEMPLATE.py.jinja2
index 12e32af0b9..714e731ece 100644
--- a/dev/breeze/src/airflow_breeze/pre_commit_ids_TEMPLATE.py.jinja2
+++ b/dev/breeze/src/airflow_breeze/pre_commit_ids_TEMPLATE.py.jinja2
@@ -20,5 +20,6 @@
 #
 # IF YOU WANT TO MODIFY IT, YOU SHOULD MODIFY THE TEMPLATE
 # `pre_commit_ids_TEMPLATE.py.jinja2` IN the `dev/breeze/src/airflow_breeze` DIRECTORY
+from __future__ import annotations
 
 PRE_COMMIT_LIST= {{ PRE_COMMIT_IDS }}
diff --git a/docs/apache-airflow/img/airflow_erd.sha256 b/docs/apache-airflow/img/airflow_erd.sha256
index 5b0ac5fc48..2e63d0c0a0 100644
--- a/docs/apache-airflow/img/airflow_erd.sha256
+++ b/docs/apache-airflow/img/airflow_erd.sha256
@@ -1 +1 @@
-a9c9af1ba1a690ea1c77aba3458aff1ef6f7e776f759dae227641422ba6a5856
\ No newline at end of file
+edb1bcac449e2d38c4523cea6094e812da491a01c40cf9f79024d85e69977893
diff --git a/docs/build_docs.py b/docs/build_docs.py
index 273858be76..bc66fc793f 100755
--- a/docs/build_docs.py
+++ b/docs/build_docs.py
@@ -30,6 +30,9 @@ from collections import defaultdict
 from itertools import filterfalse, tee
 from typing import Callable, Iterable, NamedTuple, TypeVar
 
+from rich.console import Console
+from tabulate import tabulate
+
 from docs.exts.docs_build import dev_index_generator, lint_checks
 from docs.exts.docs_build.code_utils import CONSOLE_WIDTH, PROVIDER_INIT_FILE
 from docs.exts.docs_build.docs_builder import DOCS_DIR, AirflowDocsBuilder, get_available_packages
@@ -39,9 +42,6 @@ from docs.exts.docs_build.github_action_utils import with_group
 from docs.exts.docs_build.package_filter import process_package_filters
 from docs.exts.docs_build.spelling_checks import SpellingError, display_spelling_error_summary
 
-from rich.console import Console
-from tabulate import tabulate
-
 TEXT_RED = "\033[31m"
 TEXT_RESET = "\033[0m"
 
diff --git a/docs/exts/provider_init_hack.py b/docs/exts/provider_init_hack.py
index be34d13b3a..fa082afc49 100644
--- a/docs/exts/provider_init_hack.py
+++ b/docs/exts/provider_init_hack.py
@@ -14,13 +14,13 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-from __future__ import annotations
-
 """
 Bugs in sphinx-autoapi using metaclasses prevent us from upgrading to 1.3
 which has implicit namespace support. Until that time, we make it look
 like a real package for building docs
 """
+from __future__ import annotations
+
 import os
 
 from sphinx.application import Sphinx
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 8bd3aa4b75..916675ef3b 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -750,6 +750,7 @@ IRSA
 isfile
 ish
 isn
+isort
 iterable
 iterables
 iteratively
diff --git a/images/breeze/output-commands-hash.txt b/images/breeze/output-commands-hash.txt
index 3ac57898ce..d159616b63 100644
--- a/images/breeze/output-commands-hash.txt
+++ b/images/breeze/output-commands-hash.txt
@@ -55,7 +55,7 @@ setup:version:123b462a421884dc2320ffc5e54b2478
 setup:f383b9236f6141f95276136ccd9217f5
 shell:affbf6f7f469408d0af47f75c6a38f6c
 start-airflow:109728919a0dd5c5ff5640ae86ba9e90
-static-checks:6c18cfc471ad4118a11fc84d41abb747
+static-checks:06708a5e0c50a6fc6cd18c2413431168
 stop:e5aa686b4e53707ced4039d8414d5cd6
 testing:docker-compose-tests:b86c044b24138af0659a05ed6331576c
 testing:helm-tests:94a442e7f3f63b34c4831a84d165690a
diff --git a/images/breeze/output_static-checks.svg b/images/breeze/output_static-checks.svg
index 87a81e7c32..7934d746c9 100644
--- a/images/breeze/output_static-checks.svg
+++ b/images/breeze/output_static-checks.svg
@@ -1,4 +1,4 @@
-<svg class="rich-terminal" viewBox="0 0 1482 1367.6" xmlns="http://www.w3.org/2000/svg">
+<svg class="rich-terminal" viewBox="0 0 1482 1343.1999999999998" xmlns="http://www.w3.org/2000/svg">
     <!-- Generated with Rich https://www.textualize.io -->
     <style>
 
@@ -43,7 +43,7 @@
 
     <defs>
     <clipPath id="breeze-static-checks-clip-terminal">
-      <rect x="0" y="0" width="1463.0" height="1316.6" />
+      <rect x="0" y="0" width="1463.0" height="1292.1999999999998" />
     </clipPath>
     <clipPath id="breeze-static-checks-line-0">
     <rect x="0" y="1.5" width="1464" height="24.65"/>
@@ -201,12 +201,9 @@
 <clipPath id="breeze-static-checks-line-51">
     <rect x="0" y="1245.9" width="1464" height="24.65"/>
             </clipPath>
-<clipPath id="breeze-static-checks-line-52">
-    <rect x="0" y="1270.3" width="1464" height="24.65"/>
-            </clipPath>
     </defs>
 
-    <rect fill="#292929" stroke="rgba(255,255,255,0.35)" stroke-width="1" x="1" y="1" width="1480" height="1365.6" rx="8"/><text class="breeze-static-checks-title" fill="#c5c8c6" text-anchor="middle" x="740" y="27">Command:&#160;static-checks</text>
+    <rect fill="#292929" stroke="rgba(255,255,255,0.35)" stroke-width="1" x="1" y="1" width="1480" height="1341.2" rx="8"/><text class="breeze-static-checks-title" fill="#c5c8c6" text-anchor="middle" x="740" y="27">Command:&#160;static-checks</text>
             <g transform="translate(26,22)">
             <circle cx="0" cy="0" r="7" fill="#ff5f57"/>
             <circle cx="22" cy="0" r="7" fill="#febc2e"/>
@@ -245,31 +242,30 @@
 </text><text class="breeze-static-checks-r5" x="0" y="654.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-26)">│</text><text class="breeze-static-checks-r7" x="366" y="654.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-26)">check-system-tests-tocs&#160;|&#160;check-xml&#160;|&#160;codespell&#160;|&#160;compile-www-assets&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="br [...]
 </text><text class="breeze-static-checks-r5" x="0" y="678.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-27)">│</text><text class="breeze-static-checks-r7" x="366" y="678.8" textLength="1073.6" clip-path="url(#breeze-static-checks-line-27)">compile-www-assets-dev&#160;|&#160;create-missing-init-py-files-tests&#160;|&#160;debug-statements&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="678.8" textLength="12.2 [...]
 </text><text class="breeze-static-checks-r5" x="0" y="703.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-28)">│</text><text class="breeze-static-checks-r7" x="366" y="703.2" textLength="1073.6" clip-path="url(#breeze-static-checks-line-28)">detect-private-key&#160;|&#160;doctoc&#160;|&#160;end-of-file-fixer&#160;|&#160;fix-encoding-pragma&#160;|&#160;flynt&#160;|&#160;identity</text><text class="breeze-static-checks-r5" x="1451.8" y="703.2" textLength="12.2" clip-path="ur [...]
-</text><text class="breeze-static-checks-r5" x="0" y="727.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-29)">│</text><text class="breeze-static-checks-r7" x="366" y="727.6" textLength="1073.6" clip-path="url(#breeze-static-checks-line-29)">|&#160;insert-license&#160;|&#160;isort&#160;|&#160;lint-chart-schema&#160;|&#160;lint-css&#160;|&#160;lint-dockerfile&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-stati [...]
-</text><text class="breeze-static-checks-r5" x="0" y="752" textLength="12.2" clip-path="url(#breeze-static-checks-line-30)">│</text><text class="breeze-static-checks-r7" x="366" y="752" textLength="1073.6" clip-path="url(#breeze-static-checks-line-30)">lint-helm-chart&#160;|&#160;lint-json-schema&#160;|&#160;lint-markdown&#160;|&#160;lint-openapi&#160;|&#160;mixed-line-ending&#160;|&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="752" textLength="12.2" clip-path="url(#bre [...]
-</text><text class="breeze-static-checks-r5" x="0" y="776.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-31)">│</text><text class="breeze-static-checks-r7" x="366" y="776.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-31)">pretty-format-json&#160;|&#160;pydocstyle&#160;|&#160;python-no-log-warn&#160;|&#160;pyupgrade&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</ [...]
-</text><text class="breeze-static-checks-r5" x="0" y="800.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-32)">│</text><text class="breeze-static-checks-r7" x="366" y="800.8" textLength="1073.6" clip-path="url(#breeze-static-checks-line-32)">replace-bad-characters&#160;|&#160;rst-backticks&#160;|&#160;run-flake8&#160;|&#160;run-mypy&#160;|&#160;run-shellcheck&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="800.8" t [...]
-</text><text class="breeze-static-checks-r5" x="0" y="825.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-33)">│</text><text class="breeze-static-checks-r7" x="366" y="825.2" textLength="1073.6" clip-path="url(#breeze-static-checks-line-33)">static-check-autoflake&#160;|&#160;trailing-whitespace&#160;|&#160;ts-compile-and-lint-javascript&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="825.2" textLength= [...]
-</text><text class="breeze-static-checks-r5" x="0" y="849.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-34)">│</text><text class="breeze-static-checks-r7" x="366" y="849.6" textLength="1073.6" clip-path="url(#breeze-static-checks-line-34)">update-black-version&#160;|&#160;update-breeze-cmd-output&#160;|&#160;update-breeze-readme-config-hash&#160;|&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="849.6" textLength="12.2" clip-path="url(#br [...]
-</text><text class="breeze-static-checks-r5" x="0" y="874" textLength="12.2" clip-path="url(#breeze-static-checks-line-35)">│</text><text class="breeze-static-checks-r7" x="366" y="874" textLength="1073.6" clip-path="url(#breeze-static-checks-line-35)">update-er-diagram&#160;|&#160;update-extras&#160;|&#160;update-in-the-wild-to-be-sorted&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-stat [...]
-</text><text class="breeze-static-checks-r5" x="0" y="898.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-36)">│</text><text class="breeze-static-checks-r7" x="366" y="898.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-36)">update-inlined-dockerfile-scripts&#160;|&#160;update-local-yml-file&#160;|&#160;update-migration-references&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="898.4" textLength="12.2" clip-path="url(#breeze-static-checks-l [...]
-</text><text class="breeze-static-checks-r5" x="0" y="922.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-37)">│</text><text class="breeze-static-checks-r7" x="366" y="922.8" textLength="1073.6" clip-path="url(#breeze-static-checks-line-37)">|&#160;update-providers-dependencies&#160;|&#160;update-spelling-wordlist-to-be-sorted&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451 [...]
-</text><text class="breeze-static-checks-r5" x="0" y="947.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-38)">│</text><text class="breeze-static-checks-r7" x="366" y="947.2" textLength="1073.6" clip-path="url(#breeze-static-checks-line-38)">update-supported-versions&#160;|&#160;update-vendored-in-k8s-json-schema&#160;|&#160;update-version&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="947.2" textLength="12.2" cli [...]
-</text><text class="breeze-static-checks-r5" x="0" y="971.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-39)">│</text><text class="breeze-static-checks-r7" x="366" y="971.6" textLength="1073.6" clip-path="url(#breeze-static-checks-line-39)">yamllint&#160;|&#160;yesqa)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#16 [...]
-</text><text class="breeze-static-checks-r5" x="0" y="996" textLength="12.2" clip-path="url(#breeze-static-checks-line-40)">│</text><text class="breeze-static-checks-r4" x="24.4" y="996" textLength="12.2" clip-path="url(#breeze-static-checks-line-40)">-</text><text class="breeze-static-checks-r4" x="36.6" y="996" textLength="61" clip-path="url(#breeze-static-checks-line-40)">-file</text><text class="breeze-static-checks-r6" x="317.2" y="996" textLength="24.4" clip-path="url(#breeze-stati [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1020.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1020.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1020.4" textLength="48.8" clip-path="url(#breeze-static-checks-line-41)">-all</text><text class="breeze-static-checks-r4" x="85.4" y="1020.4" textLength="73.2" clip-path="url(# [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1044.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1044.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1044.8" textLength="61" clip-path="url(#breeze-static-checks-line-42)">-show</text><text class="breeze-static-checks-r4" x="97.6" y="1044.8" textLength="195.2" clip-path="url(# [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1069.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1069.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1069.2" textLength="61" clip-path="url(#breeze-static-checks-line-43)">-last</text><text class="breeze-static-checks-r4" x="97.6" y="1069.2" textLength="85.4" clip-path="url(#b [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1093.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1093.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1093.6" textLength="85.4" clip-path="url(#breeze-static-checks-line-44)">-commit</text><text class="breeze-static-checks-r4" x="122" y="1093.6" textLength="48.8" clip-path="url [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1118" textLength="12.2" clip-path="url(#breeze-static-checks-line-45)">│</text><text class="breeze-static-checks-r2" x="366" y="1118" textLength="292.8" clip-path="url(#breeze-static-checks-line-45)">Mutually&#160;exclusive&#160;with&#160;</text><text class="breeze-static-checks-r4" x="658.8" y="1118" textLength="12.2" clip-path="url(#breeze-static-checks-line-45)">-</text><text class="breeze-static-checks-r4" x="671" y="1118" textLen [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1142.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-46)">│</text><text class="breeze-static-checks-r7" x="366" y="1142.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-46)">(TEXT)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160 [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1166.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-47)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1166.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-47)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1166.8" textLength="85.4" clip-path="url(#breeze-static-checks-line-47)">-github</text><text class="breeze-static-checks-r4" x="122" y="1166.8" textLength="134.2" clip-path="ur [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1191.2" textLength="1464" clip-path="url(#breeze-static-checks-line-48)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-static-checks-r2" x="1464" y="1191.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-48)">
-</text><text class="breeze-static-checks-r5" x="0" y="1215.6" textLength="24.4" clip-path="url(#breeze-static-checks-line-49)">╭─</text><text class="breeze-static-checks-r5" x="24.4" y="1215.6" textLength="195.2" clip-path="url(#breeze-static-checks-line-49)">&#160;Common&#160;options&#160;</text><text class="breeze-static-checks-r5" x="219.6" y="1215.6" textLength="1220" clip-path="url(#breeze-static-checks-line-49)">────────────────────────────────────────────────────────────────────── [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1240" textLength="12.2" clip-path="url(#breeze-static-checks-line-50)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1240" textLength="12.2" clip-path="url(#breeze-static-checks-line-50)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1240" textLength="97.6" clip-path="url(#breeze-static-checks-line-50)">-verbose</text><text class="breeze-static-checks-r6" x="158.6" y="1240" textLength="24.4" clip-path="url(#bre [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1264.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1264.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1264.4" textLength="48.8" clip-path="url(#breeze-static-checks-line-51)">-dry</text><text class="breeze-static-checks-r4" x="85.4" y="1264.4" textLength="48.8" clip-path="url(# [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1288.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-52)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1288.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-52)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1288.8" textLength="61" clip-path="url(#breeze-static-checks-line-52)">-help</text><text class="breeze-static-checks-r6" x="158.6" y="1288.8" textLength="24.4" clip-path="url(# [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1313.2" textLength="1464" clip-path="url(#breeze-static-checks-line-53)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-static-checks-r2" x="1464" y="1313.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-53)">
+</text><text class="breeze-static-checks-r5" x="0" y="727.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-29)">│</text><text class="breeze-static-checks-r7" x="366" y="727.6" textLength="1073.6" clip-path="url(#breeze-static-checks-line-29)">|&#160;insert-license&#160;|&#160;lint-chart-schema&#160;|&#160;lint-css&#160;|&#160;lint-dockerfile&#160;|&#160;lint-helm-chart&#160;|&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="727.6" textLength="12.2 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="752" textLength="12.2" clip-path="url(#breeze-static-checks-line-30)">│</text><text class="breeze-static-checks-r7" x="366" y="752" textLength="1073.6" clip-path="url(#breeze-static-checks-line-30)">lint-json-schema&#160;|&#160;lint-markdown&#160;|&#160;lint-openapi&#160;|&#160;mixed-line-ending&#160;|&#160;pretty-format-json</text><text class="breeze-static-checks-r5" x="1451.8" y="752" textLength="12.2" clip-path="url(#breeze-static [...]
+</text><text class="breeze-static-checks-r5" x="0" y="776.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-31)">│</text><text class="breeze-static-checks-r7" x="366" y="776.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-31)">|&#160;python-no-log-warn&#160;|&#160;replace-bad-characters&#160;|&#160;rst-backticks&#160;|&#160;ruff&#160;|&#160;run-mypy&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="776 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="800.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-32)">│</text><text class="breeze-static-checks-r7" x="366" y="800.8" textLength="1073.6" clip-path="url(#breeze-static-checks-line-32)">run-shellcheck&#160;|&#160;trailing-whitespace&#160;|&#160;ts-compile-and-lint-javascript&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-che [...]
+</text><text class="breeze-static-checks-r5" x="0" y="825.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-33)">│</text><text class="breeze-static-checks-r7" x="366" y="825.2" textLength="1073.6" clip-path="url(#breeze-static-checks-line-33)">update-black-version&#160;|&#160;update-breeze-cmd-output&#160;|&#160;update-breeze-readme-config-hash&#160;|&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="825.2" textLength="12.2" clip-path="url(#br [...]
+</text><text class="breeze-static-checks-r5" x="0" y="849.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-34)">│</text><text class="breeze-static-checks-r7" x="366" y="849.6" textLength="1073.6" clip-path="url(#breeze-static-checks-line-34)">update-er-diagram&#160;|&#160;update-extras&#160;|&#160;update-in-the-wild-to-be-sorted&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze- [...]
+</text><text class="breeze-static-checks-r5" x="0" y="874" textLength="12.2" clip-path="url(#breeze-static-checks-line-35)">│</text><text class="breeze-static-checks-r7" x="366" y="874" textLength="1073.6" clip-path="url(#breeze-static-checks-line-35)">update-inlined-dockerfile-scripts&#160;|&#160;update-local-yml-file&#160;|&#160;update-migration-references&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="874" textLength="12.2" clip-path="url(#breeze-static-checks-line-35 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="898.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-36)">│</text><text class="breeze-static-checks-r7" x="366" y="898.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-36)">|&#160;update-providers-dependencies&#160;|&#160;update-spelling-wordlist-to-be-sorted&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="922.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-37)">│</text><text class="breeze-static-checks-r7" x="366" y="922.8" textLength="1073.6" clip-path="url(#breeze-static-checks-line-37)">update-supported-versions&#160;|&#160;update-vendored-in-k8s-json-schema&#160;|&#160;update-version&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="922.8" textLength="12.2" cli [...]
+</text><text class="breeze-static-checks-r5" x="0" y="947.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-38)">│</text><text class="breeze-static-checks-r7" x="366" y="947.2" textLength="1073.6" clip-path="url(#breeze-static-checks-line-38)">yamllint)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#16 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="971.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-39)">│</text><text class="breeze-static-checks-r4" x="24.4" y="971.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-39)">-</text><text class="breeze-static-checks-r4" x="36.6" y="971.6" textLength="61" clip-path="url(#breeze-static-checks-line-39)">-file</text><text class="breeze-static-checks-r6" x="317.2" y="971.6" textLength="24.4" clip-path="url(#bree [...]
+</text><text class="breeze-static-checks-r5" x="0" y="996" textLength="12.2" clip-path="url(#breeze-static-checks-line-40)">│</text><text class="breeze-static-checks-r4" x="24.4" y="996" textLength="12.2" clip-path="url(#breeze-static-checks-line-40)">-</text><text class="breeze-static-checks-r4" x="36.6" y="996" textLength="48.8" clip-path="url(#breeze-static-checks-line-40)">-all</text><text class="breeze-static-checks-r4" x="85.4" y="996" textLength="73.2" clip-path="url(#breeze-stati [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1020.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1020.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1020.4" textLength="61" clip-path="url(#breeze-static-checks-line-41)">-show</text><text class="breeze-static-checks-r4" x="97.6" y="1020.4" textLength="195.2" clip-path="url(# [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1044.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1044.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1044.8" textLength="61" clip-path="url(#breeze-static-checks-line-42)">-last</text><text class="breeze-static-checks-r4" x="97.6" y="1044.8" textLength="85.4" clip-path="url(#b [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1069.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1069.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1069.2" textLength="85.4" clip-path="url(#breeze-static-checks-line-43)">-commit</text><text class="breeze-static-checks-r4" x="122" y="1069.2" textLength="48.8" clip-path="url [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1093.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">│</text><text class="breeze-static-checks-r2" x="366" y="1093.6" textLength="292.8" clip-path="url(#breeze-static-checks-line-44)">Mutually&#160;exclusive&#160;with&#160;</text><text class="breeze-static-checks-r4" x="658.8" y="1093.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">-</text><text class="breeze-static-checks-r4" x="671" y="1093.6" [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1118" textLength="12.2" clip-path="url(#breeze-static-checks-line-45)">│</text><text class="breeze-static-checks-r7" x="366" y="1118" textLength="1073.6" clip-path="url(#breeze-static-checks-line-45)">(TEXT)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#1 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1142.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-46)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1142.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-46)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1142.4" textLength="85.4" clip-path="url(#breeze-static-checks-line-46)">-github</text><text class="breeze-static-checks-r4" x="122" y="1142.4" textLength="134.2" clip-path="ur [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1166.8" textLength="1464" clip-path="url(#breeze-static-checks-line-47)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-static-checks-r2" x="1464" y="1166.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-47)">
+</text><text class="breeze-static-checks-r5" x="0" y="1191.2" textLength="24.4" clip-path="url(#breeze-static-checks-line-48)">╭─</text><text class="breeze-static-checks-r5" x="24.4" y="1191.2" textLength="195.2" clip-path="url(#breeze-static-checks-line-48)">&#160;Common&#160;options&#160;</text><text class="breeze-static-checks-r5" x="219.6" y="1191.2" textLength="1220" clip-path="url(#breeze-static-checks-line-48)">────────────────────────────────────────────────────────────────────── [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1215.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-49)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1215.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-49)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1215.6" textLength="97.6" clip-path="url(#breeze-static-checks-line-49)">-verbose</text><text class="breeze-static-checks-r6" x="158.6" y="1215.6" textLength="24.4" clip-path=" [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1240" textLength="12.2" clip-path="url(#breeze-static-checks-line-50)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1240" textLength="12.2" clip-path="url(#breeze-static-checks-line-50)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1240" textLength="48.8" clip-path="url(#breeze-static-checks-line-50)">-dry</text><text class="breeze-static-checks-r4" x="85.4" y="1240" textLength="48.8" clip-path="url(#breeze-s [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1264.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1264.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1264.4" textLength="61" clip-path="url(#breeze-static-checks-line-51)">-help</text><text class="breeze-static-checks-r6" x="158.6" y="1264.4" textLength="24.4" clip-path="url(# [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1288.8" textLength="1464" clip-path="url(#breeze-static-checks-line-52)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-static-checks-r2" x="1464" y="1288.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-52)">
 </text>
     </g>
     </g>
diff --git a/provider_packages/.flake8 b/provider_packages/.flake8
deleted file mode 120000
index cb0568d647..0000000000
--- a/provider_packages/.flake8
+++ /dev/null
@@ -1 +0,0 @@
-../.flake8
\ No newline at end of file
diff --git a/pyproject.toml b/pyproject.toml
index 0f951ba318..722d50de53 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -25,16 +25,102 @@ target-version = ['py37', 'py38', 'py39', 'py310']
 [build-system]
 requires = ['setuptools==67.2.0']
 build-backend = "setuptools.build_meta"
-[tool.isort]
-add_imports = ["from __future__ import annotations"]
-append_only = true
-line_length = 110
-combine_as_imports = true
-default_section = "THIRDPARTY"
-known_first_party = ["airflow", "airflow_breeze", "docker_tests", "docs", "kubernetes_tests", "tests"]
+
+[tool.ruff]
+typing-modules = ["airflow.typing_compat"]
+line-length = 110
+extend-exclude = [
+    ".eggs",
+    "airflow/_vendor/*",
+
+    # The files generated by stubgen aren't 100% valid syntax it turns out, and we don't ship them, so we can
+    # ignore them in ruff
+    "airflow/providers/common/sql/*/*.pyi"
+]
+
+# TODO: Bump to Python 3.8 when support for Python 3.7 is dropped in Airflow.
+target-version = "py37"
+
+extend-select = [
+    "I", # Missing required import (auto-fixable)
+    "UP", # Pyupgrade
+    "RUF100", # Unused noqa (auto-fixable)
+
+    # We ignore more pydocstyle than we enable, so be more selective at what we enable
+    "D101",
+    "D106",
+    "D2",
+    "D3",
+    # "D401", # Not enabled by ruff, but we don't want it
+    "D402",
+    "D403",
+    "D412",
+    "D419"
+]
+extend-ignore = [
+    "D203",
+    "D205",
+    "D212",
+    "D213",
+    "D214",
+    "D215",
+    "E731",
+]
+
+[tool.ruff.isort]
+known-first-party = ["airflow", "airflow_breeze", "docker_tests", "docs", "kubernetes_tests", "tests"]
+required-imports = ["from __future__ import annotations"]
+combine-as-imports = true
+
+# TODO: for now, https://github.com/charliermarsh/ruff/issues/1817
+known-third-party = [
+    "asana",
+    "atlassian",
+    "celery",
+    "cloudant",
+    "databricks",
+    "datadog",
+    "docker",
+    "elasticsearch",
+    "github",
+    "google",
+    "grpc",
+    "jenkins",
+    "mysql",
+    "neo4j",
+    "papermill",
+    "redis",
+    "sendgrid",
+    "snowflake",
+    "telegram",
+    "trino",
+]
+
+[tool.ruff.per-file-ignores]
+"airflow/models/__init__.py" = ["F401"]
+"airflow/models/sqla_models.py" = ["F401"]
+
+
 # The test_python.py is needed because adding __future__.annotations breaks runtime checks that are
 # needed for the test to work
-skip = ["build", ".tox", "venv", "tests/decorators/test_python.py"]
-lines_between_types = 0
-skip_glob = ["*.pyi"]
-profile = "black"
+"tests/decorators/test_python.py" = ["I002"]
+
+# Ignore pydoc style from these
+"*.pyi" = ["D"]
+"tests/*" = ["D"]
+"scripts/*" = ["D"]
+"dev/*" = ["D"]
+"docs/*" = ["D"]
+"provider_packages/*" = ["D"]
+"docker_tests/*" = ["D"]
+"kubernetes_tests/*" = ["D"]
+"*/example_dags/*" = ["D"]
+"chart/*" = ["D"]
+
+# All of the modules which have an extra license header (i.e. that we copy from another project) need to
+# ignore E402 -- module level import not at top level
+"airflow/api/auth/backend/kerberos_auth.py" = ["E402"]
+"airflow/security/kerberos.py" = ["E402"]
+"airflow/security/utils.py" = ["E402"]
+"tests/providers/elasticsearch/log/elasticmock/__init__.py" = ["E402"]
+"tests/providers/elasticsearch/log/elasticmock/utilities/__init__.py" = ["E402"]
diff --git a/scripts/ci/pre_commit/pre_commit_check_pre_commit_hooks.py b/scripts/ci/pre_commit/pre_commit_check_pre_commit_hooks.py
index 2cee66a4b2..d6e32a0937 100755
--- a/scripts/ci/pre_commit/pre_commit_check_pre_commit_hooks.py
+++ b/scripts/ci/pre_commit/pre_commit_check_pre_commit_hooks.py
@@ -119,10 +119,10 @@ def black_mode():
 
     return Mode(
         target_versions=target_versions,
-        line_length=bool(config.get("line_length", Mode.line_length)),
-        is_pyi=bool(config.get("is_pyi", Mode.is_pyi)),
-        string_normalization=not bool(config.get("skip_string_normalization", not Mode.string_normalization)),
-        preview=bool(config.get("preview", Mode.preview)),
+        line_length=config.get("line_length", Mode.line_length),
+        is_pyi=config.get("is_pyi", False),
+        string_normalization=not config.get("skip_string_normalization", False),
+        preview=config.get("preview", False),
     )
 
 
@@ -170,7 +170,7 @@ def main():
     parser = argparse.ArgumentParser()
     parser.add_argument("--max-length", help="Max length for hook names")
     args = parser.parse_args()
-    max_length = int(args.max_length) or 70
+    max_length = int(args.max_length or 70)
     content = yaml.safe_load(PRE_COMMIT_YAML_FILE.read_text())
     errors, hooks, image_hooks = get_errors_and_hooks(content, max_length)
     if errors:
diff --git a/scripts/ci/pre_commit/pre_commit_flake8.py b/scripts/ci/pre_commit/pre_commit_flake8.py
deleted file mode 100755
index 1a09c9f474..0000000000
--- a/scripts/ci/pre_commit/pre_commit_flake8.py
+++ /dev/null
@@ -1,72 +0,0 @@
-#!/usr/bin/env python
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-from __future__ import annotations
-
-import os
-import sys
-from pathlib import Path
-
-if __name__ not in ("__main__", "__mp_main__"):
-    raise SystemExit(
-        "This file is intended to be executed as an executable program. You cannot use it as a module."
-        f"To run this script, run the ./{__file__} command"
-    )
-
-AIRFLOW_SOURCES = Path(__file__).parents[3].resolve()
-GITHUB_REPOSITORY = os.environ.get("GITHUB_REPOSITORY", "apache/airflow")
-os.environ["SKIP_GROUP_OUTPUT"] = "true"
-
-if __name__ == "__main__":
-    sys.path.insert(0, str(Path(__file__).parent.resolve()))  # make sure common_precommit_utils is imported
-    from common_precommit_utils import filter_out_providers_on_non_main_branch
-
-    sys.path.insert(0, str(AIRFLOW_SOURCES / "dev" / "breeze" / "src"))
-    from airflow_breeze.global_constants import MOUNT_SELECTED
-    from airflow_breeze.utils.console import get_console
-    from airflow_breeze.utils.docker_command_utils import get_extra_docker_flags
-    from airflow_breeze.utils.run_utils import get_ci_image_for_pre_commits, run_command
-
-    files_to_test = filter_out_providers_on_non_main_branch(sys.argv[1:])
-    if not files_to_test:
-        print("No files to tests. Quitting")
-        sys.exit(0)
-    airflow_image = get_ci_image_for_pre_commits()
-    cmd_result = run_command(
-        [
-            "docker",
-            "run",
-            "-t",
-            *get_extra_docker_flags(MOUNT_SELECTED),
-            "-e",
-            "SKIP_ENVIRONMENT_INITIALIZATION=true",
-            "-e",
-            "BACKEND=sqlite",
-            "--pull",
-            "never",
-            airflow_image,
-            "/opt/airflow/scripts/in_container/run_flake8.sh",
-            *files_to_test,
-        ],
-        check=False,
-    )
-    if cmd_result.returncode != 0:
-        get_console().print(
-            "[warning]If you see strange stacktraces above, "
-            "run `breeze ci-image build --python 3.7` and try again."
-        )
-    sys.exit(cmd_result.returncode)
diff --git a/scripts/in_container/run_flake8.sh b/scripts/in_container/run_flake8.sh
deleted file mode 100755
index f6c7baa3e1..0000000000
--- a/scripts/in_container/run_flake8.sh
+++ /dev/null
@@ -1,20 +0,0 @@
-#!/usr/bin/env bash
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-# shellcheck source=scripts/in_container/_in_container_script_init.sh
-. "$( dirname "${BASH_SOURCE[0]}" )/_in_container_script_init.sh"
-flake8 "$@"
diff --git a/setup.py b/setup.py
index 622d630c05..d9279da7b0 100644
--- a/setup.py
+++ b/setup.py
@@ -366,15 +366,8 @@ devel_only = [
     "click>=8.0",
     "coverage",
     "filelock",
-    "flake8>=3.9.0",
-    "flake8-colors",
-    "flake8-implicit-str-concat",
     "gitpython",
     "ipdb",
-    # make sure that we are using stable sorting order from 5.* version (some changes were introduced
-    # in 5.11.3. Black is not compatible yet, so we need to limit isort
-    # we can remove the limit when black and isort agree on the order
-    "isort==5.11.2",
     "jira",
     "jsondiff",
     "mongomock",
@@ -399,6 +392,7 @@ devel_only = [
     "pytest-httpx",
     "requests_mock",
     "rich-click>=1.5",
+    "ruff>=0.0.219",
     "semver",
     "time-machine",
     "towncrier",
diff --git a/tests/api_connexion/endpoints/test_dag_endpoint.py b/tests/api_connexion/endpoints/test_dag_endpoint.py
index 3aa60abd33..a6ace057f4 100644
--- a/tests/api_connexion/endpoints/test_dag_endpoint.py
+++ b/tests/api_connexion/endpoints/test_dag_endpoint.py
@@ -1190,21 +1190,6 @@ class TestPatchDags(TestDagEndpoint):
                     "timetable_description": None,
                     "has_import_errors": False,
                     "pickle_id": None,
-                    "next_dagrun": None,
-                    "has_task_concurrency_limits": True,
-                    "next_dagrun_data_interval_start": None,
-                    "next_dagrun_data_interval_end": None,
-                    "max_active_runs": 16,
-                    "next_dagrun_create_after": None,
-                    "last_expired": None,
-                    "max_active_tasks": 16,
-                    "last_pickled": None,
-                    "default_view": None,
-                    "last_parsed_time": None,
-                    "scheduler_lock": None,
-                    "timetable_description": None,
-                    "has_import_errors": False,
-                    "pickle_id": None,
                 },
                 {
                     "dag_id": "TEST_DAG_DELETED_1",
diff --git a/tests/providers/google/suite/hooks/test_calendar.py b/tests/providers/google/suite/hooks/test_calendar.py
index f472b3da3f..6f51351715 100644
--- a/tests/providers/google/suite/hooks/test_calendar.py
+++ b/tests/providers/google/suite/hooks/test_calendar.py
@@ -15,11 +15,10 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-from __future__ import annotations
-
 """
 Unit Tests for the Google Calendar Hook
 """
+from __future__ import annotations
 
 import unittest
 from unittest import mock
diff --git a/tests/system/providers/cncf/kubernetes/example_spark_kubernetes.py b/tests/system/providers/cncf/kubernetes/example_spark_kubernetes.py
index ba2e21ee23..03fd0a34e5 100644
--- a/tests/system/providers/cncf/kubernetes/example_spark_kubernetes.py
+++ b/tests/system/providers/cncf/kubernetes/example_spark_kubernetes.py
@@ -15,8 +15,6 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-from __future__ import annotations
-
 """
 This is an example DAG which uses SparkKubernetesOperator and SparkKubernetesSensor.
 In this example, we create two tasks which execute sequentially.
@@ -26,6 +24,7 @@ and the second task is to check the final state of the sparkApplication that sub
 Spark-on-k8s operator is required to be already installed on Kubernetes
 https://github.com/GoogleCloudPlatform/spark-on-k8s-operator
 """
+from __future__ import annotations
 
 import os
 from datetime import datetime, timedelta
diff --git a/tests/system/providers/google/cloud/bigtable/example_bigtable.py b/tests/system/providers/google/cloud/bigtable/example_bigtable.py
index 96a3c1c450..b105634118 100644
--- a/tests/system/providers/google/cloud/bigtable/example_bigtable.py
+++ b/tests/system/providers/google/cloud/bigtable/example_bigtable.py
@@ -29,7 +29,7 @@ This DAG relies on the following environment variables:
 * CBT_INSTANCE_ID - desired ID of a Cloud Bigtable instance
 * CBT_INSTANCE_DISPLAY_NAME - desired human-readable display name of the Instance
 * CBT_INSTANCE_TYPE - type of the Instance, e.g. 1 for DEVELOPMENT
-    See https://googleapis.github.io/google-cloud-python/latest/bigtable/instance.html#google.cloud.bigtable.instance.Instance # noqa E501
+    See https://googleapis.github.io/google-cloud-python/latest/bigtable/instance.html#google.cloud.bigtable.instance.Instance
 * CBT_INSTANCE_LABELS - labels to add for the Instance
 * CBT_CLUSTER_ID - desired ID of the main Cluster created for the Instance
 * CBT_CLUSTER_ZONE - zone in which main Cluster will be created. e.g. europe-west1-b
@@ -37,10 +37,10 @@ This DAG relies on the following environment variables:
 * CBT_CLUSTER_NODES - initial amount of nodes of the Cluster
 * CBT_CLUSTER_NODES_UPDATED - amount of nodes for BigtableClusterUpdateOperator
 * CBT_CLUSTER_STORAGE_TYPE - storage for the Cluster, e.g. 1 for SSD
-    See https://googleapis.github.io/google-cloud-python/latest/bigtable/instance.html#google.cloud.bigtable.instance.Instance.cluster # noqa E501
+    See https://googleapis.github.io/google-cloud-python/latest/bigtable/instance.html#google.cloud.bigtable.instance.Instance.cluster
 * CBT_TABLE_ID - desired ID of the Table
 * CBT_POKE_INTERVAL - number of seconds between every attempt of Sensor check
-"""
+"""  # noqa: E501
 from __future__ import annotations
 
 import os
diff --git a/tests/test_utils/get_all_tests.py b/tests/test_utils/get_all_tests.py
index ff2b6e4975..1ed04ed64d 100644
--- a/tests/test_utils/get_all_tests.py
+++ b/tests/test_utils/get_all_tests.py
@@ -16,11 +16,11 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-from __future__ import annotations
-
 """
 Gets all tests cases from xunit file.
 """
+from __future__ import annotations
+
 import sys
 from xml.etree import ElementTree
 


[airflow] 02/12: Replace freezegun with time-machine (#28193)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2ecbfd0118ac1db9bf87ed9a44853b11d6b972f0
Author: Ash Berlin-Taylor <as...@apache.org>
AuthorDate: Mon Dec 12 17:38:03 2022 +0000

    Replace freezegun with time-machine (#28193)
    
    The primary driver for this was a niggle that the durations output for
    one test was reporting over 52 years:
    
    > 1670340356.40s call     tests/jobs/test_base_job.py::TestBaseJob::test_heartbeat
    
    It turns out this was caused by freezegun, but time_machine fixes this.
    It also might be a bit faster, but that isn't a noticeable difference for
    us. (No runtime difference for the changed files, but it does make
    collection quicker: from 10s to 8s)
    
    (cherry picked from commit 4d0fd8ef6adc35f683c7561f05688a65fd7451f4)
---
 setup.py                                           |   3 +-
 tests/api/client/test_local_client.py              |   4 +-
 .../endpoints/test_dag_run_endpoint.py             |   4 +-
 tests/api_connexion/schemas/test_dataset_schema.py |   4 +-
 tests/conftest.py                                  |  20 ++--
 tests/core/test_sentry.py                          |   4 +-
 tests/dag_processing/test_manager.py               |   6 +-
 tests/executors/test_celery_executor.py            |   6 +-
 tests/jobs/test_scheduler_job.py                   |   4 +-
 tests/models/test_dag.py                           |   6 +-
 tests/models/test_dagbag.py                        |  14 +--
 tests/models/test_taskinstance.py                  |  18 ++--
 tests/models/test_timestamp.py                     |   6 +-
 tests/operators/test_datetime.py                   |  16 +--
 tests/operators/test_latest_only_operator.py       |   4 +-
 tests/operators/test_weekday.py                    |  10 +-
 tests/providers/amazon/aws/hooks/test_eks.py       |  10 +-
 .../amazon/aws/sensors/test_s3_keys_unchanged.py   |  24 ++---
 .../amazon/aws/utils/test_eks_get_token.py         |   4 +-
 .../elasticsearch/log/test_es_task_handler.py      |  15 ++-
 .../test_cloud_storage_transfer_service.py         |   4 +-
 tests/sensors/test_base.py                         | 107 ++++++++++-----------
 tests/sensors/test_time_sensor.py                  |  11 +--
 tests/ti_deps/deps/test_not_in_retry_period_dep.py |   6 +-
 tests/ti_deps/deps/test_runnable_exec_date_dep.py  |   6 +-
 tests/timetables/test_interval_timetable.py        |   8 +-
 tests/timetables/test_trigger_timetable.py         |   6 +-
 tests/utils/log/test_file_processor_handler.py     |   8 +-
 tests/utils/test_serve_logs.py                     |  14 +--
 tests/www/test_security.py                         |  10 +-
 tests/www/views/test_views_grid.py                 |  23 +++--
 tests/www/views/test_views_tasks.py                |  17 ++--
 32 files changed, 194 insertions(+), 208 deletions(-)

diff --git a/setup.py b/setup.py
index 5ed4bb756f..a132eb861c 100644
--- a/setup.py
+++ b/setup.py
@@ -339,7 +339,6 @@ mypy_dependencies = [
     "types-croniter",
     "types-Deprecated",
     "types-docutils",
-    "types-freezegun",
     "types-paramiko",
     "types-protobuf",
     "types-python-dateutil",
@@ -371,7 +370,6 @@ devel_only = [
     "flake8-colors",
     "flake8-implicit-str-concat",
     "flaky",
-    "freezegun",
     "gitpython",
     "ipdb",
     # make sure that we are using stable sorting order from 5.* version (some changes were introduced
@@ -403,6 +401,7 @@ devel_only = [
     "requests_mock",
     "rich-click>=1.5",
     "semver",
+    "time-machine",
     "towncrier",
     "twine",
     "wheel",
diff --git a/tests/api/client/test_local_client.py b/tests/api/client/test_local_client.py
index 70188ba5e1..d079f6c510 100644
--- a/tests/api/client/test_local_client.py
+++ b/tests/api/client/test_local_client.py
@@ -24,7 +24,7 @@ from unittest.mock import patch
 
 import pendulum
 import pytest
-from freezegun import freeze_time
+import time_machine
 
 from airflow.api.client.local_client import Client
 from airflow.example_dags import example_bash_operator
@@ -72,7 +72,7 @@ class TestLocalClient:
             run_after=pendulum.instance(EXECDATE_NOFRACTIONS)
         )
 
-        with freeze_time(EXECDATE):
+        with time_machine.travel(EXECDATE, tick=False):
             # no execution date, execution date should be set automatically
 
             self.client.trigger_dag(dag_id=test_dag_id)
diff --git a/tests/api_connexion/endpoints/test_dag_run_endpoint.py b/tests/api_connexion/endpoints/test_dag_run_endpoint.py
index 93c9c6a1d0..a80e8b9f4a 100644
--- a/tests/api_connexion/endpoints/test_dag_run_endpoint.py
+++ b/tests/api_connexion/endpoints/test_dag_run_endpoint.py
@@ -20,7 +20,7 @@ from datetime import timedelta
 from unittest import mock
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 from parameterized import parameterized
 
 from airflow.api_connexion.exceptions import EXCEPTIONS_LINK_MAP
@@ -1350,7 +1350,7 @@ class TestPatchDagRunState(TestDagRunEndpoint):
         }
 
     @pytest.mark.parametrize("invalid_state", ["running"])
-    @freeze_time(TestDagRunEndpoint.default_time)
+    @time_machine.travel(TestDagRunEndpoint.default_time)
     def test_should_response_400_for_non_existing_dag_run_state(self, invalid_state, dag_maker):
         dag_id = "TEST_DAG_ID"
         dag_run_id = "TEST_DAG_RUN_ID"
diff --git a/tests/api_connexion/schemas/test_dataset_schema.py b/tests/api_connexion/schemas/test_dataset_schema.py
index 5147768aed..85deb129f3 100644
--- a/tests/api_connexion/schemas/test_dataset_schema.py
+++ b/tests/api_connexion/schemas/test_dataset_schema.py
@@ -16,7 +16,7 @@
 # under the License.
 from __future__ import annotations
 
-from freezegun import freeze_time
+import time_machine
 
 from airflow.api_connexion.schemas.dataset_schema import (
     DatasetCollection,
@@ -37,7 +37,7 @@ class TestDatasetSchemaBase:
         clear_db_dags()
         clear_db_datasets()
         self.timestamp = "2022-06-10T12:02:44+00:00"
-        self.freezer = freeze_time(self.timestamp)
+        self.freezer = time_machine.travel(self.timestamp, tick=False)
         self.freezer.start()
 
     def teardown_method(self) -> None:
diff --git a/tests/conftest.py b/tests/conftest.py
index bdaec7da0f..74d412f849 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -24,8 +24,8 @@ from contextlib import ExitStack, suppress
 from datetime import datetime, timedelta
 from typing import TYPE_CHECKING
 
-import freezegun
 import pytest
+import time_machine
 
 # We should set these before loading _any_ of the rest of airflow so that the
 # unit test mode config is set as early as possible.
@@ -400,7 +400,7 @@ def pytest_runtest_setup(item):
 @pytest.fixture
 def frozen_sleep(monkeypatch):
     """
-    Use freezegun to "stub" sleep, so that it takes no time, but that
+    Use time-machine to "stub" sleep, so that it takes no time, but that
     ``datetime.now()`` appears to move forwards
 
     If your module under test does ``import time`` and then ``time.sleep``::
@@ -416,21 +416,21 @@ def frozen_sleep(monkeypatch):
             monkeypatch.setattr('my_mod.sleep', frozen_sleep)
             my_mod.fn_under_test()
     """
-    freezegun_control = None
+    traveller = None
 
     def fake_sleep(seconds):
-        nonlocal freezegun_control
+        nonlocal traveller
         utcnow = datetime.utcnow()
-        if freezegun_control is not None:
-            freezegun_control.stop()
-        freezegun_control = freezegun.freeze_time(utcnow + timedelta(seconds=seconds))
-        freezegun_control.start()
+        if traveller is not None:
+            traveller.stop()
+        traveller = time_machine.travel(utcnow + timedelta(seconds=seconds))
+        traveller.start()
 
     monkeypatch.setattr("time.sleep", fake_sleep)
     yield fake_sleep
 
-    if freezegun_control is not None:
-        freezegun_control.stop()
+    if traveller is not None:
+        traveller.stop()
 
 
 @pytest.fixture(scope="session")
diff --git a/tests/core/test_sentry.py b/tests/core/test_sentry.py
index 2b29a1c704..0e8607f2ad 100644
--- a/tests/core/test_sentry.py
+++ b/tests/core/test_sentry.py
@@ -22,7 +22,7 @@ import importlib
 from unittest import mock
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 from sentry_sdk import configure_scope
 
 from airflow.operators.python import PythonOperator
@@ -129,7 +129,7 @@ class TestSentryHook:
             for key, value in scope._tags.items():
                 assert TEST_SCOPE[key] == value
 
-    @freeze_time(CRUMB_DATE.isoformat())
+    @time_machine.travel(CRUMB_DATE)
     def test_add_breadcrumbs(self, sentry, task_instance):
         """
         Test adding breadcrumbs.
diff --git a/tests/dag_processing/test_manager.py b/tests/dag_processing/test_manager.py
index 1cbfb7275d..862dfdb2c8 100644
--- a/tests/dag_processing/test_manager.py
+++ b/tests/dag_processing/test_manager.py
@@ -34,7 +34,7 @@ from unittest import mock
 from unittest.mock import MagicMock, PropertyMock
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 from sqlalchemy import func
 
 from airflow.callbacks.callback_requests import CallbackRequest, DagCallbackRequest, SlaCallbackRequest
@@ -470,7 +470,7 @@ class TestDagFileProcessorManager:
         manager._file_stats = {
             "file_1.py": DagFileStat(1, 0, last_finish_time, timedelta(seconds=1.0), 1),
         }
-        with freeze_time(freezed_base_time):
+        with time_machine.travel(freezed_base_time):
             manager.set_file_paths(dag_files)
             assert manager._file_path_queue == collections.deque()
             # File Path Queue will be empty as the "modified time" < "last finish time"
@@ -481,7 +481,7 @@ class TestDagFileProcessorManager:
         # than the last_parse_time but still less than now - min_file_process_interval
         file_1_new_mtime = freezed_base_time - timedelta(seconds=5)
         file_1_new_mtime_ts = file_1_new_mtime.timestamp()
-        with freeze_time(freezed_base_time):
+        with time_machine.travel(freezed_base_time):
             manager.set_file_paths(dag_files)
             assert manager._file_path_queue == collections.deque()
             # File Path Queue will be empty as the "modified time" < "last finish time"
diff --git a/tests/executors/test_celery_executor.py b/tests/executors/test_celery_executor.py
index cbbe64c564..2a52776dec 100644
--- a/tests/executors/test_celery_executor.py
+++ b/tests/executors/test_celery_executor.py
@@ -27,9 +27,9 @@ from unittest import mock
 # leave this it is used by the test worker
 import celery.contrib.testing.tasks  # noqa: F401
 import pytest
+import time_machine
 from celery import Celery
 from celery.result import AsyncResult
-from freezegun import freeze_time
 from kombu.asynchronous import set_event_loop
 from parameterized import parameterized
 
@@ -162,7 +162,7 @@ class TestCeleryExecutor:
         assert executor.try_adopt_task_instances(tis) == tis
 
     @pytest.mark.backend("mysql", "postgres")
-    @freeze_time("2020-01-01")
+    @time_machine.travel("2020-01-01", tick=False)
     def test_try_adopt_task_instances(self):
         start_date = timezone.utcnow() - timedelta(days=2)
 
@@ -270,7 +270,7 @@ class TestCeleryExecutor:
         assert ti.external_executor_id is None
 
     @pytest.mark.backend("mysql", "postgres")
-    @freeze_time("2020-01-01")
+    @time_machine.travel("2020-01-01", tick=False)
     def test_pending_tasks_timeout_with_appropriate_config_setting(self):
         start_date = timezone.utcnow() - timedelta(days=2)
 
diff --git a/tests/jobs/test_scheduler_job.py b/tests/jobs/test_scheduler_job.py
index b661293ea2..9d1b4dd452 100644
--- a/tests/jobs/test_scheduler_job.py
+++ b/tests/jobs/test_scheduler_job.py
@@ -30,7 +30,7 @@ from unittest.mock import MagicMock, patch
 
 import psutil
 import pytest
-from freezegun import freeze_time
+import time_machine
 from sqlalchemy import func
 
 import airflow.example_dags
@@ -3290,7 +3290,7 @@ class TestSchedulerJob:
 
         assert dag3.get_last_dagrun().creating_job_id == self.scheduler_job.id
 
-    @freeze_time(DEFAULT_DATE + datetime.timedelta(days=1, seconds=9))
+    @time_machine.travel(DEFAULT_DATE + datetime.timedelta(days=1, seconds=9), tick=False)
     @mock.patch("airflow.jobs.scheduler_job.Stats.timing")
     def test_start_dagruns(self, stats_timing, dag_maker):
         """
diff --git a/tests/models/test_dag.py b/tests/models/test_dag.py
index 855ecc4b6f..6fc846f20a 100644
--- a/tests/models/test_dag.py
+++ b/tests/models/test_dag.py
@@ -34,8 +34,8 @@ from unittest.mock import patch
 import jinja2
 import pendulum
 import pytest
+import time_machine
 from dateutil.relativedelta import relativedelta
-from freezegun import freeze_time
 from sqlalchemy import inspect
 
 import airflow
@@ -2122,7 +2122,7 @@ my_postgres_conn:
         # The DR should be scheduled in the last 2 hours, not 6 hours ago
         assert next_date == six_hours_ago_to_the_hour
 
-    @freeze_time(timezone.datetime(2020, 1, 5))
+    @time_machine.travel(timezone.datetime(2020, 1, 5), tick=False)
     def test_next_dagrun_info_timedelta_schedule_and_catchup_false(self):
         """
         Test that the dag file processor does not create multiple dagruns
@@ -2142,7 +2142,7 @@ my_postgres_conn:
         next_info = dag.next_dagrun_info(next_info.data_interval)
         assert next_info and next_info.logical_date == timezone.datetime(2020, 1, 5)
 
-    @freeze_time(timezone.datetime(2020, 5, 4))
+    @time_machine.travel(timezone.datetime(2020, 5, 4))
     def test_next_dagrun_info_timedelta_schedule_and_catchup_true(self):
         """
         Test that the dag file processor creates multiple dagruns
diff --git a/tests/models/test_dagbag.py b/tests/models/test_dagbag.py
index 102ad0b518..dbeaba5dd1 100644
--- a/tests/models/test_dagbag.py
+++ b/tests/models/test_dagbag.py
@@ -32,7 +32,7 @@ from unittest import mock
 from unittest.mock import patch
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 from sqlalchemy import func
 from sqlalchemy.exc import OperationalError
 
@@ -879,7 +879,7 @@ class TestDagBag:
         """
         db_clean_up()
         session = settings.Session()
-        with freeze_time(tz.datetime(2020, 1, 5, 0, 0, 0)) as frozen_time:
+        with time_machine.travel(tz.datetime(2020, 1, 5, 0, 0, 0), tick=False) as frozen_time:
             dagbag = DagBag(
                 dag_folder=os.path.join(TEST_DAGS_FOLDER, "test_example_bash_operator.py"),
                 include_examples=False,
@@ -889,7 +889,7 @@ class TestDagBag:
 
             def _sync_to_db():
                 mock_sync_perm_for_dag.reset_mock()
-                frozen_time.tick(20)
+                frozen_time.shift(20)
                 dagbag.sync_to_db(session=session)
 
             dag = dagbag.dags["test_example_bash_operator"]
@@ -950,7 +950,7 @@ class TestDagBag:
         Serialized DAG table after 'min_serialized_dag_fetch_interval' seconds are passed.
         """
 
-        with freeze_time(tz.datetime(2020, 1, 5, 0, 0, 0)):
+        with time_machine.travel((tz.datetime(2020, 1, 5, 0, 0, 0)), tick=False):
             example_bash_op_dag = DagBag(include_examples=True).dags.get("example_bash_operator")
             SerializedDagModel.write_dag(dag=example_bash_op_dag)
 
@@ -962,18 +962,18 @@ class TestDagBag:
 
         # Check that if min_serialized_dag_fetch_interval has not passed we do not fetch the DAG
         # from DB
-        with freeze_time(tz.datetime(2020, 1, 5, 0, 0, 4)):
+        with time_machine.travel((tz.datetime(2020, 1, 5, 0, 0, 4)), tick=False):
             with assert_queries_count(0):
                 assert dag_bag.get_dag("example_bash_operator").tags == ["example", "example2"]
 
         # Make a change in the DAG and write Serialized DAG to the DB
-        with freeze_time(tz.datetime(2020, 1, 5, 0, 0, 6)):
+        with time_machine.travel((tz.datetime(2020, 1, 5, 0, 0, 6)), tick=False):
             example_bash_op_dag.tags += ["new_tag"]
             SerializedDagModel.write_dag(dag=example_bash_op_dag)
 
         # Since min_serialized_dag_fetch_interval is passed verify that calling 'dag_bag.get_dag'
         # fetches the Serialized DAG from DB
-        with freeze_time(tz.datetime(2020, 1, 5, 0, 0, 8)):
+        with time_machine.travel((tz.datetime(2020, 1, 5, 0, 0, 8)), tick=False):
             with assert_queries_count(2):
                 updated_ser_dag_1 = dag_bag.get_dag("example_bash_operator")
                 updated_ser_dag_1_update_time = dag_bag.dags_last_fetched["example_bash_operator"]
diff --git a/tests/models/test_taskinstance.py b/tests/models/test_taskinstance.py
index 80b6d6d302..9bcfe5f677 100644
--- a/tests/models/test_taskinstance.py
+++ b/tests/models/test_taskinstance.py
@@ -33,7 +33,7 @@ from uuid import uuid4
 
 import pendulum
 import pytest
-from freezegun import freeze_time
+import time_machine
 
 from airflow import models, settings
 from airflow.decorators import task, task_group
@@ -564,11 +564,11 @@ class TestTaskInstance:
         assert ti.next_kwargs is None
         assert ti.state == state
 
-    @freeze_time("2021-09-19 04:56:35", as_kwarg="frozen_time")
-    def test_retry_delay(self, dag_maker, frozen_time=None):
+    def test_retry_delay(self, dag_maker, time_machine):
         """
         Test that retry delays are respected
         """
+        time_machine.move_to("2021-09-19 04:56:35", tick=False)
         with dag_maker(dag_id="test_retry_handling"):
             task = BashOperator(
                 task_id="test_retry_handling_op",
@@ -593,12 +593,12 @@ class TestTaskInstance:
         assert ti.try_number == 2
 
         # second run -- still up for retry because retry_delay hasn't expired
-        frozen_time.tick(delta=datetime.timedelta(seconds=3))
+        time_machine.coordinates.shift(3)
         run_with_error(ti)
         assert ti.state == State.UP_FOR_RETRY
 
         # third run -- failed
-        frozen_time.tick(delta=datetime.datetime.resolution)
+        time_machine.coordinates.shift(datetime.datetime.resolution)
         run_with_error(ti)
         assert ti.state == State.FAILED
 
@@ -756,7 +756,7 @@ class TestTaskInstance:
             expected_try_number,
             expected_task_reschedule_count,
         ):
-            with freeze_time(run_date):
+            with time_machine.travel(run_date, tick=False):
                 try:
                     ti.run()
                 except AirflowException:
@@ -856,7 +856,7 @@ class TestTaskInstance:
             expected_task_reschedule_count,
         ):
             ti.refresh_from_task(task)
-            with freeze_time(run_date):
+            with time_machine.travel(run_date, tick=False):
                 try:
                     ti.run()
                 except AirflowException:
@@ -955,7 +955,7 @@ class TestTaskInstance:
             expected_task_reschedule_count,
         ):
             ti.refresh_from_task(task)
-            with freeze_time(run_date):
+            with time_machine.travel(run_date, tick=False):
                 try:
                     ti.run()
                 except AirflowException:
@@ -1023,7 +1023,7 @@ class TestTaskInstance:
             expected_try_number,
             expected_task_reschedule_count,
         ):
-            with freeze_time(run_date):
+            with time_machine.travel(run_date, tick=False):
                 try:
                     ti.run()
                 except AirflowException:
diff --git a/tests/models/test_timestamp.py b/tests/models/test_timestamp.py
index 0313183f12..2315e25dd4 100644
--- a/tests/models/test_timestamp.py
+++ b/tests/models/test_timestamp.py
@@ -18,7 +18,7 @@ from __future__ import annotations
 
 import pendulum
 import pytest
-from freezegun import freeze_time
+import time_machine
 
 from airflow.models import Log, TaskInstance
 from airflow.operators.empty import EmptyOperator
@@ -53,7 +53,7 @@ def add_log(execdate, session, dag_maker, timezone_override=None):
 @provide_session
 def test_timestamp_behaviour(dag_maker, session=None):
     execdate = timezone.utcnow()
-    with freeze_time(execdate):
+    with time_machine.travel(execdate, tick=False):
         current_time = timezone.utcnow()
         old_log = add_log(execdate, session, dag_maker)
         session.expunge(old_log)
@@ -65,7 +65,7 @@ def test_timestamp_behaviour(dag_maker, session=None):
 @provide_session
 def test_timestamp_behaviour_with_timezone(dag_maker, session=None):
     execdate = timezone.utcnow()
-    with freeze_time(execdate):
+    with time_machine.travel(execdate, tick=False):
         current_time = timezone.utcnow()
         old_log = add_log(execdate, session, dag_maker, timezone_override=pendulum.timezone("Europe/Warsaw"))
         session.expunge(old_log)
diff --git a/tests/operators/test_datetime.py b/tests/operators/test_datetime.py
index 242442e8ea..027c4c1f0f 100644
--- a/tests/operators/test_datetime.py
+++ b/tests/operators/test_datetime.py
@@ -20,8 +20,8 @@ from __future__ import annotations
 import datetime
 import unittest
 
-import freezegun
 import pytest
+import time_machine
 
 from airflow.exceptions import AirflowException
 from airflow.models import DAG, DagRun, TaskInstance as TI
@@ -113,7 +113,7 @@ class TestBranchDateTimeOperator(unittest.TestCase):
                 dag=self.dag,
             )
 
-    @freezegun.freeze_time("2020-07-07 10:54:05")
+    @time_machine.travel("2020-07-07 10:54:05")
     def test_branch_datetime_operator_falls_within_range(self):
         """Check BranchDateTimeOperator branch operation"""
         for target_lower, target_upper in self.targets:
@@ -143,7 +143,7 @@ class TestBranchDateTimeOperator(unittest.TestCase):
                 self.branch_op.target_upper = target_upper
 
                 for date in dates:
-                    with freezegun.freeze_time(date):
+                    with time_machine.travel(date):
                         self.branch_op.run(start_date=DEFAULT_DATE, end_date=DEFAULT_DATE)
 
                         self._assert_task_ids_match_states(
@@ -154,7 +154,7 @@ class TestBranchDateTimeOperator(unittest.TestCase):
                             }
                         )
 
-    @freezegun.freeze_time("2020-07-07 10:54:05")
+    @time_machine.travel("2020-07-07 10:54:05")
     def test_branch_datetime_operator_upper_comparison_within_range(self):
         """Check BranchDateTimeOperator branch operation"""
         for _, target_upper in self.targets:
@@ -172,7 +172,7 @@ class TestBranchDateTimeOperator(unittest.TestCase):
                     }
                 )
 
-    @freezegun.freeze_time("2020-07-07 10:54:05")
+    @time_machine.travel("2020-07-07 10:54:05")
     def test_branch_datetime_operator_lower_comparison_within_range(self):
         """Check BranchDateTimeOperator branch operation"""
         for target_lower, _ in self.targets:
@@ -190,7 +190,7 @@ class TestBranchDateTimeOperator(unittest.TestCase):
                     }
                 )
 
-    @freezegun.freeze_time("2020-07-07 12:00:00")
+    @time_machine.travel("2020-07-07 12:00:00")
     def test_branch_datetime_operator_upper_comparison_outside_range(self):
         """Check BranchDateTimeOperator branch operation"""
         for _, target_upper in self.targets:
@@ -208,7 +208,7 @@ class TestBranchDateTimeOperator(unittest.TestCase):
                     }
                 )
 
-    @freezegun.freeze_time("2020-07-07 09:00:00")
+    @time_machine.travel("2020-07-07 09:00:00")
     def test_branch_datetime_operator_lower_comparison_outside_range(self):
         """Check BranchDateTimeOperator branch operation"""
         for target_lower, _ in self.targets:
@@ -226,7 +226,7 @@ class TestBranchDateTimeOperator(unittest.TestCase):
                     }
                 )
 
-    @freezegun.freeze_time("2020-12-01 09:00:00")
+    @time_machine.travel("2020-12-01 09:00:00")
     def test_branch_datetime_operator_use_task_logical_date(self):
         """Check if BranchDateTimeOperator uses task execution date"""
         in_between_date = timezone.datetime(2020, 7, 7, 10, 30, 0)
diff --git a/tests/operators/test_latest_only_operator.py b/tests/operators/test_latest_only_operator.py
index cf1c3ca18f..8c0e3d0ae1 100644
--- a/tests/operators/test_latest_only_operator.py
+++ b/tests/operators/test_latest_only_operator.py
@@ -19,7 +19,7 @@ from __future__ import annotations
 
 import datetime
 
-from freezegun import freeze_time
+import time_machine
 
 from airflow import settings
 from airflow.models import DagRun, TaskInstance
@@ -64,7 +64,7 @@ class TestLatestOnlyOperator:
             default_args={"owner": "airflow", "start_date": DEFAULT_DATE},
             schedule=INTERVAL,
         )
-        self.freezer = freeze_time(FROZEN_NOW)
+        self.freezer = time_machine.travel(FROZEN_NOW, tick=False)
         self.freezer.start()
 
     def teardown_method(self):
diff --git a/tests/operators/test_weekday.py b/tests/operators/test_weekday.py
index 046e593578..1c843c94c9 100644
--- a/tests/operators/test_weekday.py
+++ b/tests/operators/test_weekday.py
@@ -20,7 +20,7 @@ from __future__ import annotations
 import datetime
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 from parameterized import parameterized
 
 from airflow.exceptions import AirflowException
@@ -94,7 +94,7 @@ class TestBranchDayOfWeekOperator:
     @pytest.mark.parametrize(
         "weekday", TEST_CASE_BRANCH_FOLLOW_TRUE.values(), ids=TEST_CASE_BRANCH_FOLLOW_TRUE.keys()
     )
-    @freeze_time("2021-01-25")  # Monday
+    @time_machine.travel("2021-01-25")  # Monday
     def test_branch_follow_true(self, weekday):
         """Checks if BranchDayOfWeekOperator follows true branch"""
         print(datetime.datetime.now())
@@ -131,7 +131,7 @@ class TestBranchDayOfWeekOperator:
             },
         )
 
-    @freeze_time("2021-01-25")  # Monday
+    @time_machine.travel("2021-01-25")  # Monday
     def test_branch_follow_true_with_execution_date(self):
         """Checks if BranchDayOfWeekOperator follows true branch when set use_task_logical_date"""
 
@@ -166,7 +166,7 @@ class TestBranchDayOfWeekOperator:
             },
         )
 
-    @freeze_time("2021-01-25")  # Monday
+    @time_machine.travel("2021-01-25")  # Monday
     def test_branch_follow_false(self):
         """Checks if BranchDayOfWeekOperator follow false branch"""
 
@@ -245,7 +245,7 @@ class TestBranchDayOfWeekOperator:
                 dag=self.dag,
             )
 
-    @freeze_time("2021-01-25")  # Monday
+    @time_machine.travel("2021-01-25")  # Monday
     def test_branch_xcom_push_true_branch(self):
         """Check if BranchDayOfWeekOperator push to xcom value of follow_task_ids_if_true"""
         branch_op = BranchDayOfWeekOperator(
diff --git a/tests/providers/amazon/aws/hooks/test_eks.py b/tests/providers/amazon/aws/hooks/test_eks.py
index 3d3e51f94a..157c870973 100644
--- a/tests/providers/amazon/aws/hooks/test_eks.py
+++ b/tests/providers/amazon/aws/hooks/test_eks.py
@@ -25,10 +25,10 @@ from unittest import mock
 from urllib.parse import ParseResult, urlsplit
 
 import pytest
+import time_machine
 import yaml
 from _pytest._code import ExceptionInfo
 from botocore.exceptions import ClientError
-from freezegun import freeze_time
 from moto import mock_eks
 from moto.core import DEFAULT_ACCOUNT_ID
 from moto.core.exceptions import AWSError
@@ -298,7 +298,7 @@ class TestEksHooks:
             arn_under_test=generated_test_data.cluster_describe_output[ClusterAttributes.ARN],
         )
 
-    @freeze_time(FROZEN_TIME)
+    @time_machine.travel(FROZEN_TIME, tick=False)
     def test_create_cluster_generates_valid_cluster_created_timestamp(self, cluster_builder) -> None:
         _, generated_test_data = cluster_builder()
 
@@ -515,7 +515,7 @@ class TestEksHooks:
             arn_under_test=generated_test_data.nodegroup_describe_output[NodegroupAttributes.ARN],
         )
 
-    @freeze_time(FROZEN_TIME)
+    @time_machine.travel(FROZEN_TIME)
     def test_create_nodegroup_generates_valid_nodegroup_created_timestamp(self, nodegroup_builder) -> None:
         _, generated_test_data = nodegroup_builder()
 
@@ -523,7 +523,7 @@ class TestEksHooks:
 
         assert iso_date(result_time) == FROZEN_TIME
 
-    @freeze_time(FROZEN_TIME)
+    @time_machine.travel(FROZEN_TIME)
     def test_create_nodegroup_generates_valid_nodegroup_modified_timestamp(self, nodegroup_builder) -> None:
         _, generated_test_data = nodegroup_builder()
 
@@ -917,7 +917,7 @@ class TestEksHooks:
             arn_under_test=generated_test_data.fargate_describe_output[FargateProfileAttributes.ARN],
         )
 
-    @freeze_time(FROZEN_TIME)
+    @time_machine.travel(FROZEN_TIME)
     def test_create_fargate_profile_generates_valid_created_timestamp(self, fargate_profile_builder) -> None:
         _, generated_test_data = fargate_profile_builder()
 
diff --git a/tests/providers/amazon/aws/sensors/test_s3_keys_unchanged.py b/tests/providers/amazon/aws/sensors/test_s3_keys_unchanged.py
index 251f8d6258..726c4de7db 100644
--- a/tests/providers/amazon/aws/sensors/test_s3_keys_unchanged.py
+++ b/tests/providers/amazon/aws/sensors/test_s3_keys_unchanged.py
@@ -21,7 +21,7 @@ from datetime import datetime
 from unittest import mock
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 
 from airflow.models.dag import DAG, AirflowException
 from airflow.providers.amazon.aws.sensors.s3 import S3KeysUnchangedSensor
@@ -68,7 +68,7 @@ class TestS3KeysUnchangedSensor:
             dag=self.dag,
         ).render_template_fields({})
 
-    @freeze_time(DEFAULT_DATE, auto_tick_seconds=10)
+    @time_machine.travel(DEFAULT_DATE)
     def test_files_deleted_between_pokes_throw_error(self):
         self.sensor.allow_delete = False
         self.sensor.is_keys_unchanged({"a", "b"})
@@ -98,19 +98,19 @@ class TestS3KeysUnchangedSensor:
             ),
         ],
     )
-    @freeze_time(DEFAULT_DATE, auto_tick_seconds=10)
-    def test_key_changes(self, current_objects, expected_returns, inactivity_periods):
-        assert self.sensor.is_keys_unchanged(current_objects[0]) == expected_returns[0]
-        assert self.sensor.inactivity_seconds == inactivity_periods[0]
-        assert self.sensor.is_keys_unchanged(current_objects[1]) == expected_returns[1]
-        assert self.sensor.inactivity_seconds == inactivity_periods[1]
-        assert self.sensor.is_keys_unchanged(current_objects[2]) == expected_returns[2]
-        assert self.sensor.inactivity_seconds == inactivity_periods[2]
+    def test_key_changes(self, current_objects, expected_returns, inactivity_periods, time_machine):
+        time_machine.move_to(DEFAULT_DATE)
+        for current, expected, period in zip(current_objects, expected_returns, inactivity_periods):
+            assert self.sensor.is_keys_unchanged(current) == expected
+            assert self.sensor.inactivity_seconds == period
+            time_machine.coordinates.shift(10)
 
-    @freeze_time(DEFAULT_DATE, auto_tick_seconds=10)
     @mock.patch("airflow.providers.amazon.aws.sensors.s3.S3Hook")
-    def test_poke_succeeds_on_upload_complete(self, mock_hook):
+    def test_poke_succeeds_on_upload_complete(self, mock_hook, time_machine):
+        time_machine.move_to(DEFAULT_DATE)
         mock_hook.return_value.list_keys.return_value = {"a"}
         assert not self.sensor.poke(dict())
+        time_machine.coordinates.shift(10)
         assert not self.sensor.poke(dict())
+        time_machine.coordinates.shift(10)
         assert self.sensor.poke(dict())
diff --git a/tests/providers/amazon/aws/utils/test_eks_get_token.py b/tests/providers/amazon/aws/utils/test_eks_get_token.py
index 700bd18373..f4b71cb94b 100644
--- a/tests/providers/amazon/aws/utils/test_eks_get_token.py
+++ b/tests/providers/amazon/aws/utils/test_eks_get_token.py
@@ -23,12 +23,12 @@ import runpy
 from unittest import mock
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 
 
 class TestGetEksToken:
     @mock.patch("airflow.providers.amazon.aws.hooks.eks.EksHook")
-    @freeze_time("1995-02-14")
+    @time_machine.travel("1995-02-14", tick=False)
     @pytest.mark.parametrize(
         "args, expected_aws_conn_id, expected_region_name",
         [
diff --git a/tests/providers/elasticsearch/log/test_es_task_handler.py b/tests/providers/elasticsearch/log/test_es_task_handler.py
index bdf274732e..5e23f0bd92 100644
--- a/tests/providers/elasticsearch/log/test_es_task_handler.py
+++ b/tests/providers/elasticsearch/log/test_es_task_handler.py
@@ -27,7 +27,6 @@ from unittest import mock
 from urllib.parse import quote
 
 import elasticsearch
-import freezegun
 import pendulum
 import pytest
 
@@ -500,7 +499,7 @@ class TestElasticsearchTaskHandler:
         assert self.es_task_handler.supports_external_link == expected
 
     @mock.patch("sys.__stdout__", new_callable=io.StringIO)
-    def test_dynamic_offset(self, stdout_mock, ti):
+    def test_dynamic_offset(self, stdout_mock, ti, time_machine):
         # arrange
         handler = ElasticsearchTaskHandler(
             base_log_folder=self.local_log_location,
@@ -524,12 +523,12 @@ class TestElasticsearchTaskHandler:
         t2, t3 = t1 + pendulum.duration(seconds=5), t1 + pendulum.duration(seconds=10)
 
         # act
-        with freezegun.freeze_time(t1):
-            ti.log.info("Test")
-        with freezegun.freeze_time(t2):
-            ti.log.info("Test2")
-        with freezegun.freeze_time(t3):
-            ti.log.info("Test3")
+        time_machine.move_to(t1, tick=False)
+        ti.log.info("Test")
+        time_machine.move_to(t2, tick=False)
+        ti.log.info("Test2")
+        time_machine.move_to(t3, tick=False)
+        ti.log.info("Test3")
 
         # assert
         first_log, second_log, third_log = map(json.loads, stdout_mock.getvalue().strip().split("\n"))
diff --git a/tests/providers/google/cloud/operators/test_cloud_storage_transfer_service.py b/tests/providers/google/cloud/operators/test_cloud_storage_transfer_service.py
index d696ffa14d..3c45636214 100644
--- a/tests/providers/google/cloud/operators/test_cloud_storage_transfer_service.py
+++ b/tests/providers/google/cloud/operators/test_cloud_storage_transfer_service.py
@@ -24,8 +24,8 @@ from datetime import date, time
 from unittest import mock
 
 import pytest
+import time_machine
 from botocore.credentials import Credentials
-from freezegun import freeze_time
 from parameterized import parameterized
 
 from airflow.exceptions import AirflowException
@@ -189,7 +189,7 @@ class TestTransferJobPreprocessor(unittest.TestCase):
         TransferJobPreprocessor(body=body).process_body()
         assert body[SCHEDULE][START_TIME_OF_DAY] == DICT_TIME
 
-    @freeze_time("2018-10-15")
+    @time_machine.travel("2018-10-15", tick=False)
     def test_should_set_default_schedule(self):
         body = {}
         TransferJobPreprocessor(body=body, default_schedule=True).process_body()
diff --git a/tests/sensors/test_base.py b/tests/sensors/test_base.py
index 8545b99f59..7417c183d8 100644
--- a/tests/sensors/test_base.py
+++ b/tests/sensors/test_base.py
@@ -21,7 +21,7 @@ from datetime import timedelta
 from unittest.mock import Mock, patch
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 
 from airflow.exceptions import AirflowException, AirflowRescheduleException, AirflowSensorTimeout
 from airflow.models import TaskReschedule
@@ -158,14 +158,14 @@ class TestBaseSensor:
             if ti.task_id == DUMMY_OP:
                 assert ti.state == State.NONE
 
-    def test_ok_with_reschedule(self, make_sensor):
+    def test_ok_with_reschedule(self, make_sensor, time_machine):
         sensor, dr = make_sensor(return_value=None, poke_interval=10, timeout=25, mode="reschedule")
         sensor.poke = Mock(side_effect=[False, False, True])
 
         # first poke returns False and task is re-scheduled
         date1 = timezone.utcnow()
-        with freeze_time(date1):
-            self._run(sensor)
+        time_machine.move_to(date1, tick=False)
+        self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -183,9 +183,9 @@ class TestBaseSensor:
                 assert ti.state == State.NONE
 
         # second poke returns False and task is re-scheduled
+        time_machine.coordinates.shift(sensor.poke_interval)
         date2 = date1 + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date2):
-            self._run(sensor)
+        self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -203,9 +203,8 @@ class TestBaseSensor:
                 assert ti.state == State.NONE
 
         # third poke returns True and task succeeds
-        date3 = date2 + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date3):
-            self._run(sensor)
+        time_machine.coordinates.shift(sensor.poke_interval)
+        self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -216,13 +215,13 @@ class TestBaseSensor:
             if ti.task_id == DUMMY_OP:
                 assert ti.state == State.NONE
 
-    def test_fail_with_reschedule(self, make_sensor):
+    def test_fail_with_reschedule(self, make_sensor, time_machine):
         sensor, dr = make_sensor(return_value=False, poke_interval=10, timeout=5, mode="reschedule")
 
         # first poke returns False and task is re-scheduled
         date1 = timezone.utcnow()
-        with freeze_time(date1):
-            self._run(sensor)
+        time_machine.move_to(date1, tick=False)
+        self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -232,10 +231,9 @@ class TestBaseSensor:
                 assert ti.state == State.NONE
 
         # second poke returns False, timeout occurs
-        date2 = date1 + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date2):
-            with pytest.raises(AirflowSensorTimeout):
-                self._run(sensor)
+        time_machine.coordinates.shift(sensor.poke_interval)
+        with pytest.raises(AirflowSensorTimeout):
+            self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -244,15 +242,15 @@ class TestBaseSensor:
             if ti.task_id == DUMMY_OP:
                 assert ti.state == State.NONE
 
-    def test_soft_fail_with_reschedule(self, make_sensor):
+    def test_soft_fail_with_reschedule(self, make_sensor, time_machine):
         sensor, dr = make_sensor(
             return_value=False, poke_interval=10, timeout=5, soft_fail=True, mode="reschedule"
         )
 
         # first poke returns False and task is re-scheduled
         date1 = timezone.utcnow()
-        with freeze_time(date1):
-            self._run(sensor)
+        time_machine.move_to(date1, tick=False)
+        self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -262,9 +260,8 @@ class TestBaseSensor:
                 assert ti.state == State.NONE
 
         # second poke returns False, timeout occurs
-        date2 = date1 + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date2):
-            self._run(sensor)
+        time_machine.coordinates.shift(sensor.poke_interval)
+        self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -273,7 +270,7 @@ class TestBaseSensor:
             if ti.task_id == DUMMY_OP:
                 assert ti.state == State.NONE
 
-    def test_ok_with_reschedule_and_retry(self, make_sensor):
+    def test_ok_with_reschedule_and_retry(self, make_sensor, time_machine):
         sensor, dr = make_sensor(
             return_value=None,
             poke_interval=10,
@@ -286,8 +283,8 @@ class TestBaseSensor:
 
         # first poke returns False and task is re-scheduled
         date1 = timezone.utcnow()
-        with freeze_time(date1):
-            self._run(sensor)
+        time_machine.move_to(date1, tick=False)
+        self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -303,10 +300,9 @@ class TestBaseSensor:
                 assert ti.state == State.NONE
 
         # second poke timesout and task instance is failed
-        date2 = date1 + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date2):
-            with pytest.raises(AirflowSensorTimeout):
-                self._run(sensor)
+        time_machine.coordinates.shift(sensor.poke_interval)
+        with pytest.raises(AirflowSensorTimeout):
+            self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -319,9 +315,9 @@ class TestBaseSensor:
         sensor.clear()
 
         # third poke returns False and task is rescheduled again
-        date3 = date2 + timedelta(seconds=sensor.poke_interval) + sensor.retry_delay
-        with freeze_time(date3):
-            self._run(sensor)
+        date3 = date1 + timedelta(seconds=sensor.poke_interval) * 2 + sensor.retry_delay
+        time_machine.coordinates.shift(sensor.poke_interval + sensor.retry_delay.total_seconds())
+        self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -337,9 +333,10 @@ class TestBaseSensor:
                 assert ti.state == State.NONE
 
         # fourth poke return True and task succeeds
-        date4 = date3 + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date4):
-            self._run(sensor)
+
+        time_machine.coordinates.shift(sensor.poke_interval)
+        self._run(sensor)
+
         tis = dr.get_task_instances()
         assert len(tis) == 2
         for ti in tis:
@@ -368,7 +365,7 @@ class TestBaseSensor:
         )
 
         # first poke returns False and task is re-scheduled
-        with freeze_time(date1):
+        with time_machine.travel(date1, tick=False):
             self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
@@ -385,7 +382,7 @@ class TestBaseSensor:
                 assert ti.state == State.NONE
 
         # second poke returns False and task is re-scheduled
-        with freeze_time(date2):
+        with time_machine.travel(date2, tick=False):
             self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
@@ -402,7 +399,7 @@ class TestBaseSensor:
                 assert ti.state == State.NONE
 
         # third poke returns True and task succeeds
-        with freeze_time(date3):
+        with time_machine.travel(date3, tick=False):
             self._run(sensor)
         tis = dr.get_task_instances()
         assert len(tis) == 2
@@ -418,7 +415,7 @@ class TestBaseSensor:
 
         # poke returns False and AirflowRescheduleException is raised
         date1 = timezone.utcnow()
-        with freeze_time(date1):
+        with time_machine.travel(date1, tick=False):
             self._run(sensor, test_mode=True)
         tis = dr.get_task_instances()
         assert len(tis) == 2
@@ -545,7 +542,7 @@ class TestBaseSensor:
         sensor, _ = make_sensor(poke_interval=60 * 60 * 24, mode="reschedule", return_value=False)
 
         # A few hours until TIMESTAMP's limit, the next poke will take us over.
-        with freeze_time(datetime(2038, 1, 19, tzinfo=timezone.utc)):
+        with time_machine.travel(datetime(2038, 1, 19, tzinfo=timezone.utc), tick=False):
             with pytest.raises(AirflowSensorTimeout) as ctx:
                 self._run(sensor)
         assert str(ctx.value) == (
@@ -553,7 +550,7 @@ class TestBaseSensor:
             "since it is over MySQL's TIMESTAMP storage limit."
         )
 
-    def test_reschedule_and_retry_timeout(self, make_sensor):
+    def test_reschedule_and_retry_timeout(self, make_sensor, time_machine):
         """
         Test mode="reschedule", retries and timeout configurations interact correctly.
 
@@ -606,42 +603,40 @@ class TestBaseSensor:
 
         # first poke returns False and task is re-scheduled
         date1 = timezone.utcnow()
-        with freeze_time(date1):
-            self._run(sensor)
+        time_machine.move_to(date1, tick=False)
+        self._run(sensor)
         assert_ti_state(1, 2, State.UP_FOR_RESCHEDULE)
 
         # second poke raises RuntimeError and task instance retries
-        date2 = date1 + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date2), pytest.raises(RuntimeError):
+        time_machine.coordinates.shift(sensor.poke_interval)
+        with pytest.raises(RuntimeError):
             self._run(sensor)
         assert_ti_state(2, 2, State.UP_FOR_RETRY)
 
         # third poke returns False and task is rescheduled again
-        date3 = date2 + sensor.retry_delay + timedelta(seconds=1)
-        with freeze_time(date3):
-            self._run(sensor)
+        time_machine.coordinates.shift(sensor.retry_delay + timedelta(seconds=1))
+        self._run(sensor)
         assert_ti_state(2, 2, State.UP_FOR_RESCHEDULE)
 
         # fourth poke times out and raises AirflowSensorTimeout
-        date4 = date3 + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date4), pytest.raises(AirflowSensorTimeout):
+        time_machine.coordinates.shift(sensor.poke_interval)
+        with pytest.raises(AirflowSensorTimeout):
             self._run(sensor)
         assert_ti_state(3, 2, State.FAILED)
 
         # Clear the failed sensor
         sensor.clear()
 
-        date_i = date4 + timedelta(seconds=20)
+        time_machine.coordinates.shift(20)
 
         for _ in range(3):
-            date_i += timedelta(seconds=sensor.poke_interval)
-            with freeze_time(date_i):
-                self._run(sensor)
+            time_machine.coordinates.shift(sensor.poke_interval)
+            self._run(sensor)
             assert_ti_state(3, 4, State.UP_FOR_RESCHEDULE)
 
         # Last poke times out and raises AirflowSensorTimeout
-        date8 = date_i + timedelta(seconds=sensor.poke_interval)
-        with freeze_time(date8), pytest.raises(AirflowSensorTimeout):
+        time_machine.coordinates.shift(sensor.poke_interval)
+        with pytest.raises(AirflowSensorTimeout):
             self._run(sensor)
         assert_ti_state(4, 4, State.FAILED)
 
diff --git a/tests/sensors/test_time_sensor.py b/tests/sensors/test_time_sensor.py
index 70b875311c..2ccfdd2c42 100644
--- a/tests/sensors/test_time_sensor.py
+++ b/tests/sensors/test_time_sensor.py
@@ -20,9 +20,9 @@ from __future__ import annotations
 from datetime import datetime, time
 from unittest.mock import patch
 
-import freezegun
 import pendulum
 import pytest
+import time_machine
 
 from airflow.exceptions import TaskDeferred
 from airflow.models.dag import DAG
@@ -35,10 +35,6 @@ DEFAULT_DATE_WO_TZ = datetime(2015, 1, 1)
 DEFAULT_DATE_WITH_TZ = datetime(2015, 1, 1, tzinfo=pendulum.tz.timezone(DEFAULT_TIMEZONE))
 
 
-@patch(
-    "airflow.sensors.time_sensor.timezone.utcnow",
-    return_value=timezone.datetime(2020, 1, 1, 23, 0).replace(tzinfo=timezone.utc),
-)
 class TestTimeSensor:
     @pytest.mark.parametrize(
         "default_timezone, start_date, expected",
@@ -48,7 +44,8 @@ class TestTimeSensor:
             (DEFAULT_TIMEZONE, DEFAULT_DATE_WO_TZ, False),
         ],
     )
-    def test_timezone(self, mock_utcnow, default_timezone, start_date, expected):
+    @time_machine.travel(timezone.datetime(2020, 1, 1, 23, 0).replace(tzinfo=timezone.utc))
+    def test_timezone(self, default_timezone, start_date, expected):
         with patch("airflow.settings.TIMEZONE", pendulum.timezone(default_timezone)):
             dag = DAG("test", default_args={"start_date": start_date})
             op = TimeSensor(task_id="test", target_time=time(10, 0), dag=dag)
@@ -56,7 +53,7 @@ class TestTimeSensor:
 
 
 class TestTimeSensorAsync:
-    @freezegun.freeze_time("2020-07-07 00:00:00")
+    @time_machine.travel("2020-07-07 00:00:00", tick=False)
     def test_task_is_deferred(self):
         with DAG("test_task_is_deferred", start_date=timezone.datetime(2020, 1, 1, 23, 0)):
             op = TimeSensorAsync(task_id="test", target_time=time(10, 0))
diff --git a/tests/ti_deps/deps/test_not_in_retry_period_dep.py b/tests/ti_deps/deps/test_not_in_retry_period_dep.py
index 107376d2a3..2abf42273a 100644
--- a/tests/ti_deps/deps/test_not_in_retry_period_dep.py
+++ b/tests/ti_deps/deps/test_not_in_retry_period_dep.py
@@ -20,7 +20,7 @@ from __future__ import annotations
 from datetime import timedelta
 from unittest.mock import Mock
 
-from freezegun import freeze_time
+import time_machine
 
 from airflow.models import TaskInstance
 from airflow.ti_deps.deps.not_in_retry_period_dep import NotInRetryPeriodDep
@@ -35,7 +35,7 @@ class TestNotInRetryPeriodDep:
         ti.end_date = end_date
         return ti
 
-    @freeze_time("2016-01-01 15:44")
+    @time_machine.travel("2016-01-01 15:44")
     def test_still_in_retry_period(self):
         """
         Task instances that are in their retry period should fail this dep
@@ -44,7 +44,7 @@ class TestNotInRetryPeriodDep:
         assert ti.is_premature
         assert not NotInRetryPeriodDep().is_met(ti=ti)
 
-    @freeze_time("2016-01-01 15:46")
+    @time_machine.travel("2016-01-01 15:46")
     def test_retry_period_finished(self):
         """
         Task instance's that have had their retry period elapse should pass this dep
diff --git a/tests/ti_deps/deps/test_runnable_exec_date_dep.py b/tests/ti_deps/deps/test_runnable_exec_date_dep.py
index eebde0cd42..4df559e489 100644
--- a/tests/ti_deps/deps/test_runnable_exec_date_dep.py
+++ b/tests/ti_deps/deps/test_runnable_exec_date_dep.py
@@ -20,7 +20,7 @@ from __future__ import annotations
 from unittest.mock import Mock, patch
 
 import pytest
-from freezegun import freeze_time
+import time_machine
 
 from airflow import settings
 from airflow.models import DagRun, TaskInstance
@@ -36,7 +36,7 @@ def clean_db(session):
     session.query(TaskInstance).delete()
 
 
-@freeze_time("2016-11-01")
+@time_machine.travel("2016-11-01")
 @pytest.mark.parametrize(
     "allow_trigger_in_future,schedule_interval,execution_date,is_met",
     [
@@ -74,7 +74,7 @@ def test_exec_date_dep(
         assert RunnableExecDateDep().is_met(ti=ti) == is_met
 
 
-@freeze_time("2016-01-01")
+@time_machine.travel("2016-01-01")
 def test_exec_date_after_end_date(session, dag_maker, create_dummy_dag):
     """
     If the dag's execution date is in the future this dep should fail
diff --git a/tests/timetables/test_interval_timetable.py b/tests/timetables/test_interval_timetable.py
index 3c324672a7..596c274cf7 100644
--- a/tests/timetables/test_interval_timetable.py
+++ b/tests/timetables/test_interval_timetable.py
@@ -20,9 +20,9 @@ from __future__ import annotations
 import datetime
 
 import dateutil.relativedelta
-import freezegun
 import pendulum
 import pytest
+import time_machine
 
 from airflow.exceptions import AirflowTimetableInvalid
 from airflow.settings import TIMEZONE
@@ -52,7 +52,7 @@ DELTA_FROM_MIDNIGHT = datetime.timedelta(minutes=30, hours=16)
     "last_automated_data_interval",
     [pytest.param(None, id="first-run"), pytest.param(PREV_DATA_INTERVAL, id="subsequent")],
 )
-@freezegun.freeze_time(CURRENT_TIME)
+@time_machine.travel(CURRENT_TIME)
 def test_no_catchup_first_starts_at_current_time(
     last_automated_data_interval: DataInterval | None,
 ) -> None:
@@ -73,7 +73,7 @@ def test_no_catchup_first_starts_at_current_time(
     "catchup",
     [pytest.param(True, id="catchup_true"), pytest.param(False, id="catchup_false")],
 )
-@freezegun.freeze_time(CURRENT_TIME)
+@time_machine.travel(CURRENT_TIME)
 def test_new_schedule_interval_next_info_starts_at_new_time(
     earliest: pendulum.DateTime | None,
     catchup: bool,
@@ -100,7 +100,7 @@ def test_new_schedule_interval_next_info_starts_at_new_time(
     "last_automated_data_interval",
     [pytest.param(None, id="first-run"), pytest.param(PREV_DATA_INTERVAL, id="subsequent")],
 )
-@freezegun.freeze_time(CURRENT_TIME)
+@time_machine.travel(CURRENT_TIME)
 def test_no_catchup_next_info_starts_at_current_time(
     timetable: Timetable,
     last_automated_data_interval: DataInterval | None,
diff --git a/tests/timetables/test_trigger_timetable.py b/tests/timetables/test_trigger_timetable.py
index cabb1198ef..6f1d44479f 100644
--- a/tests/timetables/test_trigger_timetable.py
+++ b/tests/timetables/test_trigger_timetable.py
@@ -20,10 +20,10 @@ import datetime
 import typing
 
 import dateutil.relativedelta
-import freezegun
 import pendulum
 import pendulum.tz
 import pytest
+import time_machine
 
 from airflow.exceptions import AirflowTimetableInvalid
 from airflow.timetables.base import DagRunInfo, DataInterval, TimeRestriction
@@ -63,7 +63,7 @@ DELTA_FROM_MIDNIGHT = datetime.timedelta(minutes=30, hours=16)
         ),
     ],
 )
-@freezegun.freeze_time(CURRENT_TIME)
+@time_machine.travel(CURRENT_TIME)
 def test_daily_cron_trigger_no_catchup_first_starts_at_next_schedule(
     last_automated_data_interval: DataInterval | None,
     next_start_time: pendulum.DateTime,
@@ -105,7 +105,7 @@ def test_hourly_cron_trigger_no_catchup_next_info(
     earliest: pendulum.DateTime,
     expected: DagRunInfo,
 ) -> None:
-    with freezegun.freeze_time(current_time):
+    with time_machine.travel(current_time):
         next_info = HOURLY_CRON_TRIGGER_TIMETABLE.next_dagrun_info(
             last_automated_data_interval=PREV_DATA_INTERVAL_EXACT,
             restriction=TimeRestriction(earliest=earliest, latest=None, catchup=False),
diff --git a/tests/utils/log/test_file_processor_handler.py b/tests/utils/log/test_file_processor_handler.py
index 1d479c71da..e11ff35b0c 100644
--- a/tests/utils/log/test_file_processor_handler.py
+++ b/tests/utils/log/test_file_processor_handler.py
@@ -21,7 +21,7 @@ import os
 import shutil
 from datetime import timedelta
 
-from freezegun import freeze_time
+import time_machine
 
 from airflow.utils import timezone
 from airflow.utils.log.file_processor_handler import FileProcessorHandler
@@ -77,13 +77,13 @@ class TestFileProcessorHandler:
 
         link = os.path.join(self.base_log_folder, "latest")
 
-        with freeze_time(date1):
+        with time_machine.travel(date1, tick=False):
             handler.set_context(filename=os.path.join(self.dag_dir, "log1"))
             assert os.path.islink(link)
             assert os.path.basename(os.readlink(link)) == date1
             assert os.path.exists(os.path.join(link, "log1"))
 
-        with freeze_time(date2):
+        with time_machine.travel(date2, tick=False):
             handler.set_context(filename=os.path.join(self.dag_dir, "log2"))
             assert os.path.islink(link)
             assert os.path.basename(os.readlink(link)) == date2
@@ -104,7 +104,7 @@ class TestFileProcessorHandler:
             os.remove(link)
         os.makedirs(link)
 
-        with freeze_time(date1):
+        with time_machine.travel(date1, tick=False):
             handler.set_context(filename=os.path.join(self.dag_dir, "log1"))
 
     def teardown_method(self):
diff --git a/tests/utils/test_serve_logs.py b/tests/utils/test_serve_logs.py
index e306c50e2b..5288e646ed 100644
--- a/tests/utils/test_serve_logs.py
+++ b/tests/utils/test_serve_logs.py
@@ -21,7 +21,7 @@ from typing import TYPE_CHECKING
 
 import jwt
 import pytest
-from freezegun import freeze_time
+import time_machine
 
 from airflow.utils.jwt_signer import JWTSigner
 from airflow.utils.serve_logs import create_app
@@ -92,7 +92,7 @@ class TestServeLogs:
         assert response.status_code == 403
 
     def test_forbidden_expired(self, client: FlaskClient, signer):
-        with freeze_time("2010-01-14"):
+        with time_machine.travel("2010-01-14"):
             token = signer.generate_signed_token({"filename": "sample.log"})
         assert (
             client.get(
@@ -105,7 +105,7 @@ class TestServeLogs:
         )
 
     def test_forbidden_future(self, client: FlaskClient, signer):
-        with freeze_time(datetime.datetime.utcnow() + datetime.timedelta(seconds=3600)):
+        with time_machine.travel(datetime.datetime.utcnow() + datetime.timedelta(seconds=3600)):
             token = signer.generate_signed_token({"filename": "sample.log"})
         assert (
             client.get(
@@ -118,7 +118,7 @@ class TestServeLogs:
         )
 
     def test_ok_with_short_future_skew(self, client: FlaskClient, signer):
-        with freeze_time(datetime.datetime.utcnow() + datetime.timedelta(seconds=1)):
+        with time_machine.travel(datetime.datetime.utcnow() + datetime.timedelta(seconds=1)):
             token = signer.generate_signed_token({"filename": "sample.log"})
         assert (
             client.get(
@@ -131,7 +131,7 @@ class TestServeLogs:
         )
 
     def test_ok_with_short_past_skew(self, client: FlaskClient, signer):
-        with freeze_time(datetime.datetime.utcnow() - datetime.timedelta(seconds=31)):
+        with time_machine.travel(datetime.datetime.utcnow() - datetime.timedelta(seconds=31)):
             token = signer.generate_signed_token({"filename": "sample.log"})
         assert (
             client.get(
@@ -144,7 +144,7 @@ class TestServeLogs:
         )
 
     def test_forbidden_with_long_future_skew(self, client: FlaskClient, signer):
-        with freeze_time(datetime.datetime.utcnow() + datetime.timedelta(seconds=10)):
+        with time_machine.travel(datetime.datetime.utcnow() + datetime.timedelta(seconds=10)):
             token = signer.generate_signed_token({"filename": "sample.log"})
         assert (
             client.get(
@@ -157,7 +157,7 @@ class TestServeLogs:
         )
 
     def test_forbidden_with_long_past_skew(self, client: FlaskClient, signer):
-        with freeze_time(datetime.datetime.utcnow() - datetime.timedelta(seconds=40)):
+        with time_machine.travel(datetime.datetime.utcnow() - datetime.timedelta(seconds=40)):
             token = signer.generate_signed_token({"filename": "sample.log"})
         assert (
             client.get(
diff --git a/tests/www/test_security.py b/tests/www/test_security.py
index b34842a92d..31e6ca3b46 100644
--- a/tests/www/test_security.py
+++ b/tests/www/test_security.py
@@ -23,9 +23,9 @@ import logging
 from unittest import mock
 
 import pytest
+import time_machine
 from flask_appbuilder import SQLA, Model, expose, has_access
 from flask_appbuilder.views import BaseView, ModelView
-from freezegun import freeze_time
 from sqlalchemy import Column, Date, Float, Integer, String
 
 from airflow.exceptions import AirflowException
@@ -904,7 +904,7 @@ def old_user():
     return user
 
 
-@freeze_time(datetime.datetime(1985, 11, 5, 1, 24, 0))  # Get the Delorean, doc!
+@time_machine.travel(datetime.datetime(1985, 11, 5, 1, 24, 0), tick=False)
 def test_update_user_auth_stat_first_successful_auth(mock_security_manager, new_user):
     mock_security_manager.update_user_auth_stat(new_user, success=True)
 
@@ -914,7 +914,7 @@ def test_update_user_auth_stat_first_successful_auth(mock_security_manager, new_
     assert mock_security_manager.update_user.called_once
 
 
-@freeze_time(datetime.datetime(1985, 11, 5, 1, 24, 0))
+@time_machine.travel(datetime.datetime(1985, 11, 5, 1, 24, 0), tick=False)
 def test_update_user_auth_stat_subsequent_successful_auth(mock_security_manager, old_user):
     mock_security_manager.update_user_auth_stat(old_user, success=True)
 
@@ -924,7 +924,7 @@ def test_update_user_auth_stat_subsequent_successful_auth(mock_security_manager,
     assert mock_security_manager.update_user.called_once
 
 
-@freeze_time(datetime.datetime(1985, 11, 5, 1, 24, 0))
+@time_machine.travel(datetime.datetime(1985, 11, 5, 1, 24, 0), tick=False)
 def test_update_user_auth_stat_first_unsuccessful_auth(mock_security_manager, new_user):
     mock_security_manager.update_user_auth_stat(new_user, success=False)
 
@@ -934,7 +934,7 @@ def test_update_user_auth_stat_first_unsuccessful_auth(mock_security_manager, ne
     assert mock_security_manager.update_user.called_once
 
 
-@freeze_time(datetime.datetime(1985, 11, 5, 1, 24, 0))
+@time_machine.travel(datetime.datetime(1985, 11, 5, 1, 24, 0), tick=False)
 def test_update_user_auth_stat_subsequent_unsuccessful_auth(mock_security_manager, old_user):
     mock_security_manager.update_user_auth_stat(old_user, success=False)
 
diff --git a/tests/www/views/test_views_grid.py b/tests/www/views/test_views_grid.py
index 40dd1ce917..bc17ddc5de 100644
--- a/tests/www/views/test_views_grid.py
+++ b/tests/www/views/test_views_grid.py
@@ -17,8 +17,6 @@
 # under the License.
 from __future__ import annotations
 
-from datetime import datetime, timedelta
-
 import pendulum
 import pytest
 from dateutil.tz import UTC
@@ -29,6 +27,7 @@ from airflow.models import DagBag
 from airflow.models.dagrun import DagRun
 from airflow.models.dataset import DatasetDagRunQueue, DatasetEvent, DatasetModel
 from airflow.operators.empty import EmptyOperator
+from airflow.utils import timezone
 from airflow.utils.state import DagRunState, TaskInstanceState
 from airflow.utils.task_group import TaskGroup
 from airflow.utils.types import DagRunType
@@ -129,6 +128,14 @@ def test_no_runs(admin_client, dag_without_runs):
     }
 
 
+# Create this as a fixture so that it is applied before the `dag_with_runs` fixture is!
+@pytest.fixture
+def freeze_time_for_dagruns(time_machine):
+    time_machine.move_to("2022-01-02T00:00:00+00:00", tick=False)
+    yield
+
+
+@pytest.mark.usefixtures("freeze_time_for_dagruns")
 def test_one_run(admin_client, dag_with_runs: list[DagRun], session):
     """
     Test a DAG with complex interaction of states:
@@ -159,22 +166,14 @@ def test_one_run(admin_client, dag_with_runs: list[DagRun], session):
 
     assert resp.status_code == 200, resp.json
 
-    # We cannot use freezegun here as it does not play well with Flask 2.2 and SqlAlchemy
-    # Unlike real datetime, when FakeDatetime is used, it coerces to
-    # '2020-08-06 09:00:00+00:00' which is rejected by MySQL for EXPIRY Column
-    current_date_placeholder = "2022-01-02T00:00:00+00:00"
-    actual_date_in_json = datetime.fromisoformat(resp.json["dag_runs"][0]["end_date"])
-    assert datetime.now(tz=UTC) - actual_date_in_json < timedelta(minutes=5)
-    res = resp.json
-    res["dag_runs"][0]["end_date"] = current_date_placeholder
-    assert res == {
+    assert resp.json == {
         "dag_runs": [
             {
                 "conf": None,
                 "conf_is_json": False,
                 "data_interval_end": "2016-01-02T00:00:00+00:00",
                 "data_interval_start": "2016-01-01T00:00:00+00:00",
-                "end_date": current_date_placeholder,
+                "end_date": timezone.utcnow().isoformat(),
                 "execution_date": "2016-01-01T00:00:00+00:00",
                 "external_trigger": False,
                 "last_scheduling_decision": None,
diff --git a/tests/www/views/test_views_tasks.py b/tests/www/views/test_views_tasks.py
index ab3b40f0b2..1543db30f6 100644
--- a/tests/www/views/test_views_tasks.py
+++ b/tests/www/views/test_views_tasks.py
@@ -22,10 +22,9 @@ import json
 import re
 import unittest.mock
 import urllib.parse
-from datetime import timedelta
 
-import freezegun
 import pytest
+import time_machine
 
 from airflow import settings
 from airflow.exceptions import AirflowException
@@ -61,7 +60,7 @@ def reset_dagruns():
 
 @pytest.fixture(autouse=True)
 def init_dagruns(app, reset_dagruns):
-    with freezegun.freeze_time(DEFAULT_DATE):
+    with time_machine.travel(DEFAULT_DATE, tick=False):
         app.dag_bag.get_dag("example_bash_operator").create_dagrun(
             run_id=DEFAULT_DAGRUN,
             run_type=DagRunType.SCHEDULED,
@@ -563,7 +562,7 @@ def test_run_with_runnable_states(_, admin_client, session, state):
     "airflow.executors.executor_loader.ExecutorLoader.get_default_executor",
     return_value=_ForceHeartbeatCeleryExecutor(),
 )
-def test_run_ignoring_deps_sets_queued_dttm(_, admin_client, session):
+def test_run_ignoring_deps_sets_queued_dttm(_, admin_client, session, time_machine):
     task_id = "runme_0"
     session.query(TaskInstance).filter(TaskInstance.task_id == task_id).update(
         {"state": State.SCHEDULED, "queued_dttm": None}
@@ -579,15 +578,13 @@ def test_run_ignoring_deps_sets_queued_dttm(_, admin_client, session):
         dag_run_id=DEFAULT_DAGRUN,
         origin="/home",
     )
+    now = timezone.utcnow()
+
+    time_machine.move_to(now, tick=False)
     resp = admin_client.post("run", data=form, follow_redirects=True)
 
     assert resp.status_code == 200
-    # We cannot use freezegun here as it does not play well with Flask 2.2 and SqlAlchemy
-    # Unlike real datetime, when FakeDatetime is used, it coerces to
-    # '2020-08-06 09:00:00+00:00' which is rejected by MySQL for EXPIRY Column
-    assert timezone.utcnow() - session.query(TaskInstance.queued_dttm).filter(
-        TaskInstance.task_id == task_id
-    ).scalar() < timedelta(minutes=5)
+    assert session.query(TaskInstance.queued_dttm).filter(TaskInstance.task_id == task_id).scalar() == now
 
 
 @pytest.mark.parametrize("state", QUEUEABLE_STATES)


[airflow] 10/12: Update black version automatically in pre-commit configuration (#28578)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 587b14456925d182f7eaadd2bc625d57c1953c79
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sat Dec 24 16:08:40 2022 +0100

    Update black version automatically in pre-commit configuration (#28578)
    
    Follow up after #28576
    
    (cherry picked from commit 11575931e6d1f824154c84f71fb1dc77dd1b638f)
---
 .pre-commit-config.yaml                            | 15 +++++--
 STATIC_CODE_CHECKS.rst                             |  2 +
 dev/breeze/src/airflow_breeze/pre_commit_ids.py    |  1 +
 images/breeze/output-commands-hash.txt             |  2 +-
 images/breeze/output_static-checks.svg             | 48 ++++++++++++----------
 images/breeze/output_stop.svg                      | 24 +++++------
 .../pre_commit/pre_commit_update_black_version.py  | 37 +++++++++++++++++
 7 files changed, 91 insertions(+), 38 deletions(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index e188a72a5d..93ca966d0b 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -146,12 +146,21 @@ repos:
           - --fuzzy-match-generates-todo
         files: >
           \.cfg$|\.conf$|\.ini$|\.ldif$|\.properties$|\.readthedocs$|\.service$|\.tf$|Dockerfile.*$
+  - repo: local
+    hooks:
+      - id: update-black-version
+        name: Update black versions everywhere
+        entry: ./scripts/ci/pre_commit/pre_commit_update_black_version.py
+        language: python
+        files: ^.pre-commit-config.yaml$
+        additional_dependencies: ['pyyaml']
+        pass_filenames: false
+        require_serial: true
   - repo: https://github.com/PyCQA/isort
     rev: 5.11.2
     hooks:
       - id: isort
         name: Run isort to sort imports in Python files
-  # Keep version of black in sync wit blacken-docs and pre-commit-hook-names
   - repo: https://github.com/psf/black
     rev: 22.12.0
     hooks:
@@ -171,7 +180,7 @@ repos:
           - --target-version=py39
           - --target-version=py310
         alias: black
-        additional_dependencies: [black==22.3.0]
+        additional_dependencies: [black==22.12.0]
   - repo: https://github.com/pre-commit/pre-commit-hooks
     rev: v4.2.0
     hooks:
@@ -649,7 +658,7 @@ repos:
           - --max-length=64
         language: python
         files: ^\.pre-commit-config\.yaml$|^scripts/ci/pre_commit/pre_commit_check_pre_commit_hook_names\.py$
-        additional_dependencies: ['pyyaml', 'jinja2', 'black==22.3.0', 'tabulate', 'rich>=12.4.4']
+        additional_dependencies: ['pyyaml', 'jinja2', 'black==22.12.0', 'tabulate', 'rich>=12.4.4']
         require_serial: true
         pass_filenames: false
       - id: update-breeze-readme-config-hash
diff --git a/STATIC_CODE_CHECKS.rst b/STATIC_CODE_CHECKS.rst
index 8433fa3395..1b4732cd44 100644
--- a/STATIC_CODE_CHECKS.rst
+++ b/STATIC_CODE_CHECKS.rst
@@ -311,6 +311,8 @@ require Breeze Docker image to be build locally.
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | ts-compile-and-lint-javascript                            | TS types generation and ESLint against current UI files          |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
+| update-black-version                                      | Update black versions everywhere                                 |         |
++-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | update-breeze-cmd-output                                  | Update output of breeze commands in BREEZE.rst                   |         |
 +-----------------------------------------------------------+------------------------------------------------------------------+---------+
 | update-breeze-readme-config-hash                          | Update Breeze README.md with config files hash                   |         |
diff --git a/dev/breeze/src/airflow_breeze/pre_commit_ids.py b/dev/breeze/src/airflow_breeze/pre_commit_ids.py
index 4914cb0914..851e68be39 100644
--- a/dev/breeze/src/airflow_breeze/pre_commit_ids.py
+++ b/dev/breeze/src/airflow_breeze/pre_commit_ids.py
@@ -99,6 +99,7 @@ PRE_COMMIT_LIST = [
     "static-check-autoflake",
     "trailing-whitespace",
     "ts-compile-and-lint-javascript",
+    "update-black-version",
     "update-breeze-cmd-output",
     "update-breeze-readme-config-hash",
     "update-er-diagram",
diff --git a/images/breeze/output-commands-hash.txt b/images/breeze/output-commands-hash.txt
index b721a519c5..3ac57898ce 100644
--- a/images/breeze/output-commands-hash.txt
+++ b/images/breeze/output-commands-hash.txt
@@ -55,7 +55,7 @@ setup:version:123b462a421884dc2320ffc5e54b2478
 setup:f383b9236f6141f95276136ccd9217f5
 shell:affbf6f7f469408d0af47f75c6a38f6c
 start-airflow:109728919a0dd5c5ff5640ae86ba9e90
-static-checks:7a39e28c87fbca0a9fae0ebfe1591b71
+static-checks:6c18cfc471ad4118a11fc84d41abb747
 stop:e5aa686b4e53707ced4039d8414d5cd6
 testing:docker-compose-tests:b86c044b24138af0659a05ed6331576c
 testing:helm-tests:94a442e7f3f63b34c4831a84d165690a
diff --git a/images/breeze/output_static-checks.svg b/images/breeze/output_static-checks.svg
index 3cd8b66bbc..87a81e7c32 100644
--- a/images/breeze/output_static-checks.svg
+++ b/images/breeze/output_static-checks.svg
@@ -1,4 +1,4 @@
-<svg class="rich-terminal" viewBox="0 0 1482 1343.1999999999998" xmlns="http://www.w3.org/2000/svg">
+<svg class="rich-terminal" viewBox="0 0 1482 1367.6" xmlns="http://www.w3.org/2000/svg">
     <!-- Generated with Rich https://www.textualize.io -->
     <style>
 
@@ -43,7 +43,7 @@
 
     <defs>
     <clipPath id="breeze-static-checks-clip-terminal">
-      <rect x="0" y="0" width="1463.0" height="1292.1999999999998" />
+      <rect x="0" y="0" width="1463.0" height="1316.6" />
     </clipPath>
     <clipPath id="breeze-static-checks-line-0">
     <rect x="0" y="1.5" width="1464" height="24.65"/>
@@ -201,9 +201,12 @@
 <clipPath id="breeze-static-checks-line-51">
     <rect x="0" y="1245.9" width="1464" height="24.65"/>
             </clipPath>
+<clipPath id="breeze-static-checks-line-52">
+    <rect x="0" y="1270.3" width="1464" height="24.65"/>
+            </clipPath>
     </defs>
 
-    <rect fill="#292929" stroke="rgba(255,255,255,0.35)" stroke-width="1" x="1" y="1" width="1480" height="1341.2" rx="8"/><text class="breeze-static-checks-title" fill="#c5c8c6" text-anchor="middle" x="740" y="27">Command:&#160;static-checks</text>
+    <rect fill="#292929" stroke="rgba(255,255,255,0.35)" stroke-width="1" x="1" y="1" width="1480" height="1365.6" rx="8"/><text class="breeze-static-checks-title" fill="#c5c8c6" text-anchor="middle" x="740" y="27">Command:&#160;static-checks</text>
             <g transform="translate(26,22)">
             <circle cx="0" cy="0" r="7" fill="#ff5f57"/>
             <circle cx="22" cy="0" r="7" fill="#febc2e"/>
@@ -247,25 +250,26 @@
 </text><text class="breeze-static-checks-r5" x="0" y="776.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-31)">│</text><text class="breeze-static-checks-r7" x="366" y="776.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-31)">pretty-format-json&#160;|&#160;pydocstyle&#160;|&#160;python-no-log-warn&#160;|&#160;pyupgrade&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</ [...]
 </text><text class="breeze-static-checks-r5" x="0" y="800.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-32)">│</text><text class="breeze-static-checks-r7" x="366" y="800.8" textLength="1073.6" clip-path="url(#breeze-static-checks-line-32)">replace-bad-characters&#160;|&#160;rst-backticks&#160;|&#160;run-flake8&#160;|&#160;run-mypy&#160;|&#160;run-shellcheck&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="800.8" t [...]
 </text><text class="breeze-static-checks-r5" x="0" y="825.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-33)">│</text><text class="breeze-static-checks-r7" x="366" y="825.2" textLength="1073.6" clip-path="url(#breeze-static-checks-line-33)">static-check-autoflake&#160;|&#160;trailing-whitespace&#160;|&#160;ts-compile-and-lint-javascript&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="825.2" textLength= [...]
-</text><text class="breeze-static-checks-r5" x="0" y="849.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-34)">│</text><text class="breeze-static-checks-r7" x="366" y="849.6" textLength="1073.6" clip-path="url(#breeze-static-checks-line-34)">update-breeze-cmd-output&#160;|&#160;update-breeze-readme-config-hash&#160;|&#160;update-er-diagram&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="849.6" textLength="12.2" cli [...]
-</text><text class="breeze-static-checks-r5" x="0" y="874" textLength="12.2" clip-path="url(#breeze-static-checks-line-35)">│</text><text class="breeze-static-checks-r7" x="366" y="874" textLength="1073.6" clip-path="url(#breeze-static-checks-line-35)">update-extras&#160;|&#160;update-in-the-wild-to-be-sorted&#160;|&#160;update-inlined-dockerfile-scripts&#160;|&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="874" textLength="12.2" clip-path="url(#breeze-static [...]
-</text><text class="breeze-static-checks-r5" x="0" y="898.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-36)">│</text><text class="breeze-static-checks-r7" x="366" y="898.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-36)">update-local-yml-file&#160;|&#160;update-migration-references&#160;|&#160;update-providers-dependencies&#160;|&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="898.4" textLength="12.2" clip-path="url(#breeze- [...]
-</text><text class="breeze-static-checks-r5" x="0" y="922.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-37)">│</text><text class="breeze-static-checks-r7" x="366" y="922.8" textLength="1073.6" clip-path="url(#breeze-static-checks-line-37)">update-spelling-wordlist-to-be-sorted&#160;|&#160;update-supported-versions&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze- [...]
-</text><text class="breeze-static-checks-r5" x="0" y="947.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-38)">│</text><text class="breeze-static-checks-r7" x="366" y="947.2" textLength="1073.6" clip-path="url(#breeze-static-checks-line-38)">update-vendored-in-k8s-json-schema&#160;|&#160;update-version&#160;|&#160;yamllint&#160;|&#160;yesqa)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-stati [...]
-</text><text class="breeze-static-checks-r5" x="0" y="971.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-39)">│</text><text class="breeze-static-checks-r4" x="24.4" y="971.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-39)">-</text><text class="breeze-static-checks-r4" x="36.6" y="971.6" textLength="61" clip-path="url(#breeze-static-checks-line-39)">-file</text><text class="breeze-static-checks-r6" x="317.2" y="971.6" textLength="24.4" clip-path="url(#bree [...]
-</text><text class="breeze-static-checks-r5" x="0" y="996" textLength="12.2" clip-path="url(#breeze-static-checks-line-40)">│</text><text class="breeze-static-checks-r4" x="24.4" y="996" textLength="12.2" clip-path="url(#breeze-static-checks-line-40)">-</text><text class="breeze-static-checks-r4" x="36.6" y="996" textLength="48.8" clip-path="url(#breeze-static-checks-line-40)">-all</text><text class="breeze-static-checks-r4" x="85.4" y="996" textLength="73.2" clip-path="url(#breeze-stati [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1020.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1020.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1020.4" textLength="61" clip-path="url(#breeze-static-checks-line-41)">-show</text><text class="breeze-static-checks-r4" x="97.6" y="1020.4" textLength="195.2" clip-path="url(# [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1044.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1044.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1044.8" textLength="61" clip-path="url(#breeze-static-checks-line-42)">-last</text><text class="breeze-static-checks-r4" x="97.6" y="1044.8" textLength="85.4" clip-path="url(#b [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1069.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1069.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1069.2" textLength="85.4" clip-path="url(#breeze-static-checks-line-43)">-commit</text><text class="breeze-static-checks-r4" x="122" y="1069.2" textLength="48.8" clip-path="url [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1093.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">│</text><text class="breeze-static-checks-r2" x="366" y="1093.6" textLength="292.8" clip-path="url(#breeze-static-checks-line-44)">Mutually&#160;exclusive&#160;with&#160;</text><text class="breeze-static-checks-r4" x="658.8" y="1093.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">-</text><text class="breeze-static-checks-r4" x="671" y="1093.6" [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1118" textLength="12.2" clip-path="url(#breeze-static-checks-line-45)">│</text><text class="breeze-static-checks-r7" x="366" y="1118" textLength="1073.6" clip-path="url(#breeze-static-checks-line-45)">(TEXT)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#1 [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1142.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-46)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1142.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-46)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1142.4" textLength="85.4" clip-path="url(#breeze-static-checks-line-46)">-github</text><text class="breeze-static-checks-r4" x="122" y="1142.4" textLength="134.2" clip-path="ur [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1166.8" textLength="1464" clip-path="url(#breeze-static-checks-line-47)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-static-checks-r2" x="1464" y="1166.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-47)">
-</text><text class="breeze-static-checks-r5" x="0" y="1191.2" textLength="24.4" clip-path="url(#breeze-static-checks-line-48)">╭─</text><text class="breeze-static-checks-r5" x="24.4" y="1191.2" textLength="195.2" clip-path="url(#breeze-static-checks-line-48)">&#160;Common&#160;options&#160;</text><text class="breeze-static-checks-r5" x="219.6" y="1191.2" textLength="1220" clip-path="url(#breeze-static-checks-line-48)">────────────────────────────────────────────────────────────────────── [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1215.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-49)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1215.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-49)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1215.6" textLength="97.6" clip-path="url(#breeze-static-checks-line-49)">-verbose</text><text class="breeze-static-checks-r6" x="158.6" y="1215.6" textLength="24.4" clip-path=" [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1240" textLength="12.2" clip-path="url(#breeze-static-checks-line-50)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1240" textLength="12.2" clip-path="url(#breeze-static-checks-line-50)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1240" textLength="48.8" clip-path="url(#breeze-static-checks-line-50)">-dry</text><text class="breeze-static-checks-r4" x="85.4" y="1240" textLength="48.8" clip-path="url(#breeze-s [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1264.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1264.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1264.4" textLength="61" clip-path="url(#breeze-static-checks-line-51)">-help</text><text class="breeze-static-checks-r6" x="158.6" y="1264.4" textLength="24.4" clip-path="url(# [...]
-</text><text class="breeze-static-checks-r5" x="0" y="1288.8" textLength="1464" clip-path="url(#breeze-static-checks-line-52)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-static-checks-r2" x="1464" y="1288.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-52)">
+</text><text class="breeze-static-checks-r5" x="0" y="849.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-34)">│</text><text class="breeze-static-checks-r7" x="366" y="849.6" textLength="1073.6" clip-path="url(#breeze-static-checks-line-34)">update-black-version&#160;|&#160;update-breeze-cmd-output&#160;|&#160;update-breeze-readme-config-hash&#160;|&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="849.6" textLength="12.2" clip-path="url(#br [...]
+</text><text class="breeze-static-checks-r5" x="0" y="874" textLength="12.2" clip-path="url(#breeze-static-checks-line-35)">│</text><text class="breeze-static-checks-r7" x="366" y="874" textLength="1073.6" clip-path="url(#breeze-static-checks-line-35)">update-er-diagram&#160;|&#160;update-extras&#160;|&#160;update-in-the-wild-to-be-sorted&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-stat [...]
+</text><text class="breeze-static-checks-r5" x="0" y="898.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-36)">│</text><text class="breeze-static-checks-r7" x="366" y="898.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-36)">update-inlined-dockerfile-scripts&#160;|&#160;update-local-yml-file&#160;|&#160;update-migration-references&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="898.4" textLength="12.2" clip-path="url(#breeze-static-checks-l [...]
+</text><text class="breeze-static-checks-r5" x="0" y="922.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-37)">│</text><text class="breeze-static-checks-r7" x="366" y="922.8" textLength="1073.6" clip-path="url(#breeze-static-checks-line-37)">|&#160;update-providers-dependencies&#160;|&#160;update-spelling-wordlist-to-be-sorted&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="947.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-38)">│</text><text class="breeze-static-checks-r7" x="366" y="947.2" textLength="1073.6" clip-path="url(#breeze-static-checks-line-38)">update-supported-versions&#160;|&#160;update-vendored-in-k8s-json-schema&#160;|&#160;update-version&#160;|&#160;&#160;&#160;&#160;&#160;&#160;&#160;</text><text class="breeze-static-checks-r5" x="1451.8" y="947.2" textLength="12.2" cli [...]
+</text><text class="breeze-static-checks-r5" x="0" y="971.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-39)">│</text><text class="breeze-static-checks-r7" x="366" y="971.6" textLength="1073.6" clip-path="url(#breeze-static-checks-line-39)">yamllint&#160;|&#160;yesqa)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#16 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="996" textLength="12.2" clip-path="url(#breeze-static-checks-line-40)">│</text><text class="breeze-static-checks-r4" x="24.4" y="996" textLength="12.2" clip-path="url(#breeze-static-checks-line-40)">-</text><text class="breeze-static-checks-r4" x="36.6" y="996" textLength="61" clip-path="url(#breeze-static-checks-line-40)">-file</text><text class="breeze-static-checks-r6" x="317.2" y="996" textLength="24.4" clip-path="url(#breeze-stati [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1020.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1020.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-41)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1020.4" textLength="48.8" clip-path="url(#breeze-static-checks-line-41)">-all</text><text class="breeze-static-checks-r4" x="85.4" y="1020.4" textLength="73.2" clip-path="url(# [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1044.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1044.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-42)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1044.8" textLength="61" clip-path="url(#breeze-static-checks-line-42)">-show</text><text class="breeze-static-checks-r4" x="97.6" y="1044.8" textLength="195.2" clip-path="url(# [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1069.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1069.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-43)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1069.2" textLength="61" clip-path="url(#breeze-static-checks-line-43)">-last</text><text class="breeze-static-checks-r4" x="97.6" y="1069.2" textLength="85.4" clip-path="url(#b [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1093.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1093.6" textLength="12.2" clip-path="url(#breeze-static-checks-line-44)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1093.6" textLength="85.4" clip-path="url(#breeze-static-checks-line-44)">-commit</text><text class="breeze-static-checks-r4" x="122" y="1093.6" textLength="48.8" clip-path="url [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1118" textLength="12.2" clip-path="url(#breeze-static-checks-line-45)">│</text><text class="breeze-static-checks-r2" x="366" y="1118" textLength="292.8" clip-path="url(#breeze-static-checks-line-45)">Mutually&#160;exclusive&#160;with&#160;</text><text class="breeze-static-checks-r4" x="658.8" y="1118" textLength="12.2" clip-path="url(#breeze-static-checks-line-45)">-</text><text class="breeze-static-checks-r4" x="671" y="1118" textLen [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1142.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-46)">│</text><text class="breeze-static-checks-r7" x="366" y="1142.4" textLength="1073.6" clip-path="url(#breeze-static-checks-line-46)">(TEXT)&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160;&#160 [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1166.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-47)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1166.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-47)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1166.8" textLength="85.4" clip-path="url(#breeze-static-checks-line-47)">-github</text><text class="breeze-static-checks-r4" x="122" y="1166.8" textLength="134.2" clip-path="ur [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1191.2" textLength="1464" clip-path="url(#breeze-static-checks-line-48)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-static-checks-r2" x="1464" y="1191.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-48)">
+</text><text class="breeze-static-checks-r5" x="0" y="1215.6" textLength="24.4" clip-path="url(#breeze-static-checks-line-49)">╭─</text><text class="breeze-static-checks-r5" x="24.4" y="1215.6" textLength="195.2" clip-path="url(#breeze-static-checks-line-49)">&#160;Common&#160;options&#160;</text><text class="breeze-static-checks-r5" x="219.6" y="1215.6" textLength="1220" clip-path="url(#breeze-static-checks-line-49)">────────────────────────────────────────────────────────────────────── [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1240" textLength="12.2" clip-path="url(#breeze-static-checks-line-50)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1240" textLength="12.2" clip-path="url(#breeze-static-checks-line-50)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1240" textLength="97.6" clip-path="url(#breeze-static-checks-line-50)">-verbose</text><text class="breeze-static-checks-r6" x="158.6" y="1240" textLength="24.4" clip-path="url(#bre [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1264.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1264.4" textLength="12.2" clip-path="url(#breeze-static-checks-line-51)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1264.4" textLength="48.8" clip-path="url(#breeze-static-checks-line-51)">-dry</text><text class="breeze-static-checks-r4" x="85.4" y="1264.4" textLength="48.8" clip-path="url(# [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1288.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-52)">│</text><text class="breeze-static-checks-r4" x="24.4" y="1288.8" textLength="12.2" clip-path="url(#breeze-static-checks-line-52)">-</text><text class="breeze-static-checks-r4" x="36.6" y="1288.8" textLength="61" clip-path="url(#breeze-static-checks-line-52)">-help</text><text class="breeze-static-checks-r6" x="158.6" y="1288.8" textLength="24.4" clip-path="url(# [...]
+</text><text class="breeze-static-checks-r5" x="0" y="1313.2" textLength="1464" clip-path="url(#breeze-static-checks-line-53)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-static-checks-r2" x="1464" y="1313.2" textLength="12.2" clip-path="url(#breeze-static-checks-line-53)">
 </text>
     </g>
     </g>
diff --git a/images/breeze/output_stop.svg b/images/breeze/output_stop.svg
index 81e2abb315..3197abd935 100644
--- a/images/breeze/output_stop.svg
+++ b/images/breeze/output_stop.svg
@@ -35,8 +35,8 @@
     .breeze-stop-r1 { fill: #c5c8c6;font-weight: bold }
 .breeze-stop-r2 { fill: #c5c8c6 }
 .breeze-stop-r3 { fill: #d0b344;font-weight: bold }
-.breeze-stop-r4 { fill: #868887 }
-.breeze-stop-r5 { fill: #68a0b3;font-weight: bold }
+.breeze-stop-r4 { fill: #68a0b3;font-weight: bold }
+.breeze-stop-r5 { fill: #868887 }
 .breeze-stop-r6 { fill: #98a84b;font-weight: bold }
     </style>
 
@@ -96,19 +96,19 @@
     
     <g class="breeze-stop-matrix">
     <text class="breeze-stop-r2" x="1464" y="20" textLength="12.2" clip-path="url(#breeze-stop-line-0)">
-</text><text class="breeze-stop-r3" x="12.2" y="44.4" textLength="85.4" clip-path="url(#breeze-stop-line-1)">Usage:&#160;</text><text class="breeze-stop-r1" x="97.6" y="44.4" textLength="256.2" clip-path="url(#breeze-stop-line-1)">breeze&#160;stop&#160;[OPTIONS]</text><text class="breeze-stop-r2" x="1464" y="44.4" textLength="12.2" clip-path="url(#breeze-stop-line-1)">
+</text><text class="breeze-stop-r3" x="12.2" y="44.4" textLength="85.4" clip-path="url(#breeze-stop-line-1)">Usage:&#160;</text><text class="breeze-stop-r1" x="97.6" y="44.4" textLength="158.6" clip-path="url(#breeze-stop-line-1)">breeze&#160;stop&#160;[</text><text class="breeze-stop-r4" x="256.2" y="44.4" textLength="85.4" clip-path="url(#breeze-stop-line-1)">OPTIONS</text><text class="breeze-stop-r1" x="341.6" y="44.4" textLength="12.2" clip-path="url(#breeze-stop-line-1)">]</text><te [...]
 </text><text class="breeze-stop-r2" x="1464" y="68.8" textLength="12.2" clip-path="url(#breeze-stop-line-2)">
 </text><text class="breeze-stop-r2" x="12.2" y="93.2" textLength="390.4" clip-path="url(#breeze-stop-line-3)">Stop&#160;running&#160;breeze&#160;environment.</text><text class="breeze-stop-r2" x="1464" y="93.2" textLength="12.2" clip-path="url(#breeze-stop-line-3)">
 </text><text class="breeze-stop-r2" x="1464" y="117.6" textLength="12.2" clip-path="url(#breeze-stop-line-4)">
-</text><text class="breeze-stop-r4" x="0" y="142" textLength="24.4" clip-path="url(#breeze-stop-line-5)">╭─</text><text class="breeze-stop-r4" x="24.4" y="142" textLength="146.4" clip-path="url(#breeze-stop-line-5)">&#160;Stop&#160;flags&#160;</text><text class="breeze-stop-r4" x="170.8" y="142" textLength="1268.8" clip-path="url(#breeze-stop-line-5)">────────────────────────────────────────────────────────────────────────────────────────────────────────</text><text class="breeze-stop-r4 [...]
-</text><text class="breeze-stop-r4" x="0" y="166.4" textLength="12.2" clip-path="url(#breeze-stop-line-6)">│</text><text class="breeze-stop-r5" x="24.4" y="166.4" textLength="12.2" clip-path="url(#breeze-stop-line-6)">-</text><text class="breeze-stop-r5" x="36.6" y="166.4" textLength="109.8" clip-path="url(#breeze-stop-line-6)">-preserve</text><text class="breeze-stop-r5" x="146.4" y="166.4" textLength="97.6" clip-path="url(#breeze-stop-line-6)">-volumes</text><text class="breeze-stop-r6 [...]
-</text><text class="breeze-stop-r4" x="0" y="190.8" textLength="12.2" clip-path="url(#breeze-stop-line-7)">│</text><text class="breeze-stop-r5" x="24.4" y="190.8" textLength="12.2" clip-path="url(#breeze-stop-line-7)">-</text><text class="breeze-stop-r5" x="36.6" y="190.8" textLength="97.6" clip-path="url(#breeze-stop-line-7)">-cleanup</text><text class="breeze-stop-r5" x="134.2" y="190.8" textLength="134.2" clip-path="url(#breeze-stop-line-7)">-mypy-cache</text><text class="breeze-stop- [...]
-</text><text class="breeze-stop-r4" x="0" y="215.2" textLength="1464" clip-path="url(#breeze-stop-line-8)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-stop-r2" x="1464" y="215.2" textLength="12.2" clip-path="url(#breeze-stop-line-8)">
-</text><text class="breeze-stop-r4" x="0" y="239.6" textLength="24.4" clip-path="url(#breeze-stop-line-9)">╭─</text><text class="breeze-stop-r4" x="24.4" y="239.6" textLength="195.2" clip-path="url(#breeze-stop-line-9)">&#160;Common&#160;options&#160;</text><text class="breeze-stop-r4" x="219.6" y="239.6" textLength="1220" clip-path="url(#breeze-stop-line-9)">────────────────────────────────────────────────────────────────────────────────────────────────────</text><text class="breeze-sto [...]
-</text><text class="breeze-stop-r4" x="0" y="264" textLength="12.2" clip-path="url(#breeze-stop-line-10)">│</text><text class="breeze-stop-r5" x="24.4" y="264" textLength="12.2" clip-path="url(#breeze-stop-line-10)">-</text><text class="breeze-stop-r5" x="36.6" y="264" textLength="97.6" clip-path="url(#breeze-stop-line-10)">-verbose</text><text class="breeze-stop-r6" x="158.6" y="264" textLength="24.4" clip-path="url(#breeze-stop-line-10)">-v</text><text class="breeze-stop-r2" x="207.4"  [...]
-</text><text class="breeze-stop-r4" x="0" y="288.4" textLength="12.2" clip-path="url(#breeze-stop-line-11)">│</text><text class="breeze-stop-r5" x="24.4" y="288.4" textLength="12.2" clip-path="url(#breeze-stop-line-11)">-</text><text class="breeze-stop-r5" x="36.6" y="288.4" textLength="48.8" clip-path="url(#breeze-stop-line-11)">-dry</text><text class="breeze-stop-r5" x="85.4" y="288.4" textLength="48.8" clip-path="url(#breeze-stop-line-11)">-run</text><text class="breeze-stop-r6" x="15 [...]
-</text><text class="breeze-stop-r4" x="0" y="312.8" textLength="12.2" clip-path="url(#breeze-stop-line-12)">│</text><text class="breeze-stop-r5" x="24.4" y="312.8" textLength="12.2" clip-path="url(#breeze-stop-line-12)">-</text><text class="breeze-stop-r5" x="36.6" y="312.8" textLength="61" clip-path="url(#breeze-stop-line-12)">-help</text><text class="breeze-stop-r6" x="158.6" y="312.8" textLength="24.4" clip-path="url(#breeze-stop-line-12)">-h</text><text class="breeze-stop-r2" x="207. [...]
-</text><text class="breeze-stop-r4" x="0" y="337.2" textLength="1464" clip-path="url(#breeze-stop-line-13)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-stop-r2" x="1464" y="337.2" textLength="12.2" clip-path="url(#breeze-stop-line-13)">
+</text><text class="breeze-stop-r5" x="0" y="142" textLength="24.4" clip-path="url(#breeze-stop-line-5)">╭─</text><text class="breeze-stop-r5" x="24.4" y="142" textLength="146.4" clip-path="url(#breeze-stop-line-5)">&#160;Stop&#160;flags&#160;</text><text class="breeze-stop-r5" x="170.8" y="142" textLength="1268.8" clip-path="url(#breeze-stop-line-5)">────────────────────────────────────────────────────────────────────────────────────────────────────────</text><text class="breeze-stop-r5 [...]
+</text><text class="breeze-stop-r5" x="0" y="166.4" textLength="12.2" clip-path="url(#breeze-stop-line-6)">│</text><text class="breeze-stop-r4" x="24.4" y="166.4" textLength="12.2" clip-path="url(#breeze-stop-line-6)">-</text><text class="breeze-stop-r4" x="36.6" y="166.4" textLength="109.8" clip-path="url(#breeze-stop-line-6)">-preserve</text><text class="breeze-stop-r4" x="146.4" y="166.4" textLength="97.6" clip-path="url(#breeze-stop-line-6)">-volumes</text><text class="breeze-stop-r6 [...]
+</text><text class="breeze-stop-r5" x="0" y="190.8" textLength="12.2" clip-path="url(#breeze-stop-line-7)">│</text><text class="breeze-stop-r4" x="24.4" y="190.8" textLength="12.2" clip-path="url(#breeze-stop-line-7)">-</text><text class="breeze-stop-r4" x="36.6" y="190.8" textLength="97.6" clip-path="url(#breeze-stop-line-7)">-cleanup</text><text class="breeze-stop-r4" x="134.2" y="190.8" textLength="134.2" clip-path="url(#breeze-stop-line-7)">-mypy-cache</text><text class="breeze-stop- [...]
+</text><text class="breeze-stop-r5" x="0" y="215.2" textLength="1464" clip-path="url(#breeze-stop-line-8)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-stop-r2" x="1464" y="215.2" textLength="12.2" clip-path="url(#breeze-stop-line-8)">
+</text><text class="breeze-stop-r5" x="0" y="239.6" textLength="24.4" clip-path="url(#breeze-stop-line-9)">╭─</text><text class="breeze-stop-r5" x="24.4" y="239.6" textLength="195.2" clip-path="url(#breeze-stop-line-9)">&#160;Common&#160;options&#160;</text><text class="breeze-stop-r5" x="219.6" y="239.6" textLength="1220" clip-path="url(#breeze-stop-line-9)">────────────────────────────────────────────────────────────────────────────────────────────────────</text><text class="breeze-sto [...]
+</text><text class="breeze-stop-r5" x="0" y="264" textLength="12.2" clip-path="url(#breeze-stop-line-10)">│</text><text class="breeze-stop-r4" x="24.4" y="264" textLength="12.2" clip-path="url(#breeze-stop-line-10)">-</text><text class="breeze-stop-r4" x="36.6" y="264" textLength="97.6" clip-path="url(#breeze-stop-line-10)">-verbose</text><text class="breeze-stop-r6" x="158.6" y="264" textLength="24.4" clip-path="url(#breeze-stop-line-10)">-v</text><text class="breeze-stop-r2" x="207.4"  [...]
+</text><text class="breeze-stop-r5" x="0" y="288.4" textLength="12.2" clip-path="url(#breeze-stop-line-11)">│</text><text class="breeze-stop-r4" x="24.4" y="288.4" textLength="12.2" clip-path="url(#breeze-stop-line-11)">-</text><text class="breeze-stop-r4" x="36.6" y="288.4" textLength="48.8" clip-path="url(#breeze-stop-line-11)">-dry</text><text class="breeze-stop-r4" x="85.4" y="288.4" textLength="48.8" clip-path="url(#breeze-stop-line-11)">-run</text><text class="breeze-stop-r6" x="15 [...]
+</text><text class="breeze-stop-r5" x="0" y="312.8" textLength="12.2" clip-path="url(#breeze-stop-line-12)">│</text><text class="breeze-stop-r4" x="24.4" y="312.8" textLength="12.2" clip-path="url(#breeze-stop-line-12)">-</text><text class="breeze-stop-r4" x="36.6" y="312.8" textLength="61" clip-path="url(#breeze-stop-line-12)">-help</text><text class="breeze-stop-r6" x="158.6" y="312.8" textLength="24.4" clip-path="url(#breeze-stop-line-12)">-h</text><text class="breeze-stop-r2" x="207. [...]
+</text><text class="breeze-stop-r5" x="0" y="337.2" textLength="1464" clip-path="url(#breeze-stop-line-13)">╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯</text><text class="breeze-stop-r2" x="1464" y="337.2" textLength="12.2" clip-path="url(#breeze-stop-line-13)">
 </text>
     </g>
     </g>
diff --git a/scripts/ci/pre_commit/pre_commit_update_black_version.py b/scripts/ci/pre_commit/pre_commit_update_black_version.py
new file mode 100755
index 0000000000..5e4fda5ed3
--- /dev/null
+++ b/scripts/ci/pre_commit/pre_commit_update_black_version.py
@@ -0,0 +1,37 @@
+#!/usr/bin/env python
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+import re
+from pathlib import Path
+
+import yaml
+
+AIRFLOW_SOURCES = Path(__file__).parents[3].resolve()
+
+
+if __name__ == "__main__":
+    PRE_COMMIT_CONFIG_FILE = AIRFLOW_SOURCES / ".pre-commit-config.yaml"
+    pre_commit_content = yaml.safe_load(PRE_COMMIT_CONFIG_FILE.read_text())
+    for repo in pre_commit_content["repos"]:
+        if repo["repo"] == "https://github.com/psf/black":
+            black_version = repo["rev"]
+            pre_commit_text = PRE_COMMIT_CONFIG_FILE.read_text()
+            pre_commit_text = re.sub(r"black==[0-9\.]*", f"black=={black_version}", pre_commit_text)
+            PRE_COMMIT_CONFIG_FILE.write_text(pre_commit_text)
+            break


[airflow] 05/12: add hostname argument to DockerOperator (#27822)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3784f2e25bda3c33b604212949d8bfca27924e94
Author: skabbit <sk...@gmail.com>
AuthorDate: Tue Nov 29 13:08:48 2022 +0400

    add hostname argument to DockerOperator (#27822)
    
    (cherry picked from commit 1aa3da543a3f9229527a5de2807053e15b2bfea7)
---
 airflow/providers/docker/operators/docker.py    | 4 ++++
 tests/providers/docker/operators/test_docker.py | 7 +++++++
 2 files changed, 11 insertions(+)

diff --git a/airflow/providers/docker/operators/docker.py b/airflow/providers/docker/operators/docker.py
index 509713680a..e02a813024 100644
--- a/airflow/providers/docker/operators/docker.py
+++ b/airflow/providers/docker/operators/docker.py
@@ -136,6 +136,7 @@ class DockerOperator(BaseOperator):
         greater than 0. If omitted uses system default.
     :param tty: Allocate pseudo-TTY to the container
         This needs to be set see logs of the Docker container.
+    :param hostname: Optional hostname for the container.
     :param privileged: Give extended privileges to this container.
     :param cap_add: Include container capabilities
     :param retrieve_output: Should this docker image consistently attempt to pull from and output
@@ -194,6 +195,7 @@ class DockerOperator(BaseOperator):
         auto_remove: str = "never",
         shm_size: int | None = None,
         tty: bool = False,
+        hostname: str | None = None,
         privileged: bool = False,
         cap_add: Iterable[str] | None = None,
         extra_hosts: dict[str, str] | None = None,
@@ -251,6 +253,7 @@ class DockerOperator(BaseOperator):
         self.docker_conn_id = docker_conn_id
         self.shm_size = shm_size
         self.tty = tty
+        self.hostname = hostname
         self.privileged = privileged
         self.cap_add = cap_add
         self.extra_hosts = extra_hosts
@@ -342,6 +345,7 @@ class DockerOperator(BaseOperator):
             entrypoint=self.format_command(self.entrypoint),
             working_dir=self.working_dir,
             tty=self.tty,
+            hostname=self.hostname,
         )
         logstream = self.cli.attach(container=self.container["Id"], stdout=True, stderr=True, stream=True)
         try:
diff --git a/tests/providers/docker/operators/test_docker.py b/tests/providers/docker/operators/test_docker.py
index f17c6a01eb..0430e0ae2c 100644
--- a/tests/providers/docker/operators/test_docker.py
+++ b/tests/providers/docker/operators/test_docker.py
@@ -101,6 +101,7 @@ class TestDockerOperator:
             host_tmp_dir="/host/airflow",
             container_name="test_container",
             tty=True,
+            hostname="test.contrainer.host",
             device_requests=[DeviceRequest(count=-1, capabilities=[["gpu"]])],
             log_opts_max_file="5",
             log_opts_max_size="10m",
@@ -127,6 +128,7 @@ class TestDockerOperator:
             entrypoint=["sh", "-c"],
             working_dir="/container/path",
             tty=True,
+            hostname="test.contrainer.host",
         )
         self.client_mock.create_host_config.assert_called_once_with(
             mounts=[
@@ -183,6 +185,7 @@ class TestDockerOperator:
             shm_size=1000,
             host_tmp_dir="/host/airflow",
             container_name="test_container",
+            hostname="test.contrainer.host",
             tty=True,
         )
         operator.execute(None)
@@ -201,6 +204,7 @@ class TestDockerOperator:
             entrypoint=["sh", "-c"],
             working_dir="/container/path",
             tty=True,
+            hostname="test.contrainer.host",
         )
         self.client_mock.create_host_config.assert_called_once_with(
             mounts=[
@@ -294,6 +298,7 @@ class TestDockerOperator:
                     entrypoint=["sh", "-c"],
                     working_dir="/container/path",
                     tty=True,
+                    hostname=None,
                 ),
                 call(
                     command="env",
@@ -305,6 +310,7 @@ class TestDockerOperator:
                     entrypoint=["sh", "-c"],
                     working_dir="/container/path",
                     tty=True,
+                    hostname=None,
                 ),
             ]
         )
@@ -403,6 +409,7 @@ class TestDockerOperator:
             entrypoint=["sh", "-c"],
             working_dir="/container/path",
             tty=True,
+            hostname=None,
         )
         stringio_mock.assert_called_once_with("UNIT=FILE\nPRIVATE=FILE\nVAR=VALUE")
         self.dotenv_mock.assert_called_once_with(stream="UNIT=FILE\nPRIVATE=FILE\nVAR=VALUE")


[airflow] 12/12: Make static checks generated file more stable accross the board (#29080)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4e2af12f99dbb816d341751490921f3eca34b047
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Mon Jan 23 12:48:33 2023 +0100

    Make static checks generated file  more stable accross the board (#29080)
    
    There were couple of problems with static checks generating source
    files including generated stubs in the common.sql package:
    
    * black formatting was implemented in multiple separate scripts
      making it harded to fix problems in all of them
    * generated stub files were not formatted with is_pyi=True and
      black had no way to figure it out because it was working on strings
    * black formatting was not consistently applied in all places
    * EOL at the end of generated stub file was missing, leading to EOL
      fixer adding them after generation leading to multiple pre-commit
      passes needed
    * there was (already unused) deprecated dev dict generator that used
      its own black formatting.
    
    There were also couple of problems with the files generated by
    stubgen itself:
    
    * Union was missing in the generated stubs (this is a known issue
      with stubgen: https://github.com/python/mypy/issues/12929
    * Intellij complained on Incomplete import from _typeshed missing
    
    This PR fixes all the problems:
    
    * black formatting is now consistenly extracted and applied everywhere
    * when needed, is_pyi flag is passed to black so that it knows
      that .pyi file is being fomratted
    * EOL is added at the end of file when the file is generated
    * Union is added to the generated stub
    * noqa is added to _typeshed import
    * the dict generator is removed
    
    As the end result, generated stub files are fully importable
    (no errors reported by IntelliJ IDE) and consistently formatted
    every time.
    
    (cherry picked from commit 129f0820cd03c721ebebe3461489f255bb9e752c)
---
 .pre-commit-config.yaml                            |  21 +-
 dev/deprecations/generate_deprecated_dicts.py      | 217 ---------------------
 dev/provider_packages/prepare_provider_packages.py |  18 +-
 .../ci/pre_commit/common_precommit_black_utils.py  |  44 +++++
 scripts/ci/pre_commit/common_precommit_utils.py    |   3 +-
 .../pre_commit_check_pre_commit_hooks.py           |  70 +++----
 .../ci/pre_commit/pre_commit_compile_www_assets.py |   3 +-
 scripts/ci/pre_commit/pre_commit_insert_extras.py  |   4 +-
 .../ci/pre_commit/pre_commit_local_yml_mounts.py   |  18 +-
 scripts/ci/pre_commit/pre_commit_mypy.py           |  13 +-
 10 files changed, 105 insertions(+), 306 deletions(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 07e5d18d58..4d06c974dc 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -146,6 +146,13 @@ repos:
           - --fuzzy-match-generates-todo
         files: >
           \.cfg$|\.conf$|\.ini$|\.ldif$|\.properties$|\.readthedocs$|\.service$|\.tf$|Dockerfile.*$
+  - repo: https://github.com/psf/black
+    rev: 22.12.0
+    hooks:
+      - id: black
+        name: Run black (python formatter)
+        args: [--config=./pyproject.toml]
+        exclude: ^airflow/_vendor/|^airflow/contrib/
   - repo: local
     hooks:
       - id: update-black-version
@@ -167,15 +174,8 @@ repos:
         additional_dependencies: ['ruff>=0.0.219']
         files: \.pyi?$
         exclude: ^airflow/_vendor/
-  - repo: https://github.com/psf/black
-    rev: 22.12.0
-    hooks:
-      - id: black
-        name: Run black (python formatter)
-        args: [--config=./pyproject.toml]
-        exclude: ^airflow/_vendor/|^airflow/contrib/
   - repo: https://github.com/asottile/blacken-docs
-    rev: v1.12.1
+    rev: 1.13.0
     hooks:
       - id: blacken-docs
         name: Run black on python code blocks in documentation files
@@ -230,7 +230,7 @@ repos:
         files: ^chart/values\.schema\.json$|^chart/values_schema\.schema\.json$
         pass_filenames: true
   - repo: https://github.com/pre-commit/pygrep-hooks
-    rev: v1.9.0
+    rev: v1.10.0
     hooks:
       - id: rst-backticks
         name: Check if RST files use double backticks for code
@@ -239,7 +239,7 @@ repos:
         name: Check if there are no deprecate log warn
         exclude: ^airflow/_vendor/
   - repo: https://github.com/adrienverge/yamllint
-    rev: v1.26.3
+    rev: v1.29.0
     hooks:
       - id: yamllint
         name: Check YAML files with yamllint
@@ -344,6 +344,7 @@ repos:
         language: python
         files: ^setup\.py$|^INSTALL$|^CONTRIBUTING\.rst$
         pass_filenames: false
+        additional_dependencies: ['rich>=12.4.4']
       - id: check-extras-order
         name: Check order of extras in Dockerfile
         entry: ./scripts/ci/pre_commit/pre_commit_check_order_dockerfile_extras.py
diff --git a/dev/deprecations/generate_deprecated_dicts.py b/dev/deprecations/generate_deprecated_dicts.py
deleted file mode 100644
index b705fee48b..0000000000
--- a/dev/deprecations/generate_deprecated_dicts.py
+++ /dev/null
@@ -1,217 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-from __future__ import annotations
-
-import ast
-import os
-from collections import defaultdict
-from functools import lru_cache
-from pathlib import Path
-from typing import NamedTuple
-
-from jinja2 import BaseLoader, Environment
-from rich.console import Console
-
-if __name__ not in ("__main__", "__mp_main__"):
-    raise SystemExit(
-        "This file is intended to be executed as an executable program. You cannot use it as a module_path."
-        f"To run this script, run the ./{__file__} command [FILE] ..."
-    )
-
-AIRFLOW_SOURCES_ROOT = Path(__file__).parents[2].resolve()
-CONTRIB_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "contrib"
-
-
-@lru_cache(maxsize=None)
-def black_mode():
-    from black import Mode, parse_pyproject_toml, target_version_option_callback
-
-    config = parse_pyproject_toml(os.path.join(AIRFLOW_SOURCES_ROOT, "pyproject.toml"))
-
-    target_versions = set(
-        target_version_option_callback(None, None, tuple(config.get("target_version", ()))),
-    )
-
-    return Mode(
-        target_versions=target_versions,
-        line_length=config.get("line_length", Mode.line_length),
-        is_pyi=bool(config.get("is_pyi", Mode.is_pyi)),
-        string_normalization=not bool(config.get("skip_string_normalization", not Mode.string_normalization)),
-        experimental_string_processing=bool(
-            config.get("experimental_string_processing", Mode.experimental_string_processing)
-        ),
-    )
-
-
-def black_format(content) -> str:
-    from black import format_str
-
-    return format_str(content, mode=black_mode())
-
-
-class Import(NamedTuple):
-    module_path: str
-    name: str
-    alias: str
-
-
-class ImportedClass(NamedTuple):
-    module_path: str
-    name: str
-
-
-def get_imports(path: Path):
-    root = ast.parse(path.read_text())
-    imports: dict[str, ImportedClass] = {}
-    for node in ast.iter_child_nodes(root):
-        if isinstance(node, ast.Import):
-            module_array: list[str] = []
-        elif isinstance(node, ast.ImportFrom) and node.module:
-            module_array = node.module.split(".")
-        elif isinstance(node, ast.ClassDef):
-            for base in node.bases:
-                res = imports.get(base.id)  # type: ignore[attr-defined]
-                if res:
-                    yield Import(module_path=res.module_path, name=res.name, alias=node.name)
-            continue
-        else:
-            continue
-        for n in node.names:  # type: ignore[attr-defined]
-            imported_as = n.asname if n.asname else n.name
-            module_path = ".".join(module_array)
-            imports[imported_as] = ImportedClass(module_path=module_path, name=n.name)
-            yield Import(module_path, n.name, imported_as)
-
-
-DEPRECATED_CLASSES_TEMPLATE = """
-__deprecated_classes = {
-{%- for module_path, package_imports in package_imports.items() %}
-    '{{module_path}}': {
-{%- for import_item in package_imports %}
-        '{{import_item.alias}}': '{{import_item.module_path}}.{{import_item.name}}',
-{%- endfor %}
-    },
-{%- endfor %}
-}
-"""
-
-DEPRECATED_MODULES = [
-    "airflow/hooks/base_hook.py",
-    "airflow/hooks/dbapi_hook.py",
-    "airflow/hooks/docker_hook.py",
-    "airflow/hooks/druid_hook.py",
-    "airflow/hooks/hdfs_hook.py",
-    "airflow/hooks/hive_hooks.py",
-    "airflow/hooks/http_hook.py",
-    "airflow/hooks/jdbc_hook.py",
-    "airflow/hooks/mssql_hook.py",
-    "airflow/hooks/mysql_hook.py",
-    "airflow/hooks/oracle_hook.py",
-    "airflow/hooks/pig_hook.py",
-    "airflow/hooks/postgres_hook.py",
-    "airflow/hooks/presto_hook.py",
-    "airflow/hooks/S3_hook.py",
-    "airflow/hooks/samba_hook.py",
-    "airflow/hooks/slack_hook.py",
-    "airflow/hooks/sqlite_hook.py",
-    "airflow/hooks/webhdfs_hook.py",
-    "airflow/hooks/zendesk_hook.py",
-    "airflow/operators/bash_operator.py",
-    "airflow/operators/branch_operator.py",
-    "airflow/operators/check_operator.py",
-    "airflow/operators/dagrun_operator.py",
-    "airflow/operators/docker_operator.py",
-    "airflow/operators/druid_check_operator.py",
-    "airflow/operators/dummy.py",
-    "airflow/operators/dummy_operator.py",
-    "airflow/operators/email_operator.py",
-    "airflow/operators/gcs_to_s3.py",
-    "airflow/operators/google_api_to_s3_transfer.py",
-    "airflow/operators/hive_operator.py",
-    "airflow/operators/hive_stats_operator.py",
-    "airflow/operators/hive_to_druid.py",
-    "airflow/operators/hive_to_mysql.py",
-    "airflow/operators/hive_to_samba_operator.py",
-    "airflow/operators/http_operator.py",
-    "airflow/operators/jdbc_operator.py",
-    "airflow/operators/latest_only_operator.py",
-    "airflow/operators/mssql_operator.py",
-    "airflow/operators/mssql_to_hive.py",
-    "airflow/operators/mysql_operator.py",
-    "airflow/operators/mysql_to_hive.py",
-    "airflow/operators/oracle_operator.py",
-    "airflow/operators/papermill_operator.py",
-    "airflow/operators/pig_operator.py",
-    "airflow/operators/postgres_operator.py",
-    "airflow/operators/presto_check_operator.py",
-    "airflow/operators/presto_to_mysql.py",
-    "airflow/operators/python_operator.py",
-    "airflow/operators/redshift_to_s3_operator.py",
-    "airflow/operators/s3_file_transform_operator.py",
-    "airflow/operators/s3_to_hive_operator.py",
-    "airflow/operators/s3_to_redshift_operator.py",
-    "airflow/operators/slack_operator.py",
-    "airflow/operators/sql.py",
-    "airflow/operators/sql_branch_operator.py",
-    "airflow/operators/sqlite_operator.py",
-    "airflow/operators/subdag_operator.py",
-    "airflow/sensors/base_sensor_operator.py",
-    "airflow/sensors/date_time_sensor.py",
-    "airflow/sensors/external_task_sensor.py",
-    "airflow/sensors/hdfs_sensor.py",
-    "airflow/sensors/hive_partition_sensor.py",
-    "airflow/sensors/http_sensor.py",
-    "airflow/sensors/metastore_partition_sensor.py",
-    "airflow/sensors/named_hive_partition_sensor.py",
-    "airflow/sensors/s3_key_sensor.py",
-    "airflow/sensors/sql.py",
-    "airflow/sensors/sql_sensor.py",
-    "airflow/sensors/time_delta_sensor.py",
-    "airflow/sensors/web_hdfs_sensor.py",
-    "airflow/utils/log/cloudwatch_task_handler.py",
-    "airflow/utils/log/es_task_handler.py",
-    "airflow/utils/log/gcs_task_handler.py",
-    "airflow/utils/log/s3_task_handler.py",
-    "airflow/utils/log/stackdriver_task_handler.py",
-    "airflow/utils/log/wasb_task_handler.py",
-]
-
-CONTRIB_FILES = (AIRFLOW_SOURCES_ROOT / "airflow" / "contrib").rglob("*.py")
-
-
-if __name__ == "__main__":
-    console = Console(color_system="standard", width=300)
-    all_deprecated_imports: dict[str, dict[str, list[Import]]] = defaultdict(lambda: defaultdict(list))
-    # delete = True
-    delete = False
-    # for file in DEPRECATED_MODULES:
-    for file in CONTRIB_FILES:
-        file_path = AIRFLOW_SOURCES_ROOT / file
-        if not file_path.exists() or file.name == "__init__.py":
-            continue
-        original_module = os.fspath(file_path.parent.relative_to(AIRFLOW_SOURCES_ROOT)).replace(os.sep, ".")
-        for _import in get_imports(file_path):
-            module_name = file_path.name[: -len(".py")]
-            if _import.name not in ["warnings", "RemovedInAirflow3Warning"]:
-                all_deprecated_imports[original_module][module_name].append(_import)
-        if delete:
-            file_path.unlink()
-
-    for module_path, package_imports in all_deprecated_imports.items():
-        console.print(f"[yellow]Import dictionary for {module_path}:\n")
-        template = Environment(loader=BaseLoader()).from_string(DEPRECATED_CLASSES_TEMPLATE)
-        print(black_format(template.render(package_imports=dict(sorted(package_imports.items())))))
diff --git a/dev/provider_packages/prepare_provider_packages.py b/dev/provider_packages/prepare_provider_packages.py
index 96c00dffea..ac27bcafa1 100755
--- a/dev/provider_packages/prepare_provider_packages.py
+++ b/dev/provider_packages/prepare_provider_packages.py
@@ -46,6 +46,7 @@ from typing import Any, Generator, Iterable, NamedTuple
 import jsonschema
 import rich_click as click
 import semver as semver
+from black import Mode, TargetVersion, format_str, parse_pyproject_toml
 from packaging.version import Version
 from rich.console import Console
 from rich.syntax import Syntax
@@ -1393,29 +1394,16 @@ def update_commits_rst(
 
 
 @lru_cache(maxsize=None)
-def black_mode():
-    from black import Mode, parse_pyproject_toml, target_version_option_callback
-
+def black_mode() -> Mode:
     config = parse_pyproject_toml(os.path.join(AIRFLOW_SOURCES_ROOT_PATH, "pyproject.toml"))
-
-    target_versions = set(
-        target_version_option_callback(None, None, tuple(config.get("target_version", ()))),
-    )
-
+    target_versions = {TargetVersion[val.upper()] for val in config.get("target_version", ())}
     return Mode(
         target_versions=target_versions,
         line_length=config.get("line_length", Mode.line_length),
-        is_pyi=bool(config.get("is_pyi", Mode.is_pyi)),
-        string_normalization=not bool(config.get("skip_string_normalization", not Mode.string_normalization)),
-        experimental_string_processing=bool(
-            config.get("experimental_string_processing", Mode.experimental_string_processing)
-        ),
     )
 
 
 def black_format(content) -> str:
-    from black import format_str
-
     return format_str(content, mode=black_mode())
 
 
diff --git a/scripts/ci/pre_commit/common_precommit_black_utils.py b/scripts/ci/pre_commit/common_precommit_black_utils.py
new file mode 100644
index 0000000000..c9d0f77122
--- /dev/null
+++ b/scripts/ci/pre_commit/common_precommit_black_utils.py
@@ -0,0 +1,44 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+import os
+import sys
+from functools import lru_cache
+from pathlib import Path
+
+from black import Mode, TargetVersion, format_str, parse_pyproject_toml
+
+sys.path.insert(0, str(Path(__file__).parent.resolve()))  # make sure common_precommit_utils is imported
+
+from common_precommit_utils import AIRFLOW_BREEZE_SOURCES_PATH  # isort: skip # noqa E402
+
+
+@lru_cache(maxsize=None)
+def black_mode(is_pyi: bool = Mode.is_pyi) -> Mode:
+    config = parse_pyproject_toml(os.fspath(AIRFLOW_BREEZE_SOURCES_PATH / "pyproject.toml"))
+    target_versions = {TargetVersion[val.upper()] for val in config.get("target_version", ())}
+
+    return Mode(
+        target_versions=target_versions,
+        line_length=config.get("line_length", Mode.line_length),
+        is_pyi=is_pyi,
+    )
+
+
+def black_format(content: str, is_pyi: bool = Mode.is_pyi) -> str:
+    return format_str(content, mode=black_mode(is_pyi=is_pyi))
diff --git a/scripts/ci/pre_commit/common_precommit_utils.py b/scripts/ci/pre_commit/common_precommit_utils.py
index 3dc1fceba5..29109a4c34 100644
--- a/scripts/ci/pre_commit/common_precommit_utils.py
+++ b/scripts/ci/pre_commit/common_precommit_utils.py
@@ -22,7 +22,8 @@ import os
 import re
 from pathlib import Path
 
-AIRFLOW_SOURCES_ROOT = Path(__file__).parents[3].resolve()
+AIRFLOW_SOURCES_ROOT_PATH = Path(__file__).parents[3].resolve()
+AIRFLOW_BREEZE_SOURCES_PATH = AIRFLOW_SOURCES_ROOT_PATH / "dev" / "breeze"
 
 
 def read_airflow_version() -> str:
diff --git a/scripts/ci/pre_commit/pre_commit_check_pre_commit_hooks.py b/scripts/ci/pre_commit/pre_commit_check_pre_commit_hooks.py
index d6e32a0937..1c98d5cb41 100755
--- a/scripts/ci/pre_commit/pre_commit_check_pre_commit_hooks.py
+++ b/scripts/ci/pre_commit/pre_commit_check_pre_commit_hooks.py
@@ -26,22 +26,24 @@ import sys
 from pathlib import Path
 
 sys.path.insert(0, str(Path(__file__).parent.resolve()))  # make sure common_precommit_utils is imported
+from common_precommit_utils import (  # isort: skip # noqa: E402
+    AIRFLOW_BREEZE_SOURCES_PATH,
+    AIRFLOW_SOURCES_ROOT_PATH,
+    insert_documentation,
+)
+from common_precommit_black_utils import black_format  # isort: skip # noqa E402
 
 from collections import defaultdict  # noqa: E402
-from functools import lru_cache  # noqa: E402
 from typing import Any  # noqa: E402
 
 import yaml  # noqa: E402
-from common_precommit_utils import insert_documentation  # noqa: E402
 from rich.console import Console  # noqa: E402
 from tabulate import tabulate  # noqa: E402
 
 console = Console(width=400, color_system="standard")
 
-AIRFLOW_SOURCES_PATH = Path(__file__).parents[3].resolve()
-AIRFLOW_BREEZE_SOURCES_PATH = AIRFLOW_SOURCES_PATH / "dev" / "breeze"
 PRE_COMMIT_IDS_PATH = AIRFLOW_BREEZE_SOURCES_PATH / "src" / "airflow_breeze" / "pre_commit_ids.py"
-PRE_COMMIT_YAML_FILE = AIRFLOW_SOURCES_PATH / ".pre-commit-config.yaml"
+PRE_COMMIT_YAML_FILE = AIRFLOW_SOURCES_ROOT_PATH / ".pre-commit-config.yaml"
 
 
 def get_errors_and_hooks(content: Any, max_length: int) -> tuple[list[str], dict[str, list[str]], list[str]]:
@@ -75,6 +77,22 @@ def get_errors_and_hooks(content: Any, max_length: int) -> tuple[list[str], dict
     return errors, hooks, image_hooks
 
 
+def prepare_pre_commit_ids_py_file(pre_commit_ids):
+    PRE_COMMIT_IDS_PATH.write_text(
+        black_format(
+            content=render_template(
+                searchpath=AIRFLOW_BREEZE_SOURCES_PATH / "src" / "airflow_breeze",
+                template_name="pre_commit_ids",
+                context={"PRE_COMMIT_IDS": pre_commit_ids},
+                extension=".py",
+                autoescape=False,
+                keep_trailing_newline=True,
+            ),
+            is_pyi=False,
+        )
+    )
+
+
 def render_template(
     searchpath: Path,
     template_name: str,
@@ -107,46 +125,6 @@ def render_template(
     return content
 
 
-@lru_cache(maxsize=None)
-def black_mode():
-    from black import Mode, parse_pyproject_toml, target_version_option_callback
-
-    config = parse_pyproject_toml(AIRFLOW_BREEZE_SOURCES_PATH / "pyproject.toml")
-
-    target_versions = set(
-        target_version_option_callback(None, None, tuple(config.get("target_version", ()))),
-    )
-
-    return Mode(
-        target_versions=target_versions,
-        line_length=config.get("line_length", Mode.line_length),
-        is_pyi=config.get("is_pyi", False),
-        string_normalization=not config.get("skip_string_normalization", False),
-        preview=config.get("preview", False),
-    )
-
-
-def black_format(content) -> str:
-    from black import format_str
-
-    return format_str(content, mode=black_mode())
-
-
-def prepare_pre_commit_ids_py_file(pre_commit_ids):
-    PRE_COMMIT_IDS_PATH.write_text(
-        black_format(
-            render_template(
-                searchpath=AIRFLOW_BREEZE_SOURCES_PATH / "src" / "airflow_breeze",
-                template_name="pre_commit_ids",
-                context={"PRE_COMMIT_IDS": pre_commit_ids},
-                extension=".py",
-                autoescape=False,
-                keep_trailing_newline=True,
-            )
-        )
-    )
-
-
 def update_static_checks_array(hooks: dict[str, list[str]], image_hooks: list[str]):
     rows = []
     hook_ids = list(hooks.keys())
@@ -159,7 +137,7 @@ def update_static_checks_array(hooks: dict[str, list[str]], image_hooks: list[st
         rows.append((hook_id, formatted_hook_description, " * " if hook_id in image_hooks else "  "))
     formatted_table = "\n" + tabulate(rows, tablefmt="grid", headers=("ID", "Description", "Image")) + "\n\n"
     insert_documentation(
-        file_path=AIRFLOW_SOURCES_PATH / "STATIC_CODE_CHECKS.rst",
+        file_path=AIRFLOW_SOURCES_ROOT_PATH / "STATIC_CODE_CHECKS.rst",
         content=formatted_table.splitlines(keepends=True),
         header="  .. BEGIN AUTO-GENERATED STATIC CHECK LIST",
         footer="  .. END AUTO-GENERATED STATIC CHECK LIST",
diff --git a/scripts/ci/pre_commit/pre_commit_compile_www_assets.py b/scripts/ci/pre_commit/pre_commit_compile_www_assets.py
index 27975f6b8c..4733a1460b 100755
--- a/scripts/ci/pre_commit/pre_commit_compile_www_assets.py
+++ b/scripts/ci/pre_commit/pre_commit_compile_www_assets.py
@@ -23,7 +23,8 @@ import sys
 from pathlib import Path
 
 sys.path.insert(0, str(Path(__file__).parent.resolve()))  # make sure common_precommit_utils is imported
-from common_precommit_utils import get_directory_hash  # isort: skip # noqa
+from common_precommit_utils import get_directory_hash  # isort: skip # noqa E402
+from common_precommit_black_utils import black_format  # isort: skip # noqa E402
 
 AIRFLOW_SOURCES_PATH = Path(__file__).parents[3].resolve()
 WWW_HASH_FILE = AIRFLOW_SOURCES_PATH / ".build" / "www" / "hash.txt"
diff --git a/scripts/ci/pre_commit/pre_commit_insert_extras.py b/scripts/ci/pre_commit/pre_commit_insert_extras.py
index fac926f611..3e08bd674d 100755
--- a/scripts/ci/pre_commit/pre_commit_insert_extras.py
+++ b/scripts/ci/pre_commit/pre_commit_insert_extras.py
@@ -27,8 +27,8 @@ sys.path.insert(0, str(Path(__file__).parent.resolve()))  # make sure common_pre
 sys.path.insert(0, str(AIRFLOW_SOURCES_DIR))  # make sure setup is imported from Airflow
 # flake8: noqa: F401
 
-from common_precommit_utils import insert_documentation  # isort: skip
-from setup import EXTRAS_DEPENDENCIES  # isort:skip
+from common_precommit_utils import insert_documentation  # isort: skip # noqa E402
+from setup import EXTRAS_DEPENDENCIES  # isort:skip # noqa
 
 sys.path.append(str(AIRFLOW_SOURCES_DIR))
 
diff --git a/scripts/ci/pre_commit/pre_commit_local_yml_mounts.py b/scripts/ci/pre_commit/pre_commit_local_yml_mounts.py
index 0f9f954959..6efba5a6aa 100755
--- a/scripts/ci/pre_commit/pre_commit_local_yml_mounts.py
+++ b/scripts/ci/pre_commit/pre_commit_local_yml_mounts.py
@@ -20,18 +20,20 @@ from __future__ import annotations
 import sys
 from pathlib import Path
 
-AIRFLOW_SOURCES_DIR = Path(__file__).parents[3].resolve()
-
 sys.path.insert(0, str(Path(__file__).parent.resolve()))  # make sure common_precommit_utils is imported
-sys.path.insert(0, str(AIRFLOW_SOURCES_DIR))  # make sure setup is imported from Airflow
+
+from common_precommit_utils import AIRFLOW_SOURCES_ROOT_PATH  # isort: skip # noqa E402
+
+sys.path.insert(0, str(AIRFLOW_SOURCES_ROOT_PATH))  # make sure setup is imported from Airflow
 sys.path.insert(
-    0, str(AIRFLOW_SOURCES_DIR / "dev" / "breeze" / "src")
+    0, str(AIRFLOW_SOURCES_ROOT_PATH / "dev" / "breeze" / "src")
 )  # make sure setup is imported from Airflow
 # flake8: noqa: F401
+from airflow_breeze.utils.docker_command_utils import VOLUMES_FOR_SELECTED_MOUNTS  # isort: skip # noqa E402
 
-from common_precommit_utils import insert_documentation  # isort: skip
+from common_precommit_utils import insert_documentation  # isort: skip # noqa E402
 
-sys.path.append(str(AIRFLOW_SOURCES_DIR))
+sys.path.append(str(AIRFLOW_SOURCES_ROOT_PATH))
 
 MOUNTS_HEADER = (
     "        # START automatically generated volumes from "
@@ -43,9 +45,7 @@ MOUNTS_FOOTER = (
 )
 
 if __name__ == "__main__":
-    from airflow_breeze.utils.docker_command_utils import VOLUMES_FOR_SELECTED_MOUNTS
-
-    local_mount_file_path = AIRFLOW_SOURCES_DIR / "scripts" / "ci" / "docker-compose" / "local.yml"
+    local_mount_file_path = AIRFLOW_SOURCES_ROOT_PATH / "scripts" / "ci" / "docker-compose" / "local.yml"
     PREFIX = "      "
     volumes = []
     for (src, dest) in VOLUMES_FOR_SELECTED_MOUNTS:
diff --git a/scripts/ci/pre_commit/pre_commit_mypy.py b/scripts/ci/pre_commit/pre_commit_mypy.py
index f6a08e8815..5c9de455b9 100755
--- a/scripts/ci/pre_commit/pre_commit_mypy.py
+++ b/scripts/ci/pre_commit/pre_commit_mypy.py
@@ -37,11 +37,14 @@ if __name__ == "__main__":
     from common_precommit_utils import filter_out_providers_on_non_main_branch
 
     sys.path.insert(0, str(AIRFLOW_SOURCES / "dev" / "breeze" / "src"))
-    from airflow_breeze.global_constants import MOUNT_SELECTED
-    from airflow_breeze.utils.console import get_console
-    from airflow_breeze.utils.docker_command_utils import get_extra_docker_flags
-    from airflow_breeze.utils.path_utils import create_mypy_volume_if_needed
-    from airflow_breeze.utils.run_utils import get_ci_image_for_pre_commits, run_command
+    from airflow_breeze.global_constants import MOUNT_SELECTED  # isort: skip
+    from airflow_breeze.utils.console import get_console  # isort: skip
+    from airflow_breeze.utils.docker_command_utils import get_extra_docker_flags  # isort: skip
+    from airflow_breeze.utils.path_utils import create_mypy_volume_if_needed  # isort: skip
+    from airflow_breeze.utils.run_utils import (
+        get_ci_image_for_pre_commits,
+        run_command,
+    )
 
     files_to_test = filter_out_providers_on_non_main_branch(sys.argv[1:])
     if files_to_test == ["--namespace-packages"]:


[airflow] 04/12: Fix discoverability of tests for ARM in Breeze (#28432)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7787fdb10f7d6de369648108232ae79fba8c963c
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sun Dec 18 18:03:46 2022 +0100

    Fix discoverability of tests for ARM in Breeze (#28432)
    
    Breeze in case of ARM processor lacks support for several components
    (because they do not have supported ARM binaries available):
    
    * MySQL
    * MSSQL
    * LevelDB
    * Azure Service Bus
    
    When you try to attempt to run pytest on a group of tests that import
    one of those, the collection failed and none of the tests could run
    even if some of them could.
    
    This change uses pytest's skip on a module level and local imports
    in case the tests are inter-mixed with other tests in the same module
    to avoid import errors during collection.
    
    The try/except pattern over pytest.importorskip is preferred because
    we are using try/except in a number of other cases and we are pretty
    familiar with similar pattern and importorskipi has a bit unexpected
    behaviour (it returns imported module and you do not see the usual
    `import nnnn`. Also in our case we often wrap more than one
    import in one try/except (and it would lead to a duplicating messages
    to print really.
    
    We also add a separate command in ci to just perform a collection of
    tests and see if all tests are collectable after uninstalling all
    those libraries. This would prevent the problems from reapparing.
    
    Isort fixes are implemented for recently relesed isort version
    
    (cherry picked from commit 2a78f50b36eb7d0e4589633d12458eabbf82418d)
---
 .github/workflows/ci.yml                           |  6 ++-
 scripts/in_container/test_arm_pytest_collection.py | 53 ++++++++++++++++++++++
 tests/operators/test_generic_transfer.py           |  6 ++-
 .../apache/hive/transfers/test_mssql_to_hive.py    | 12 +++--
 .../apache/hive/transfers/test_mysql_to_hive.py    |  8 +++-
 .../cloud/transfers/test_bigquery_to_mssql.py      |  9 +++-
 .../google/cloud/transfers/test_mssql_to_gcs.py    |  8 +++-
 .../google/cloud/transfers/test_mysql_to_gcs.py    | 11 +++--
 .../google/cloud/triggers/test_cloud_build.py      |  2 -
 .../providers/google/leveldb/hooks/test_leveldb.py |  7 ++-
 .../google/leveldb/operators/test_leveldb.py       | 11 ++++-
 tests/providers/microsoft/azure/hooks/test_asb.py  |  8 +++-
 .../microsoft/azure/operators/test_asb.py          |  6 ++-
 .../providers/microsoft/mssql/hooks/test_mssql.py  |  6 ++-
 .../microsoft/mssql/operators/test_mssql.py        | 10 +++-
 tests/providers/mysql/hooks/test_mysql.py          | 11 ++++-
 .../mysql/transfers/test_vertica_to_mysql.py       |  8 +++-
 .../cloud/bigquery/example_bigquery_to_mssql.py    |  8 +++-
 .../google/cloud/gcs/example_mssql_to_gcs.py       |  9 +++-
 .../google/cloud/gcs/example_mysql_to_gcs.py       |  9 +++-
 .../providers/google/leveldb/example_leveldb.py    | 10 +++-
 .../microsoft/azure/example_azure_service_bus.py   | 30 +++++++-----
 .../providers/microsoft/mssql/example_mssql.py     | 10 +++-
 23 files changed, 212 insertions(+), 46 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 63138de93f..4c8bc96151 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -864,8 +864,10 @@ jobs:
         uses: ./.github/actions/prepare_breeze_and_image
       - name: "Migration Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         uses: ./.github/actions/migration_tests
-      - name: "Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}} (w/Kerberos)"
+      - name: "Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         run: breeze testing tests --run-in-parallel
+      - name: "Tests ARM Pytest collection: ${{matrix.python-version}}"
+        run: breeze shell "python /opt/airflow/scripts/in_container/test_arm_pytest_collection.py"
       - name: "Post Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         uses: ./.github/actions/post_tests
 
@@ -989,6 +991,8 @@ jobs:
         uses: ./.github/actions/migration_tests
       - name: "Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         run: breeze testing tests --run-in-parallel
+      - name: "Tests ARM Pytest collection: ${{matrix.python-version}}"
+        run: breeze shell "python /opt/airflow/scripts/in_container/test_arm_pytest_collection.py"
       - name: "Post Tests: ${{matrix.python-version}}:${{needs.build-info.outputs.test-types}}"
         uses: ./.github/actions/post_tests
 
diff --git a/scripts/in_container/test_arm_pytest_collection.py b/scripts/in_container/test_arm_pytest_collection.py
new file mode 100755
index 0000000000..43277c5562
--- /dev/null
+++ b/scripts/in_container/test_arm_pytest_collection.py
@@ -0,0 +1,53 @@
+#!/usr/bin/env python
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+import json
+import re
+import subprocess
+from pathlib import Path
+
+from rich.console import Console
+
+AIRFLOW_SOURCES_ROOT = Path(__file__).parents[2].resolve()
+
+if __name__ == "__main__":
+    console = Console(width=400, color_system="standard")
+
+    provider_dependencies = json.loads(
+        (AIRFLOW_SOURCES_ROOT / "generated" / "provider_dependencies.json").read_text()
+    )
+    all_dependencies_to_remove = []
+    for provider in provider_dependencies:
+        for dependency in provider_dependencies[provider]["deps"]:
+            if 'platform_machine != "aarch64"' in dependency:
+                all_dependencies_to_remove.append(re.split(r"[~<>=;]", dependency)[0])
+    console.print(
+        "\n[bright_blue]Uninstalling ARM-incompatible libraries "
+        + " ".join(all_dependencies_to_remove)
+        + "\n"
+    )
+    subprocess.run(["pip", "uninstall", "-y"] + all_dependencies_to_remove)
+    result = subprocess.run(["pytest", "--collect-only", "-qqqq", "--disable-warnings", "tests"], check=False)
+    if result.returncode != 0:
+        console.print("\n[red]Test collection in ARM environment failed.")
+        console.print(
+            "[yellow]You should wrap the failing imports in try/except/skip clauses\n"
+            "See similar examples as skipped tests right above.\n"
+        )
+        exit(result.returncode)
diff --git a/tests/operators/test_generic_transfer.py b/tests/operators/test_generic_transfer.py
index 202478fe48..aaf7c07262 100644
--- a/tests/operators/test_generic_transfer.py
+++ b/tests/operators/test_generic_transfer.py
@@ -26,10 +26,8 @@ from parameterized import parameterized
 
 from airflow.models.dag import DAG
 from airflow.operators.generic_transfer import GenericTransfer
-from airflow.providers.mysql.hooks.mysql import MySqlHook
 from airflow.providers.postgres.hooks.postgres import PostgresHook
 from airflow.utils import timezone
-from tests.providers.mysql.hooks.test_mysql import MySqlContext
 
 DEFAULT_DATE = timezone.datetime(2015, 1, 1)
 DEFAULT_DATE_ISO = DEFAULT_DATE.isoformat()
@@ -45,6 +43,8 @@ class TestMySql(unittest.TestCase):
         self.dag = dag
 
     def tearDown(self):
+        from airflow.providers.mysql.hooks.mysql import MySqlHook
+
         drop_tables = {"test_mysql_to_mysql", "test_airflow"}
         with closing(MySqlHook().get_conn()) as conn:
             for table in drop_tables:
@@ -59,6 +59,8 @@ class TestMySql(unittest.TestCase):
         ]
     )
     def test_mysql_to_mysql(self, client):
+        from tests.providers.mysql.hooks.test_mysql import MySqlContext
+
         with MySqlContext(client):
             sql = "SELECT * FROM connection;"
             op = GenericTransfer(
diff --git a/tests/providers/apache/hive/transfers/test_mssql_to_hive.py b/tests/providers/apache/hive/transfers/test_mssql_to_hive.py
index 53e49207e5..5ad8b329b6 100644
--- a/tests/providers/apache/hive/transfers/test_mssql_to_hive.py
+++ b/tests/providers/apache/hive/transfers/test_mssql_to_hive.py
@@ -20,11 +20,17 @@ from __future__ import annotations
 from collections import OrderedDict
 from unittest.mock import Mock, PropertyMock, patch
 
-import pymssql
+import pytest
 
-from airflow.providers.apache.hive.transfers.mssql_to_hive import MsSqlToHiveOperator
+try:
+    import pymssql
 
+    from airflow.providers.apache.hive.transfers.mssql_to_hive import MsSqlToHiveOperator
+except ImportError:
+    pytest.skip("MSSQL not available", allow_module_level=True)
 
+
+@pytest.mark.backend("mssql")
 class TestMsSqlToHiveTransfer:
     def setup_method(self):
         self.kwargs = dict(sql="sql", hive_table="table", task_id="test_mssql_to_hive", dag=None)
@@ -36,13 +42,11 @@ class TestMsSqlToHiveTransfer:
         assert mapped_type == "INT"
 
     def test_type_map_decimal(self):
-
         mapped_type = MsSqlToHiveOperator(**self.kwargs).type_map(pymssql.DECIMAL.value)
 
         assert mapped_type == "FLOAT"
 
     def test_type_map_number(self):
-
         mapped_type = MsSqlToHiveOperator(**self.kwargs).type_map(pymssql.NUMBER.value)
 
         assert mapped_type == "INT"
diff --git a/tests/providers/apache/hive/transfers/test_mysql_to_hive.py b/tests/providers/apache/hive/transfers/test_mysql_to_hive.py
index 06f3259afc..0e601263ab 100644
--- a/tests/providers/apache/hive/transfers/test_mysql_to_hive.py
+++ b/tests/providers/apache/hive/transfers/test_mysql_to_hive.py
@@ -25,10 +25,14 @@ from unittest import mock
 import pytest
 
 from airflow.providers.apache.hive.hooks.hive import HiveCliHook
-from airflow.providers.apache.hive.transfers.mysql_to_hive import MySqlToHiveOperator
-from airflow.providers.mysql.hooks.mysql import MySqlHook
 from airflow.utils import timezone
 
+try:
+    from airflow.providers.apache.hive.transfers.mysql_to_hive import MySqlToHiveOperator
+    from airflow.providers.mysql.hooks.mysql import MySqlHook
+except ImportError:
+    pytest.skip("MysQL and/or hive not available", allow_module_level=True)
+
 DEFAULT_DATE = timezone.datetime(2015, 1, 1)
 DEFAULT_DATE_ISO = DEFAULT_DATE.isoformat()
 DEFAULT_DATE_DS = DEFAULT_DATE_ISO[:10]
diff --git a/tests/providers/google/cloud/transfers/test_bigquery_to_mssql.py b/tests/providers/google/cloud/transfers/test_bigquery_to_mssql.py
index 9554749622..3c4e8f0ecc 100644
--- a/tests/providers/google/cloud/transfers/test_bigquery_to_mssql.py
+++ b/tests/providers/google/cloud/transfers/test_bigquery_to_mssql.py
@@ -20,7 +20,13 @@ from __future__ import annotations
 import unittest
 from unittest import mock
 
-from airflow.providers.google.cloud.transfers.bigquery_to_mssql import BigQueryToMsSqlOperator
+import pytest
+
+try:
+    from airflow.providers.google.cloud.transfers.bigquery_to_mssql import BigQueryToMsSqlOperator
+except ImportError:
+    pytest.skip("MSSQL not available", allow_module_level=True)
+
 
 TASK_ID = "test-bq-create-table-operator"
 TEST_PROJECT_ID = "test-project"
@@ -29,6 +35,7 @@ TEST_TABLE_ID = "test-table-id"
 TEST_DAG_ID = "test-bigquery-operators"
 
 
+@pytest.mark.backend("mssql")
 class TestBigQueryToMsSqlOperator(unittest.TestCase):
     @mock.patch("airflow.providers.google.cloud.transfers.bigquery_to_mssql.BigQueryHook")
     def test_execute_good_request_to_bq(self, mock_hook):
diff --git a/tests/providers/google/cloud/transfers/test_mssql_to_gcs.py b/tests/providers/google/cloud/transfers/test_mssql_to_gcs.py
index 9bd9e43b12..f2aeb218e3 100644
--- a/tests/providers/google/cloud/transfers/test_mssql_to_gcs.py
+++ b/tests/providers/google/cloud/transfers/test_mssql_to_gcs.py
@@ -21,9 +21,14 @@ import datetime
 import unittest
 from unittest import mock
 
+import pytest
 from parameterized import parameterized
 
-from airflow.providers.google.cloud.transfers.mssql_to_gcs import MSSQLToGCSOperator
+try:
+    from airflow.providers.google.cloud.transfers.mssql_to_gcs import MSSQLToGCSOperator
+except ImportError:
+    pytest.skip("MSSQL not available", allow_module_level=True)
+
 
 TASK_ID = "test-mssql-to-gcs"
 MSSQL_CONN_ID = "mssql_conn_test"
@@ -49,6 +54,7 @@ SCHEMA_JSON = [
 ]
 
 
+@pytest.mark.backend("mssql")
 class TestMsSqlToGoogleCloudStorageOperator(unittest.TestCase):
     @parameterized.expand(
         [
diff --git a/tests/providers/google/cloud/transfers/test_mysql_to_gcs.py b/tests/providers/google/cloud/transfers/test_mysql_to_gcs.py
index 0101548843..4688b9bf77 100644
--- a/tests/providers/google/cloud/transfers/test_mysql_to_gcs.py
+++ b/tests/providers/google/cloud/transfers/test_mysql_to_gcs.py
@@ -23,11 +23,8 @@ import unittest
 from unittest import mock
 
 import pytest
-from MySQLdb import ProgrammingError
 from parameterized import parameterized
 
-from airflow.providers.google.cloud.transfers.mysql_to_gcs import MySQLToGCSOperator
-
 TASK_ID = "test-mysql-to-gcs"
 MYSQL_CONN_ID = "mysql_conn_test"
 TZ_QUERY = "SET time_zone = '+00:00'"
@@ -69,7 +66,15 @@ CUSTOM_SCHEMA_JSON = [
     b'{"mode": "REQUIRED", "name": "some_num", "type": "TIMESTAMP"}]',
 ]
 
+try:
+    from MySQLdb import ProgrammingError
+
+    from airflow.providers.google.cloud.transfers.mysql_to_gcs import MySQLToGCSOperator
+except ImportError:
+    pytest.skip("MySQL not available", allow_module_level=True)
+
 
+@pytest.mark.backend("mysql")
 class TestMySqlToGoogleCloudStorageOperator(unittest.TestCase):
     def test_init(self):
         """Test MySqlToGoogleCloudStorageOperator instance is properly initialized."""
diff --git a/tests/providers/google/cloud/triggers/test_cloud_build.py b/tests/providers/google/cloud/triggers/test_cloud_build.py
index 62203ddacb..f3658e37ba 100644
--- a/tests/providers/google/cloud/triggers/test_cloud_build.py
+++ b/tests/providers/google/cloud/triggers/test_cloud_build.py
@@ -83,8 +83,6 @@ TEST_BUILD_INSTANCE = dict(
     warnings=[],
 )
 
-pytest.hook = CloudBuildAsyncHook(gcp_conn_id="google_cloud_default")
-
 
 @pytest.fixture
 def hook():
diff --git a/tests/providers/google/leveldb/hooks/test_leveldb.py b/tests/providers/google/leveldb/hooks/test_leveldb.py
index dd15b66fb5..a0fd5b6ddf 100644
--- a/tests/providers/google/leveldb/hooks/test_leveldb.py
+++ b/tests/providers/google/leveldb/hooks/test_leveldb.py
@@ -22,7 +22,12 @@ from unittest import mock
 
 import pytest
 
-from airflow.providers.google.leveldb.hooks.leveldb import LevelDBHook, LevelDBHookException
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+try:
+    from airflow.providers.google.leveldb.hooks.leveldb import LevelDBHook, LevelDBHookException
+except AirflowOptionalProviderFeatureException:
+    pytest.skip("LevelDB not available", allow_module_level=True)
 
 
 class TestLevelDBHook(unittest.TestCase):
diff --git a/tests/providers/google/leveldb/operators/test_leveldb.py b/tests/providers/google/leveldb/operators/test_leveldb.py
index b2993f7ffd..3becdf9a86 100644
--- a/tests/providers/google/leveldb/operators/test_leveldb.py
+++ b/tests/providers/google/leveldb/operators/test_leveldb.py
@@ -36,8 +36,15 @@ from __future__ import annotations
 import unittest
 from unittest import mock
 
-from airflow.providers.google.leveldb.hooks.leveldb import LevelDBHook
-from airflow.providers.google.leveldb.operators.leveldb import LevelDBOperator
+import pytest
+
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+try:
+    from airflow.providers.google.leveldb.hooks.leveldb import LevelDBHook
+    from airflow.providers.google.leveldb.operators.leveldb import LevelDBOperator
+except AirflowOptionalProviderFeatureException:
+    pytest.skip("LevelDB not available", allow_module_level=True)
 
 
 class TestLevelDBOperator(unittest.TestCase):
diff --git a/tests/providers/microsoft/azure/hooks/test_asb.py b/tests/providers/microsoft/azure/hooks/test_asb.py
index fc5c8dba21..61ce05a0b9 100644
--- a/tests/providers/microsoft/azure/hooks/test_asb.py
+++ b/tests/providers/microsoft/azure/hooks/test_asb.py
@@ -19,8 +19,12 @@ from __future__ import annotations
 from unittest import mock
 
 import pytest
-from azure.servicebus import ServiceBusClient, ServiceBusMessage, ServiceBusMessageBatch
-from azure.servicebus.management import ServiceBusAdministrationClient
+
+try:
+    from azure.servicebus import ServiceBusClient, ServiceBusMessage, ServiceBusMessageBatch
+    from azure.servicebus.management import ServiceBusAdministrationClient
+except ImportError:
+    pytest.skip("Azure Service Bus not available", allow_module_level=True)
 
 from airflow.models import Connection
 from airflow.providers.microsoft.azure.hooks.asb import AdminClientHook, MessageHook
diff --git a/tests/providers/microsoft/azure/operators/test_asb.py b/tests/providers/microsoft/azure/operators/test_asb.py
index d7b162ec32..e35f7f9cf7 100644
--- a/tests/providers/microsoft/azure/operators/test_asb.py
+++ b/tests/providers/microsoft/azure/operators/test_asb.py
@@ -19,7 +19,11 @@ from __future__ import annotations
 from unittest import mock
 
 import pytest
-from azure.servicebus import ServiceBusMessage
+
+try:
+    from azure.servicebus import ServiceBusMessage
+except ImportError:
+    pytest.skip("Azure Service Bus not available", allow_module_level=True)
 
 from airflow.providers.microsoft.azure.operators.asb import (
     ASBReceiveSubscriptionMessageOperator,
diff --git a/tests/providers/microsoft/mssql/hooks/test_mssql.py b/tests/providers/microsoft/mssql/hooks/test_mssql.py
index 0b899fe370..6bffab744b 100644
--- a/tests/providers/microsoft/mssql/hooks/test_mssql.py
+++ b/tests/providers/microsoft/mssql/hooks/test_mssql.py
@@ -23,7 +23,11 @@ from urllib.parse import quote_plus
 import pytest
 
 from airflow.models import Connection
-from airflow.providers.microsoft.mssql.hooks.mssql import MsSqlHook
+
+try:
+    from airflow.providers.microsoft.mssql.hooks.mssql import MsSqlHook
+except ImportError:
+    pytest.skip("MSSQL not available", allow_module_level=True)
 
 PYMSSQL_CONN = Connection(
     conn_type="mssql", host="ip", schema="share", login="username", password="password", port=8081
diff --git a/tests/providers/microsoft/mssql/operators/test_mssql.py b/tests/providers/microsoft/mssql/operators/test_mssql.py
index 5f0955ac03..62865c85f5 100644
--- a/tests/providers/microsoft/mssql/operators/test_mssql.py
+++ b/tests/providers/microsoft/mssql/operators/test_mssql.py
@@ -20,9 +20,15 @@ from __future__ import annotations
 from unittest import mock
 from unittest.mock import MagicMock, Mock
 
+import pytest
+
 from airflow import AirflowException
-from airflow.providers.microsoft.mssql.hooks.mssql import MsSqlHook
-from airflow.providers.microsoft.mssql.operators.mssql import MsSqlOperator
+
+try:
+    from airflow.providers.microsoft.mssql.hooks.mssql import MsSqlHook
+    from airflow.providers.microsoft.mssql.operators.mssql import MsSqlOperator
+except ImportError:
+    pytest.skip("MSSQL not available", allow_module_level=True)
 
 
 class TestMsSqlOperator:
diff --git a/tests/providers/mysql/hooks/test_mysql.py b/tests/providers/mysql/hooks/test_mysql.py
index 4c4d991864..eefc34c51b 100644
--- a/tests/providers/mysql/hooks/test_mysql.py
+++ b/tests/providers/mysql/hooks/test_mysql.py
@@ -23,12 +23,19 @@ import uuid
 from contextlib import closing
 from unittest import mock
 
-import MySQLdb.cursors
 import pytest
 
 from airflow.models import Connection
 from airflow.models.dag import DAG
-from airflow.providers.mysql.hooks.mysql import MySqlHook
+
+try:
+    import MySQLdb.cursors
+
+    from airflow.providers.mysql.hooks.mysql import MySqlHook
+except ImportError:
+    pytest.skip("MySQL not available", allow_module_level=True)
+
+
 from airflow.utils import timezone
 from tests.test_utils.asserts import assert_equal_ignore_multiple_spaces
 
diff --git a/tests/providers/mysql/transfers/test_vertica_to_mysql.py b/tests/providers/mysql/transfers/test_vertica_to_mysql.py
index e13c5d0578..82997a46f4 100644
--- a/tests/providers/mysql/transfers/test_vertica_to_mysql.py
+++ b/tests/providers/mysql/transfers/test_vertica_to_mysql.py
@@ -20,8 +20,14 @@ from __future__ import annotations
 import datetime
 from unittest import mock
 
+import pytest
+
 from airflow.models.dag import DAG
-from airflow.providers.mysql.transfers.vertica_to_mysql import VerticaToMySqlOperator
+
+try:
+    from airflow.providers.mysql.transfers.vertica_to_mysql import VerticaToMySqlOperator
+except ImportError:
+    pytest.skip("MySQL not available", allow_module_level=True)
 
 
 def mock_get_conn():
diff --git a/tests/system/providers/google/cloud/bigquery/example_bigquery_to_mssql.py b/tests/system/providers/google/cloud/bigquery/example_bigquery_to_mssql.py
index d9a409791e..c3747ec589 100644
--- a/tests/system/providers/google/cloud/bigquery/example_bigquery_to_mssql.py
+++ b/tests/system/providers/google/cloud/bigquery/example_bigquery_to_mssql.py
@@ -23,13 +23,19 @@ from __future__ import annotations
 import os
 from datetime import datetime
 
+import pytest
+
 from airflow import models
 from airflow.providers.google.cloud.operators.bigquery import (
     BigQueryCreateEmptyDatasetOperator,
     BigQueryCreateEmptyTableOperator,
     BigQueryDeleteDatasetOperator,
 )
-from airflow.providers.google.cloud.transfers.bigquery_to_mssql import BigQueryToMsSqlOperator
+
+try:
+    from airflow.providers.google.cloud.transfers.bigquery_to_mssql import BigQueryToMsSqlOperator
+except ImportError:
+    pytest.skip("MySQL not available", allow_module_level=True)
 
 ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
 PROJECT_ID = os.environ.get("GCP_PROJECT_ID", "example-project")
diff --git a/tests/system/providers/google/cloud/gcs/example_mssql_to_gcs.py b/tests/system/providers/google/cloud/gcs/example_mssql_to_gcs.py
index b762970e72..76b9a68095 100644
--- a/tests/system/providers/google/cloud/gcs/example_mssql_to_gcs.py
+++ b/tests/system/providers/google/cloud/gcs/example_mssql_to_gcs.py
@@ -19,9 +19,16 @@ from __future__ import annotations
 import os
 from datetime import datetime
 
+import pytest
+
 from airflow import models
 from airflow.providers.google.cloud.operators.gcs import GCSCreateBucketOperator, GCSDeleteBucketOperator
-from airflow.providers.google.cloud.transfers.mssql_to_gcs import MSSQLToGCSOperator
+
+try:
+    from airflow.providers.google.cloud.transfers.mssql_to_gcs import MSSQLToGCSOperator
+except ImportError:
+    pytest.skip("MSSQL not available", allow_module_level=True)
+
 from airflow.utils.trigger_rule import TriggerRule
 
 ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
diff --git a/tests/system/providers/google/cloud/gcs/example_mysql_to_gcs.py b/tests/system/providers/google/cloud/gcs/example_mysql_to_gcs.py
index 55b4dd1b57..97404f1a8a 100644
--- a/tests/system/providers/google/cloud/gcs/example_mysql_to_gcs.py
+++ b/tests/system/providers/google/cloud/gcs/example_mysql_to_gcs.py
@@ -19,9 +19,16 @@ from __future__ import annotations
 import os
 from datetime import datetime
 
+import pytest
+
 from airflow import models
 from airflow.providers.google.cloud.operators.gcs import GCSCreateBucketOperator, GCSDeleteBucketOperator
-from airflow.providers.google.cloud.transfers.mysql_to_gcs import MySQLToGCSOperator
+
+try:
+    from airflow.providers.google.cloud.transfers.mysql_to_gcs import MySQLToGCSOperator
+except ImportError:
+    pytest.skip("MySQL not available", allow_module_level=True)
+
 from airflow.utils.trigger_rule import TriggerRule
 
 ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
diff --git a/tests/system/providers/google/leveldb/example_leveldb.py b/tests/system/providers/google/leveldb/example_leveldb.py
index e199ccbe7b..2662c38940 100644
--- a/tests/system/providers/google/leveldb/example_leveldb.py
+++ b/tests/system/providers/google/leveldb/example_leveldb.py
@@ -23,8 +23,16 @@ from __future__ import annotations
 import os
 from datetime import datetime
 
+import pytest
+
 from airflow import models
-from airflow.providers.google.leveldb.operators.leveldb import LevelDBOperator
+from airflow.exceptions import AirflowOptionalProviderFeatureException
+
+try:
+    from airflow.providers.google.leveldb.operators.leveldb import LevelDBOperator
+except AirflowOptionalProviderFeatureException:
+    pytest.skip("LevelDB not available", allow_module_level=True)
+
 from airflow.utils.trigger_rule import TriggerRule
 
 ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
diff --git a/tests/system/providers/microsoft/azure/example_azure_service_bus.py b/tests/system/providers/microsoft/azure/example_azure_service_bus.py
index a3c51bd2dc..7c4a99786d 100644
--- a/tests/system/providers/microsoft/azure/example_azure_service_bus.py
+++ b/tests/system/providers/microsoft/azure/example_azure_service_bus.py
@@ -19,20 +19,26 @@ from __future__ import annotations
 import os
 from datetime import datetime, timedelta
 
+import pytest
+
 from airflow import DAG
 from airflow.models.baseoperator import chain
-from airflow.providers.microsoft.azure.operators.asb import (
-    ASBReceiveSubscriptionMessageOperator,
-    AzureServiceBusCreateQueueOperator,
-    AzureServiceBusDeleteQueueOperator,
-    AzureServiceBusReceiveMessageOperator,
-    AzureServiceBusSendMessageOperator,
-    AzureServiceBusSubscriptionCreateOperator,
-    AzureServiceBusSubscriptionDeleteOperator,
-    AzureServiceBusTopicCreateOperator,
-    AzureServiceBusTopicDeleteOperator,
-    AzureServiceBusUpdateSubscriptionOperator,
-)
+
+try:
+    from airflow.providers.microsoft.azure.operators.asb import (
+        ASBReceiveSubscriptionMessageOperator,
+        AzureServiceBusCreateQueueOperator,
+        AzureServiceBusDeleteQueueOperator,
+        AzureServiceBusReceiveMessageOperator,
+        AzureServiceBusSendMessageOperator,
+        AzureServiceBusSubscriptionCreateOperator,
+        AzureServiceBusSubscriptionDeleteOperator,
+        AzureServiceBusTopicCreateOperator,
+        AzureServiceBusTopicDeleteOperator,
+        AzureServiceBusUpdateSubscriptionOperator,
+    )
+except ImportError:
+    pytest.skip("Azure Service Bus not available", allow_module_level=True)
 
 EXECUTION_TIMEOUT = int(os.getenv("EXECUTION_TIMEOUT", 6))
 
diff --git a/tests/system/providers/microsoft/mssql/example_mssql.py b/tests/system/providers/microsoft/mssql/example_mssql.py
index 4b59da30ed..8d7fad9f25 100644
--- a/tests/system/providers/microsoft/mssql/example_mssql.py
+++ b/tests/system/providers/microsoft/mssql/example_mssql.py
@@ -24,9 +24,15 @@ from __future__ import annotations
 import os
 from datetime import datetime
 
+import pytest
+
 from airflow import DAG
-from airflow.providers.microsoft.mssql.hooks.mssql import MsSqlHook
-from airflow.providers.microsoft.mssql.operators.mssql import MsSqlOperator
+
+try:
+    from airflow.providers.microsoft.mssql.hooks.mssql import MsSqlHook
+    from airflow.providers.microsoft.mssql.operators.mssql import MsSqlOperator
+except ImportError:
+    pytest.skip("MSSQL provider not available", allow_module_level=True)
 
 ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
 DAG_ID = "example_mssql"


[airflow] 08/12: Variables set in variables.env are automatically exported (#28633)

Posted by ep...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-5-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 56b0b76b3af4c27be436d9aa051cf7b1b6cac268
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Wed Dec 28 23:28:50 2022 +0100

    Variables set in variables.env are automatically exported (#28633)
    
    The variables set in variables.env were not automatically exported.
    
    (cherry picked from commit 94b3b897e2f94902777c4b24fb10c915279d8967)
---
 scripts/in_container/configure_environment.sh | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/scripts/in_container/configure_environment.sh b/scripts/in_container/configure_environment.sh
index d9638b3219..6d0cbae41e 100644
--- a/scripts/in_container/configure_environment.sh
+++ b/scripts/in_container/configure_environment.sh
@@ -36,8 +36,10 @@ if [[ -d "${AIRFLOW_BREEZE_CONFIG_DIR}" && \
     echo
     echo "${COLOR_BLUE}Sourcing environment variables from ${VARIABLES_ENV_FILE} in ${AIRFLOW_BREEZE_CONFIG_DIR}${COLOR_RESET}"
     echo
+    set -o allexport
      # shellcheck disable=1090
     source "${VARIABLES_ENV_FILE}"
+    set +o allexport
     popd >/dev/null 2>&1 || exit 1
 fi