You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by as...@apache.org on 2021/03/19 15:06:02 UTC

[airflow] branch v2-0-test updated (178dde3 -> 831c7ec)

This is an automated email from the ASF dual-hosted git repository.

ash pushed a change to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


    from 178dde3  By default PIP will install all packages in .local folder (#14125)
     new ac43056  Note that the DB must be using UTF-8 (#14742)
     new f4cc5c5  Fix running child tasks in a subdag after clearing a successful subdag (#14776)
     new b583736  [AIRFLOW-6076] fix dag.cli() KeyError (#13647)
     new 04ae0f6  Add more flexibility with FAB menu links (#13903)
     new 118f86c  Speed up clear_task_instances by doing a single sql delete for TaskReschedule (#14048)
     new 9d91058  Fix typos in concept docs (#14130)
     new 472077e  [AIRFLOW-7044] Host key can be specified via SSH connection extras. (#12944)
     new 05326e2  Sync DB Migrations in Master with 2.0.1 (#14155)
     new 76be86e  Log migrations info in consistent way (#14158)
     new 8c95675  Make TaskInstance.pool_slots not nullable with a default of 1 (#14406)
     new 7156d6c  Rename last_scheduler_run into last_parsed_time, and ensure it's updated in DB (#14581)
     new 7790b2f  Use `Lax` for `cookie_samesite` when empty string is passed (#14183)
     new 1858a94  Fix comparison dagTZ with localTZ (#14204)
     new 36ff9c5  Fix indentation in code block in Taskflow API doc (#14241)
     new 3f36fa9  Make airflow dags show command display TaskGroup (#14269)
     new 44a261a  Fix bug allowing task instances to survive when dagrun_timeout is exceeded (#14321)
     new 6dd7559  Scheduler should not fail when invalid executor_config is passed (#14323)
     new 0cb2a96  BugFix: Fix taskInstance API call fails if a task is removed from running DAG (#14381)
     new 040f7d8  Fix crash when user clicks on  "Task Instance Details" caused by start_date being None (#14416)
     new 041c9d2  BugFix: Serialize max_retry_delay as a timedelta (#14436)
     new 97d98bb  Gracefully handle missing start_date and end_date for DagRun (#14452)
     new 99f1022  Fix statsd metrics not sending when using daemon mode (#14454)
     new 62725ce  Fix logging error with task error when JSON logging is enabled (#14456)
     new 5dd51dc  Bugfix: Fix wrong output of tags and owners in dag detail API endpoint (#14490)
     new 87c26b4  BugFix: TypeError in monitor_pod (#14513)
     new 051d239  Add plain format output to cli tables (#14546)
     new a156053  BugFix: fix DAG doc display (especially for TaskFlow DAGs) (#14564)
     new bfe57d3  Bugfix: Plugins endpoint was unauthenticated (#14570)
     new d6b40e2  Replace Graph View Screenshot to show Auto-refresh (#14571)
     new 0fc6ca3  Default to Celery Task model when backend model does not exist (#14612)
     new b175785  Bump version to match node dependency (#14624)
     new 3a0fb37  Remember expanded task groups in localStorage (#14661)
     new f395cd8  Update Flask-AppBuilder dependency to allow 3.2 (and all 3.x series) (#14665)
     new 654248e  Fix minor issues in 'Concepts' doc (#14679)
     new ecd3e3c  Webserver: Allow Filtering TaskInstances by queued_dttm (#14708)
     new 59afddd  Fix KubernetesExecutor issue with deleted pending pods (#14810)
     new 6ae0cb0  Pin SQLAlchemy to <1.4 due to breakage of sqlalchemy-utils (#14812)
     new eab3b9e  Suggest using $http_host instead of $host (#14814)
     new 8bdfe07  Bump Redoc to resolve vulnerability in sub-dependency (#14608)
     new 71b44a2  fix lossing duration < 1 secs in tree (#13537)
     new 9f33dfb  Fix tests for all urllib versions with only '&' as separator (#14710)
     new 831c7ec  Webserver: Sanitize string passed to origin param (#14738)

The 42 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 UPDATING.md                                        |   2 +-
 airflow/api_connexion/openapi/v1.yaml              |   1 +
 airflow/api_connexion/schemas/dag_schema.py        |  36 +-
 airflow/cli/cli_parser.py                          |  39 +-
 airflow/cli/simple_table.py                        |  11 +
 airflow/executors/celery_executor.py               |   4 +-
 airflow/executors/kubernetes_executor.py           |  14 +-
 airflow/jobs/scheduler_job.py                      |  14 +-
 airflow/kubernetes/pod_launcher.py                 |   7 +-
 ...e42bb497a22_rename_last_scheduler_run_column.py |  65 +++
 ...da_increase_size_of_connection_extra_field_.py} |  24 +-
 ...8c147f_remove_can_read_permission_on_config_.py |   6 +
 .../8646922c8a04_change_default_pool_slots_to_1.py |  93 ++++
 airflow/models/baseoperator.py                     |   9 +-
 airflow/models/connection.py                       |   2 +-
 airflow/models/dag.py                              |  17 +-
 airflow/models/dagrun.py                           |   6 +
 airflow/models/taskinstance.py                     |  47 +-
 airflow/providers/sftp/hooks/sftp.py               |   6 +
 airflow/providers/ssh/hooks/ssh.py                 |  18 +-
 airflow/serialization/schema.json                  |   1 +
 airflow/serialization/serialized_objects.py        |   2 +-
 airflow/stats.py                                   |  27 +-
 airflow/utils/dot_renderer.py                      | 120 +++-
 airflow/utils/state.py                             |   1 +
 airflow/www/app.py                                 |  12 +-
 airflow/www/extensions/init_views.py               |   9 +-
 airflow/www/package.json                           |   2 +-
 airflow/www/security.py                            |   9 +-
 airflow/www/static/js/task-instances.js            |   2 +-
 airflow/www/templates/airflow/graph.html           | 117 +++-
 airflow/www/utils.py                               |   2 +
 airflow/www/views.py                               |  32 +-
 airflow/www/yarn.lock                              | 616 ++++++---------------
 .../connections/ssh.rst                            |   6 +-
 docs/apache-airflow/concepts.rst                   |  32 +-
 docs/apache-airflow/howto/run-behind-proxy.rst     |   4 +-
 docs/apache-airflow/howto/set-up-database.rst      |   9 +
 docs/apache-airflow/img/graph.png                  | Bin 118674 -> 225347 bytes
 docs/apache-airflow/plugins.rst                    |  19 +-
 docs/apache-airflow/tutorial_taskflow_api.rst      |   2 +-
 docs/apache-airflow/usage-cli.rst                  |   1 +
 docs/conf.py                                       |   2 +-
 docs/spelling_wordlist.txt                         |   2 +
 setup.cfg                                          |   5 +-
 tests/api_connexion/endpoints/test_dag_endpoint.py |  28 +-
 .../endpoints/test_task_instance_endpoint.py       |  31 ++
 tests/api_connexion/schemas/test_dag_schema.py     |   3 +-
 tests/cli/test_cli_parser.py                       |  26 +
 tests/core/test_stats.py                           |   1 +
 tests/executors/test_kubernetes_executor.py        | 112 ++++
 tests/jobs/test_scheduler_job.py                   |  63 +++
 tests/kubernetes/test_pod_launcher.py              |  18 +-
 tests/models/test_cleartasks.py                    |  47 +-
 tests/models/test_dag.py                           |  70 ++-
 tests/models/test_taskinstance.py                  |  33 ++
 tests/plugins/test_plugin.py                       |  12 +-
 tests/plugins/test_plugins_manager.py              |  27 +-
 tests/providers/sftp/hooks/test_sftp.py            |  41 +-
 tests/providers/ssh/hooks/test_ssh.py              |  93 ++++
 tests/serialization/test_dag_serialization.py      |   4 +
 tests/utils/test_dot_renderer.py                   | 101 +++-
 tests/www/test_app.py                              |   6 +
 tests/www/test_utils.py                            |  11 +
 tests/www/test_views.py                            |  33 +-
 65 files changed, 1570 insertions(+), 645 deletions(-)
 create mode 100644 airflow/migrations/versions/2e42bb497a22_rename_last_scheduler_run_column.py
 copy airflow/migrations/versions/{fe461863935f_increase_length_for_connection_password.py => 449b4072c2da_increase_size_of_connection_extra_field_.py} (76%)
 create mode 100644 airflow/migrations/versions/8646922c8a04_change_default_pool_slots_to_1.py

[airflow] 38/42: Suggest using $http_host instead of $host (#14814)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit eab3b9e865b8fa143e6a17dc8d3c18860c8d7ec5
Author: Miguel Carbajo Berrocal <em...@gmail.com>
AuthorDate: Tue Mar 16 09:40:39 2021 -0400

    Suggest using $http_host instead of $host (#14814)
    
    If the reverse proxy is not running in port 80, using $host won't forward the port in the HTTP request to airflow and it won't build the correct redirect URL.
    
    E.g.
    I have nginx and airflow running in docker in the default ports. My mapping for nginx x is 7003:80.
    The request http://myserver:7003/myorg/airflow/ will redirect to http://myserver/myorg/airflow/admin/ instead of http://myserver:7003/myorg/airflow/admin/.
    
    (cherry picked from commit 9cb6553935c247fbfe23c7584dd4d9f579c6bf55)
---
 docs/apache-airflow/howto/run-behind-proxy.rst | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/apache-airflow/howto/run-behind-proxy.rst b/docs/apache-airflow/howto/run-behind-proxy.rst
index eea0eb7..2901ed0 100644
--- a/docs/apache-airflow/howto/run-behind-proxy.rst
+++ b/docs/apache-airflow/howto/run-behind-proxy.rst
@@ -47,7 +47,7 @@ Your reverse proxy (ex: nginx) should be configured as follow:
 
         location /myorg/airflow/ {
             proxy_pass http://localhost:8080;
-            proxy_set_header Host $host;
+            proxy_set_header Host $http_host;
             proxy_redirect off;
             proxy_http_version 1.1;
             proxy_set_header Upgrade $http_upgrade;
@@ -64,7 +64,7 @@ Your reverse proxy (ex: nginx) should be configured as follow:
           location /myorg/flower/ {
               rewrite ^/myorg/flower/(.*)$ /$1 break;  # remove prefix from http header
               proxy_pass http://localhost:5555;
-              proxy_set_header Host $host;
+              proxy_set_header Host $http_host;
               proxy_redirect off;
               proxy_http_version 1.1;
               proxy_set_header Upgrade $http_upgrade;

[airflow] 18/42: BugFix: Fix taskInstance API call fails if a task is removed from running DAG (#14381)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0cb2a967ed40f2f3bf6eeb8795d7b6e38555b3a3
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Mon Mar 8 22:24:59 2021 +0100

    BugFix: Fix taskInstance API call fails if a task is removed from running DAG (#14381)
    
    Closes: #14331
    
    (cherry picked from commit 7418679591e5df4ceaab6c471bc6d4a975201871)
---
 airflow/api_connexion/openapi/v1.yaml              |  1 +
 airflow/utils/state.py                             |  1 +
 .../endpoints/test_task_instance_endpoint.py       | 31 ++++++++++++++++++++++
 3 files changed, 33 insertions(+)

diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml
index 0da5925..83dae6a 100644
--- a/airflow/api_connexion/openapi/v1.yaml
+++ b/airflow/api_connexion/openapi/v1.yaml
@@ -2498,6 +2498,7 @@ components:
         - queued
         - none
         - scheduled
+        - removed
 
     DagState:
       description: DAG State.
diff --git a/airflow/utils/state.py b/airflow/utils/state.py
index 681cbc5..d5300e1 100644
--- a/airflow/utils/state.py
+++ b/airflow/utils/state.py
@@ -57,6 +57,7 @@ class State:
         NONE,
         SCHEDULED,
         SENSING,
+        REMOVED,
     )
 
     dag_states = (
diff --git a/tests/api_connexion/endpoints/test_task_instance_endpoint.py b/tests/api_connexion/endpoints/test_task_instance_endpoint.py
index 84c957f..4f8028e 100644
--- a/tests/api_connexion/endpoints/test_task_instance_endpoint.py
+++ b/tests/api_connexion/endpoints/test_task_instance_endpoint.py
@@ -167,6 +167,37 @@ class TestGetTaskInstance(TestTaskInstanceEndpoint):
         }
 
     @provide_session
+    def test_should_respond_200_with_task_state_in_removed(self, session):
+        self.create_task_instances(session, task_instances=[{"state": State.REMOVED}], update_extras=True)
+        response = self.client.get(
+            "/api/v1/dags/example_python_operator/dagRuns/TEST_DAG_RUN_ID/taskInstances/print_the_context",
+            environ_overrides={"REMOTE_USER": "test"},
+        )
+        assert response.status_code == 200
+        assert response.json == {
+            "dag_id": "example_python_operator",
+            "duration": 10000.0,
+            "end_date": "2020-01-03T00:00:00+00:00",
+            "execution_date": "2020-01-01T00:00:00+00:00",
+            "executor_config": "{}",
+            "hostname": "",
+            "max_tries": 0,
+            "operator": "PythonOperator",
+            "pid": 100,
+            "pool": "default_pool",
+            "pool_slots": 1,
+            "priority_weight": 6,
+            "queue": "default_queue",
+            "queued_when": None,
+            "sla_miss": None,
+            "start_date": "2020-01-02T00:00:00+00:00",
+            "state": "removed",
+            "task_id": "print_the_context",
+            "try_number": 0,
+            "unixname": getpass.getuser(),
+        }
+
+    @provide_session
     def test_should_respond_200_task_instance_with_sla(self, session):
         self.create_task_instances(session)
         session.query()

[airflow] 05/42: Speed up clear_task_instances by doing a single sql delete for TaskReschedule (#14048)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 118f86c394f97fa628fc2069bb31ba29d70e37d8
Author: yuqian90 <yu...@gmail.com>
AuthorDate: Wed Feb 10 22:07:18 2021 +0800

    Speed up clear_task_instances by doing a single sql delete for TaskReschedule (#14048)
    
    Clearing large number of tasks takes a long time. Most of the time is spent at this line in clear_task_instances (more than 95% time). This slowness sometimes causes the webserver to timeout because the web_server_worker_timeout is hit.
    
    ```
            # Clear all reschedules related to the ti to clear
            session.query(TR).filter(
                TR.dag_id == ti.dag_id,
                TR.task_id == ti.task_id,
                TR.execution_date == ti.execution_date,
                TR.try_number == ti.try_number,
            ).delete()
    ```
    This line was very slow because it's deleting TaskReschedule rows in a for loop one by one.
    
    This PR simply changes this code to delete TaskReschedule in a single sql query with a bunch of OR conditions. It's effectively doing the same, but now it's much faster.
    
    Some profiling showed great speed improvement (something like 40 to 50 times faster) compared to the first iteration. So the overall performance should now be 300 times faster than the original for loop deletion.
    
    (cherry picked from commit 9036ce20c140520d3f9d5e0f83b5ebfded07fa7c)
---
 airflow/models/taskinstance.py  | 37 ++++++++++++++++++++++++++------
 tests/models/test_cleartasks.py | 47 ++++++++++++++++++++++++++++++++++++++++-
 2 files changed, 77 insertions(+), 7 deletions(-)

diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
index d671a01..c7d7ff7 100644
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -24,6 +24,7 @@ import os
 import pickle
 import signal
 import warnings
+from collections import defaultdict
 from datetime import datetime, timedelta
 from tempfile import NamedTemporaryFile
 from typing import IO, Any, Dict, Iterable, List, NamedTuple, Optional, Tuple, Union
@@ -146,6 +147,7 @@ def clear_task_instances(
     :param dag: DAG object
     """
     job_ids = []
+    task_id_by_key = defaultdict(lambda: defaultdict(lambda: defaultdict(set)))
     for ti in tis:
         if ti.state == State.RUNNING:
             if ti.job_id:
@@ -166,13 +168,36 @@ def clear_task_instances(
                 ti.max_tries = max(ti.max_tries, ti.prev_attempted_tries)
             ti.state = State.NONE
             session.merge(ti)
+
+        task_id_by_key[ti.dag_id][ti.execution_date][ti.try_number].add(ti.task_id)
+
+    if task_id_by_key:
         # Clear all reschedules related to the ti to clear
-        session.query(TR).filter(
-            TR.dag_id == ti.dag_id,
-            TR.task_id == ti.task_id,
-            TR.execution_date == ti.execution_date,
-            TR.try_number == ti.try_number,
-        ).delete()
+
+        # This is an optimization for the common case where all tis are for a small number
+        # of dag_id, execution_date and try_number. Use a nested dict of dag_id,
+        # execution_date, try_number and task_id to construct the where clause in a
+        # hierarchical manner. This speeds up the delete statement by more than 40x for
+        # large number of tis (50k+).
+        conditions = or_(
+            and_(
+                TR.dag_id == dag_id,
+                or_(
+                    and_(
+                        TR.execution_date == execution_date,
+                        or_(
+                            and_(TR.try_number == try_number, TR.task_id.in_(task_ids))
+                            for try_number, task_ids in task_tries.items()
+                        ),
+                    )
+                    for execution_date, task_tries in dates.items()
+                ),
+            )
+            for dag_id, dates in task_id_by_key.items()
+        )
+
+        delete_qry = TR.__table__.delete().where(conditions)
+        session.execute(delete_qry)
 
     if job_ids:
         from airflow.jobs.base_job import BaseJob
diff --git a/tests/models/test_cleartasks.py b/tests/models/test_cleartasks.py
index f54bacc..1c5606e 100644
--- a/tests/models/test_cleartasks.py
+++ b/tests/models/test_cleartasks.py
@@ -20,8 +20,9 @@ import datetime
 import unittest
 
 from airflow import settings
-from airflow.models import DAG, TaskInstance as TI, clear_task_instances
+from airflow.models import DAG, TaskInstance as TI, TaskReschedule, clear_task_instances
 from airflow.operators.dummy import DummyOperator
+from airflow.sensors.python import PythonSensor
 from airflow.utils.session import create_session
 from airflow.utils.state import State
 from airflow.utils.types import DagRunType
@@ -138,6 +139,50 @@ class TestClearTasks(unittest.TestCase):
         assert ti1.try_number == 2
         assert ti1.max_tries == 2
 
+    def test_clear_task_instances_with_task_reschedule(self):
+        """Test that TaskReschedules are deleted correctly when TaskInstances are cleared"""
+
+        with DAG(
+            'test_clear_task_instances_with_task_reschedule',
+            start_date=DEFAULT_DATE,
+            end_date=DEFAULT_DATE + datetime.timedelta(days=10),
+        ) as dag:
+            task0 = PythonSensor(task_id='0', python_callable=lambda: False, mode="reschedule")
+            task1 = PythonSensor(task_id='1', python_callable=lambda: False, mode="reschedule")
+
+        ti0 = TI(task=task0, execution_date=DEFAULT_DATE)
+        ti1 = TI(task=task1, execution_date=DEFAULT_DATE)
+
+        dag.create_dagrun(
+            execution_date=ti0.execution_date,
+            state=State.RUNNING,
+            run_type=DagRunType.SCHEDULED,
+        )
+
+        ti0.run()
+        ti1.run()
+
+        with create_session() as session:
+
+            def count_task_reschedule(task_id):
+                return (
+                    session.query(TaskReschedule)
+                    .filter(
+                        TaskReschedule.dag_id == dag.dag_id,
+                        TaskReschedule.task_id == task_id,
+                        TaskReschedule.execution_date == DEFAULT_DATE,
+                        TaskReschedule.try_number == 1,
+                    )
+                    .count()
+                )
+
+            assert count_task_reschedule(ti0.task_id) == 1
+            assert count_task_reschedule(ti1.task_id) == 1
+            qry = session.query(TI).filter(TI.dag_id == dag.dag_id, TI.task_id == ti0.task_id).all()
+            clear_task_instances(qry, session, dag=dag)
+            assert count_task_reschedule(ti0.task_id) == 0
+            assert count_task_reschedule(ti1.task_id) == 1
+
     def test_dag_clear(self):
         dag = DAG(
             'test_dag_clear', start_date=DEFAULT_DATE, end_date=DEFAULT_DATE + datetime.timedelta(days=10)

[airflow] 36/42: Fix KubernetesExecutor issue with deleted pending pods (#14810)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 59afddd2f27aa0529c602da825e1ee429fe3934b
Author: Jed Cunningham <66...@users.noreply.github.com>
AuthorDate: Mon Mar 15 15:16:49 2021 -0600

    Fix KubernetesExecutor issue with deleted pending pods (#14810)
    
    This change treats a pending KubernetesExecutor worker pod deletion
    as a failure. This allows them to follow the configured retry rules
    for the task as one would expect.
    
    (cherry picked from commit a639dd364865da7367f342d5721a5f46a7188a29)
---
 airflow/executors/kubernetes_executor.py    |  6 +-
 tests/executors/test_kubernetes_executor.py | 92 +++++++++++++++++++++++++++++
 2 files changed, 94 insertions(+), 4 deletions(-)

diff --git a/airflow/executors/kubernetes_executor.py b/airflow/executors/kubernetes_executor.py
index fd5c6fa..c42531a 100644
--- a/airflow/executors/kubernetes_executor.py
+++ b/airflow/executors/kubernetes_executor.py
@@ -194,10 +194,8 @@ class KubernetesJobWatcher(multiprocessing.Process, LoggingMixin):
         """Process status response"""
         if status == 'Pending':
             if event['type'] == 'DELETED':
-                self.log.info('Event: Failed to start pod %s, will reschedule', pod_id)
-                self.watcher_queue.put(
-                    (pod_id, namespace, State.UP_FOR_RESCHEDULE, annotations, resource_version)
-                )
+                self.log.info('Event: Failed to start pod %s', pod_id)
+                self.watcher_queue.put((pod_id, namespace, State.FAILED, annotations, resource_version))
             else:
                 self.log.info('Event: %s Pending', pod_id)
         elif status == 'Failed':
diff --git a/tests/executors/test_kubernetes_executor.py b/tests/executors/test_kubernetes_executor.py
index dc7cbbb..68b0006 100644
--- a/tests/executors/test_kubernetes_executor.py
+++ b/tests/executors/test_kubernetes_executor.py
@@ -34,6 +34,7 @@ try:
     from airflow.executors.kubernetes_executor import (
         AirflowKubernetesScheduler,
         KubernetesExecutor,
+        KubernetesJobWatcher,
         create_pod_id,
         get_base_pod_from_template,
     )
@@ -328,3 +329,94 @@ class TestKubernetesExecutor(unittest.TestCase):
         executor.adopt_launched_task(mock_kube_client, pod=pod, pod_ids=pod_ids)
         assert not mock_kube_client.patch_namespaced_pod.called
         assert pod_ids == {"foobar": {}}
+
+
+class TestKubernetesJobWatcher(unittest.TestCase):
+    def setUp(self):
+        self.watcher = KubernetesJobWatcher(
+            namespace="airflow",
+            multi_namespace_mode=False,
+            watcher_queue=mock.MagicMock(),
+            resource_version="0",
+            scheduler_job_id="123",
+            kube_config=mock.MagicMock(),
+        )
+        self.kube_client = mock.MagicMock()
+        self.core_annotations = {
+            "dag_id": "dag",
+            "task_id": "task",
+            "execution_date": "dt",
+            "try_number": "1",
+        }
+        self.pod = k8s.V1Pod(
+            metadata=k8s.V1ObjectMeta(
+                name="foo",
+                annotations={"airflow-worker": "bar", **self.core_annotations},
+                namespace="airflow",
+                resource_version="456",
+            ),
+            status=k8s.V1PodStatus(phase="Pending"),
+        )
+        self.events = []
+
+    def _run(self):
+        with mock.patch('airflow.executors.kubernetes_executor.watch') as mock_watch:
+            mock_watch.Watch.return_value.stream.return_value = self.events
+            latest_resource_version = self.watcher._run(
+                self.kube_client,
+                self.watcher.resource_version,
+                self.watcher.scheduler_job_id,
+                self.watcher.kube_config,
+            )
+            assert self.pod.metadata.resource_version == latest_resource_version
+
+    def assert_watcher_queue_called_once_with_state(self, state):
+        self.watcher.watcher_queue.put.assert_called_once_with(
+            (
+                self.pod.metadata.name,
+                self.watcher.namespace,
+                state,
+                self.core_annotations,
+                self.pod.metadata.resource_version,
+            )
+        )
+
+    def test_process_status_pending(self):
+        self.events.append({"type": 'MODIFIED', "object": self.pod})
+
+        self._run()
+        self.watcher.watcher_queue.put.assert_not_called()
+
+    def test_process_status_pending_deleted(self):
+        self.events.append({"type": 'DELETED', "object": self.pod})
+
+        self._run()
+        self.assert_watcher_queue_called_once_with_state(State.FAILED)
+
+    def test_process_status_failed(self):
+        self.pod.status.phase = "Failed"
+        self.events.append({"type": 'MODIFIED', "object": self.pod})
+
+        self._run()
+        self.assert_watcher_queue_called_once_with_state(State.FAILED)
+
+    def test_process_status_succeeded(self):
+        self.pod.status.phase = "Succeeded"
+        self.events.append({"type": 'MODIFIED', "object": self.pod})
+
+        self._run()
+        self.assert_watcher_queue_called_once_with_state(None)
+
+    def test_process_status_running(self):
+        self.pod.status.phase = "Running"
+        self.events.append({"type": 'MODIFIED', "object": self.pod})
+
+        self._run()
+        self.watcher.watcher_queue.put.assert_not_called()
+
+    def test_process_status_catchall(self):
+        self.pod.status.phase = "Unknown"
+        self.events.append({"type": 'MODIFIED', "object": self.pod})
+
+        self._run()
+        self.watcher.watcher_queue.put.assert_not_called()

[airflow] 31/42: Bump version to match node dependency (#14624)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b175785ad6556cb0e0a83e5180b80f75977958ea
Author: Ryan Hamilton <ry...@ryanahamilton.com>
AuthorDate: Fri Mar 5 08:35:43 2021 -0500

    Bump version to match node dependency (#14624)
    
    (cherry picked from commit 86a54b763dc458d5c24caba399dd07eac913ed9c)
---
 docs/conf.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/conf.py b/docs/conf.py
index 411796c..cc6dad8 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -533,4 +533,4 @@ if PACKAGE_NAME == 'apache-airflow':
     ]
 
     # Options for script updater
-    redoc_script_url = "https://cdn.jsdelivr.net/npm/redoc@2.0.0-rc.30/bundles/redoc.standalone.js"
+    redoc_script_url = "https://cdn.jsdelivr.net/npm/redoc@2.0.0-rc.48/bundles/redoc.standalone.js"

[airflow] 15/42: Make airflow dags show command display TaskGroup (#14269)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3f36fa9692cbcb13caca5b4f23c78f1967171e50
Author: yuqian90 <yu...@gmail.com>
AuthorDate: Thu Feb 25 23:23:15 2021 +0800

    Make airflow dags show command display TaskGroup (#14269)
    
    closes: #13053
    
    Make `airflow dags show` display TaskGroup.
    
    (cherry picked from commit c71f707d24a9196d33b91a7a2a9e3384698e5193)
---
 airflow/utils/dot_renderer.py    | 120 ++++++++++++++++++++++++++++++---------
 tests/utils/test_dot_renderer.py | 101 +++++++++++++++++++++++++++++++-
 2 files changed, 191 insertions(+), 30 deletions(-)

diff --git a/airflow/utils/dot_renderer.py b/airflow/utils/dot_renderer.py
index 990c7a7..4123f99 100644
--- a/airflow/utils/dot_renderer.py
+++ b/airflow/utils/dot_renderer.py
@@ -17,13 +17,17 @@
 # specific language governing permissions and limitations
 # under the License.
 """Renderer DAG (tasks and dependencies) to the graphviz object."""
-from typing import List, Optional
+from typing import Dict, List, Optional
 
 import graphviz
 
 from airflow.models import TaskInstance
+from airflow.models.baseoperator import BaseOperator
 from airflow.models.dag import DAG
+from airflow.models.taskmixin import TaskMixin
 from airflow.utils.state import State
+from airflow.utils.task_group import TaskGroup
+from airflow.www.views import dag_edges
 
 
 def _refine_color(color: str):
@@ -42,6 +46,88 @@ def _refine_color(color: str):
     return color
 
 
+def _draw_task(task: BaseOperator, parent_graph: graphviz.Digraph, states_by_task_id: Dict[str, str]) -> None:
+    """Draw a single task on the given parent_graph"""
+    if states_by_task_id:
+        state = states_by_task_id.get(task.task_id, State.NONE)
+        color = State.color_fg(state)
+        fill_color = State.color(state)
+    else:
+        color = task.ui_fgcolor
+        fill_color = task.ui_color
+
+    parent_graph.node(
+        task.task_id,
+        _attributes={
+            "label": task.label,
+            "shape": "rectangle",
+            "style": "filled,rounded",
+            "color": _refine_color(color),
+            "fillcolor": _refine_color(fill_color),
+        },
+    )
+
+
+def _draw_task_group(
+    task_group: TaskGroup, parent_graph: graphviz.Digraph, states_by_task_id: Dict[str, str]
+) -> None:
+    """Draw the given task_group and its children on the given parent_graph"""
+    # Draw joins
+    if task_group.upstream_group_ids or task_group.upstream_task_ids:
+        parent_graph.node(
+            task_group.upstream_join_id,
+            _attributes={
+                "label": "",
+                "shape": "circle",
+                "style": "filled,rounded",
+                "color": _refine_color(task_group.ui_fgcolor),
+                "fillcolor": _refine_color(task_group.ui_color),
+                "width": "0.2",
+                "height": "0.2",
+            },
+        )
+
+    if task_group.downstream_group_ids or task_group.downstream_task_ids:
+        parent_graph.node(
+            task_group.downstream_join_id,
+            _attributes={
+                "label": "",
+                "shape": "circle",
+                "style": "filled,rounded",
+                "color": _refine_color(task_group.ui_fgcolor),
+                "fillcolor": _refine_color(task_group.ui_color),
+                "width": "0.2",
+                "height": "0.2",
+            },
+        )
+
+    # Draw children
+    for child in sorted(task_group.children.values(), key=lambda t: t.label):
+        _draw_nodes(child, parent_graph, states_by_task_id)
+
+
+def _draw_nodes(node: TaskMixin, parent_graph: graphviz.Digraph, states_by_task_id: Dict[str, str]) -> None:
+    """Draw the node and its children on the given parent_graph recursively."""
+    if isinstance(node, BaseOperator):
+        _draw_task(node, parent_graph, states_by_task_id)
+    else:
+        # Draw TaskGroup
+        if node.is_root:
+            # No need to draw background for root TaskGroup.
+            _draw_task_group(node, parent_graph, states_by_task_id)
+        else:
+            with parent_graph.subgraph(name=f"cluster_{node.group_id}") as sub:
+                sub.attr(
+                    shape="rectangle",
+                    style="filled",
+                    color=_refine_color(node.ui_fgcolor),
+                    # Partially transparent CornflowerBlue
+                    fillcolor="#6495ed7f",
+                    label=node.label,
+                )
+                _draw_task_group(node, sub, states_by_task_id)
+
+
 def render_dag(dag: DAG, tis: Optional[List[TaskInstance]] = None) -> graphviz.Digraph:
     """
     Renders the DAG object to the DOT object.
@@ -66,30 +152,10 @@ def render_dag(dag: DAG, tis: Optional[List[TaskInstance]] = None) -> graphviz.D
     states_by_task_id = None
     if tis is not None:
         states_by_task_id = {ti.task_id: ti.state for ti in tis}
-    for task in dag.tasks:
-        node_attrs = {
-            "shape": "rectangle",
-            "style": "filled,rounded",
-        }
-        if states_by_task_id is None:
-            node_attrs.update(
-                {
-                    "color": _refine_color(task.ui_fgcolor),
-                    "fillcolor": _refine_color(task.ui_color),
-                }
-            )
-        else:
-            state = states_by_task_id.get(task.task_id, State.NONE)
-            node_attrs.update(
-                {
-                    "color": State.color_fg(state),
-                    "fillcolor": State.color(state),
-                }
-            )
-        dot.node(
-            task.task_id,
-            _attributes=node_attrs,
-        )
-        for downstream_task_id in task.downstream_task_ids:
-            dot.edge(task.task_id, downstream_task_id)
+
+    _draw_nodes(dag.task_group, dot, states_by_task_id)
+
+    for edge in dag_edges(dag):
+        dot.edge(edge["source_id"], edge["target_id"])
+
     return dot
diff --git a/tests/utils/test_dot_renderer.py b/tests/utils/test_dot_renderer.py
index b030623..ca3ea01 100644
--- a/tests/utils/test_dot_renderer.py
+++ b/tests/utils/test_dot_renderer.py
@@ -23,9 +23,11 @@ from unittest import mock
 from airflow.models import TaskInstance
 from airflow.models.dag import DAG
 from airflow.operators.bash import BashOperator
+from airflow.operators.dummy import DummyOperator
 from airflow.operators.python import PythonOperator
 from airflow.utils import dot_renderer
 from airflow.utils.state import State
+from airflow.utils.task_group import TaskGroup
 
 START_DATE = datetime.datetime.now()
 
@@ -72,9 +74,16 @@ class TestDotRenderer(unittest.TestCase):
         source = dot.source
         # Should render DAG title
         assert "label=DAG_ID" in source
-        assert 'first [color=black fillcolor=tan shape=rectangle style="filled,rounded"]' in source
-        assert 'second [color=white fillcolor=green shape=rectangle style="filled,rounded"]' in source
-        assert 'third [color=black fillcolor=lime shape=rectangle style="filled,rounded"]' in source
+        assert (
+            'first [color=black fillcolor=tan label=first shape=rectangle style="filled,rounded"]' in source
+        )
+        assert (
+            'second [color=white fillcolor=green label=second shape=rectangle style="filled,rounded"]'
+            in source
+        )
+        assert (
+            'third [color=black fillcolor=lime label=third shape=rectangle style="filled,rounded"]' in source
+        )
 
     def test_should_render_dag_orientation(self):
         orientation = "TB"
@@ -105,3 +114,89 @@ class TestDotRenderer(unittest.TestCase):
         # Should render DAG title with orientation
         assert "label=DAG_ID" in source
         assert f'label=DAG_ID labelloc=t rankdir={orientation}' in source
+
+    def test_render_task_group(self):
+        with DAG(dag_id="example_task_group", start_date=START_DATE) as dag:
+            start = DummyOperator(task_id="start")
+
+            with TaskGroup("section_1", tooltip="Tasks for section_1") as section_1:
+                task_1 = DummyOperator(task_id="task_1")
+                task_2 = BashOperator(task_id="task_2", bash_command='echo 1')
+                task_3 = DummyOperator(task_id="task_3")
+
+                task_1 >> [task_2, task_3]
+
+            with TaskGroup("section_2", tooltip="Tasks for section_2") as section_2:
+                task_1 = DummyOperator(task_id="task_1")
+
+                with TaskGroup("inner_section_2", tooltip="Tasks for inner_section2"):
+                    task_2 = BashOperator(task_id="task_2", bash_command='echo 1')
+                    task_3 = DummyOperator(task_id="task_3")
+                    task_4 = DummyOperator(task_id="task_4")
+
+                    [task_2, task_3] >> task_4
+
+            end = DummyOperator(task_id='end')
+
+            start >> section_1 >> section_2 >> end
+
+        dot = dot_renderer.render_dag(dag)
+
+        assert dot.source == '\n'.join(
+            [
+                'digraph example_task_group {',
+                '\tgraph [label=example_task_group labelloc=t rankdir=LR]',
+                '\tend [color="#000000" fillcolor="#e8f7e4" label=end shape=rectangle '
+                'style="filled,rounded"]',
+                '\tsubgraph cluster_section_1 {',
+                '\t\tcolor="#000000" fillcolor="#6495ed7f" label=section_1 shape=rectangle style=filled',
+                '\t\t"section_1.upstream_join_id" [color="#000000" fillcolor=CornflowerBlue height=0.2 '
+                'label="" shape=circle style="filled,rounded" width=0.2]',
+                '\t\t"section_1.downstream_join_id" [color="#000000" fillcolor=CornflowerBlue height=0.2 '
+                'label="" shape=circle style="filled,rounded" width=0.2]',
+                '\t\t"section_1.task_1" [color="#000000" fillcolor="#e8f7e4" label=task_1 shape=rectangle '
+                'style="filled,rounded"]',
+                '\t\t"section_1.task_2" [color="#000000" fillcolor="#f0ede4" label=task_2 shape=rectangle '
+                'style="filled,rounded"]',
+                '\t\t"section_1.task_3" [color="#000000" fillcolor="#e8f7e4" label=task_3 shape=rectangle '
+                'style="filled,rounded"]',
+                '\t}',
+                '\tsubgraph cluster_section_2 {',
+                '\t\tcolor="#000000" fillcolor="#6495ed7f" label=section_2 shape=rectangle style=filled',
+                '\t\t"section_2.upstream_join_id" [color="#000000" fillcolor=CornflowerBlue height=0.2 '
+                'label="" shape=circle style="filled,rounded" width=0.2]',
+                '\t\t"section_2.downstream_join_id" [color="#000000" fillcolor=CornflowerBlue height=0.2 '
+                'label="" shape=circle style="filled,rounded" width=0.2]',
+                '\t\tsubgraph "cluster_section_2.inner_section_2" {',
+                '\t\t\tcolor="#000000" fillcolor="#6495ed7f" label=inner_section_2 shape=rectangle '
+                'style=filled',
+                '\t\t\t"section_2.inner_section_2.task_2" [color="#000000" fillcolor="#f0ede4" label=task_2 '
+                'shape=rectangle style="filled,rounded"]',
+                '\t\t\t"section_2.inner_section_2.task_3" [color="#000000" fillcolor="#e8f7e4" label=task_3 '
+                'shape=rectangle style="filled,rounded"]',
+                '\t\t\t"section_2.inner_section_2.task_4" [color="#000000" fillcolor="#e8f7e4" label=task_4 '
+                'shape=rectangle style="filled,rounded"]',
+                '\t\t}',
+                '\t\t"section_2.task_1" [color="#000000" fillcolor="#e8f7e4" label=task_1 shape=rectangle '
+                'style="filled,rounded"]',
+                '\t}',
+                '\tstart [color="#000000" fillcolor="#e8f7e4" label=start shape=rectangle '
+                'style="filled,rounded"]',
+                '\t"section_1.downstream_join_id" -> "section_2.upstream_join_id"',
+                '\t"section_1.task_1" -> "section_1.task_2"',
+                '\t"section_1.task_1" -> "section_1.task_3"',
+                '\t"section_1.task_2" -> "section_1.downstream_join_id"',
+                '\t"section_1.task_3" -> "section_1.downstream_join_id"',
+                '\t"section_1.upstream_join_id" -> "section_1.task_1"',
+                '\t"section_2.downstream_join_id" -> end',
+                '\t"section_2.inner_section_2.task_2" -> "section_2.inner_section_2.task_4"',
+                '\t"section_2.inner_section_2.task_3" -> "section_2.inner_section_2.task_4"',
+                '\t"section_2.inner_section_2.task_4" -> "section_2.downstream_join_id"',
+                '\t"section_2.task_1" -> "section_2.downstream_join_id"',
+                '\t"section_2.upstream_join_id" -> "section_2.inner_section_2.task_2"',
+                '\t"section_2.upstream_join_id" -> "section_2.inner_section_2.task_3"',
+                '\t"section_2.upstream_join_id" -> "section_2.task_1"',
+                '\tstart -> "section_1.upstream_join_id"',
+                '}',
+            ]
+        )

[airflow] 14/42: Fix indentation in code block in Taskflow API doc (#14241)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 36ff9c5af1c8b5c985870e91276ee98e6b5ca199
Author: Kan Ouivirach <ka...@odds.team>
AuthorDate: Tue Feb 16 23:59:01 2021 +0700

    Fix indentation in code block in Taskflow API doc (#14241)
    
    (cherry picked from commit a68f0739e990514ce7b1f6510af738fec5f87b9b)
---
 docs/apache-airflow/tutorial_taskflow_api.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/apache-airflow/tutorial_taskflow_api.rst b/docs/apache-airflow/tutorial_taskflow_api.rst
index b089e03..ea52be1 100644
--- a/docs/apache-airflow/tutorial_taskflow_api.rst
+++ b/docs/apache-airflow/tutorial_taskflow_api.rst
@@ -188,7 +188,7 @@ Building this dependency is shown in the code below:
 .. code-block:: python
 
     @task()
-        def extract_from_file():
+    def extract_from_file():
         """
         #### Extract from file task
         A simple Extract task to get data ready for the rest of the data

[airflow] 35/42: Webserver: Allow Filtering TaskInstances by queued_dttm (#14708)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ecd3e3c79bf8ef5ed86bd8ce1a79a2ce71a29c0e
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Fri Mar 12 08:17:49 2021 +0000

    Webserver: Allow Filtering TaskInstances by queued_dttm (#14708)
    
    We allow filtering TaskInstance in the Webserver by queued_dttm
    similar to start_date and end_date.
    
    This helps in debugging issues quicker, then getting access to DB.
    
    (cherry picked from commit d2c2a2285c176ef232452e72a28e355667b8b50b)
---
 airflow/www/views.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index fbee413..ab31607 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -3572,6 +3572,7 @@ class TaskInstanceModelView(AirflowModelView):
         'operator',
         'start_date',
         'end_date',
+        'queued_dttm',
     ]
 
     edit_columns = [

[airflow] 13/42: Fix comparison dagTZ with localTZ (#14204)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 1858a940c4fed61c87baaac726db192844788e1f
Author: Alexander Millin <mi...@gmail.com>
AuthorDate: Mon Feb 15 18:35:09 2021 +0300

    Fix comparison dagTZ with localTZ (#14204)
    
    (cherry picked from commit fe0ee585d11474a0c99e51a4400dc16f643ea14b)
---
 airflow/www/static/js/task-instances.js | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/www/static/js/task-instances.js b/airflow/www/static/js/task-instances.js
index 1f71fa1..be9df24 100644
--- a/airflow/www/static/js/task-instances.js
+++ b/airflow/www/static/js/task-instances.js
@@ -35,7 +35,7 @@ function generateTooltipDateTimes(startDate, endDate, dagTZ) {
   }
 
   const tzFormat = 'z (Z)';
-  const localTZ = moment.defaultZone.name;
+  const localTZ = moment.defaultZone.name.toUpperCase();
   startDate = moment.utc(startDate);
   endDate = moment.utc(endDate);
   dagTZ = dagTZ.toUpperCase();

[airflow] 19/42: Fix crash when user clicks on "Task Instance Details" caused by start_date being None (#14416)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 040f7d85883d73c9e652c7e6ad0fca67f9445366
Author: yuqian90 <yu...@gmail.com>
AuthorDate: Thu Feb 25 23:06:45 2021 +0800

    Fix crash when user clicks on  "Task Instance Details" caused by start_date being None (#14416)
    
    This is to fix the following error that happens when a user clicks on 'Task Instance Details' for a TaskInstance that has previous TaskInstance not yet run. E.g.
    
    The previous TaskInstance has not yet run because its dependencies are not yet met
    The previous TaskInstance has not yet run because scheduler is busy,
    the previous TaskInstance was marked success without running.
    This bug was caused by #12910. It affects Airflow 2.0.0 and 2.0.1.
    
    (cherry picked from commit 21f297425ae85ce89e21477d55b51d5560f47bf8)
---
 airflow/models/taskinstance.py    |  3 ++-
 tests/models/test_taskinstance.py | 33 +++++++++++++++++++++++++++++++++
 2 files changed, 35 insertions(+), 1 deletion(-)

diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
index 3ceb5a3..ed7a0be 100644
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -823,7 +823,8 @@ class TaskInstance(Base, LoggingMixin):  # pylint: disable=R0902,R0904
         """
         self.log.debug("previous_start_date was called")
         prev_ti = self.get_previous_ti(state=state, session=session)
-        return prev_ti and pendulum.instance(prev_ti.start_date)
+        # prev_ti may not exist and prev_ti.start_date may be None.
+        return prev_ti and prev_ti.start_date and pendulum.instance(prev_ti.start_date)
 
     @property
     def previous_start_date_success(self) -> Optional[pendulum.DateTime]:
diff --git a/tests/models/test_taskinstance.py b/tests/models/test_taskinstance.py
index cd99b02..b9ec2c8 100644
--- a/tests/models/test_taskinstance.py
+++ b/tests/models/test_taskinstance.py
@@ -1488,6 +1488,39 @@ class TestTaskInstance(unittest.TestCase):
         assert ti_list[3].get_previous_start_date(state=State.SUCCESS) == ti_list[1].start_date
         assert ti_list[3].get_previous_start_date(state=State.SUCCESS) != ti_list[2].start_date
 
+    def test_get_previous_start_date_none(self):
+        """
+        Test that get_previous_start_date() can handle TaskInstance with no start_date.
+        """
+        with DAG("test_get_previous_start_date_none", start_date=DEFAULT_DATE, schedule_interval=None) as dag:
+            task = DummyOperator(task_id="op")
+
+        day_1 = DEFAULT_DATE
+        day_2 = DEFAULT_DATE + datetime.timedelta(days=1)
+
+        # Create a DagRun for day_1 and day_2. Calling ti_2.get_previous_start_date()
+        # should return the start_date of ti_1 (which is None because ti_1 was not run).
+        # It should not raise an error.
+        dagrun_1 = dag.create_dagrun(
+            execution_date=day_1,
+            state=State.RUNNING,
+            run_type=DagRunType.MANUAL,
+        )
+
+        dagrun_2 = dag.create_dagrun(
+            execution_date=day_2,
+            state=State.RUNNING,
+            run_type=DagRunType.MANUAL,
+        )
+
+        ti_1 = dagrun_1.get_task_instance(task.task_id)
+        ti_2 = dagrun_2.get_task_instance(task.task_id)
+        ti_1.task = task
+        ti_2.task = task
+
+        assert ti_2.get_previous_start_date() == ti_1.start_date
+        assert ti_1.start_date is None
+
     def test_pendulum_template_dates(self):
         dag = models.DAG(
             dag_id='test_pendulum_template_dates',

[airflow] 08/42: Sync DB Migrations in Master with 2.0.1 (#14155)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 05326e22cfa9532601cb097417b8a8b8cd535ca8
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Tue Feb 9 21:01:46 2021 +0000

    Sync DB Migrations in Master with 2.0.1 (#14155)
    
    `449b4072c2da_increase_size_of_connection_extra_field_.py` does not exist in 2.0.1
    
    so we need to update down_migration
    
    ```
    INFO  [alembic.runtime.migration] Running upgrade 2c6edca13270 -> 61ec73d9401f, Add description field to connection
    INFO  [alembic.runtime.migration] Running upgrade 61ec73d9401f -> 64a7d6477aae, fix description field in connection to be text
    INFO  [alembic.runtime.migration] Running upgrade 64a7d6477aae -> e959f08ac86c, Change field in DagCode to MEDIUMTEXT for MySql
    INFO  [alembic.runtime.migration] Running upgrade e959f08ac86c -> 82b7c48c147f, Remove can_read permission on config resource for User and Viewer role
    [2021-02-09 19:17:31,307] {migration.py:555} INFO - Running upgrade 82b7c48c147f -> 449b4072c2da, Increase size of connection.extra field to handle multiple RSA keys
    ```
    
    (cherry picked from commit e7f176d2dccbc18005db01f97952757c624ef233)
---
 .../versions/449b4072c2da_increase_size_of_connection_extra_field_.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/migrations/versions/449b4072c2da_increase_size_of_connection_extra_field_.py b/airflow/migrations/versions/449b4072c2da_increase_size_of_connection_extra_field_.py
index d3d9432..808d435 100644
--- a/airflow/migrations/versions/449b4072c2da_increase_size_of_connection_extra_field_.py
+++ b/airflow/migrations/versions/449b4072c2da_increase_size_of_connection_extra_field_.py
@@ -19,7 +19,7 @@
 """Increase size of connection.extra field to handle multiple RSA keys
 
 Revision ID: 449b4072c2da
-Revises: e959f08ac86c
+Revises: 82b7c48c147f
 Create Date: 2020-03-16 19:02:55.337710
 
 """
@@ -29,7 +29,7 @@ from alembic import op
 
 # revision identifiers, used by Alembic.
 revision = '449b4072c2da'
-down_revision = 'e959f08ac86c'
+down_revision = '82b7c48c147f'
 branch_labels = None
 depends_on = None
 

[airflow] 03/42: [AIRFLOW-6076] fix dag.cli() KeyError (#13647)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b583736a1ee909e5c2d5255ae04870e3ad434c06
Author: penggongkui <49...@users.noreply.github.com>
AuthorDate: Thu Mar 18 19:24:11 2021 +0800

    [AIRFLOW-6076] fix dag.cli() KeyError (#13647)
    
    (cherry picked from commit b24a1babd4271d74ba13b1dc0a9cf8da001b3f77)
---
 airflow/cli/cli_parser.py    | 33 +++++++++++++++++++++++++++++----
 tests/cli/test_cli_parser.py | 26 ++++++++++++++++++++++++++
 2 files changed, 55 insertions(+), 4 deletions(-)

diff --git a/airflow/cli/cli_parser.py b/airflow/cli/cli_parser.py
index b198234..fae516e 100644
--- a/airflow/cli/cli_parser.py
+++ b/airflow/cli/cli_parser.py
@@ -24,7 +24,7 @@ import os
 import textwrap
 from argparse import Action, ArgumentError, RawTextHelpFormatter
 from functools import lru_cache
-from typing import Callable, Dict, Iterable, List, NamedTuple, Optional, Set, Union
+from typing import Callable, Dict, Iterable, List, NamedTuple, Optional, Union
 
 from airflow import settings
 from airflow.cli.commands.legacy_commands import check_legacy_command
@@ -1511,7 +1511,31 @@ airflow_commands: List[CLICommand] = [
     ),
 ]
 ALL_COMMANDS_DICT: Dict[str, CLICommand] = {sp.name: sp for sp in airflow_commands}
-DAG_CLI_COMMANDS: Set[str] = {'list_tasks', 'backfill', 'test', 'run', 'pause', 'unpause', 'list_dag_runs'}
+
+
+def _remove_dag_id_opt(command: ActionCommand):
+    cmd = command._asdict()
+    cmd['args'] = (arg for arg in command.args if arg is not ARG_DAG_ID)
+    return ActionCommand(**cmd)
+
+
+dag_cli_commands: List[CLICommand] = [
+    GroupCommand(
+        name='dags',
+        help='Manage DAGs',
+        subcommands=[
+            _remove_dag_id_opt(sp)
+            for sp in DAGS_COMMANDS
+            if sp.name in ['backfill', 'list-runs', 'pause', 'unpause']
+        ],
+    ),
+    GroupCommand(
+        name='tasks',
+        help='Manage tasks',
+        subcommands=[_remove_dag_id_opt(sp) for sp in TASKS_COMMANDS if sp.name in ['list', 'test', 'run']],
+    ),
+]
+DAG_CLI_DICT: Dict[str, CLICommand] = {sp.name: sp for sp in dag_cli_commands}
 
 
 class AirflowHelpFormatter(argparse.HelpFormatter):
@@ -1563,10 +1587,11 @@ def get_parser(dag_parser: bool = False) -> argparse.ArgumentParser:
     subparsers = parser.add_subparsers(dest='subcommand', metavar="GROUP_OR_COMMAND")
     subparsers.required = True
 
-    subparser_list = DAG_CLI_COMMANDS if dag_parser else ALL_COMMANDS_DICT.keys()
+    command_dict = DAG_CLI_DICT if dag_parser else ALL_COMMANDS_DICT
+    subparser_list = command_dict.keys()
     sub_name: str
     for sub_name in sorted(subparser_list):
-        sub: CLICommand = ALL_COMMANDS_DICT[sub_name]
+        sub: CLICommand = command_dict[sub_name]
         _add_command(subparsers, sub)
     return parser
 
diff --git a/tests/cli/test_cli_parser.py b/tests/cli/test_cli_parser.py
index 1c2e2aa..0ba5dff 100644
--- a/tests/cli/test_cli_parser.py
+++ b/tests/cli/test_cli_parser.py
@@ -144,6 +144,16 @@ class TestCli(TestCase):
         assert "Commands" in stdout
         assert "Groups" in stdout
 
+    def test_dag_parser_commands_and_comamnd_group_sections(self):
+        parser = cli_parser.get_parser(dag_parser=True)
+
+        with contextlib.redirect_stdout(io.StringIO()) as stdout:
+            with self.assertRaises(SystemExit):
+                parser.parse_args(['--help'])
+            stdout = stdout.getvalue()
+        self.assertIn("Commands", stdout)
+        self.assertIn("Groups", stdout)
+
     def test_should_display_help(self):
         parser = cli_parser.get_parser()
 
@@ -160,6 +170,22 @@ class TestCli(TestCase):
             with pytest.raises(SystemExit):
                 parser.parse_args([*cmd_args, '--help'])
 
+    def test_dag_cli_should_display_help(self):
+        parser = cli_parser.get_parser(dag_parser=True)
+
+        all_command_as_args = [
+            command_as_args
+            for top_command in cli_parser.dag_cli_commands
+            for command_as_args in (
+                [[top_command.name]]
+                if isinstance(top_command, cli_parser.ActionCommand)
+                else [[top_command.name, nested_command.name] for nested_command in top_command.subcommands]
+            )
+        ]
+        for cmd_args in all_command_as_args:
+            with self.assertRaises(SystemExit):
+                parser.parse_args([*cmd_args, '--help'])
+
     def test_positive_int(self):
         assert 1 == cli_parser.positive_int('1')
 

[airflow] 02/42: Fix running child tasks in a subdag after clearing a successful subdag (#14776)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f4cc5c50f350801f43fc17152605d45cc169b452
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Thu Mar 18 11:38:52 2021 +0100

    Fix running child tasks in a subdag after clearing a successful subdag (#14776)
    
    After successfully running a SUBDAG, clearing it
    (including downstream+recursive) doesn't trigger the inner tasks.
    Instead, the subdag is marked successful and the inner tasks all
    stay cleared and aren't re-run.
    
    The above problem is because the DagRun state of the subdags are not updated
    after clearing. This PR solves it by updating the DagRun state of all DAGs
    including subdags when include_subdags is True
    
    (cherry picked from commit 052163516bf91ab7bb53f4ec3c7b5621df515820)
---
 airflow/models/dag.py    | 10 +++++++--
 tests/models/test_dag.py | 55 ++++++++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 63 insertions(+), 2 deletions(-)

diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 8bb32db..d77cdfc 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -1116,13 +1116,15 @@ class DAG(LoggingMixin):
         session: Session = None,
         start_date: Optional[datetime] = None,
         end_date: Optional[datetime] = None,
+        dag_ids: List[str] = None,
     ) -> None:
-        query = session.query(DagRun).filter_by(dag_id=self.dag_id)
+        dag_ids = dag_ids or [self.dag_id]
+        query = session.query(DagRun).filter(DagRun.dag_id.in_(dag_ids))
         if start_date:
             query = query.filter(DagRun.execution_date >= start_date)
         if end_date:
             query = query.filter(DagRun.execution_date <= end_date)
-        query.update({DagRun.state: state})
+        query.update({DagRun.state: state}, synchronize_session='fetch')
 
     @provide_session
     def clear(
@@ -1183,11 +1185,13 @@ class DAG(LoggingMixin):
         """
         TI = TaskInstance
         tis = session.query(TI)
+        dag_ids = []
         if include_subdags:
             # Crafting the right filter for dag_id and task_ids combo
             conditions = []
             for dag in self.subdags + [self]:
                 conditions.append((TI.dag_id == dag.dag_id) & TI.task_id.in_(dag.task_ids))
+                dag_ids.append(dag.dag_id)
             tis = tis.filter(or_(*conditions))
         else:
             tis = session.query(TI).filter(TI.dag_id == self.dag_id)
@@ -1327,11 +1331,13 @@ class DAG(LoggingMixin):
                 dag=self,
                 activate_dag_runs=False,  # We will set DagRun state later.
             )
+
             self.set_dag_runs_state(
                 session=session,
                 start_date=start_date,
                 end_date=end_date,
                 state=dag_run_state,
+                dag_ids=dag_ids,
             )
         else:
             count = 0
diff --git a/tests/models/test_dag.py b/tests/models/test_dag.py
index 60171d8..c923241 100644
--- a/tests/models/test_dag.py
+++ b/tests/models/test_dag.py
@@ -1299,6 +1299,61 @@ class TestDag(unittest.TestCase):
         assert dagrun.state == dag_run_state
 
     @parameterized.expand(
+        [
+            (State.NONE,),
+            (State.RUNNING,),
+        ]
+    )
+    def test_clear_set_dagrun_state_for_subdag(self, dag_run_state):
+        dag_id = 'test_clear_set_dagrun_state_subdag'
+        self._clean_up(dag_id)
+        task_id = 't1'
+        dag = DAG(dag_id, start_date=DEFAULT_DATE, max_active_runs=1)
+        t_1 = DummyOperator(task_id=task_id, dag=dag)
+        subdag = DAG(dag_id + '.test', start_date=DEFAULT_DATE, max_active_runs=1)
+        SubDagOperator(task_id='test', subdag=subdag, dag=dag)
+        t_2 = DummyOperator(task_id='task', dag=subdag)
+
+        session = settings.Session()
+        dagrun_1 = dag.create_dagrun(
+            run_type=DagRunType.BACKFILL_JOB,
+            state=State.FAILED,
+            start_date=DEFAULT_DATE,
+            execution_date=DEFAULT_DATE,
+        )
+        dagrun_2 = subdag.create_dagrun(
+            run_type=DagRunType.BACKFILL_JOB,
+            state=State.FAILED,
+            start_date=DEFAULT_DATE,
+            execution_date=DEFAULT_DATE,
+        )
+        session.merge(dagrun_1)
+        session.merge(dagrun_2)
+        task_instance_1 = TI(t_1, execution_date=DEFAULT_DATE, state=State.RUNNING)
+        task_instance_2 = TI(t_2, execution_date=DEFAULT_DATE, state=State.RUNNING)
+        session.merge(task_instance_1)
+        session.merge(task_instance_2)
+        session.commit()
+
+        dag.clear(
+            start_date=DEFAULT_DATE,
+            end_date=DEFAULT_DATE + datetime.timedelta(days=1),
+            dag_run_state=dag_run_state,
+            include_subdags=True,
+            include_parentdag=False,
+            session=session,
+        )
+
+        dagrun = (
+            session.query(
+                DagRun,
+            )
+            .filter(DagRun.dag_id == subdag.dag_id)
+            .one()
+        )
+        assert dagrun.state == dag_run_state
+
+    @parameterized.expand(
         [(state, State.NONE) for state in State.task_states if state != State.RUNNING]
         + [(State.RUNNING, State.SHUTDOWN)]
     )  # type: ignore

[airflow] 22/42: Fix statsd metrics not sending when using daemon mode (#14454)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 99f10224f0b468ebf08df3ef2f8d0fa0538169f4
Author: Jun <Ju...@users.noreply.github.com>
AuthorDate: Fri Feb 26 22:45:56 2021 +0800

    Fix statsd metrics not sending when using daemon mode (#14454)
    
    It seems that the daemonContext will close the socket of statsd.
    
    ```
        return self.statsd.incr(stat, count, rate)
      File "/usr/local/lib/python3.8/site-packages/statsd/client/base.py", line 35, in incr
        self._send_stat(stat, '%s|c' % count, rate)
      File "/usr/local/lib/python3.8/site-packages/statsd/client/base.py", line 59, in _send_stat
        self._after(self._prepare(stat, value, rate))
      File "/usr/local/lib/python3.8/site-packages/statsd/client/base.py", line 74, in _after
        self._send(data)
      File "/opt/airflow/airflow/stats.py", line 40, in _send
        self._sock.sendto(data.encode('ascii'), self._addr)
    OSError: [Errno 9] Bad file descriptor
    ```
    
    (cherry picked from commit 0aa597e2ffd71d3587f629c0a1cb3d904e07b6e6)
---
 airflow/stats.py         | 27 +++++++++++++++------------
 tests/core/test_stats.py |  1 +
 2 files changed, 16 insertions(+), 12 deletions(-)

diff --git a/airflow/stats.py b/airflow/stats.py
index 7472647..8207220 100644
--- a/airflow/stats.py
+++ b/airflow/stats.py
@@ -348,25 +348,28 @@ class SafeDogStatsdLogger:
 
 
 class _Stats(type):
+    factory = None
     instance: Optional[StatsLogger] = None
 
     def __getattr__(cls, name):
+        if not cls.instance:
+            try:
+                cls.instance = cls.factory()
+            except (socket.gaierror, ImportError) as e:
+                log.error("Could not configure StatsClient: %s, using DummyStatsLogger instead.", e)
+                cls.instance = DummyStatsLogger()
         return getattr(cls.instance, name)
 
     def __init__(cls, *args, **kwargs):
         super().__init__(cls)
-        if cls.__class__.instance is None:
-            try:
-                is_datadog_enabled_defined = conf.has_option('metrics', 'statsd_datadog_enabled')
-                if is_datadog_enabled_defined and conf.getboolean('metrics', 'statsd_datadog_enabled'):
-                    cls.__class__.instance = cls.get_dogstatsd_logger()
-                elif conf.getboolean('metrics', 'statsd_on'):
-                    cls.__class__.instance = cls.get_statsd_logger()
-                else:
-                    cls.__class__.instance = DummyStatsLogger()
-            except (socket.gaierror, ImportError) as e:
-                log.error("Could not configure StatsClient: %s, using DummyStatsLogger instead.", e)
-                cls.__class__.instance = DummyStatsLogger()
+        if cls.__class__.factory is None:
+            is_datadog_enabled_defined = conf.has_option('metrics', 'statsd_datadog_enabled')
+            if is_datadog_enabled_defined and conf.getboolean('metrics', 'statsd_datadog_enabled'):
+                cls.__class__.factory = cls.get_dogstatsd_logger
+            elif conf.getboolean('metrics', 'statsd_on'):
+                cls.__class__.factory = cls.get_statsd_logger
+            else:
+                cls.__class__.factory = DummyStatsLogger
 
     @classmethod
     def get_statsd_logger(cls):
diff --git a/tests/core/test_stats.py b/tests/core/test_stats.py
index 428192b..b635e62 100644
--- a/tests/core/test_stats.py
+++ b/tests/core/test_stats.py
@@ -136,6 +136,7 @@ class TestStats(unittest.TestCase):
             ),
         ):
             importlib.reload(airflow.stats)
+            airflow.stats.Stats.incr("dummy_key")
 
     def tearDown(self) -> None:
         # To avoid side-effect

[airflow] 24/42: Bugfix: Fix wrong output of tags and owners in dag detail API endpoint (#14490)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 5dd51dc903933457e3f2978c22d8e0f98eb24ff1
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Wed Mar 3 15:39:02 2021 +0100

    Bugfix: Fix wrong output of tags and owners in dag detail API endpoint (#14490)
    
    * fix wrong output of tags and owners in dag detail endpoint
    
    * fixup! fix wrong output of tags and owners in dag detail endpoint
    
    * fixup! fixup! fix wrong output of tags and owners in dag detail endpoint
    
    (cherry picked from commit 4424d10f05fa268b54c81ef8b96a0745643690b6)
---
 airflow/api_connexion/schemas/dag_schema.py        | 36 ++++++++++++++++------
 tests/api_connexion/endpoints/test_dag_endpoint.py | 28 ++++++++++-------
 tests/api_connexion/schemas/test_dag_schema.py     |  3 +-
 3 files changed, 46 insertions(+), 21 deletions(-)

diff --git a/airflow/api_connexion/schemas/dag_schema.py b/airflow/api_connexion/schemas/dag_schema.py
index b15fbd6..aabd215 100644
--- a/airflow/api_connexion/schemas/dag_schema.py
+++ b/airflow/api_connexion/schemas/dag_schema.py
@@ -21,6 +21,7 @@ from itsdangerous import URLSafeSerializer
 from marshmallow import Schema, fields
 from marshmallow_sqlalchemy import SQLAlchemySchema, auto_field
 
+from airflow import DAG
 from airflow.api_connexion.schemas.common_schema import ScheduleIntervalSchema, TimeDeltaSchema, TimezoneField
 from airflow.configuration import conf
 from airflow.models.dag import DagModel, DagTag
@@ -73,15 +74,32 @@ class DAGSchema(SQLAlchemySchema):
 class DAGDetailSchema(DAGSchema):
     """DAG details"""
 
-    timezone = TimezoneField(dump_only=True)
-    catchup = fields.Boolean(dump_only=True)
-    orientation = fields.String(dump_only=True)
-    concurrency = fields.Integer(dump_only=True)
-    start_date = fields.DateTime(dump_only=True)
-    dag_run_timeout = fields.Nested(TimeDeltaSchema, dump_only=True, attribute="dagrun_timeout")
-    doc_md = fields.String(dump_only=True)
-    default_view = fields.String(dump_only=True)
-    params = fields.Dict(dump_only=True)
+    owners = fields.Method("get_owners", dump_only=True)
+    timezone = TimezoneField()
+    catchup = fields.Boolean()
+    orientation = fields.String()
+    concurrency = fields.Integer()
+    start_date = fields.DateTime()
+    dag_run_timeout = fields.Nested(TimeDeltaSchema, attribute="dagrun_timeout")
+    doc_md = fields.String()
+    default_view = fields.String()
+    params = fields.Dict()
+    tags = fields.Method("get_tags", dump_only=True)
+
+    @staticmethod
+    def get_tags(obj: DAG):
+        """Dumps tags as objects"""
+        tags = obj.tags
+        if tags:
+            return [DagTagSchema().dump(dict(name=tag)) for tag in tags]
+        return []
+
+    @staticmethod
+    def get_owners(obj: DAG):
+        """Convert owners attribute to DAG representation"""
+        if not getattr(obj, 'owner', None):
+            return []
+        return obj.owner.split(",")
 
 
 class DAGCollection(NamedTuple):
diff --git a/tests/api_connexion/endpoints/test_dag_endpoint.py b/tests/api_connexion/endpoints/test_dag_endpoint.py
index 146720c..6041b5f 100644
--- a/tests/api_connexion/endpoints/test_dag_endpoint.py
+++ b/tests/api_connexion/endpoints/test_dag_endpoint.py
@@ -75,7 +75,13 @@ class TestDagEndpoint(unittest.TestCase):
             access_control={'TestGranularDag': [permissions.ACTION_CAN_EDIT, permissions.ACTION_CAN_READ]},
         )
 
-        with DAG(cls.dag_id, start_date=datetime(2020, 6, 15), doc_md="details", params={"foo": 1}) as dag:
+        with DAG(
+            cls.dag_id,
+            start_date=datetime(2020, 6, 15),
+            doc_md="details",
+            params={"foo": 1},
+            tags=['example'],
+        ) as dag:
             DummyOperator(task_id=cls.task_id)
 
         with DAG(cls.dag2_id, start_date=datetime(2020, 6, 15)) as dag2:  # no doc_md
@@ -212,7 +218,7 @@ class TestGetDagDetails(TestDagEndpoint):
             "is_paused": None,
             "is_subdag": False,
             "orientation": "LR",
-            "owners": [],
+            "owners": ['airflow'],
             "params": {"foo": 1},
             "schedule_interval": {
                 "__type": "TimeDelta",
@@ -221,7 +227,7 @@ class TestGetDagDetails(TestDagEndpoint):
                 "seconds": 0,
             },
             "start_date": "2020-06-15T00:00:00+00:00",
-            "tags": None,
+            "tags": [{'name': 'example'}],
             "timezone": "Timezone('UTC')",
         }
         assert response.json == expected
@@ -244,7 +250,7 @@ class TestGetDagDetails(TestDagEndpoint):
             "is_paused": None,
             "is_subdag": False,
             "orientation": "LR",
-            "owners": [],
+            "owners": ['airflow'],
             "params": {},
             "schedule_interval": {
                 "__type": "TimeDelta",
@@ -253,7 +259,7 @@ class TestGetDagDetails(TestDagEndpoint):
                 "seconds": 0,
             },
             "start_date": "2020-06-15T00:00:00+00:00",
-            "tags": None,
+            "tags": [],
             "timezone": "Timezone('UTC')",
         }
         assert response.json == expected
@@ -276,7 +282,7 @@ class TestGetDagDetails(TestDagEndpoint):
             "is_paused": None,
             "is_subdag": False,
             "orientation": "LR",
-            "owners": [],
+            "owners": ['airflow'],
             "params": {},
             "schedule_interval": {
                 "__type": "TimeDelta",
@@ -285,7 +291,7 @@ class TestGetDagDetails(TestDagEndpoint):
                 "seconds": 0,
             },
             "start_date": None,
-            "tags": None,
+            "tags": [],
             "timezone": "Timezone('UTC')",
         }
         assert response.json == expected
@@ -313,7 +319,7 @@ class TestGetDagDetails(TestDagEndpoint):
             "is_paused": None,
             "is_subdag": False,
             "orientation": "LR",
-            "owners": [],
+            "owners": ['airflow'],
             "params": {"foo": 1},
             "schedule_interval": {
                 "__type": "TimeDelta",
@@ -322,7 +328,7 @@ class TestGetDagDetails(TestDagEndpoint):
                 "seconds": 0,
             },
             "start_date": "2020-06-15T00:00:00+00:00",
-            "tags": None,
+            "tags": [{'name': 'example'}],
             "timezone": "Timezone('UTC')",
         }
         response = client.get(
@@ -349,11 +355,11 @@ class TestGetDagDetails(TestDagEndpoint):
             'is_paused': None,
             'is_subdag': False,
             'orientation': 'LR',
-            'owners': [],
+            'owners': ['airflow'],
             "params": {"foo": 1},
             'schedule_interval': {'__type': 'TimeDelta', 'days': 1, 'microseconds': 0, 'seconds': 0},
             'start_date': '2020-06-15T00:00:00+00:00',
-            'tags': None,
+            'tags': [{'name': 'example'}],
             'timezone': "Timezone('UTC')",
         }
         assert response.json == expected
diff --git a/tests/api_connexion/schemas/test_dag_schema.py b/tests/api_connexion/schemas/test_dag_schema.py
index 2811bf3..96aba1f 100644
--- a/tests/api_connexion/schemas/test_dag_schema.py
+++ b/tests/api_connexion/schemas/test_dag_schema.py
@@ -107,6 +107,7 @@ class TestDAGDetailSchema:
             orientation="LR",
             default_view="duration",
             params={"foo": 1},
+            tags=['example1', 'example2'],
         )
         schema = DAGDetailSchema()
         expected = {
@@ -126,7 +127,7 @@ class TestDAGDetailSchema:
             'params': {'foo': 1},
             'schedule_interval': {'__type': 'TimeDelta', 'days': 1, 'seconds': 0, 'microseconds': 0},
             'start_date': '2020-06-19T00:00:00+00:00',
-            'tags': None,
+            'tags': [{'name': "example1"}, {'name': "example2"}],
             'timezone': "Timezone('UTC')",
         }
         assert schema.dump(dag) == expected

[airflow] 09/42: Log migrations info in consistent way (#14158)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 76be86e12a435d68eb7fbe4d69ab5e8a3be40cb7
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Feb 10 17:22:52 2021 +0000

    Log migrations info in consistent way (#14158)
    
    same as #13458 but for `82b7c48c147f_remove_can_read_permission_on_config_.py` migration
    
    This migration changes logging handlers
    so each next migration is differently formatted when doing
    airflow db reset. This commit fixes this behavior.
    
    (cherry picked from commit e423e3873f6752fd27f4361e56afca3b12f82d68)
---
 .../versions/82b7c48c147f_remove_can_read_permission_on_config_.py  | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/airflow/migrations/versions/82b7c48c147f_remove_can_read_permission_on_config_.py b/airflow/migrations/versions/82b7c48c147f_remove_can_read_permission_on_config_.py
index 5e85ee4..85d0872 100644
--- a/airflow/migrations/versions/82b7c48c147f_remove_can_read_permission_on_config_.py
+++ b/airflow/migrations/versions/82b7c48c147f_remove_can_read_permission_on_config_.py
@@ -23,6 +23,7 @@ Revises: e959f08ac86c
 Create Date: 2021-02-04 12:45:58.138224
 
 """
+import logging
 
 from airflow.security import permissions
 from airflow.www.app import create_app
@@ -36,6 +37,9 @@ depends_on = None
 
 def upgrade():
     """Remove can_read permission on config resource for User and Viewer role"""
+    log = logging.getLogger()
+    handlers = log.handlers[:]
+
     appbuilder = create_app(config={'FAB_UPDATE_PERMS': False}).appbuilder
     roles_to_modify = [role for role in appbuilder.sm.get_all_roles() if role.name in ["User", "Viewer"]]
     can_read_on_config_perm = appbuilder.sm.find_permission_view_menu(
@@ -48,6 +52,8 @@ def upgrade():
         ):
             appbuilder.sm.del_permission_role(role, can_read_on_config_perm)
 
+    log.handlers = handlers
+
 
 def downgrade():
     """Add can_read permission on config resource for User and Viewer role"""

[airflow] 04/42: Add more flexibility with FAB menu links (#13903)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 04ae0f6c217b5adef083c17217d2d6b21ae3bb75
Author: Jed Cunningham <66...@users.noreply.github.com>
AuthorDate: Wed Feb 17 04:49:04 2021 -0700

    Add more flexibility with FAB menu links (#13903)
    
    Airflow can be more flexible with the links plugins are allowed to add. Currently, you cannot add a top level link, a link with a label, or even without providing a category_icon (which isn't used anyways).
    
    This PR gives plugin authors the flexibility to add any link FAB supports.
    
    (cherry picked from commit ef0c17baa7ce04f7d8adfca5911f1b85b1a9857a)
---
 airflow/www/extensions/init_views.py  |  9 ++-------
 docs/apache-airflow/plugins.rst       | 19 ++++++++++++-------
 tests/plugins/test_plugin.py          | 12 ++++++++----
 tests/plugins/test_plugins_manager.py | 27 +++++++++++++++++++--------
 tests/www/test_views.py               |  2 +-
 5 files changed, 42 insertions(+), 27 deletions(-)

diff --git a/airflow/www/extensions/init_views.py b/airflow/www/extensions/init_views.py
index f9736e6..0dcf8c7 100644
--- a/airflow/www/extensions/init_views.py
+++ b/airflow/www/extensions/init_views.py
@@ -121,13 +121,8 @@ def init_plugins(app):
             appbuilder.add_view_no_menu(view["view"])
 
     for menu_link in sorted(plugins_manager.flask_appbuilder_menu_links, key=lambda x: x["name"]):
-        log.debug("Adding menu link %s", menu_link["name"])
-        appbuilder.add_link(
-            menu_link["name"],
-            href=menu_link["href"],
-            category=menu_link["category"],
-            category_icon=menu_link["category_icon"],
-        )
+        log.debug("Adding menu link %s to %s", menu_link["name"], menu_link["href"])
+        appbuilder.add_link(**menu_link)
 
     for blue_print in plugins_manager.flask_blueprints:
         log.debug("Adding blueprint %s:%s", blue_print["name"], blue_print["blueprint"].import_name)
diff --git a/docs/apache-airflow/plugins.rst b/docs/apache-airflow/plugins.rst
index 80708b9..687270a 100644
--- a/docs/apache-airflow/plugins.rst
+++ b/docs/apache-airflow/plugins.rst
@@ -115,7 +115,7 @@ looks like:
         flask_blueprints = []
         # A list of dictionaries containing FlaskAppBuilder BaseView object and some metadata. See example below
         appbuilder_views = []
-        # A list of dictionaries containing FlaskAppBuilder BaseView object and some metadata. See example below
+        # A list of dictionaries containing kwargs for FlaskAppBuilder add_link. See example below
         appbuilder_menu_items = []
         # A callback to perform actions when airflow starts and the plugin is loaded.
         # NOTE: Ensure your plugin has *args, and **kwargs in the method definition
@@ -210,11 +210,16 @@ definitions in Airflow.
         "view": v_appbuilder_nomenu_view
     }
 
-    # Creating a flask appbuilder Menu Item
-    appbuilder_mitem = {"name": "Google",
-                        "category": "Search",
-                        "category_icon": "fa-th",
-                        "href": "https://www.google.com"}
+    # Creating flask appbuilder Menu Items
+    appbuilder_mitem = {
+        "name": "Google",
+        "href": "https://www.google.com",
+        "category": "Search",
+    }
+    appbuilder_mitem_toplevel = {
+        "name": "Apache",
+        "href": "https://www.apache.org/",
+    }
 
     # A global operator extra link that redirect you to
     # task logs stored in S3
@@ -247,7 +252,7 @@ definitions in Airflow.
         macros = [plugin_macro]
         flask_blueprints = [bp]
         appbuilder_views = [v_appbuilder_package, v_appbuilder_nomenu_package]
-        appbuilder_menu_items = [appbuilder_mitem]
+        appbuilder_menu_items = [appbuilder_mitem, appbuilder_mitem_toplevel]
         global_operator_extra_links = [GoogleLink(),]
         operator_extra_links = [S3LogLink(), ]
 
diff --git a/tests/plugins/test_plugin.py b/tests/plugins/test_plugin.py
index ae725f3..d52d8e5 100644
--- a/tests/plugins/test_plugin.py
+++ b/tests/plugins/test_plugin.py
@@ -77,12 +77,16 @@ v_appbuilder_package = {"name": "Test View", "category": "Test Plugin", "view":
 
 v_nomenu_appbuilder_package = {"view": v_appbuilder_view}
 
-# Creating a flask appbuilder Menu Item
+# Creating flask appbuilder Menu Items
 appbuilder_mitem = {
     "name": "Google",
-    "category": "Search",
-    "category_icon": "fa-th",
     "href": "https://www.google.com",
+    "category": "Search",
+}
+appbuilder_mitem_toplevel = {
+    "name": "apache",
+    "href": "https://www.apache.org/",
+    "label": "The Apache Software Foundation",
 }
 
 # Creating a flask blueprint to intergrate the templates and static folder
@@ -105,7 +109,7 @@ class AirflowTestPlugin(AirflowPlugin):
     macros = [plugin_macro]
     flask_blueprints = [bp]
     appbuilder_views = [v_appbuilder_package]
-    appbuilder_menu_items = [appbuilder_mitem]
+    appbuilder_menu_items = [appbuilder_mitem, appbuilder_mitem_toplevel]
     global_operator_extra_links = [
         AirflowLink(),
         GithubLink(),
diff --git a/tests/plugins/test_plugins_manager.py b/tests/plugins/test_plugins_manager.py
index d454754..f730f17 100644
--- a/tests/plugins/test_plugins_manager.py
+++ b/tests/plugins/test_plugins_manager.py
@@ -77,21 +77,32 @@ class TestPluginsRBAC(unittest.TestCase):
             assert len(plugin_views) == 1
 
     def test_flaskappbuilder_menu_links(self):
-        from tests.plugins.test_plugin import appbuilder_mitem
+        from tests.plugins.test_plugin import appbuilder_mitem, appbuilder_mitem_toplevel
 
-        # menu item should exist matching appbuilder_mitem
-        links = [
+        # menu item (category) should exist matching appbuilder_mitem.category
+        categories = [
             menu_item
             for menu_item in self.appbuilder.menu.menu
             if menu_item.name == appbuilder_mitem['category']
         ]
+        assert len(categories) == 1
 
-        assert len(links) == 1
+        # menu link should be a child in the category
+        category = categories[0]
+        assert category.name == appbuilder_mitem['category']
+        assert category.childs[0].name == appbuilder_mitem['name']
+        assert category.childs[0].href == appbuilder_mitem['href']
 
-        # menu link should also have a link matching the name of the package.
-        link = links[0]
-        assert link.name == appbuilder_mitem['category']
-        assert link.childs[0].name == appbuilder_mitem['name']
+        # a top level link isn't nested in a category
+        top_levels = [
+            menu_item
+            for menu_item in self.appbuilder.menu.menu
+            if menu_item.name == appbuilder_mitem_toplevel['name']
+        ]
+        assert len(top_levels) == 1
+        link = top_levels[0]
+        assert link.href == appbuilder_mitem_toplevel['href']
+        assert link.label == appbuilder_mitem_toplevel['label']
 
     def test_app_blueprints(self):
         from tests.plugins.test_plugin import bp
diff --git a/tests/www/test_views.py b/tests/www/test_views.py
index 5011547..efcb46e 100644
--- a/tests/www/test_views.py
+++ b/tests/www/test_views.py
@@ -488,7 +488,7 @@ class TestAirflowBaseViews(TestBase):
         )
 
     def test_index(self):
-        with assert_queries_count(42):
+        with assert_queries_count(43):
             resp = self.client.get('/', follow_redirects=True)
         self.check_content_in_response('DAGs', resp)
 

[airflow] 23/42: Fix logging error with task error when JSON logging is enabled (#14456)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 62725cea85b8636791f76710bfc538d3d22deab5
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Thu Feb 25 13:25:04 2021 +0000

    Fix logging error with task error when JSON logging is enabled (#14456)
    
    If the JSON logging mode for Elasticsearch logs is enabled, the
    handle_failure function would fail, as it tried to treat the exception
    object as the message and try to JSON serialize it (it expected a
    string) -- which fails with:
    
    ```
    TypeError: Object of type type is not JSON serializable
    ...
      File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1150, in handle_failure
        self.log.exception(error)
    ...
      File "/usr/local/lib/python3.7/site-packages/airflow/utils/log/file_task_handler.py", line 63, in emit
        self.handler.emit(record)
    ```
    
    (cherry picked from commit 258ec5d95e98eac09ecc7658dcd5226c9afe14c6)
---
 airflow/models/taskinstance.py | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
index ed7a0be..119116e 100644
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -1478,7 +1478,10 @@ class TaskInstance(Base, LoggingMixin):  # pylint: disable=R0902,R0904
             test_mode = self.test_mode
 
         if error:
-            self.log.exception(error)
+            if isinstance(error, Exception):
+                self.log.exception("Task failed with exception")
+            else:
+                self.log.error("%s", error)
             # external monitoring process provides pickle file so _run_raw_task
             # can send its runtime errors for access by failure callback
             if error_file:

[airflow] 41/42: Fix tests for all urllib versions with only '&' as separator (#14710)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9f33dfb809b90e00a2533fb2138bad8bdbdbfa2b
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Thu Mar 11 01:39:28 2021 +0000

    Fix tests for all urllib versions with only '&' as separator (#14710)
    
    Turns out #14698 did not fix the issue as Master failed again. After
    digging a bit more I found that the CVE was fixed in all
    Python versions: 3.6.13, 3.7.10 & 3.8.8
    
    The solution in this PR/commit checks the `parse_qsl` behavior with
    following tests:
    
    ```
    ❯ docker run -it python:3.8-slim bash
    root@41120dfd035e:/# python
    Python 3.8.8 (default, Feb 19 2021, 18:07:06)
    >>> from urllib.parse import parse_qsl
    >>> parse_qsl(";a=b")
    [(';a', 'b')]
    >>>
    ```
    
    ```
    ❯ docker run -it python:3.8.7-slim bash
    root@68e527725610:/# python
    Python 3.8.7 (default, Feb  9 2021, 08:21:15)
    >>> from urllib.parse import parse_qsl
    >>> parse_qsl(";a=b")
    [('a', 'b')]
    >>>
    ```
    
    (cherry picked from commit 7bd9d477dd7c59b8efb7183050de58bcfd6fdd43)
---
 tests/www/test_views.py | 20 +++++++++++++++++---
 1 file changed, 17 insertions(+), 3 deletions(-)

diff --git a/tests/www/test_views.py b/tests/www/test_views.py
index d284314..ce4478c 100644
--- a/tests/www/test_views.py
+++ b/tests/www/test_views.py
@@ -32,7 +32,7 @@ from datetime import datetime as dt, timedelta
 from typing import Any, Dict, Generator, List, NamedTuple
 from unittest import mock
 from unittest.mock import PropertyMock
-from urllib.parse import quote_plus
+from urllib.parse import parse_qsl, quote_plus
 
 import jinja2
 import pytest
@@ -2757,7 +2757,7 @@ class TestTriggerDag(TestBase):
             ("http://google.com", "/home"),
             (
                 "%2Ftree%3Fdag_id%3Dexample_bash_operator';alert(33)//",
-                "/tree?dag_id=example_bash_operator%27&amp;alert%2833%29%2F%2F=",
+                "/tree?dag_id=example_bash_operator%27%3Balert%2833%29%2F%2F",
             ),
             ("%2Ftree%3Fdag_id%3Dexample_bash_operator", "/tree?dag_id=example_bash_operator"),
             ("%2Fgraph%3Fdag_id%3Dexample_bash_operator", "/graph?dag_id=example_bash_operator"),
@@ -2766,6 +2766,13 @@ class TestTriggerDag(TestBase):
     def test_trigger_dag_form_origin_url(self, test_origin, expected_origin):
         test_dag_id = "example_bash_operator"
 
+        # https://github.com/python/cpython/pull/24297/files
+        # Check if tests are running with a Python version containing the above fix
+        # where ";" is removed as a separator
+        if parse_qsl(";a=b") != [(';a', 'b')]:
+            expected_url = expected_origin.replace("%3B", "&")
+            expected_url += "="
+
         resp = self.client.get(f'trigger?dag_id={test_dag_id}&origin={test_origin}')
         self.check_content_in_response(
             '<button type="button" class="btn" onclick="location.href = \'{}\'; return false">'.format(
@@ -3298,7 +3305,7 @@ class TestHelperFunctions(TestBase):
             (
                 "http://localhost:8080/trigger?dag_id=test_dag&origin=%2Ftree%3Fdag_id%test_dag';alert(33)//",
                 "http://localhost:8080/trigger?dag_id=test_dag&origin=%2Ftree%3F"
-                "dag_id%25test_dag%27&alert%2833%29%2F%2F=",
+                "dag_id%25test_dag%27%3Balert%2833%29%2F%2F",
             ),
             (
                 "http://localhost:8080/trigger?dag_id=test_dag&origin=%2Ftree%3Fdag_id%test_dag",
@@ -3308,6 +3315,13 @@ class TestHelperFunctions(TestBase):
     )
     @mock.patch("airflow.www.views.url_for")
     def test_get_safe_url(self, test_url, expected_url, mock_url_for):
+        # https://github.com/python/cpython/pull/24297/files
+        # Check if tests are running with a Python version containing the above fix
+        # where ";" is removed as a separator
+        if parse_qsl(";a=b") != [(';a', 'b')]:
+            expected_url = expected_url.replace("%3B", "&")
+            expected_url += "="
+
         mock_url_for.return_value = "/home"
         with self.app.test_request_context(base_url="http://localhost:8080"):
             assert get_safe_url(test_url) == expected_url

[airflow] 10/42: Make TaskInstance.pool_slots not nullable with a default of 1 (#14406)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8c956756e7b60ee265a73309cb3a245966a7477c
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Thu Feb 25 02:56:40 2021 +0000

    Make TaskInstance.pool_slots not nullable with a default of 1 (#14406)
    
    closes https://github.com/apache/airflow/issues/13799
    
    Without it the migration from 1.10.14 to 2.0.0 can fail with following error for old TIs:
    
    ```
    Traceback (most recent call last):
      File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1275, in _execute
        self._run_scheduler_loop()
      File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1377, in _run_scheduler_loop
        num_queued_tis = self._do_scheduling(session)
      File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1533, in _do_scheduling
        num_queued_tis = self._critical_section_execute_task_instances(session=session)
      File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1132, in _critical_section_execute_task_instances
        queued_tis = self._executable_task_instances_to_queued(max_tis, session=session)
      File "/usr/local/lib/python3.6/dist-packages/airflow/utils/session.py", line 62, in wrapper
        return func(*args, **kwargs)
      File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1034, in _executable_task_instances_to_queued
        if task_instance.pool_slots > open_slots:
    TypeError: '>' not supported between instances of 'NoneType' and 'int'
    ```
    
    Workaround was to run manually:
    
    ```
    UPDATE task_instance SET pool_slots = 1 WHERE pool_slots IS NULL;
    ```
    
    This commit makes adds a DB migration to change the value to 1 for records with NULL value. And makes the column NOT NULLABLE.
    
    This bug was caused by https://github.com/apache/airflow/pull/7160
    
    (cherry picked from commit f763b7c3aa9cdac82b5d77e21e1840fbe931257a)
---
 .../8646922c8a04_change_default_pool_slots_to_1.py | 93 ++++++++++++++++++++++
 airflow/models/taskinstance.py                     |  2 +-
 2 files changed, 94 insertions(+), 1 deletion(-)

diff --git a/airflow/migrations/versions/8646922c8a04_change_default_pool_slots_to_1.py b/airflow/migrations/versions/8646922c8a04_change_default_pool_slots_to_1.py
new file mode 100644
index 0000000..bf49873
--- /dev/null
+++ b/airflow/migrations/versions/8646922c8a04_change_default_pool_slots_to_1.py
@@ -0,0 +1,93 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""Change default pool_slots to 1
+
+Revision ID: 8646922c8a04
+Revises: 449b4072c2da
+Create Date: 2021-02-23 23:19:22.409973
+
+"""
+
+import dill
+import sqlalchemy as sa
+from alembic import op
+from sqlalchemy import Column, Float, Integer, PickleType, String
+
+# revision identifiers, used by Alembic.
+from sqlalchemy.ext.declarative import declarative_base
+
+from airflow.models.base import COLLATION_ARGS
+from airflow.utils.sqlalchemy import UtcDateTime
+
+revision = '8646922c8a04'
+down_revision = '449b4072c2da'
+branch_labels = None
+depends_on = None
+
+Base = declarative_base()
+BATCH_SIZE = 5000
+ID_LEN = 250
+
+
+class TaskInstance(Base):  # noqa: D101  # type: ignore
+    __tablename__ = "task_instance"
+
+    task_id = Column(String(ID_LEN, **COLLATION_ARGS), primary_key=True)
+    dag_id = Column(String(ID_LEN, **COLLATION_ARGS), primary_key=True)
+    execution_date = Column(UtcDateTime, primary_key=True)
+    start_date = Column(UtcDateTime)
+    end_date = Column(UtcDateTime)
+    duration = Column(Float)
+    state = Column(String(20))
+    _try_number = Column('try_number', Integer, default=0)
+    max_tries = Column(Integer)
+    hostname = Column(String(1000))
+    unixname = Column(String(1000))
+    job_id = Column(Integer)
+    pool = Column(String(50), nullable=False)
+    pool_slots = Column(Integer, default=1)
+    queue = Column(String(256))
+    priority_weight = Column(Integer)
+    operator = Column(String(1000))
+    queued_dttm = Column(UtcDateTime)
+    queued_by_job_id = Column(Integer)
+    pid = Column(Integer)
+    executor_config = Column(PickleType(pickler=dill))
+    external_executor_id = Column(String(ID_LEN, **COLLATION_ARGS))
+
+
+def upgrade():
+    """Change default pool_slots to 1 and make pool_slots not nullable"""
+    connection = op.get_bind()
+    sessionmaker = sa.orm.sessionmaker()
+    session = sessionmaker(bind=connection)
+
+    session.query(TaskInstance).filter(TaskInstance.pool_slots.is_(None)).update(
+        {TaskInstance.pool_slots: 1}, synchronize_session=False
+    )
+    session.commit()
+
+    with op.batch_alter_table("task_instance", schema=None) as batch_op:
+        batch_op.alter_column("pool_slots", existing_type=sa.Integer, nullable=False)
+
+
+def downgrade():
+    """Unapply Change default pool_slots to 1"""
+    with op.batch_alter_table("task_instance", schema=None) as batch_op:
+        batch_op.alter_column("pool_slots", existing_type=sa.Integer, nullable=True)
diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
index c7d7ff7..3ceb5a3 100644
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -273,7 +273,7 @@ class TaskInstance(Base, LoggingMixin):  # pylint: disable=R0902,R0904
     unixname = Column(String(1000))
     job_id = Column(Integer)
     pool = Column(String(50), nullable=False)
-    pool_slots = Column(Integer, default=1)
+    pool_slots = Column(Integer, default=1, nullable=False)
     queue = Column(String(256))
     priority_weight = Column(Integer)
     operator = Column(String(1000))

[airflow] 06/42: Fix typos in concept docs (#14130)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9d910586b0cf9d3a250df599bb8e679189cab0ce
Author: José Coto <jl...@users.noreply.github.com>
AuthorDate: Mon Feb 8 19:47:43 2021 +0100

    Fix typos in concept docs (#14130)
    
    (cherry picked from commit 3d3a219ca9fe52086a0f6f637aa09c6a8ef28631)
---
 docs/apache-airflow/concepts.rst | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/apache-airflow/concepts.rst b/docs/apache-airflow/concepts.rst
index 0522c0f..c48714a 100644
--- a/docs/apache-airflow/concepts.rst
+++ b/docs/apache-airflow/concepts.rst
@@ -464,7 +464,7 @@ Airflow provides many built-in operators for many common tasks, including:
 
 There are also other, commonly used operators that are installed together with airflow automatically,
 by pre-installing some :doc:`apache-airflow-providers:index` packages (they are always available no
-matter which extras you chose when installing Apache Airflow:
+matter which extras you chose when installing Apache Airflow):
 
 - :class:`~airflow.providers.http.operators.http.SimpleHttpOperator` - sends an HTTP request
 - :class:`~airflow.providers.sqlite.operators.sqlite.SqliteOperator` - SQLite DB operator
@@ -484,7 +484,7 @@ Some examples of popular operators are:
 - :class:`~airflow.providers.docker.operators.docker.DockerOperator`
 - :class:`~airflow.providers.apache.hive.operators.hive.HiveOperator`
 - :class:`~airflow.providers.amazon.aws.operators.s3_file_transform.S3FileTransformOperator`
-- :class:`~airflow.providers.mysql.transfers.presto_to_mysql.PrestoToMySqlOperator`,
+- :class:`~airflow.providers.mysql.transfers.presto_to_mysql.PrestoToMySqlOperator`
 - :class:`~airflow.providers.slack.operators.slack.SlackAPIOperator`
 
 But there are many, many more - you can see the list of those by following the providers documentation

[airflow] 34/42: Fix minor issues in 'Concepts' doc (#14679)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 654248e51e2bbbb749281782eedc1df28d84709d
Author: Xiaodong DENG <xd...@apache.org>
AuthorDate: Tue Mar 9 15:33:54 2021 +0100

    Fix minor issues in 'Concepts' doc (#14679)
    
    (cherry picked from commit 99aab051600715a1ad029ce45a197a8492e5a151)
---
 docs/apache-airflow/concepts.rst | 28 ++++++++++++++--------------
 1 file changed, 14 insertions(+), 14 deletions(-)

diff --git a/docs/apache-airflow/concepts.rst b/docs/apache-airflow/concepts.rst
index c48714a..6b5b2c3 100644
--- a/docs/apache-airflow/concepts.rst
+++ b/docs/apache-airflow/concepts.rst
@@ -99,7 +99,7 @@ logical workflow.
 Scope
 -----
 
-Airflow will load any ``DAG`` object it can import from a DAGfile. Critically,
+Airflow will load any ``DAG`` object it can import from a DAG file. Critically,
 that means the DAG must appear in ``globals()``. Consider the following two
 DAGs. Only ``dag_1`` will be loaded; the other one only appears in a local
 scope.
@@ -134,7 +134,7 @@ any of its operators. This makes it easy to apply a common parameter to many ope
 
     dag = DAG('my_dag', default_args=default_args)
     op = DummyOperator(task_id='dummy', dag=dag)
-    print(op.owner) # Airflow
+    print(op.owner) # airflow
 
 .. _concepts:context_manager:
 
@@ -160,9 +160,9 @@ TaskFlow API
 .. versionadded:: 2.0.0
 
 Airflow 2.0 adds a new style of authoring dags called the TaskFlow API which removes a lot of the boilerplate
-around creating PythonOperators, managing dependencies between task and accessing XCom values. (During
+around creating PythonOperators, managing dependencies between task and accessing XCom values (During
 development this feature was called "Functional DAGs", so if you see or hear any references to that, it's the
-same thing)
+same thing).
 
 Outputs and inputs are sent between tasks using :ref:`XCom values <concepts:xcom>`. In addition, you can wrap
 functions as tasks using the :ref:`task decorator <concepts:task_decorator>`. Airflow will also automatically
@@ -221,7 +221,7 @@ Example DAG with decorator:
     :end-before: [END dag_decorator_usage]
 
 .. note:: Note that Airflow will only load DAGs that appear in ``globals()`` as noted in :ref:`scope section <concepts:scope>`.
-  This means you need to make sure to have a variable for your returned DAG is in the module scope.
+  This means you need to make sure to have a variable for your returned DAG in the module scope.
   Otherwise Airflow won't detect your decorated DAG.
 
 .. _concepts:executor_config:
@@ -229,7 +229,7 @@ Example DAG with decorator:
 ``executor_config``
 ===================
 
-The ``executor_config`` is an argument placed into operators that allow airflow users to override tasks
+The ``executor_config`` is an argument placed into operators that allow Airflow users to override tasks
 before launch. Currently this is primarily used by the :class:`KubernetesExecutor`, but will soon be available
 for other overrides.
 
@@ -252,7 +252,7 @@ execution_date
 The ``execution_date`` is the *logical* date and time which the DAG Run, and its task instances, are running for.
 
 This allows task instances to process data for the desired *logical* date & time.
-While a task_instance or DAG run might have an *actual* start date of now,
+While a task instance or DAG run might have an *actual* start date of now,
 their *logical* date might be 3 months ago because we are busy reloading something.
 
 In the prior example the ``execution_date`` was 2016-01-01 for the first DAG Run and 2016-01-02 for the second.
@@ -454,7 +454,7 @@ This is a subtle but very important point: in general, if two operators need to
 share information, like a filename or small amount of data, you should consider
 combining them into a single operator. If it absolutely can't be avoided,
 Airflow does have a feature for operator cross-communication called XCom that is
-described in the section :ref:`XComs <concepts:xcom>`
+described in the section :ref:`XComs <concepts:xcom>`.
 
 Airflow provides many built-in operators for many common tasks, including:
 
@@ -530,7 +530,7 @@ There are currently 3 different modes for how a sensor operates:
 
 How to use:
 
-For ``poke|schedule`` mode, you can configure them at the task level by supplying the ``mode`` parameter,
+For ``poke|reschedule`` mode, you can configure them at the task level by supplying the ``mode`` parameter,
 i.e. ``S3KeySensor(task_id='check-bucket', mode='reschedule', ...)``.
 
 For ``smart sensor``, you need to configure it in ``airflow.cfg``, for example:
@@ -545,7 +545,7 @@ For ``smart sensor``, you need to configure it in ``airflow.cfg``, for example:
     shards = 5
     sensors_enabled = NamedHivePartitionSensor, MetastorePartitionSensor
 
-For more information on how to configure ``smart-sensor`` and its architecture, see:
+For more information on how to configure ``smart sensor`` and its architecture, see:
 :doc:`Smart Sensor Architecture and Configuration<smart-sensor>`
 
 DAG Assignment
@@ -655,11 +655,11 @@ Relationship Builders
 
 *Moved in Airflow 2.0*
 
-In Airflow 2.0 those two methods moved from ``airflow.utils.helpers`` to ``airflow.models.baseoperator``.
-
 ``chain`` and ``cross_downstream`` function provide easier ways to set relationships
 between operators in specific situation.
 
+In Airflow 2.0 those two methods moved from ``airflow.utils.helpers`` to ``airflow.models.baseoperator``.
+
 When setting a relationship between two lists,
 if we want all operators in one list to be upstream to all operators in the other,
 we cannot use a single bitshift composition. Instead we have to split one of the lists:
@@ -736,7 +736,7 @@ be conceptualized like this:
 - Operator: A class that acts as a template for carrying out some work.
 - Task: Defines work by implementing an operator, written in Python.
 - Task Instance: An instance of a task - that has been assigned to a DAG and has a
-  state associated with a specific DAG run (i.e for a specific execution_date).
+  state associated with a specific DAG run (i.e. for a specific execution_date).
 - execution_date: The logical date and time for a DAG Run and its Task Instances.
 
 By combining ``DAGs`` and ``Operators`` to create ``TaskInstances``, you can
@@ -1634,7 +1634,7 @@ A ``.airflowignore`` file specifies the directories or files in ``DAG_FOLDER``
 or ``PLUGINS_FOLDER`` that Airflow should intentionally ignore.
 Each line in ``.airflowignore`` specifies a regular expression pattern,
 and directories or files whose names (not DAG id) match any of the patterns
-would be ignored (under the hood,``Pattern.search()`` is used to match the pattern).
+would be ignored (under the hood, ``Pattern.search()`` is used to match the pattern).
 Overall it works like a ``.gitignore`` file.
 Use the ``#`` character to indicate a comment; all characters
 on a line following a ``#`` will be ignored.

[airflow] 33/42: Update Flask-AppBuilder dependency to allow 3.2 (and all 3.x series) (#14665)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f395cd8cee726d2fcd0eca5de85c76f89da3fc08
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Mon Mar 8 17:12:05 2021 +0000

    Update Flask-AppBuilder dependency to allow 3.2 (and all 3.x series) (#14665)
    
    (cherry picked from commit 97b5e4cd6c001ec1a1597606f4e9f1c0fbea20d2)
---
 setup.cfg | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/setup.cfg b/setup.cfg
index b0f13c3..267d972 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -92,7 +92,7 @@ install_requires =
     cryptography>=0.9.3
     dill>=0.2.2, <0.4
     flask>=1.1.0, <2.0
-    flask-appbuilder~=3.1.1
+    flask-appbuilder~=3.1,>=3.1.1
     flask-caching>=1.5.0, <2.0.0
     flask-login>=0.3, <0.5
     flask-wtf>=0.14.3, <0.15

[airflow] 42/42: Webserver: Sanitize string passed to origin param (#14738)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 831c7ecd260eda4452c6e6c53317c270d9e0865a
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Fri Mar 12 09:48:59 2021 +0000

    Webserver: Sanitize string passed to origin param (#14738)
    
    Follow-up of #12459 & #10334
    
    Since https://github.com/python/cpython/pull/24297/files (bpo-42967)
    also removed ';' as query argument separator, we remove query arguments
    with semicolons.
    
    (cherry picked from commit 409c249121bd9c8902fc2ba551b21873ab41f953)
---
 airflow/www/views.py    | 12 +++++++++++-
 tests/www/test_views.py | 27 +++++++++------------------
 2 files changed, 20 insertions(+), 19 deletions(-)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index 4c272b5..f9f7f5c 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -129,8 +129,18 @@ def get_safe_url(url):
 
     parsed = urlparse(url)
 
+    # If the url is relative & it contains semicolon, redirect it to homepage to avoid
+    # potential XSS. (Similar to https://github.com/python/cpython/pull/24297/files (bpo-42967))
+    if parsed.netloc == '' and parsed.scheme == '' and ';' in unquote(url):
+        return url_for('Airflow.index')
+
     query = parse_qsl(parsed.query, keep_blank_values=True)
-    url = parsed._replace(query=urlencode(query)).geturl()
+
+    # Remove all the query elements containing semicolon
+    # As part of https://github.com/python/cpython/pull/24297/files (bpo-42967)
+    # semicolon was already removed as a separator for query arguments by default
+    sanitized_query = [query_arg for query_arg in query if ';' not in query_arg[1]]
+    url = parsed._replace(query=urlencode(sanitized_query)).geturl()
 
     if parsed.scheme in valid_schemes and parsed.netloc in valid_netlocs:
         return url
diff --git a/tests/www/test_views.py b/tests/www/test_views.py
index ce4478c..f67c478 100644
--- a/tests/www/test_views.py
+++ b/tests/www/test_views.py
@@ -32,7 +32,7 @@ from datetime import datetime as dt, timedelta
 from typing import Any, Dict, Generator, List, NamedTuple
 from unittest import mock
 from unittest.mock import PropertyMock
-from urllib.parse import parse_qsl, quote_plus
+from urllib.parse import quote_plus
 
 import jinja2
 import pytest
@@ -2755,9 +2755,10 @@ class TestTriggerDag(TestBase):
         [
             ("javascript:alert(1)", "/home"),
             ("http://google.com", "/home"),
+            ("36539'%3balert(1)%2f%2f166", "/home"),
             (
                 "%2Ftree%3Fdag_id%3Dexample_bash_operator';alert(33)//",
-                "/tree?dag_id=example_bash_operator%27%3Balert%2833%29%2F%2F",
+                "/home",
             ),
             ("%2Ftree%3Fdag_id%3Dexample_bash_operator", "/tree?dag_id=example_bash_operator"),
             ("%2Fgraph%3Fdag_id%3Dexample_bash_operator", "/graph?dag_id=example_bash_operator"),
@@ -2766,13 +2767,6 @@ class TestTriggerDag(TestBase):
     def test_trigger_dag_form_origin_url(self, test_origin, expected_origin):
         test_dag_id = "example_bash_operator"
 
-        # https://github.com/python/cpython/pull/24297/files
-        # Check if tests are running with a Python version containing the above fix
-        # where ";" is removed as a separator
-        if parse_qsl(";a=b") != [(';a', 'b')]:
-            expected_url = expected_origin.replace("%3B", "&")
-            expected_url += "="
-
         resp = self.client.get(f'trigger?dag_id={test_dag_id}&origin={test_origin}')
         self.check_content_in_response(
             '<button type="button" class="btn" onclick="location.href = \'{}\'; return false">'.format(
@@ -3302,10 +3296,14 @@ class TestHelperFunctions(TestBase):
         [
             ("", "/home"),
             ("http://google.com", "/home"),
+            ("36539'%3balert(1)%2f%2f166", "/home"),
+            (
+                "http://localhost:8080/trigger?dag_id=test&origin=36539%27%3balert(1)%2f%2f166&abc=2",
+                "http://localhost:8080/trigger?dag_id=test&abc=2",
+            ),
             (
                 "http://localhost:8080/trigger?dag_id=test_dag&origin=%2Ftree%3Fdag_id%test_dag';alert(33)//",
-                "http://localhost:8080/trigger?dag_id=test_dag&origin=%2Ftree%3F"
-                "dag_id%25test_dag%27%3Balert%2833%29%2F%2F",
+                "http://localhost:8080/trigger?dag_id=test_dag",
             ),
             (
                 "http://localhost:8080/trigger?dag_id=test_dag&origin=%2Ftree%3Fdag_id%test_dag",
@@ -3315,13 +3313,6 @@ class TestHelperFunctions(TestBase):
     )
     @mock.patch("airflow.www.views.url_for")
     def test_get_safe_url(self, test_url, expected_url, mock_url_for):
-        # https://github.com/python/cpython/pull/24297/files
-        # Check if tests are running with a Python version containing the above fix
-        # where ";" is removed as a separator
-        if parse_qsl(";a=b") != [(';a', 'b')]:
-            expected_url = expected_url.replace("%3B", "&")
-            expected_url += "="
-
         mock_url_for.return_value = "/home"
         with self.app.test_request_context(base_url="http://localhost:8080"):
             assert get_safe_url(test_url) == expected_url

[airflow] 17/42: Scheduler should not fail when invalid executor_config is passed (#14323)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6dd75596eb619dfa5e398153fe83b52368af1748
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Sat Feb 20 00:46:39 2021 +0000

    Scheduler should not fail when invalid executor_config is passed (#14323)
    
    closes #14182
    
    (cherry picked from commit e0ee91e15f8385e34e3d7dfc8a6365e350ea7083)
---
 airflow/executors/kubernetes_executor.py    |  8 +++++++-
 tests/executors/test_kubernetes_executor.py | 20 ++++++++++++++++++++
 2 files changed, 27 insertions(+), 1 deletion(-)

diff --git a/airflow/executors/kubernetes_executor.py b/airflow/executors/kubernetes_executor.py
index 88e26be..fd5c6fa 100644
--- a/airflow/executors/kubernetes_executor.py
+++ b/airflow/executors/kubernetes_executor.py
@@ -490,7 +490,13 @@ class KubernetesExecutor(BaseExecutor, LoggingMixin):
     ) -> None:
         """Executes task asynchronously"""
         self.log.info('Add task %s with command %s with executor_config %s', key, command, executor_config)
-        kube_executor_config = PodGenerator.from_obj(executor_config)
+        try:
+            kube_executor_config = PodGenerator.from_obj(executor_config)
+        except Exception:  # pylint: disable=broad-except
+            self.log.error("Invalid executor_config for %s", key)
+            self.fail(key=key, info="Invalid executor_config passed")
+            return
+
         if executor_config:
             pod_template_file = executor_config.get("pod_template_override", None)
         else:
diff --git a/tests/executors/test_kubernetes_executor.py b/tests/executors/test_kubernetes_executor.py
index 9abb328..dc7cbbb 100644
--- a/tests/executors/test_kubernetes_executor.py
+++ b/tests/executors/test_kubernetes_executor.py
@@ -196,6 +196,26 @@ class TestKubernetesExecutor(unittest.TestCase):
 
     @mock.patch('airflow.executors.kubernetes_executor.KubernetesJobWatcher')
     @mock.patch('airflow.executors.kubernetes_executor.get_kube_client')
+    def test_invalid_executor_config(self, mock_get_kube_client, mock_kubernetes_job_watcher):
+        executor = self.kubernetes_executor
+        executor.start()
+
+        assert executor.event_buffer == {}
+        executor.execute_async(
+            key=('dag', 'task', datetime.utcnow(), 1),
+            queue=None,
+            command=['airflow', 'tasks', 'run', 'true', 'some_parameter'],
+            executor_config=k8s.V1Pod(
+                spec=k8s.V1PodSpec(
+                    containers=[k8s.V1Container(name="base", image="myimage", image_pull_policy="Always")]
+                )
+            ),
+        )
+
+        assert list(executor.event_buffer.values())[0][1] == "Invalid executor_config passed"
+
+    @mock.patch('airflow.executors.kubernetes_executor.KubernetesJobWatcher')
+    @mock.patch('airflow.executors.kubernetes_executor.get_kube_client')
     def test_change_state_running(self, mock_get_kube_client, mock_kubernetes_job_watcher):
         executor = self.kubernetes_executor
         executor.start()

[airflow] 25/42: BugFix: TypeError in monitor_pod (#14513)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 87c26b464a7bba388c4830982fd11e7ae5384fa8
Author: Emil Ejbyfeldt <ee...@liveintent.com>
AuthorDate: Mon Mar 1 14:58:01 2021 +0100

    BugFix: TypeError in monitor_pod (#14513)
    
    If the log read is interrupted before any logs are produced then
    `last_log_time` will not be set and the line
    `delta = pendulum.now() - last_log_time` will fail with
    ```
    TypeError: unsupported operand type(s) for -: 'DateTime' and 'NoneType'
    ```
    
    This commit fix this issue by only updating `read_logs_since_sec` if
    `last_log_time` has been set.
    
    (cherry picked from commit 45a0ac2e01c174754f4e6612c8e4d3125061d096)
---
 airflow/kubernetes/pod_launcher.py    |  7 ++++---
 tests/kubernetes/test_pod_launcher.py | 18 +++++++++++++++++-
 2 files changed, 21 insertions(+), 4 deletions(-)

diff --git a/airflow/kubernetes/pod_launcher.py b/airflow/kubernetes/pod_launcher.py
index 02194d7..3d663d2 100644
--- a/airflow/kubernetes/pod_launcher.py
+++ b/airflow/kubernetes/pod_launcher.py
@@ -140,9 +140,10 @@ class PodLauncher(LoggingMixin):
                     break
 
                 self.log.warning('Pod %s log read interrupted', pod.metadata.name)
-                delta = pendulum.now() - last_log_time
-                # Prefer logs duplication rather than loss
-                read_logs_since_sec = math.ceil(delta.total_seconds())
+                if last_log_time:
+                    delta = pendulum.now() - last_log_time
+                    # Prefer logs duplication rather than loss
+                    read_logs_since_sec = math.ceil(delta.total_seconds())
         result = None
         if self.extract_xcom:
             while self.base_container_is_running(pod):
diff --git a/tests/kubernetes/test_pod_launcher.py b/tests/kubernetes/test_pod_launcher.py
index 9e7cc82..6e40264 100644
--- a/tests/kubernetes/test_pod_launcher.py
+++ b/tests/kubernetes/test_pod_launcher.py
@@ -21,7 +21,7 @@ import pytest
 from requests.exceptions import BaseHTTPError
 
 from airflow.exceptions import AirflowException
-from airflow.kubernetes.pod_launcher import PodLauncher
+from airflow.kubernetes.pod_launcher import PodLauncher, PodStatus
 
 
 class TestPodLauncher(unittest.TestCase):
@@ -170,6 +170,22 @@ class TestPodLauncher(unittest.TestCase):
             ]
         )
 
+    def test_monitor_pod_empty_logs(self):
+        mock.sentinel.metadata = mock.MagicMock()
+        running_status = mock.MagicMock()
+        running_status.configure_mock(**{'name': 'base', 'state.running': True})
+        pod_info_running = mock.MagicMock(**{'status.container_statuses': [running_status]})
+        pod_info_succeeded = mock.MagicMock(**{'status.phase': PodStatus.SUCCEEDED})
+
+        def pod_state_gen():
+            yield pod_info_running
+            while True:
+                yield pod_info_succeeded
+
+        self.mock_kube_client.read_namespaced_pod.side_effect = pod_state_gen()
+        self.mock_kube_client.read_namespaced_pod_log.return_value = iter(())
+        self.pod_launcher.monitor_pod(mock.sentinel, get_logs=True)
+
     def test_read_pod_retries_fails(self):
         mock.sentinel.metadata = mock.MagicMock()
         self.mock_kube_client.read_namespaced_pod.side_effect = [

[airflow] 16/42: Fix bug allowing task instances to survive when dagrun_timeout is exceeded (#14321)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 44a261a76f4074d676fde3ba68e337a3fa9088cb
Author: Ryan Hatter <25...@users.noreply.github.com>
AuthorDate: Thu Mar 4 19:45:06 2021 -0500

    Fix bug allowing task instances to survive when dagrun_timeout is exceeded (#14321)
    
    closes: #12912
    related: #13407
    (cherry picked from commit 09327ba6b371aa68cf681747c73a7a0f4968c173)
---
 airflow/jobs/scheduler_job.py    | 14 +++++++--
 tests/jobs/test_scheduler_job.py | 63 ++++++++++++++++++++++++++++++++++++++++
 2 files changed, 74 insertions(+), 3 deletions(-)

diff --git a/airflow/jobs/scheduler_job.py b/airflow/jobs/scheduler_job.py
index f25ae02..ae91b0d 100644
--- a/airflow/jobs/scheduler_job.py
+++ b/airflow/jobs/scheduler_job.py
@@ -1717,10 +1717,18 @@ class SchedulerJob(BaseJob):  # pylint: disable=too-many-instance-attributes
             and dag.dagrun_timeout
             and dag_run.start_date < timezone.utcnow() - dag.dagrun_timeout
         ):
-            dag_run.state = State.FAILED
-            dag_run.end_date = timezone.utcnow()
-            self.log.info("Run %s of %s has timed-out", dag_run.run_id, dag_run.dag_id)
+            dag_run.set_state(State.FAILED)
+            unfinished_task_instances = (
+                session.query(TI)
+                .filter(TI.dag_id == dag_run.dag_id)
+                .filter(TI.execution_date == dag_run.execution_date)
+                .filter(TI.state.in_(State.unfinished))
+            )
+            for task_instance in unfinished_task_instances:
+                task_instance.state = State.SKIPPED
+                session.merge(task_instance)
             session.flush()
+            self.log.info("Run %s of %s has timed-out", dag_run.run_id, dag_run.dag_id)
 
             # Work out if we should allow creating a new DagRun now?
             self._update_dag_next_dagruns([session.query(DagModel).get(dag_run.dag_id)], session)
diff --git a/tests/jobs/test_scheduler_job.py b/tests/jobs/test_scheduler_job.py
index 3867833..e0a4723 100644
--- a/tests/jobs/test_scheduler_job.py
+++ b/tests/jobs/test_scheduler_job.py
@@ -23,6 +23,7 @@ import shutil
 import unittest
 from datetime import timedelta
 from tempfile import NamedTemporaryFile, mkdtemp
+from time import sleep
 from unittest import mock
 from unittest.mock import MagicMock, patch
 from zipfile import ZipFile
@@ -3813,6 +3814,68 @@ class TestSchedulerJob(unittest.TestCase):
         ti = run2.get_task_instance(task1.task_id, session)
         assert ti.state == State.QUEUED
 
+    def test_do_schedule_max_active_runs_dag_timed_out(self):
+        """Test that tasks are set to a finished state when their DAG times out"""
+
+        dag = DAG(
+            dag_id='test_max_active_run_with_dag_timed_out',
+            start_date=DEFAULT_DATE,
+            schedule_interval='@once',
+            max_active_runs=1,
+            catchup=True,
+        )
+        dag.dagrun_timeout = datetime.timedelta(seconds=1)
+
+        with dag:
+            task1 = BashOperator(
+                task_id='task1',
+                bash_command=' for((i=1;i<=600;i+=1)); do sleep "$i";  done',
+            )
+
+        session = settings.Session()
+        dagbag = DagBag(
+            dag_folder=os.devnull,
+            include_examples=False,
+            read_dags_from_db=True,
+        )
+
+        dagbag.bag_dag(dag=dag, root_dag=dag)
+        dagbag.sync_to_db(session=session)
+
+        run1 = dag.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE,
+            state=State.RUNNING,
+            session=session,
+        )
+        run1_ti = run1.get_task_instance(task1.task_id, session)
+        run1_ti.state = State.RUNNING
+
+        sleep(1)
+
+        run2 = dag.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE + timedelta(seconds=10),
+            state=State.RUNNING,
+            session=session,
+        )
+
+        dag.sync_to_db(session=session)
+
+        job = SchedulerJob(subdir=os.devnull)
+        job.executor = MockExecutor()
+        job.processor_agent = mock.MagicMock(spec=DagFileProcessorAgent)
+
+        _ = job._do_scheduling(session)
+
+        assert run1.state == State.FAILED
+        assert run1_ti.state == State.SKIPPED
+        assert run2.state == State.RUNNING
+
+        _ = job._do_scheduling(session)
+        run2_ti = run2.get_task_instance(task1.task_id, session)
+        assert run2_ti.state == State.QUEUED
+
     def test_do_schedule_max_active_runs_task_removed(self):
         """Test that tasks in removed state don't count as actively running."""
 

[airflow] 07/42: [AIRFLOW-7044] Host key can be specified via SSH connection extras. (#12944)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 472077ec5df36f83490739cc102c2b709ad7db37
Author: Andreas Franzén <an...@devil.se>
AuthorDate: Fri Jan 8 12:02:53 2021 +0100

    [AIRFLOW-7044] Host key can be specified via SSH connection extras. (#12944)
    
    (cherry picked from commit 52339a55c054bddd1d46253575274a3d5d141ebe)
---
 ...2da_increase_size_of_connection_extra_field_.py | 56 +++++++++++++
 airflow/models/connection.py                       |  2 +-
 airflow/providers/sftp/hooks/sftp.py               |  6 ++
 airflow/providers/ssh/hooks/ssh.py                 | 18 ++++-
 .../connections/ssh.rst                            |  6 +-
 docs/spelling_wordlist.txt                         |  1 +
 tests/providers/sftp/hooks/test_sftp.py            | 41 +++++++++-
 tests/providers/ssh/hooks/test_ssh.py              | 93 ++++++++++++++++++++++
 8 files changed, 216 insertions(+), 7 deletions(-)

diff --git a/airflow/migrations/versions/449b4072c2da_increase_size_of_connection_extra_field_.py b/airflow/migrations/versions/449b4072c2da_increase_size_of_connection_extra_field_.py
new file mode 100644
index 0000000..d3d9432
--- /dev/null
+++ b/airflow/migrations/versions/449b4072c2da_increase_size_of_connection_extra_field_.py
@@ -0,0 +1,56 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""Increase size of connection.extra field to handle multiple RSA keys
+
+Revision ID: 449b4072c2da
+Revises: e959f08ac86c
+Create Date: 2020-03-16 19:02:55.337710
+
+"""
+
+import sqlalchemy as sa
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = '449b4072c2da'
+down_revision = 'e959f08ac86c'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+    """Apply increase_length_for_connection_password"""
+    with op.batch_alter_table('connection', schema=None) as batch_op:
+        batch_op.alter_column(
+            'extra',
+            existing_type=sa.VARCHAR(length=5000),
+            type_=sa.TEXT(),
+            existing_nullable=True,
+        )
+
+
+def downgrade():
+    """Unapply increase_length_for_connection_password"""
+    with op.batch_alter_table('connection', schema=None) as batch_op:
+        batch_op.alter_column(
+            'extra',
+            existing_type=sa.TEXT(),
+            type_=sa.VARCHAR(length=5000),
+            existing_nullable=True,
+        )
diff --git a/airflow/models/connection.py b/airflow/models/connection.py
index 1159a44..c030571 100644
--- a/airflow/models/connection.py
+++ b/airflow/models/connection.py
@@ -102,7 +102,7 @@ class Connection(Base, LoggingMixin):  # pylint: disable=too-many-instance-attri
     port = Column(Integer())
     is_encrypted = Column(Boolean, unique=False, default=False)
     is_extra_encrypted = Column(Boolean, unique=False, default=False)
-    _extra = Column('extra', String(5000))
+    _extra = Column('extra', Text())
 
     def __init__(  # pylint: disable=too-many-arguments
         self,
diff --git a/airflow/providers/sftp/hooks/sftp.py b/airflow/providers/sftp/hooks/sftp.py
index 498f362..e2a991e 100644
--- a/airflow/providers/sftp/hooks/sftp.py
+++ b/airflow/providers/sftp/hooks/sftp.py
@@ -115,6 +115,12 @@ class SFTPHook(SSHHook):
             cnopts = pysftp.CnOpts()
             if self.no_host_key_check:
                 cnopts.hostkeys = None
+            else:
+                if self.host_key is not None:
+                    cnopts.hostkeys.add(self.remote_host, 'ssh-rsa', self.host_key)
+                else:
+                    pass  # will fallback to system host keys if none explicitly specified in conn extra
+
             cnopts.compression = self.compress
             cnopts.ciphers = self.ciphers
             conn_params = {
diff --git a/airflow/providers/ssh/hooks/ssh.py b/airflow/providers/ssh/hooks/ssh.py
index d420b1b..1b35db3 100644
--- a/airflow/providers/ssh/hooks/ssh.py
+++ b/airflow/providers/ssh/hooks/ssh.py
@@ -19,6 +19,7 @@
 import getpass
 import os
 import warnings
+from base64 import decodebytes
 from io import StringIO
 from typing import Dict, Optional, Tuple, Union
 
@@ -30,7 +31,7 @@ from airflow.exceptions import AirflowException
 from airflow.hooks.base import BaseHook
 
 
-class SSHHook(BaseHook):
+class SSHHook(BaseHook):  # pylint: disable=too-many-instance-attributes
     """
     Hook for ssh remote execution using Paramiko.
     ref: https://github.com/paramiko/paramiko
@@ -72,7 +73,7 @@ class SSHHook(BaseHook):
             },
         }
 
-    def __init__(
+    def __init__(  # pylint: disable=too-many-statements
         self,
         ssh_conn_id: Optional[str] = None,
         remote_host: Optional[str] = None,
@@ -99,6 +100,7 @@ class SSHHook(BaseHook):
         self.no_host_key_check = True
         self.allow_host_key_change = False
         self.host_proxy = None
+        self.host_key = None
         self.look_for_keys = True
 
         # Placeholder for deprecated __enter__
@@ -149,7 +151,9 @@ class SSHHook(BaseHook):
                     and str(extra_options["look_for_keys"]).lower() == 'false'
                 ):
                     self.look_for_keys = False
-
+                if "host_key" in extra_options and self.no_host_key_check is False:
+                    decoded_host_key = decodebytes(extra_options["host_key"].encode('utf-8'))
+                    self.host_key = paramiko.RSAKey(data=decoded_host_key)
         if self.pkey and self.key_file:
             raise AirflowException(
                 "Params key_file and private_key both provided.  Must provide no more than one."
@@ -198,10 +202,18 @@ class SSHHook(BaseHook):
                 'This wont protect against Man-In-The-Middle attacks'
             )
             client.load_system_host_keys()
+
         if self.no_host_key_check:
             self.log.warning('No Host Key Verification. This wont protect against Man-In-The-Middle attacks')
             # Default is RejectPolicy
             client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
+        else:
+            if self.host_key is not None:
+                client_host_keys = client.get_host_keys()
+                client_host_keys.add(self.remote_host, 'ssh-rsa', self.host_key)
+            else:
+                pass  # will fallback to system host keys if none explicitly specified in conn extra
+
         connect_kwargs = dict(
             hostname=self.remote_host,
             username=self.username,
diff --git a/docs/apache-airflow-providers-ssh/connections/ssh.rst b/docs/apache-airflow-providers-ssh/connections/ssh.rst
index 54e902e..f320381 100644
--- a/docs/apache-airflow-providers-ssh/connections/ssh.rst
+++ b/docs/apache-airflow-providers-ssh/connections/ssh.rst
@@ -47,9 +47,10 @@ Extra (optional)
     * ``private_key_passphrase`` - Content of the private key passphrase used to decrypt the private key.
     * ``timeout`` - An optional timeout (in seconds) for the TCP connect. Default is ``10``.
     * ``compress`` - ``true`` to ask the remote client/server to compress traffic; ``false`` to refuse compression. Default is ``true``.
-    * ``no_host_key_check`` - Set to ``false`` to restrict connecting to hosts with no entries in ``~/.ssh/known_hosts`` (Hosts file). This provides maximum protection against trojan horse attacks, but can be troublesome when the ``/etc/ssh/ssh_known_hosts`` file is poorly maintained or connections to new hosts are frequently made. This option forces the user to manually add all new hosts. Default is ``true``, ssh will automatically add new host keys to the user known hosts files.
+    * ``no_host_key_check`` - Set to ``false`` to restrict connecting to hosts with either no entries in ``~/.ssh/known_hosts`` (Hosts file) or not present in the ``host_key`` extra. This provides maximum protection against trojan horse attacks, but can be troublesome when the ``/etc/ssh/ssh_known_hosts`` file is poorly maintained or connections to new hosts are frequently made. This option forces the user to manually add all new hosts. Default is ``true``, ssh will automatically add new [...]
     * ``allow_host_key_change`` - Set to ``true`` if you want to allow connecting to hosts that has host key changed or when you get 'REMOTE HOST IDENTIFICATION HAS CHANGED' error.  This wont protect against Man-In-The-Middle attacks. Other possible solution is to remove the host entry from ``~/.ssh/known_hosts`` file. Default is ``false``.
     * ``look_for_keys`` - Set to ``false`` if you want to disable searching for discoverable private key files in ``~/.ssh/``
+    * ``host_key`` - The base64 encoded ssh-rsa public key of the host, as you would find in the ``known_hosts`` file. Specifying this, along with ``no_host_key_check=False`` allows you to only make the connection if the public key of the endpoint matches this value.
 
     Example "extras" field:
 
@@ -59,9 +60,10 @@ Extra (optional)
           "key_file": "/home/airflow/.ssh/id_rsa",
           "timeout": "10",
           "compress": "false",
+          "look_for_keys": "false",
           "no_host_key_check": "false",
           "allow_host_key_change": "false",
-          "look_for_keys": "false"
+          "host_key": "AAAHD...YDWwq=="
        }
 
     When specifying the connection as URI (in :envvar:`AIRFLOW_CONN_{CONN_ID}` variable) you should specify it
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 238021e..84f1860 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -1157,6 +1157,7 @@ rootcss
 rowid
 rpc
 rshift
+rsa
 rst
 rtype
 ru
diff --git a/tests/providers/sftp/hooks/test_sftp.py b/tests/providers/sftp/hooks/test_sftp.py
index 45097e6..9211c30 100644
--- a/tests/providers/sftp/hooks/test_sftp.py
+++ b/tests/providers/sftp/hooks/test_sftp.py
@@ -15,12 +15,14 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-
+import json
 import os
 import shutil
 import unittest
+from io import StringIO
 from unittest import mock
 
+import paramiko
 import pysftp
 from parameterized import parameterized
 
@@ -28,6 +30,15 @@ from airflow.models import Connection
 from airflow.providers.sftp.hooks.sftp import SFTPHook
 from airflow.utils.session import provide_session
 
+
+def generate_host_key(pkey: paramiko.PKey):
+    key_fh = StringIO()
+    pkey.write_private_key(key_fh)
+    key_fh.seek(0)
+    key_obj = paramiko.RSAKey(file_obj=key_fh)
+    return key_obj.get_base64()
+
+
 TMP_PATH = '/tmp'
 TMP_DIR_FOR_TESTS = 'tests_sftp_hook_dir'
 SUB_DIR = "sub_dir"
@@ -35,6 +46,9 @@ TMP_FILE_FOR_TESTS = 'test_file.txt'
 
 SFTP_CONNECTION_USER = "root"
 
+TEST_PKEY = paramiko.RSAKey.generate(4096)
+TEST_HOST_KEY = generate_host_key(pkey=TEST_PKEY)
+
 
 class TestSFTPHook(unittest.TestCase):
     @provide_session
@@ -178,6 +192,31 @@ class TestSFTPHook(unittest.TestCase):
         hook = SFTPHook()
         self.assertEqual(hook.no_host_key_check, False)
 
+    @mock.patch('airflow.providers.sftp.hooks.sftp.SFTPHook.get_connection')
+    def test_host_key_default(self, get_connection):
+        connection = Connection(login='login', host='host')
+        get_connection.return_value = connection
+        hook = SFTPHook()
+        self.assertEqual(hook.host_key, None)
+
+    @mock.patch('airflow.providers.sftp.hooks.sftp.SFTPHook.get_connection')
+    def test_host_key(self, get_connection):
+        connection = Connection(
+            login='login',
+            host='host',
+            extra=json.dumps({"host_key": TEST_HOST_KEY, "no_host_key_check": False}),
+        )
+        get_connection.return_value = connection
+        hook = SFTPHook()
+        self.assertEqual(hook.host_key.get_base64(), TEST_HOST_KEY)
+
+    @mock.patch('airflow.providers.sftp.hooks.sftp.SFTPHook.get_connection')
+    def test_host_key_with_no_host_key_check(self, get_connection):
+        connection = Connection(login='login', host='host', extra=json.dumps({"host_key": TEST_HOST_KEY}))
+        get_connection.return_value = connection
+        hook = SFTPHook()
+        self.assertEqual(hook.host_key, None)
+
     @parameterized.expand(
         [
             (os.path.join(TMP_PATH, TMP_DIR_FOR_TESTS), True),
diff --git a/tests/providers/ssh/hooks/test_ssh.py b/tests/providers/ssh/hooks/test_ssh.py
index 027de40..fea52bc 100644
--- a/tests/providers/ssh/hooks/test_ssh.py
+++ b/tests/providers/ssh/hooks/test_ssh.py
@@ -51,8 +51,17 @@ def generate_key_string(pkey: paramiko.PKey, passphrase: Optional[str] = None):
     return key_str
 
 
+def generate_host_key(pkey: paramiko.PKey):
+    key_fh = StringIO()
+    pkey.write_private_key(key_fh)
+    key_fh.seek(0)
+    key_obj = paramiko.RSAKey(file_obj=key_fh)
+    return key_obj.get_base64()
+
+
 TEST_PKEY = paramiko.RSAKey.generate(4096)
 TEST_PRIVATE_KEY = generate_key_string(pkey=TEST_PKEY)
+TEST_HOST_KEY = generate_host_key(pkey=TEST_PKEY)
 
 PASSPHRASE = ''.join(random.choice(string.ascii_letters) for i in range(10))
 TEST_ENCRYPTED_PRIVATE_KEY = generate_key_string(pkey=TEST_PKEY, passphrase=PASSPHRASE)
@@ -63,6 +72,10 @@ class TestSSHHook(unittest.TestCase):
     CONN_SSH_WITH_PRIVATE_KEY_PASSPHRASE_EXTRA = 'ssh_with_private_key_passphrase_extra'
     CONN_SSH_WITH_EXTRA = 'ssh_with_extra'
     CONN_SSH_WITH_EXTRA_FALSE_LOOK_FOR_KEYS = 'ssh_with_extra_false_look_for_keys'
+    CONN_SSH_WITH_HOST_KEY_EXTRA = 'ssh_with_host_key_extra'
+    CONN_SSH_WITH_HOST_KEY_AND_NO_HOST_KEY_CHECK_FALSE = 'ssh_with_host_key_and_no_host_key_check_false'
+    CONN_SSH_WITH_HOST_KEY_AND_NO_HOST_KEY_CHECK_TRUE = 'ssh_with_host_key_and_no_host_key_check_true'
+    CONN_SSH_WITH_NO_HOST_KEY_AND_NO_HOST_KEY_CHECK_FALSE = 'ssh_with_no_host_key_and_no_host_key_check_false'
 
     @classmethod
     def tearDownClass(cls) -> None:
@@ -70,6 +83,11 @@ class TestSSHHook(unittest.TestCase):
             conns_to_reset = [
                 cls.CONN_SSH_WITH_PRIVATE_KEY_EXTRA,
                 cls.CONN_SSH_WITH_PRIVATE_KEY_PASSPHRASE_EXTRA,
+                cls.CONN_SSH_WITH_EXTRA,
+                cls.CONN_SSH_WITH_HOST_KEY_EXTRA,
+                cls.CONN_SSH_WITH_HOST_KEY_AND_NO_HOST_KEY_CHECK_FALSE,
+                cls.CONN_SSH_WITH_HOST_KEY_AND_NO_HOST_KEY_CHECK_TRUE,
+                cls.CONN_SSH_WITH_NO_HOST_KEY_AND_NO_HOST_KEY_CHECK_FALSE,
             ]
             connections = session.query(Connection).filter(Connection.conn_id.in_(conns_to_reset))
             connections.delete(synchronize_session=False)
@@ -116,6 +134,42 @@ class TestSSHHook(unittest.TestCase):
                 ),
             )
         )
+        db.merge_conn(
+            Connection(
+                conn_id=cls.CONN_SSH_WITH_HOST_KEY_EXTRA,
+                host='localhost',
+                conn_type='ssh',
+                extra=json.dumps({"private_key": TEST_PRIVATE_KEY, "host_key": TEST_HOST_KEY}),
+            )
+        )
+        db.merge_conn(
+            Connection(
+                conn_id=cls.CONN_SSH_WITH_HOST_KEY_AND_NO_HOST_KEY_CHECK_FALSE,
+                host='remote_host',
+                conn_type='ssh',
+                extra=json.dumps(
+                    {"private_key": TEST_PRIVATE_KEY, "host_key": TEST_HOST_KEY, "no_host_key_check": False}
+                ),
+            )
+        )
+        db.merge_conn(
+            Connection(
+                conn_id=cls.CONN_SSH_WITH_HOST_KEY_AND_NO_HOST_KEY_CHECK_TRUE,
+                host='remote_host',
+                conn_type='ssh',
+                extra=json.dumps(
+                    {"private_key": TEST_PRIVATE_KEY, "host_key": TEST_HOST_KEY, "no_host_key_check": True}
+                ),
+            )
+        )
+        db.merge_conn(
+            Connection(
+                conn_id=cls.CONN_SSH_WITH_NO_HOST_KEY_AND_NO_HOST_KEY_CHECK_FALSE,
+                host='remote_host',
+                conn_type='ssh',
+                extra=json.dumps({"private_key": TEST_PRIVATE_KEY, "no_host_key_check": False}),
+            )
+        )
 
     @mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
     def test_ssh_connection_with_password(self, ssh_mock):
@@ -344,3 +398,42 @@ class TestSSHHook(unittest.TestCase):
                 sock=None,
                 look_for_keys=True,
             )
+
+    @mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+    def test_ssh_connection_with_host_key_extra(self, ssh_client):
+        hook = SSHHook(ssh_conn_id=self.CONN_SSH_WITH_HOST_KEY_EXTRA)
+        assert hook.host_key is None  # Since default no_host_key_check = True unless explicit override
+        with hook.get_conn():
+            assert ssh_client.return_value.connect.called is True
+            assert ssh_client.return_value.get_host_keys.return_value.add.called is False
+
+    @mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+    def test_ssh_connection_with_host_key_where_no_host_key_check_is_true(self, ssh_client):
+        hook = SSHHook(ssh_conn_id=self.CONN_SSH_WITH_HOST_KEY_AND_NO_HOST_KEY_CHECK_TRUE)
+        assert hook.host_key is None
+        with hook.get_conn():
+            assert ssh_client.return_value.connect.called is True
+            assert ssh_client.return_value.get_host_keys.return_value.add.called is False
+
+    @mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+    def test_ssh_connection_with_host_key_where_no_host_key_check_is_false(self, ssh_client):
+        hook = SSHHook(ssh_conn_id=self.CONN_SSH_WITH_HOST_KEY_AND_NO_HOST_KEY_CHECK_FALSE)
+        assert hook.host_key.get_base64() == TEST_HOST_KEY
+        with hook.get_conn():
+            assert ssh_client.return_value.connect.called is True
+            assert ssh_client.return_value.get_host_keys.return_value.add.called is True
+            assert ssh_client.return_value.get_host_keys.return_value.add.call_args == mock.call(
+                hook.remote_host, 'ssh-rsa', hook.host_key
+            )
+
+    @mock.patch('airflow.providers.ssh.hooks.ssh.paramiko.SSHClient')
+    def test_ssh_connection_with_no_host_key_where_no_host_key_check_is_false(self, ssh_client):
+        hook = SSHHook(ssh_conn_id=self.CONN_SSH_WITH_NO_HOST_KEY_AND_NO_HOST_KEY_CHECK_FALSE)
+        assert hook.host_key is None
+        with hook.get_conn():
+            assert ssh_client.return_value.connect.called is True
+            assert ssh_client.return_value.get_host_keys.return_value.add.called is False
+
+
+if __name__ == '__main__':
+    unittest.main()

[airflow] 26/42: Add plain format output to cli tables (#14546)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 051d239d95c78c97c9591a1a213aa0f0369db79d
Author: Tomek Urbaszek <tu...@gmail.com>
AuthorDate: Tue Mar 2 20:12:53 2021 +0100

    Add plain format output to cli tables (#14546)
    
    Add plain format output to cli tables so users can use standard
    linux utilities like awk, xargs etc.
    
    closes: #14517
    (cherry picked from commit 0ef084c3b70089b9b061090f7d88ce86e3651ed4)
---
 airflow/cli/cli_parser.py         |  6 +++---
 airflow/cli/simple_table.py       | 11 +++++++++++
 docs/apache-airflow/usage-cli.rst |  1 +
 docs/spelling_wordlist.txt        |  1 +
 4 files changed, 16 insertions(+), 3 deletions(-)

diff --git a/airflow/cli/cli_parser.py b/airflow/cli/cli_parser.py
index fae516e..e62346a 100644
--- a/airflow/cli/cli_parser.py
+++ b/airflow/cli/cli_parser.py
@@ -178,9 +178,9 @@ ARG_OUTPUT = Arg(
         "-o",
         "--output",
     ),
-    help="Output format. Allowed values: json, yaml, table (default: table)",
-    metavar="(table, json, yaml)",
-    choices=("table", "json", "yaml"),
+    help="Output format. Allowed values: json, yaml, plain, table (default: table)",
+    metavar="(table, json, yaml, plain)",
+    choices=("table", "json", "yaml", "plain"),
     default="table",
 )
 ARG_COLOR = Arg(
diff --git a/airflow/cli/simple_table.py b/airflow/cli/simple_table.py
index 696b9bf..3851272 100644
--- a/airflow/cli/simple_table.py
+++ b/airflow/cli/simple_table.py
@@ -23,6 +23,7 @@ from rich.box import ASCII_DOUBLE_HEAD
 from rich.console import Console
 from rich.syntax import Syntax
 from rich.table import Table
+from tabulate import tabulate
 
 from airflow.plugins_manager import PluginsDirectorySource
 
@@ -56,6 +57,15 @@ class AirflowConsole(Console):
             table.add_row(*[str(d) for d in row.values()])
         self.print(table)
 
+    def print_as_plain_table(self, data: List[Dict]):
+        """Renders list of dictionaries as a simple table than can be easily piped"""
+        if not data:
+            self.print("No data found")
+            return
+        rows = [d.values() for d in data]
+        output = tabulate(rows, tablefmt="plain", headers=data[0].keys())
+        print(output)
+
     # pylint: disable=too-many-return-statements
     def _normalize_data(self, value: Any, output: str) -> Optional[Union[list, str, dict]]:
         if isinstance(value, (tuple, list)):
@@ -76,6 +86,7 @@ class AirflowConsole(Console):
             "json": self.print_as_json,
             "yaml": self.print_as_yaml,
             "table": self.print_as_table,
+            "plain": self.print_as_plain_table,
         }
         renderer = output_to_renderer.get(output)
         if not renderer:
diff --git a/docs/apache-airflow/usage-cli.rst b/docs/apache-airflow/usage-cli.rst
index 5dadab8..34f7ae7 100644
--- a/docs/apache-airflow/usage-cli.rst
+++ b/docs/apache-airflow/usage-cli.rst
@@ -182,6 +182,7 @@ Some Airflow commands like ``airflow dags list`` or ``airflow tasks states-for-d
 which allow users to change the formatting of command's output. Possible options:
 
   - ``table`` - renders the information as a plain text table
+  - ``simple`` - renders the information as simple table which can be parsed by standard linux utilities
   - ``json`` - renders the information in form of json string
   - ``yaml`` - render the information in form of valid yaml
 
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 84f1860..322ce7b 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -928,6 +928,7 @@ licence
 licences
 lifecycle
 lineterminator
+linux
 livy
 localExecutor
 localexecutor

[airflow] 27/42: BugFix: fix DAG doc display (especially for TaskFlow DAGs) (#14564)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a1560534f6936ac2758dd64d15eeb980b91ff11a
Author: Xiaodong DENG <xd...@apache.org>
AuthorDate: Wed Mar 3 00:11:07 2021 +0100

    BugFix: fix DAG doc display (especially for TaskFlow DAGs) (#14564)
    
    Because of how TaskFlow DAGs are constructed, their __doc__ lines
    may start with spaces. This fails markdown.markdown(), and the
    doc in Markdown format cannot be transformed into HTML properly,
    and further fails the doc display in the UI.
    
    This commit fixes this by always doing left strip for each line for the doc md.
    
    (cherry picked from commit 22e3a4cc01d31024ffaa6c9c7767eec8467a9df7)
---
 airflow/www/utils.py    |  2 ++
 tests/www/test_utils.py | 11 +++++++++++
 2 files changed, 13 insertions(+)

diff --git a/airflow/www/utils.py b/airflow/www/utils.py
index 265a12f..afd94c6 100644
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -326,6 +326,8 @@ def wrapped_markdown(s, css_class=None):
     if s is None:
         return None
 
+    s = '\n'.join(line.lstrip() for line in s.split('\n'))
+
     return Markup(f'<div class="{css_class}" >' + markdown.markdown(s, extensions=['tables']) + "</div>")
 
 
diff --git a/tests/www/test_utils.py b/tests/www/test_utils.py
index f6e53e4..5ced73a 100644
--- a/tests/www/test_utils.py
+++ b/tests/www/test_utils.py
@@ -245,3 +245,14 @@ class TestWrappedMarkdown(unittest.TestCase):
             '</td>\n<td>14m</td>\n</tr>\n</tbody>\n'
             '</table></div>'
         ) == rendered
+
+    def test_wrapped_markdown_with_indented_lines(self):
+        rendered = wrapped_markdown(
+            """
+                # header
+                1st line
+                2nd line
+            """
+        )
+
+        assert '<div class="None" ><h1>header</h1>\n<p>1st line\n2nd line</p></div>' == rendered

[airflow] 40/42: fix lossing duration < 1 secs in tree (#13537)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 71b44a2244521582d94a4298d514234738e1b1e1
Author: Yujia Yang <33...@users.noreply.github.com>
AuthorDate: Fri Feb 19 00:14:44 2021 +0800

    fix lossing duration < 1 secs in tree (#13537)
    
    truncat duration to 3dp when duration < 10
    
    Update airflow/www/views.py
    
    Co-authored-by: Ash Berlin-Taylor <as...@firemirror.com>
    
    fix import, retrigger ci
    
    (cherry picked from commit 8f21fb1bf77fc67e37dc13613778ff1e6fa87cea)
---
 airflow/www/views.py    | 10 +++++++++-
 tests/www/test_views.py | 14 +++++++++++++-
 2 files changed, 22 insertions(+), 2 deletions(-)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index ab31607..4c272b5 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -111,6 +111,14 @@ FILTER_TAGS_COOKIE = 'tags_filter'
 FILTER_STATUS_COOKIE = 'dag_status_filter'
 
 
+def truncate_task_duration(task_duration):
+    """
+    Cast the task_duration to an int was for optimization for large/huge dags if task_duration > 10s
+    otherwise we keep it as a float with 3dp
+    """
+    return int(task_duration) if task_duration > 10.0 else round(task_duration, 3)
+
+
 def get_safe_url(url):
     """Given a user-supplied URL, ensure it points to our web server"""
     valid_schemes = ['http', 'https', '']
@@ -1921,7 +1929,7 @@ class Airflow(AirflowBaseView):  # noqa: D101  pylint: disable=too-many-public-m
                 # round to seconds to reduce payload size
                 task_instance_data[2] = int(task_instance.start_date.timestamp())
                 if task_instance.duration is not None:
-                    task_instance_data[3] = int(task_instance.duration)
+                    task_instance_data[3] = truncate_task_duration(task_instance.duration)
 
             return task_instance_data
 
diff --git a/tests/www/test_views.py b/tests/www/test_views.py
index b391e56..d284314 100644
--- a/tests/www/test_views.py
+++ b/tests/www/test_views.py
@@ -62,7 +62,7 @@ from airflow.utils.state import State
 from airflow.utils.timezone import datetime
 from airflow.utils.types import DagRunType
 from airflow.www import app as application
-from airflow.www.views import ConnectionModelView, get_safe_url
+from airflow.www.views import ConnectionModelView, get_safe_url, truncate_task_duration
 from tests.test_utils import fab_utils
 from tests.test_utils.asserts import assert_queries_count
 from tests.test_utils.config import conf_vars
@@ -3311,3 +3311,15 @@ class TestHelperFunctions(TestBase):
         mock_url_for.return_value = "/home"
         with self.app.test_request_context(base_url="http://localhost:8080"):
             assert get_safe_url(test_url) == expected_url
+
+    @parameterized.expand(
+        [
+            (0.12345, 0.123),
+            (0.12355, 0.124),
+            (3.12, 3.12),
+            (9.99999, 10.0),
+            (10.01232, 10),
+        ]
+    )
+    def test_truncate_task_duration(self, test_duration, expected_duration):
+        assert truncate_task_duration(test_duration) == expected_duration

[airflow] 30/42: Default to Celery Task model when backend model does not exist (#14612)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0fc6ca33fd698c123875454b64299b9af4dd4877
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Fri Mar 5 19:34:17 2021 +0000

    Default to Celery Task model when backend model does not exist (#14612)
    
    closes https://github.com/apache/airflow/issues/14586
    
    We add this feature in https://github.com/apache/airflow/pull/12336
    but looks like for some users this attribute does not exist.
    
    I am not sure if they are using a different Celery DB Backend or not
    but even Celery > 5 contains this attribute
    (https://github.com/celery/celery/blob/v5.0.5/celery/backends/database/__init__.py#L66)
    
    and even Celery 4 but this commits use the Celery Task model when an attribute
    error occurs
    
    (cherry picked from commit 33910d6c699b5528db4be40d31199626dafed912)
---
 airflow/executors/celery_executor.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/airflow/executors/celery_executor.py b/airflow/executors/celery_executor.py
index 8bbaed1..a670294 100644
--- a/airflow/executors/celery_executor.py
+++ b/airflow/executors/celery_executor.py
@@ -35,7 +35,7 @@ from typing import Any, Dict, List, Mapping, MutableMapping, Optional, Set, Tupl
 
 from celery import Celery, Task, states as celery_states
 from celery.backends.base import BaseKeyValueStoreBackend
-from celery.backends.database import DatabaseBackend, session_cleanup
+from celery.backends.database import DatabaseBackend, Task as TaskDb, session_cleanup
 from celery.result import AsyncResult
 from celery.signals import import_modules as celery_import_modules
 from setproctitle import setproctitle  # pylint: disable=no-name-in-module
@@ -567,7 +567,7 @@ class BulkStateFetcher(LoggingMixin):
     def _get_many_from_db_backend(self, async_tasks) -> Mapping[str, EventBufferValueType]:
         task_ids = _tasks_list_to_task_ids(async_tasks)
         session = app.backend.ResultSession()
-        task_cls = app.backend.task_cls
+        task_cls = getattr(app.backend, "task_cls", TaskDb)
         with session_cleanup(session):
             tasks = session.query(task_cls).filter(task_cls.task_id.in_(task_ids)).all()
 

[airflow] 21/42: Gracefully handle missing start_date and end_date for DagRun (#14452)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 97d98bbbe391095cb2534cdd1377395e751b1810
Author: Rik Heijdens <Ri...@users.noreply.github.com>
AuthorDate: Thu Feb 25 15:40:38 2021 +0100

    Gracefully handle missing start_date and end_date for DagRun (#14452)
    
    closes: #14384
    
    This PR fixes two issues:
    
    1) A TypeError that would be raised from _emit_duration_stats_for_finished_state() when the scheduler transitions a DagRun from a running state into a success or failed state if the DagRun did not have a start_date or end_date set.
    
    2) An issue with the DagRunEditForm, which would clear the start_date and end_date for a DagRun, if the form was used to transition a DagRun from a failed state back into a running state (or any other state). In the event where the scheduler would determine the DagRun should've been in the failed or success state (e.g. because the task instances weren't cleared), then this would lead to a scheduler crash.
    
    (cherry picked from commit 997a009715fb82c241a47405cc8647d23580af25)
---
 airflow/models/dagrun.py | 6 ++++++
 airflow/www/views.py     | 2 +-
 2 files changed, 7 insertions(+), 1 deletion(-)

diff --git a/airflow/models/dagrun.py b/airflow/models/dagrun.py
index fe7b29c..fae58e1 100644
--- a/airflow/models/dagrun.py
+++ b/airflow/models/dagrun.py
@@ -611,6 +611,12 @@ class DagRun(Base, LoggingMixin):
     def _emit_duration_stats_for_finished_state(self):
         if self.state == State.RUNNING:
             return
+        if self.start_date is None:
+            self.log.warning('Failed to record duration of %s: start_date is not set.', self)
+            return
+        if self.end_date is None:
+            self.log.warning('Failed to record duration of %s: end_date is not set.', self)
+            return
 
         duration = self.end_date - self.start_date
         if self.state is State.SUCCESS:
diff --git a/airflow/www/views.py b/airflow/www/views.py
index c1155ee..78dbbea 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -3290,7 +3290,7 @@ class DagRunModelView(AirflowModelView):
         'external_trigger',
         'conf',
     ]
-    edit_columns = ['state', 'dag_id', 'execution_date', 'run_id', 'conf']
+    edit_columns = ['state', 'dag_id', 'execution_date', 'start_date', 'end_date', 'run_id', 'conf']
 
     base_order = ('execution_date', 'desc')
 

[airflow] 29/42: Replace Graph View Screenshot to show Auto-refresh (#14571)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d6b40e279f483911d150d82fa3b8a511390876a6
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed Mar 3 10:48:28 2021 +0000

    Replace Graph View Screenshot to show Auto-refresh (#14571)
    
    (cherry picked from commit bcc8b5e4a674a4f13f362061609a2716a9a45700)
---
 docs/apache-airflow/img/graph.png | Bin 118674 -> 225347 bytes
 1 file changed, 0 insertions(+), 0 deletions(-)

diff --git a/docs/apache-airflow/img/graph.png b/docs/apache-airflow/img/graph.png
index d883959..23279f8 100644
Binary files a/docs/apache-airflow/img/graph.png and b/docs/apache-airflow/img/graph.png differ

[airflow] 11/42: Rename last_scheduler_run into last_parsed_time, and ensure it's updated in DB (#14581)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7156d6cdbc9882999eecfc19f791168dd9bc1d88
Author: Xiaodong DENG <xd...@apache.org>
AuthorDate: Fri Mar 5 23:05:32 2021 +0100

    Rename last_scheduler_run into last_parsed_time, and ensure it's updated in DB (#14581)
    
    - Fix functionality
      last_scheduler_run was missed in the process of
      migrating from sync_to_db/bulk_sync_to_db to bulk_write_to_db.
    
      This issue will fail DAG.deactivate_stale_dags() method,
      and blocks users from checking the last schedule time of each DAG in DB
    
    - Change name last_scheduler_run to last_parsed_time,
      to better reflect what it does now.
      Migration script is added, and codebase is updated
    
    - To ensure the migration scripts can work,
      we have to limit the columns needed in create_dag_specific_permissions(),
      so migration 2c6edca13270 can work with the renamed column.
    
    Co-authored-by: Kaxil Naik <ka...@gmail.com>
    (cherry picked from commit c2a0cb958835d0cecd90f82311e2aa8b1bbd22a0)
---
 ...e42bb497a22_rename_last_scheduler_run_column.py | 65 ++++++++++++++++++++++
 airflow/models/dag.py                              |  7 ++-
 airflow/www/security.py                            |  9 +--
 airflow/www/views.py                               |  2 +-
 tests/models/test_dag.py                           | 15 +++--
 5 files changed, 86 insertions(+), 12 deletions(-)

diff --git a/airflow/migrations/versions/2e42bb497a22_rename_last_scheduler_run_column.py b/airflow/migrations/versions/2e42bb497a22_rename_last_scheduler_run_column.py
new file mode 100644
index 0000000..97d8ff6
--- /dev/null
+++ b/airflow/migrations/versions/2e42bb497a22_rename_last_scheduler_run_column.py
@@ -0,0 +1,65 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""rename last_scheduler_run column
+
+Revision ID: 2e42bb497a22
+Revises: 8646922c8a04
+Create Date: 2021-03-04 19:50:38.880942
+
+"""
+
+import sqlalchemy as sa
+from alembic import op
+from sqlalchemy.dialects import mssql
+
+# revision identifiers, used by Alembic.
+revision = '2e42bb497a22'
+down_revision = '8646922c8a04'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+    """Apply rename last_scheduler_run column"""
+    conn = op.get_bind()
+    if conn.dialect.name == "mssql":
+        with op.batch_alter_table('dag') as batch_op:
+            batch_op.alter_column(
+                'last_scheduler_run', new_column_name='last_parsed_time', type_=mssql.DATETIME2(precision=6)
+            )
+    else:
+        with op.batch_alter_table('dag') as batch_op:
+            batch_op.alter_column(
+                'last_scheduler_run', new_column_name='last_parsed_time', type_=sa.TIMESTAMP(timezone=True)
+            )
+
+
+def downgrade():
+    """Unapply rename last_scheduler_run column"""
+    conn = op.get_bind()
+    if conn.dialect.name == "mssql":
+        with op.batch_alter_table('dag') as batch_op:
+            batch_op.alter_column(
+                'last_parsed_time', new_column_name='last_scheduler_run', type_=mssql.DATETIME2(precision=6)
+            )
+    else:
+        with op.batch_alter_table('dag') as batch_op:
+            batch_op.alter_column(
+                'last_parsed_time', new_column_name='last_scheduler_run', type_=sa.TIMESTAMP(timezone=True)
+            )
diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index d77cdfc..47fc34b 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -1882,6 +1882,7 @@ class DAG(LoggingMixin):
                 orm_dag.fileloc = dag.fileloc
                 orm_dag.owners = dag.owner
             orm_dag.is_active = True
+            orm_dag.last_parsed_time = timezone.utcnow()
             orm_dag.default_view = dag.default_view
             orm_dag.description = dag.description
             orm_dag.schedule_interval = dag.schedule_interval
@@ -1966,13 +1967,13 @@ class DAG(LoggingMixin):
         """
         for dag in (
             session.query(DagModel)
-            .filter(DagModel.last_scheduler_run < expiration_date, DagModel.is_active)
+            .filter(DagModel.last_parsed_time < expiration_date, DagModel.is_active)
             .all()
         ):
             log.info(
                 "Deactivating DAG ID %s since it was last touched by the scheduler at %s",
                 dag.dag_id,
-                dag.last_scheduler_run.isoformat(),
+                dag.last_parsed_time.isoformat(),
             )
             dag.is_active = False
             session.merge(dag)
@@ -2075,7 +2076,7 @@ class DagModel(Base):
     # Whether that DAG was seen on the last DagBag load
     is_active = Column(Boolean, default=False)
     # Last time the scheduler started
-    last_scheduler_run = Column(UtcDateTime)
+    last_parsed_time = Column(UtcDateTime)
     # Last time this DAG was pickled
     last_pickled = Column(UtcDateTime)
     # Time when the DAG last received a refresh signal
diff --git a/airflow/www/security.py b/airflow/www/security.py
index 09af167..5201ef6 100644
--- a/airflow/www/security.py
+++ b/airflow/www/security.py
@@ -474,15 +474,16 @@ class AirflowSecurityManager(SecurityManager, LoggingMixin):  # pylint: disable=
         :return: None.
         """
         perms = self.get_all_permissions()
-        dag_models = (
-            session.query(models.DagModel)
+        rows = (
+            session.query(models.DagModel.dag_id)
             .filter(or_(models.DagModel.is_active, models.DagModel.is_paused))
             .all()
         )
 
-        for dag in dag_models:
+        for row in rows:
+            dag_id = row[0]
             for perm_name in self.DAG_PERMS:
-                dag_resource_name = self.prefixed_dag_id(dag.dag_id)
+                dag_resource_name = self.prefixed_dag_id(dag_id)
                 if dag_resource_name and perm_name and (dag_resource_name, perm_name) not in perms:
                     self._merge_perm(perm_name, dag_resource_name)
 
diff --git a/airflow/www/views.py b/airflow/www/views.py
index b7c5372..c1155ee 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -3729,7 +3729,7 @@ class DagModelView(AirflowModelView):
     list_columns = [
         'dag_id',
         'is_paused',
-        'last_scheduler_run',
+        'last_parsed_time',
         'last_expired',
         'scheduler_lock',
         'fileloc',
diff --git a/tests/models/test_dag.py b/tests/models/test_dag.py
index c923241..0aae371 100644
--- a/tests/models/test_dag.py
+++ b/tests/models/test_dag.py
@@ -646,15 +646,19 @@ class TestDag(unittest.TestCase):
                 ('dag-bulk-sync-2', 'test-dag'),
                 ('dag-bulk-sync-3', 'test-dag'),
             } == set(session.query(DagTag.dag_id, DagTag.name).all())
+
+            for row in session.query(DagModel.last_parsed_time).all():
+                assert row[0] is not None
+
         # Re-sync should do fewer queries
-        with assert_queries_count(3):
+        with assert_queries_count(4):
             DAG.bulk_write_to_db(dags)
-        with assert_queries_count(3):
+        with assert_queries_count(4):
             DAG.bulk_write_to_db(dags)
         # Adding tags
         for dag in dags:
             dag.tags.append("test-dag2")
-        with assert_queries_count(4):
+        with assert_queries_count(5):
             DAG.bulk_write_to_db(dags)
         with create_session() as session:
             assert {'dag-bulk-sync-0', 'dag-bulk-sync-1', 'dag-bulk-sync-2', 'dag-bulk-sync-3'} == {
@@ -673,7 +677,7 @@ class TestDag(unittest.TestCase):
         # Removing tags
         for dag in dags:
             dag.tags.remove("test-dag")
-        with assert_queries_count(4):
+        with assert_queries_count(5):
             DAG.bulk_write_to_db(dags)
         with create_session() as session:
             assert {'dag-bulk-sync-0', 'dag-bulk-sync-1', 'dag-bulk-sync-2', 'dag-bulk-sync-3'} == {
@@ -686,6 +690,9 @@ class TestDag(unittest.TestCase):
                 ('dag-bulk-sync-3', 'test-dag2'),
             } == set(session.query(DagTag.dag_id, DagTag.name).all())
 
+            for row in session.query(DagModel.last_parsed_time).all():
+                assert row[0] is not None
+
     def test_bulk_write_to_db_max_active_runs(self):
         """
         Test that DagModel.next_dagrun_create_after is set to NULL when the dag cannot be created due to max

[airflow] 39/42: Bump Redoc to resolve vulnerability in sub-dependency (#14608)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8bdfe078e5992014cbe24b6011441299256149b4
Author: Ryan Hamilton <ry...@ryanahamilton.com>
AuthorDate: Thu Mar 4 13:11:30 2021 -0500

    Bump Redoc to resolve vulnerability in sub-dependency (#14608)
    
    (cherry picked from commit 1c23e91f08771858313d89086227146f4359b6e9)
---
 airflow/www/package.json |   2 +-
 airflow/www/yarn.lock    | 616 +++++++++++++----------------------------------
 2 files changed, 168 insertions(+), 450 deletions(-)

diff --git a/airflow/www/package.json b/airflow/www/package.json
index 5bf6530..57a7b1d 100644
--- a/airflow/www/package.json
+++ b/airflow/www/package.json
@@ -72,7 +72,7 @@
     "lodash": "^4.17.20",
     "moment-timezone": "^0.5.28",
     "nvd3": "^1.8.6",
-    "redoc": "^2.0.0-rc.30",
+    "redoc": "^2.0.0-rc.48",
     "url-search-params-polyfill": "^8.1.0"
   }
 }
diff --git a/airflow/www/yarn.lock b/airflow/www/yarn.lock
index 8625cd5..7e56643 100644
--- a/airflow/www/yarn.lock
+++ b/airflow/www/yarn.lock
@@ -96,13 +96,6 @@
   dependencies:
     "@babel/types" "^7.8.3"
 
-"@babel/helper-module-imports@^7.0.0":
-  version "7.10.4"
-  resolved "https://registry.yarnpkg.com/@babel/helper-module-imports/-/helper-module-imports-7.10.4.tgz#4c5c54be04bd31670a7382797d75b9fa2e5b5620"
-  integrity sha512-nEQJHqYavI217oD9+s5MUBzk6x1IlvoS9WTPfgG43CbMEeStE0v+r+TucWdx8KFGowPGvyOkDT9+7DHedIDnVw==
-  dependencies:
-    "@babel/types" "^7.10.4"
-
 "@babel/helper-module-imports@^7.8.3":
   version "7.8.3"
   resolved "https://registry.yarnpkg.com/@babel/helper-module-imports/-/helper-module-imports-7.8.3.tgz#7fe39589b39c016331b6b8c3f441e8f0b1419498"
@@ -200,13 +193,20 @@
   resolved "https://registry.yarnpkg.com/@babel/parser/-/parser-7.9.4.tgz#68a35e6b0319bbc014465be43828300113f2f2e8"
   integrity sha512-bC49otXX6N0/VYhgOMh4gnP26E9xnDZK3TmbNpxYzzz9BQLBosQwfyOe9/cXUU3txYhTzLCbcqd5c8y/OmCjHA==
 
-"@babel/runtime@^7.0.0", "@babel/runtime@^7.7.2", "@babel/runtime@^7.9.2":
+"@babel/runtime@^7.0.0":
   version "7.10.5"
   resolved "https://registry.yarnpkg.com/@babel/runtime/-/runtime-7.10.5.tgz#303d8bd440ecd5a491eae6117fd3367698674c5c"
   integrity sha512-otddXKhdNn7d0ptoFRHtMLa8LqDxLYwTjB4nYgM1yy5N6gU/MUf8zqyyLltCH3yAVitBzmwK4us+DD0l/MauAg==
   dependencies:
     regenerator-runtime "^0.13.4"
 
+"@babel/runtime@^7.12.5":
+  version "7.13.9"
+  resolved "https://registry.yarnpkg.com/@babel/runtime/-/runtime-7.13.9.tgz#97dbe2116e2630c489f22e0656decd60aaa1fcee"
+  integrity sha512-aY2kU+xgJ3dJ1eU6FMB9EH8dIe8dmusF1xEku52joLvw6eAFN0AI+WxCLDnpev2LEejWBAy2sBvBOBAjI3zmvA==
+  dependencies:
+    regenerator-runtime "^0.13.4"
+
 "@babel/runtime@^7.8.7":
   version "7.9.2"
   resolved "https://registry.yarnpkg.com/@babel/runtime/-/runtime-7.9.2.tgz#d90df0583a3a252f09aaa619665367bae518db06"
@@ -280,52 +280,10 @@
     lodash "^4.17.13"
     to-fast-properties "^2.0.0"
 
-"@emotion/babel-utils@^0.6.4":
-  version "0.6.10"
-  resolved "https://registry.yarnpkg.com/@emotion/babel-utils/-/babel-utils-0.6.10.tgz#83dbf3dfa933fae9fc566e54fbb45f14674c6ccc"
-  integrity sha512-/fnkM/LTEp3jKe++T0KyTszVGWNKPNOUJfjNKLO17BzQ6QPxgbg3whayom1Qr2oLFH3V92tDymU+dT5q676uow==
-  dependencies:
-    "@emotion/hash" "^0.6.6"
-    "@emotion/memoize" "^0.6.6"
-    "@emotion/serialize" "^0.9.1"
-    convert-source-map "^1.5.1"
-    find-root "^1.1.0"
-    source-map "^0.7.2"
-
-"@emotion/hash@^0.6.2", "@emotion/hash@^0.6.6":
-  version "0.6.6"
-  resolved "https://registry.yarnpkg.com/@emotion/hash/-/hash-0.6.6.tgz#62266c5f0eac6941fece302abad69f2ee7e25e44"
-  integrity sha512-ojhgxzUHZ7am3D2jHkMzPpsBAiB005GF5YU4ea+8DNPybMk01JJUM9V9YRlF/GE95tcOm8DxQvWA2jq19bGalQ==
-
-"@emotion/memoize@^0.6.1", "@emotion/memoize@^0.6.6":
-  version "0.6.6"
-  resolved "https://registry.yarnpkg.com/@emotion/memoize/-/memoize-0.6.6.tgz#004b98298d04c7ca3b4f50ca2035d4f60d2eed1b"
-  integrity sha512-h4t4jFjtm1YV7UirAFuSuFGyLa+NNxjdkq6DpFLANNQY5rHueFZHVY+8Cu1HYVP6DrheB0kv4m5xPjo7eKT7yQ==
-
-"@emotion/serialize@^0.9.1":
-  version "0.9.1"
-  resolved "https://registry.yarnpkg.com/@emotion/serialize/-/serialize-0.9.1.tgz#a494982a6920730dba6303eb018220a2b629c145"
-  integrity sha512-zTuAFtyPvCctHBEL8KZ5lJuwBanGSutFEncqLn/m9T1a6a93smBStK+bZzcNPgj4QS8Rkw9VTwJGhRIUVO8zsQ==
-  dependencies:
-    "@emotion/hash" "^0.6.6"
-    "@emotion/memoize" "^0.6.6"
-    "@emotion/unitless" "^0.6.7"
-    "@emotion/utils" "^0.8.2"
-
-"@emotion/stylis@^0.7.0":
-  version "0.7.1"
-  resolved "https://registry.yarnpkg.com/@emotion/stylis/-/stylis-0.7.1.tgz#50f63225e712d99e2b2b39c19c70fff023793ca5"
-  integrity sha512-/SLmSIkN13M//53TtNxgxo57mcJk/UJIDFRKwOiLIBEyBHEcipgR6hNMQ/59Sl4VjCJ0Z/3zeAZyvnSLPG/1HQ==
-
-"@emotion/unitless@^0.6.2", "@emotion/unitless@^0.6.7":
-  version "0.6.7"
-  resolved "https://registry.yarnpkg.com/@emotion/unitless/-/unitless-0.6.7.tgz#53e9f1892f725b194d5e6a1684a7b394df592397"
-  integrity sha512-Arj1hncvEVqQ2p7Ega08uHLr1JuRYBuO5cIvcA+WWEQ5+VmkOE3ZXzl04NbQxeQpWX78G7u6MqxKuNX3wvYZxg==
-
-"@emotion/utils@^0.8.2":
-  version "0.8.2"
-  resolved "https://registry.yarnpkg.com/@emotion/utils/-/utils-0.8.2.tgz#576ff7fb1230185b619a75d258cbc98f0867a8dc"
-  integrity sha512-rLu3wcBWH4P5q1CGoSSH/i9hrXs7SlbRLkoq9IGuoPYNGQvDJ3pt/wmOM+XgYjIDRMVIdkUWt0RsfzF50JfnCw==
+"@exodus/schemasafe@^1.0.0-rc.2":
+  version "1.0.0-rc.3"
+  resolved "https://registry.yarnpkg.com/@exodus/schemasafe/-/schemasafe-1.0.0-rc.3.tgz#dda2fbf3dafa5ad8c63dadff7e01d3fdf4736025"
+  integrity sha512-GoXw0U2Qaa33m3eUcxuHnHpNvHjNlLo0gtV091XBpaRINaB4X6FGCG5XKxSFNFiPpugUDqNruHzaqpTdDm4AOg==
 
 "@nodelib/fs.scandir@2.1.3":
   version "2.1.3"
@@ -355,6 +313,11 @@
   dependencies:
     mkdirp "^1.0.4"
 
+"@redocly/react-dropdown-aria@^2.0.11":
+  version "2.0.11"
+  resolved "https://registry.yarnpkg.com/@redocly/react-dropdown-aria/-/react-dropdown-aria-2.0.11.tgz#532b864b329237e646abe45d0f8edc923e77370a"
+  integrity sha512-rmuSC2JFFl4DkPDdGVrmffT9KcbG2AB5jvhxPIrOc1dO9mHRMUUftQY35KZlvWqqSSqVn+AM+J9dhiTo1ZqR8A==
+
 "@stylelint/postcss-css-in-js@^0.37.1":
   version "0.37.2"
   resolved "https://registry.yarnpkg.com/@stylelint/postcss-css-in-js/-/postcss-css-in-js-0.37.2.tgz#7e5a84ad181f4234a2480803422a47b8749af3d2"
@@ -631,11 +594,6 @@
   resolved "https://registry.yarnpkg.com/@xtuc/long/-/long-4.2.2.tgz#d291c6a4e97989b5c61d9acf396ae4fe133a718d"
   integrity sha512-NuHqBY1PB/D8xU6s/thBgOAiAP7HOYDQ32+BFZILJ8ivkUkAHQnWfn6WhL79Owj1qmUnoN/YPhktdIoucipkAQ==
 
-abbrev@1:
-  version "1.1.1"
-  resolved "https://registry.yarnpkg.com/abbrev/-/abbrev-1.1.1.tgz#f8f2c887ad10bf67f634f005b6987fed3179aac8"
-  integrity sha512-nne9/IiQ/hzIhY6pdDnbBtz7DjPTKrY00P/zvPSm5pOFkl6xuGrGnXn/VtTNNfNtAfZ9/1RtehkszU9qcTii0Q==
-
 acorn-jsx@^5.2.0:
   version "5.2.0"
   resolved "https://registry.yarnpkg.com/acorn-jsx/-/acorn-jsx-5.2.0.tgz#4c66069173d6fdd68ed85239fc256226182b2ebe"
@@ -704,11 +662,6 @@ ansi-regex@^2.0.0:
   resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-2.1.1.tgz#c3b33ab5ee360d86e0e628f0468ae7ef27d654df"
   integrity sha1-w7M6te42DYbg5ijwRorn7yfWVN8=
 
-ansi-regex@^3.0.0:
-  version "3.0.0"
-  resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-3.0.0.tgz#ed0317c322064f79466c02966bddb605ab37d998"
-  integrity sha1-7QMXwyIGT3lGbAKWa922Bas32Zg=
-
 ansi-regex@^4.1.0:
   version "4.1.0"
   resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-4.1.0.tgz#8b9f8f08cf1acb843756a839ca8c7e3168c51997"
@@ -975,38 +928,6 @@ babel-plugin-css-modules-transform@^1.6.1:
     css-modules-require-hook "^4.0.6"
     mkdirp "^0.5.1"
 
-babel-plugin-emotion@^9.2.11:
-  version "9.2.11"
-  resolved "https://registry.yarnpkg.com/babel-plugin-emotion/-/babel-plugin-emotion-9.2.11.tgz#319c005a9ee1d15bb447f59fe504c35fd5807728"
-  integrity sha512-dgCImifnOPPSeXod2znAmgc64NhaaOjGEHROR/M+lmStb3841yK1sgaDYAYMnlvWNz8GnpwIPN0VmNpbWYZ+VQ==
-  dependencies:
-    "@babel/helper-module-imports" "^7.0.0"
-    "@emotion/babel-utils" "^0.6.4"
-    "@emotion/hash" "^0.6.2"
-    "@emotion/memoize" "^0.6.1"
-    "@emotion/stylis" "^0.7.0"
-    babel-plugin-macros "^2.0.0"
-    babel-plugin-syntax-jsx "^6.18.0"
-    convert-source-map "^1.5.0"
-    find-root "^1.1.0"
-    mkdirp "^0.5.1"
-    source-map "^0.5.7"
-    touch "^2.0.1"
-
-babel-plugin-macros@^2.0.0:
-  version "2.8.0"
-  resolved "https://registry.yarnpkg.com/babel-plugin-macros/-/babel-plugin-macros-2.8.0.tgz#0f958a7cc6556b1e65344465d99111a1e5e10138"
-  integrity sha512-SEP5kJpfGYqYKpBrj5XU3ahw5p5GOHJ0U5ssOSQ/WBVdwkD2Dzlce95exQTs3jOVWPPKLBN2rlEWkCK7dSmLvg==
-  dependencies:
-    "@babel/runtime" "^7.7.2"
-    cosmiconfig "^6.0.0"
-    resolve "^1.12.0"
-
-babel-plugin-syntax-jsx@^6.18.0:
-  version "6.18.0"
-  resolved "https://registry.yarnpkg.com/babel-plugin-syntax-jsx/-/babel-plugin-syntax-jsx-6.18.0.tgz#0af32a9a6e13ca7a3fd5069e62d7b0f58d0d8946"
-  integrity sha1-CvMqmm4Tyno/1QaeYtew9Y0NiUY=
-
 babel-polyfill@^6.26.0:
   version "6.26.0"
   resolved "https://registry.yarnpkg.com/babel-polyfill/-/babel-polyfill-6.26.0.tgz#379937abc67d7895970adc621f284cd966cf2153"
@@ -1606,15 +1527,6 @@ clipboard@^2.0.0:
     select "^1.1.2"
     tiny-emitter "^2.0.0"
 
-cliui@^4.0.0:
-  version "4.1.0"
-  resolved "https://registry.yarnpkg.com/cliui/-/cliui-4.1.0.tgz#348422dbe82d800b3022eef4f6ac10bf2e4d1b49"
-  integrity sha512-4FG+RSG9DL7uEwRUZXZn3SS34DiDPfzP0VOiEwtUWlE+AR2EIg+hSyvrIgUUfhdgR/UkAeW2QHgeP+hWrXs7jQ==
-  dependencies:
-    string-width "^2.1.1"
-    strip-ansi "^4.0.0"
-    wrap-ansi "^2.0.0"
-
 cliui@^5.0.0:
   version "5.0.0"
   resolved "https://registry.yarnpkg.com/cliui/-/cliui-5.0.0.tgz#deefcfdb2e800784aa34f46fa08e06851c7bbbc5"
@@ -1633,6 +1545,15 @@ cliui@^6.0.0:
     strip-ansi "^6.0.0"
     wrap-ansi "^6.2.0"
 
+cliui@^7.0.2:
+  version "7.0.4"
+  resolved "https://registry.yarnpkg.com/cliui/-/cliui-7.0.4.tgz#a0265ee655476fc807aea9df3df8df7783808b4f"
+  integrity sha512-OcRE68cOsVMXp1Yvonl/fzkQOyjLSu/8bhPDfQt0e0/Eb283TKP20Fs2MqoPsr9SwA595rRCA+QMzYc9nBP+JQ==
+  dependencies:
+    string-width "^4.2.0"
+    strip-ansi "^6.0.0"
+    wrap-ansi "^7.0.0"
+
 clone-regexp@^2.1.0:
   version "2.2.0"
   resolved "https://registry.yarnpkg.com/clone-regexp/-/clone-regexp-2.2.0.tgz#7d65e00885cd8796405c35a737e7a86b7429e36f"
@@ -1664,11 +1585,6 @@ code-error-fragment@0.0.230:
   resolved "https://registry.yarnpkg.com/code-error-fragment/-/code-error-fragment-0.0.230.tgz#d736d75c832445342eca1d1fedbf17d9618b14d7"
   integrity sha512-cadkfKp6932H8UkhzE/gcUqhRMNf8jHzkAN7+5Myabswaghu4xABTgPHDCjW+dBAJxj/SpkTYokpzDqY4pCzQw==
 
-code-point-at@^1.0.0:
-  version "1.1.0"
-  resolved "https://registry.yarnpkg.com/code-point-at/-/code-point-at-1.1.0.tgz#0d070b4d043a5bea33a2f1a40e2edb3d9a4ccf77"
-  integrity sha1-DQcLTQQ6W+ozovGkDi7bPZpMz3c=
-
 collapse-white-space@^1.0.2:
   version "1.0.6"
   resolved "https://registry.yarnpkg.com/collapse-white-space/-/collapse-white-space-1.0.6.tgz#e63629c0016665792060dbbeb79c42239d2c5287"
@@ -1777,7 +1693,7 @@ contains-path@^0.1.0:
   resolved "https://registry.yarnpkg.com/contains-path/-/contains-path-0.1.0.tgz#fe8cf184ff6670b6baef01a9d4861a5cbec4120a"
   integrity sha1-/ozxhP9mcLa67wGp1IYaXL7EEgo=
 
-convert-source-map@^1.5.0, convert-source-map@^1.5.1, convert-source-map@^1.7.0:
+convert-source-map@^1.5.1, convert-source-map@^1.7.0:
   version "1.7.0"
   resolved "https://registry.yarnpkg.com/convert-source-map/-/convert-source-map-1.7.0.tgz#17a2cb882d7f77d3490585e2ce6c524424a3a442"
   integrity sha512-4FJkXzKXEDB1snCFZlLP4gpC3JILicCpGbzG9f9G7tGqGCzETQ2hWPrcinA9oU4wtf2biUaEH5065UnMeR33oA==
@@ -1862,19 +1778,6 @@ create-ecdh@^4.0.0:
     bn.js "^4.1.0"
     elliptic "^6.0.0"
 
-create-emotion@^9.2.12:
-  version "9.2.12"
-  resolved "https://registry.yarnpkg.com/create-emotion/-/create-emotion-9.2.12.tgz#0fc8e7f92c4f8bb924b0fef6781f66b1d07cb26f"
-  integrity sha512-P57uOF9NL2y98Xrbl2OuiDQUZ30GVmASsv5fbsjF4Hlraip2kyAvMm+2PoYUvFFw03Fhgtxk3RqZSm2/qHL9hA==
-  dependencies:
-    "@emotion/hash" "^0.6.2"
-    "@emotion/memoize" "^0.6.1"
-    "@emotion/stylis" "^0.7.0"
-    "@emotion/unitless" "^0.6.2"
-    csstype "^2.5.2"
-    stylis "^3.5.0"
-    stylis-rule-sheet "^0.0.10"
-
 create-hash@^1.1.0, create-hash@^1.1.2, create-hash@^1.2.0:
   version "1.2.0"
   resolved "https://registry.yarnpkg.com/create-hash/-/create-hash-1.2.0.tgz#889078af11a63756bcfb59bd221996be3a9ef196"
@@ -1898,7 +1801,7 @@ create-hmac@^1.1.0, create-hmac@^1.1.4, create-hmac@^1.1.7:
     safe-buffer "^5.0.1"
     sha.js "^2.4.8"
 
-cross-spawn@^6.0.0, cross-spawn@^6.0.5:
+cross-spawn@^6.0.5:
   version "6.0.5"
   resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-6.0.5.tgz#4a5ec7c64dfae22c3a14124dbacdee846d80cbc4"
   integrity sha512-eTVLrBSt7fjbDygz805pMnstIs2VTBNkRm0qxZd+M7A5XDdxVRWO5MxGBXZhjY4cqLYLdtrGqRf8mBPmzwSpWQ==
@@ -2115,11 +2018,6 @@ csso@^4.0.2:
   dependencies:
     css-tree "1.0.0-alpha.39"
 
-csstype@^2.5.2:
-  version "2.6.11"
-  resolved "https://registry.yarnpkg.com/csstype/-/csstype-2.6.11.tgz#452f4d024149ecf260a852b025e36562a253ffc5"
-  integrity sha512-l8YyEC9NBkSm783PFTvh0FmJy7s5pFKrDp49ZL7zBGX3fWkO+N4EEyan1qqp8cwPLDcD0OSdyY6hAMoxp34JFw==
-
 cyclist@^1.0.1:
   version "1.0.1"
   resolved "https://registry.yarnpkg.com/cyclist/-/cyclist-1.0.1.tgz#596e9698fd0c80e12038c2b82d6eb1b35b6224d9"
@@ -2596,7 +2494,7 @@ domhandler@^3.0.0:
   dependencies:
     domelementtype "^2.0.1"
 
-dompurify@^2.0.8:
+dompurify@^2.0.12:
   version "2.2.6"
   resolved "https://registry.yarnpkg.com/dompurify/-/dompurify-2.2.6.tgz#54945dc5c0b45ce5ae228705777e8e59d7b2edc4"
   integrity sha512-7b7ZArhhH0SP6W2R9cqK6RjaU82FZ2UPM7RO8qN1b1wyvC/NY1FNWcX1Pu00fFOAnzEORtwXe4bPaClg6pUybQ==
@@ -2678,14 +2576,6 @@ emojis-list@^3.0.0:
   resolved "https://registry.yarnpkg.com/emojis-list/-/emojis-list-3.0.0.tgz#5570662046ad29e2e916e71aae260abdff4f6a78"
   integrity sha512-/kyM18EfinwXZbno9FyUGeFh87KC8HRQBQGildHZbEuRyWFOmv1U10o9BBp8XVZDVNNuQKyIGIu5ZYAAXJ0V2Q==
 
-emotion@^9.2.6:
-  version "9.2.12"
-  resolved "https://registry.yarnpkg.com/emotion/-/emotion-9.2.12.tgz#53925aaa005614e65c6e43db8243c843574d1ea9"
-  integrity sha512-hcx7jppaI8VoXxIWEhxpDW7I+B4kq9RNzQLmsrF6LY8BGKqe2N+gFAQr0EfuFucFlPs2A9HM4+xNj4NeqEWIOQ==
-  dependencies:
-    babel-plugin-emotion "^9.2.11"
-    create-emotion "^9.2.12"
-
 end-of-stream@^1.0.0, end-of-stream@^1.1.0:
   version "1.4.4"
   resolved "https://registry.yarnpkg.com/end-of-stream/-/end-of-stream-1.4.4.tgz#5ae64a5f45057baf3626ec14da0ca5e4b2431eb0"
@@ -2831,7 +2721,7 @@ escalade@^3.0.1:
   resolved "https://registry.yarnpkg.com/escalade/-/escalade-3.0.2.tgz#6a580d70edb87880f22b4c91d0d56078df6962c4"
   integrity sha512-gPYAU37hYCUhW5euPeR+Y74F7BL+IBsV93j5cvGriSaD1aG6MGsqsV1yamRdrWrb2j3aiZvb0X+UBOWpx3JWtQ==
 
-escalade@^3.1.0:
+escalade@^3.1.0, escalade@^3.1.1:
   version "3.1.1"
   resolved "https://registry.yarnpkg.com/escalade/-/escalade-3.1.1.tgz#d8cfdc7000965c5a0174b4a82eaa5c0552742e40"
   integrity sha512-k0er2gUkLf8O0zKJiAhmkTnJlTvINGv7ygDNPbeIsX/TJjGJZHuh9B2UxbsaEkmlEo9MfhrSzmhIlhRlI2GXnw==
@@ -3040,10 +2930,10 @@ esutils@^2.0.2:
   resolved "https://registry.yarnpkg.com/esutils/-/esutils-2.0.3.tgz#74d2eb4de0b8da1293711910d50775b9b710ef64"
   integrity sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==
 
-eventemitter3@^4.0.0:
-  version "4.0.4"
-  resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-4.0.4.tgz#b5463ace635a083d018bdc7c917b4c5f10a85384"
-  integrity sha512-rlaVLnVxtxvoyLsQQFBx53YmXHDxRIzzTLbdfxqi4yocpSjAxXwkU0cScM5JgSKMqEhrZpnvQ2D9gjylR0AimQ==
+eventemitter3@^4.0.4:
+  version "4.0.7"
+  resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-4.0.7.tgz#2de9b68f6528d5644ef5c59526a1b4a07306169f"
+  integrity sha512-8guHBZCwKnFhYdHr2ysuRWErTwhoN2X8XELRlrRwpmfeY2jjuUN4taQMsULKUVo1K4DvZl+0pgfyoysHxvmvEw==
 
 events@^3.0.0:
   version "3.2.0"
@@ -3058,19 +2948,6 @@ evp_bytestokey@^1.0.0, evp_bytestokey@^1.0.3:
     md5.js "^1.3.4"
     safe-buffer "^5.1.1"
 
-execa@^1.0.0:
-  version "1.0.0"
-  resolved "https://registry.yarnpkg.com/execa/-/execa-1.0.0.tgz#c6236a5bb4df6d6f15e88e7f017798216749ddd8"
-  integrity sha512-adbxcyWV46qiHyvSp50TKt05tB4tK3HcmF7/nxfAdhnox83seTDbwnaqKO4sXRy7roHAIFqJP/Rw/AuEbX61LA==
-  dependencies:
-    cross-spawn "^6.0.0"
-    get-stream "^4.0.0"
-    is-stream "^1.1.0"
-    npm-run-path "^2.0.0"
-    p-finally "^1.0.0"
-    signal-exit "^3.0.0"
-    strip-eof "^1.0.0"
-
 execall@^2.0.0:
   version "2.0.0"
   resolved "https://registry.yarnpkg.com/execall/-/execall-2.0.0.tgz#16a06b5fe5099df7d00be5d9c06eecded1663b45"
@@ -3253,11 +3130,6 @@ find-cache-dir@^3.3.1:
     make-dir "^3.0.2"
     pkg-dir "^4.1.0"
 
-find-root@^1.1.0:
-  version "1.1.0"
-  resolved "https://registry.yarnpkg.com/find-root/-/find-root-1.1.0.tgz#abcfc8ba76f708c42a97b3d685b7e9450bfb9ce4"
-  integrity sha512-NKfW6bec6GfKc0SGx1e07QZY9PE99u0Bft/0rzSD5k3sO/vwkVUpDUKVm5Gpp5Ue3YfShPFTX2070tDs5kB9Ng==
-
 find-up@^2.0.0, find-up@^2.1.0:
   version "2.1.0"
   resolved "https://registry.yarnpkg.com/find-up/-/find-up-2.1.0.tgz#45d1b7e506c717ddd482775a2b77920a3c0c57a7"
@@ -3408,12 +3280,7 @@ gensync@^1.0.0-beta.1:
   resolved "https://registry.yarnpkg.com/gensync/-/gensync-1.0.0-beta.1.tgz#58f4361ff987e5ff6e1e7a210827aa371eaac269"
   integrity sha512-r8EC6NO1sngH/zdD9fiRDLdcgnbayXah+mLgManTaIZJqEC1MZstmnox8KpnI2/fxQwrp5OpCOYWLp4rBl4Jcg==
 
-get-caller-file@^1.0.1:
-  version "1.0.3"
-  resolved "https://registry.yarnpkg.com/get-caller-file/-/get-caller-file-1.0.3.tgz#f978fa4c90d1dfe7ff2d6beda2a515e713bdcf4a"
-  integrity sha512-3t6rVToeoZfYSGd8YoLFR2DJkiQrIiUrGcjvFX2mDw3bn6k2OtwHN0TNCLbBO+w8qTvimhDkv+LSscbJY1vE6w==
-
-get-caller-file@^2.0.1:
+get-caller-file@^2.0.1, get-caller-file@^2.0.5:
   version "2.0.5"
   resolved "https://registry.yarnpkg.com/get-caller-file/-/get-caller-file-2.0.5.tgz#4f94412a82db32f36e3b0b9741f8a97feb031f7e"
   integrity sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==
@@ -3423,13 +3290,6 @@ get-stdin@^8.0.0:
   resolved "https://registry.yarnpkg.com/get-stdin/-/get-stdin-8.0.0.tgz#cbad6a73feb75f6eeb22ba9e01f89aa28aa97a53"
   integrity sha512-sY22aA6xchAzprjyqmSEQv4UbAAzRN0L2dQB0NlN5acTTK9Don6nhoc3eAbUnpZiCANAMfd/+40kVdKfFygohg==
 
-get-stream@^4.0.0:
-  version "4.1.0"
-  resolved "https://registry.yarnpkg.com/get-stream/-/get-stream-4.1.0.tgz#c1b255575f3dc21d59bfc79cd3d2b46b1c3a54b5"
-  integrity sha512-GMat4EJ5161kIy2HevLlr4luNjBgvmj413KaQA7jt4V8B4RDsfpHk7WQ9GVqfYyyx8OS/L66Kox+rJRNklLK7w==
-  dependencies:
-    pump "^3.0.0"
-
 get-value@^2.0.3, get-value@^2.0.6:
   version "2.0.6"
   resolved "https://registry.yarnpkg.com/get-value/-/get-value-2.0.6.tgz#dc15ca1c672387ca76bd37ac0a395ba2042a2c28"
@@ -3914,11 +3774,6 @@ invariant@^2.2.2:
   dependencies:
     loose-envify "^1.0.0"
 
-invert-kv@^2.0.0:
-  version "2.0.0"
-  resolved "https://registry.yarnpkg.com/invert-kv/-/invert-kv-2.0.0.tgz#7393f5afa59ec9ff5f67a27620d11c226e3eec02"
-  integrity sha512-wPVv/y/QQ/Uiirj/vh3oP+1Ww+AWehmi1g5fFWGPF6IpCBCDVrhgHRMvrLfdYcwDh3QJbGXDW4JAuzxElLSqKA==
-
 is-absolute-url@^2.0.0:
   version "2.1.0"
   resolved "https://registry.yarnpkg.com/is-absolute-url/-/is-absolute-url-2.1.0.tgz#50530dfb84fcc9aa7dbe7852e83a37b93b9f2aa6"
@@ -4088,13 +3943,6 @@ is-finite@^1.0.0:
   dependencies:
     number-is-nan "^1.0.0"
 
-is-fullwidth-code-point@^1.0.0:
-  version "1.0.0"
-  resolved "https://registry.yarnpkg.com/is-fullwidth-code-point/-/is-fullwidth-code-point-1.0.0.tgz#ef9e31386f031a7f0d643af82fde50c457ef00cb"
-  integrity sha1-754xOG8DGn8NZDr4L95QxFfvAMs=
-  dependencies:
-    number-is-nan "^1.0.0"
-
 is-fullwidth-code-point@^2.0.0:
   version "2.0.0"
   resolved "https://registry.yarnpkg.com/is-fullwidth-code-point/-/is-fullwidth-code-point-2.0.0.tgz#a3b30a5c4f199183167aaab93beefae3ddfb654f"
@@ -4213,11 +4061,6 @@ is-resolvable@^1.0.0:
   resolved "https://registry.yarnpkg.com/is-resolvable/-/is-resolvable-1.1.0.tgz#fb18f87ce1feb925169c9a407c19318a3206ed88"
   integrity sha512-qgDYXFSR5WvEfuS5dMj6oTMEbrrSaM0CrFk2Yiq/gXnBvD9pMa2jGXxyhGLfvhZpuMZe18CJpFxAt3CRs42NMg==
 
-is-stream@^1.1.0:
-  version "1.1.0"
-  resolved "https://registry.yarnpkg.com/is-stream/-/is-stream-1.1.0.tgz#12d4a3dd4e68e0b79ceb8dbc84173ae80d91ca44"
-  integrity sha1-EtSj3U5o4Lec6428hBc66A2RykQ=
-
 is-string@^1.0.5:
   version "1.0.5"
   resolved "https://registry.yarnpkg.com/is-string/-/is-string-1.0.5.tgz#40493ed198ef3ff477b8c7f92f644ec82a5cd3a6"
@@ -4434,13 +4277,6 @@ last-call-webpack-plugin@^3.0.0:
     lodash "^4.17.5"
     webpack-sources "^1.1.0"
 
-lcid@^2.0.0:
-  version "2.0.0"
-  resolved "https://registry.yarnpkg.com/lcid/-/lcid-2.0.0.tgz#6ef5d2df60e52f82eb228a4c373e8d1f397253cf"
-  integrity sha512-avPEb8P8EGnwXKClwsNUgryVjllcRqtMYa49NTsbQagYuT1DcXnl1915oxWjoyGrXR6zH/Y0Zc96xWsPcoDKeA==
-  dependencies:
-    invert-kv "^2.0.0"
-
 leven@^3.1.0:
   version "3.1.0"
   resolved "https://registry.yarnpkg.com/leven/-/leven-3.1.0.tgz#77891de834064cccba82ae7842bb6b14a13ed7f2"
@@ -4624,13 +4460,6 @@ make-dir@^3.0.2:
   dependencies:
     semver "^6.0.0"
 
-map-age-cleaner@^0.1.1:
-  version "0.1.3"
-  resolved "https://registry.yarnpkg.com/map-age-cleaner/-/map-age-cleaner-0.1.3.tgz#7d583a7306434c055fe474b0f45078e6e1b4b92a"
-  integrity sha512-bJzx6nMoP6PDLPBFmg7+xRKeFZvFboMrGlxmNj9ClvX53KrmvM5bXFXEWjbz4cz1AFn+jWJ9z/DJSz7hrs0w3w==
-  dependencies:
-    p-defer "^1.0.0"
-
 map-cache@^0.2.2:
   version "0.2.2"
   resolved "https://registry.yarnpkg.com/map-cache/-/map-cache-0.2.2.tgz#c32abd0bd6525d9b051645bb4f26ac5dc98a0dbf"
@@ -4706,15 +4535,6 @@ mdn-data@2.0.6:
   resolved "https://registry.yarnpkg.com/mdn-data/-/mdn-data-2.0.6.tgz#852dc60fcaa5daa2e8cf6c9189c440ed3e042978"
   integrity sha512-rQvjv71olwNHgiTbfPZFkJtjNMciWgswYeciZhtvWLO8bmX3TnhyA62I6sTWOyZssWHJJjY6/KiWwqQsWWsqOA==
 
-mem@^4.0.0:
-  version "4.3.0"
-  resolved "https://registry.yarnpkg.com/mem/-/mem-4.3.0.tgz#461af497bc4ae09608cdb2e60eefb69bff744178"
-  integrity sha512-qX2bG48pTqYRVmDB37rn/6PT7LcR8T7oAX3bf99u1Tt1nzxYfxkgqDwUwolPlXweM0XzBOBFzSx4kfp7KP1s/w==
-  dependencies:
-    map-age-cleaner "^0.1.1"
-    mimic-fn "^2.0.0"
-    p-is-promise "^2.0.0"
-
 memoize-one@~5.1.1:
   version "5.1.1"
   resolved "https://registry.yarnpkg.com/memoize-one/-/memoize-one-5.1.1.tgz#047b6e3199b508eaec03504de71229b8eb1d75c0"
@@ -4807,11 +4627,6 @@ mime-types@^2.1.26:
   dependencies:
     mime-db "1.44.0"
 
-mimic-fn@^2.0.0:
-  version "2.1.0"
-  resolved "https://registry.yarnpkg.com/mimic-fn/-/mimic-fn-2.1.0.tgz#7ed2c2ccccaf84d3ffcb7a69b57711fc2083401b"
-  integrity sha512-OqbOk5oEQeAZ8WXWydlu9HJjz9WVdEIvamMCcXmuqUYjTknH/sqsWvhQ3vgwKFRR1HpjvNBKQ37nbJgYzGqGcg==
-
 min-indent@^1.0.0:
   version "1.0.0"
   resolved "https://registry.yarnpkg.com/min-indent/-/min-indent-1.0.0.tgz#cfc45c37e9ec0d8f0a0ec3dd4ef7f7c3abe39256"
@@ -4930,17 +4745,17 @@ mkdirp@^1.0.3, mkdirp@^1.0.4:
   resolved "https://registry.yarnpkg.com/mkdirp/-/mkdirp-1.0.4.tgz#3eb5ed62622756d79a5f0e2a221dfebad75c2f7e"
   integrity sha512-vVqVZQyf3WLx2Shd0qJ9xuvqgAyKPLAiqITEtqW0oIUjzo3PePDd6fW9iFz30ef7Ysp/oiWqbhszeGWW2T6Gzw==
 
-mobx-react-lite@^1.4.2:
-  version "1.5.2"
-  resolved "https://registry.yarnpkg.com/mobx-react-lite/-/mobx-react-lite-1.5.2.tgz#c4395b0568b9cb16f07669d8869cc4efa1b8656d"
-  integrity sha512-PyZmARqqWtpuQaAoHF5pKX7h6TKNLwq6vtovm4zZvG6sEbMRHHSqioGXSeQbpRmG8Kw8uln3q/W1yMO5IfL5Sg==
+mobx-react-lite@^3.2.0:
+  version "3.2.0"
+  resolved "https://registry.yarnpkg.com/mobx-react-lite/-/mobx-react-lite-3.2.0.tgz#331d7365a6b053378dfe9c087315b4e41c5df69f"
+  integrity sha512-q5+UHIqYCOpBoFm/PElDuOhbcatvTllgRp3M1s+Hp5j0Z6XNgDbgqxawJ0ZAUEyKM8X1zs70PCuhAIzX1f4Q/g==
 
-mobx-react@6.1.5:
-  version "6.1.5"
-  resolved "https://registry.yarnpkg.com/mobx-react/-/mobx-react-6.1.5.tgz#66a6f67bfe845216abc05d3aea47ceec8e31e2dd"
-  integrity sha512-EfWoXmGE2CfozH4Xirb65+il1ynHFCmxBSUabMSf+511YfjVs6QRcCrHkiVw+Il8iWp1gIyfa9qKkUgbDA9/2w==
+mobx-react@^7.0.5:
+  version "7.1.0"
+  resolved "https://registry.yarnpkg.com/mobx-react/-/mobx-react-7.1.0.tgz#d947cada3cfad294b4b6f692e969c242b9c16eaf"
+  integrity sha512-DxvA6VXmnZ+N9f/UTtolWtdRnAAQY2iHWTSPLktfpj8NKlXUe4dabBAjuXrBcZUM8GjLWnxD1ZEjssXq1M0RAw==
   dependencies:
-    mobx-react-lite "^1.4.2"
+    mobx-react-lite "^3.2.0"
 
 moment-locales-webpack-plugin@^1.2.0:
   version "1.2.0"
@@ -5085,13 +4900,6 @@ node-releases@^1.1.61:
   resolved "https://registry.yarnpkg.com/node-releases/-/node-releases-1.1.63.tgz#db6dbb388544c31e888216304e8fd170efee3ff5"
   integrity sha512-ukW3iCfQaoxJkSPN+iK7KznTeqDGVJatAEuXsJERYHa9tn/KaT5lBdIyxQjLEVTzSkyjJEuQ17/vaEjrOauDkg==
 
-nopt@~1.0.10:
-  version "1.0.10"
-  resolved "https://registry.yarnpkg.com/nopt/-/nopt-1.0.10.tgz#6ddd21bd2a31417b92727dd585f8a6f37608ebee"
-  integrity sha1-bd0hvSoxQXuScn3Vhfim83YI6+4=
-  dependencies:
-    abbrev "1"
-
 normalize-package-data@^2.3.2, normalize-package-data@^2.5.0:
   version "2.5.0"
   resolved "https://registry.yarnpkg.com/normalize-package-data/-/normalize-package-data-2.5.0.tgz#e66db1838b200c1dfc233225d12cb36520e234a8"
@@ -5139,13 +4947,6 @@ normalize-url@^3.0.0:
   resolved "https://registry.yarnpkg.com/normalize-url/-/normalize-url-3.3.0.tgz#b2e1c4dc4f7c6d57743df733a4f5978d18650559"
   integrity sha512-U+JJi7duF1o+u2pynbp2zXDW2/PADgC30f0GsHZtRh+HOcXHnw137TrNlyxxRvWW5fjKd3bcLHPxofWuCjaeZg==
 
-npm-run-path@^2.0.0:
-  version "2.0.2"
-  resolved "https://registry.yarnpkg.com/npm-run-path/-/npm-run-path-2.0.2.tgz#35a9232dfa35d7067b4cb2ddf2357b1871536c5f"
-  integrity sha1-NakjLfo11wZ7TLLd8jV7GHFTbF8=
-  dependencies:
-    path-key "^2.0.0"
-
 nth-check@^1.0.2:
   version "1.0.2"
   resolved "https://registry.yarnpkg.com/nth-check/-/nth-check-1.0.2.tgz#b2bd295c37e3dd58a3bf0700376663ba4d9cf05c"
@@ -5168,50 +4969,51 @@ nvd3@^1.8.6:
   resolved "https://registry.yarnpkg.com/nvd3/-/nvd3-1.8.6.tgz#2d3eba74bf33363b5101ebf1d093c59a53ae73c4"
   integrity sha1-LT66dL8zNjtRAevx0JPFmlOuc8Q=
 
-oas-kit-common@^1.0.7, oas-kit-common@^1.0.8:
+oas-kit-common@^1.0.8:
   version "1.0.8"
   resolved "https://registry.yarnpkg.com/oas-kit-common/-/oas-kit-common-1.0.8.tgz#6d8cacf6e9097967a4c7ea8bcbcbd77018e1f535"
   integrity sha512-pJTS2+T0oGIwgjGpw7sIRU8RQMcUoKCDWFLdBqKB2BNmGpbBMH2sdqAaOXUg8OzonZHU0L7vfJu1mJFEiYDWOQ==
   dependencies:
     fast-safe-stringify "^2.0.7"
 
-oas-linter@^3.1.0:
-  version "3.1.3"
-  resolved "https://registry.yarnpkg.com/oas-linter/-/oas-linter-3.1.3.tgz#1526b3da32a1bbf124d720f27fd4eb9971cebfff"
-  integrity sha512-jFWBHjSoqODGo7cKA/VWqqWSLbHNtnyCEpa2nMMS64SzCUbZDk63Oe7LqQZ2qJA0K2VRreYLt6cVkYy6MqNRDg==
+oas-linter@^3.1.3:
+  version "3.2.1"
+  resolved "https://registry.yarnpkg.com/oas-linter/-/oas-linter-3.2.1.tgz#1a6d9117d146805b58e56df479861de0293b6e5b"
+  integrity sha512-e5G6bbq3Nrfxm+SDPR5AiZ6n2smVUmhLA1OgI2/Bl8e2ywfWsKw/yuqrwiXXiNHb1wdM/GyPMX6QjCGJODlaaA==
   dependencies:
+    "@exodus/schemasafe" "^1.0.0-rc.2"
     should "^13.2.1"
-    yaml "^1.8.3"
+    yaml "^1.10.0"
 
-oas-resolver@^2.3.0:
-  version "2.4.1"
-  resolved "https://registry.yarnpkg.com/oas-resolver/-/oas-resolver-2.4.1.tgz#46948226f73e514ac6733f166cc559e800e4389b"
-  integrity sha512-rRmUv9mDTKPtsB2OGaoNMK4BC1Q/pL+tWRPKRjXJEBoLmfegJhecOZPBtIR0gKEVQb9iAA0MqulkgY43EiCFDg==
+oas-resolver@^2.4.3:
+  version "2.5.4"
+  resolved "https://registry.yarnpkg.com/oas-resolver/-/oas-resolver-2.5.4.tgz#81fa1aaa7e2387ab2dba1045827e9d7b79822326"
+  integrity sha512-1vIj5Wkjmi+kZj5sFamt95LkuXoalmoKUohtaUQoCQZjLfPFaY8uZ7nw6IZaWuE6eLON2b6xrXhxD4hiTdYl0g==
   dependencies:
     node-fetch-h2 "^2.3.0"
     oas-kit-common "^1.0.8"
-    reftools "^1.1.3"
-    yaml "^1.8.3"
-    yargs "^15.3.1"
+    reftools "^1.1.8"
+    yaml "^1.10.0"
+    yargs "^16.1.1"
 
-oas-schema-walker@^1.1.3:
-  version "1.1.4"
-  resolved "https://registry.yarnpkg.com/oas-schema-walker/-/oas-schema-walker-1.1.4.tgz#4b9d090c3622039741334d3e138510ff38197618"
-  integrity sha512-foVDDS0RJYMfhQEDh/WdBuCzydTcsCnGo9EeD8SpWq1uW10JXiz+8SfYVDA7LO87kjmlnTRZle/2gr5qxabaEA==
+oas-schema-walker@^1.1.5:
+  version "1.1.5"
+  resolved "https://registry.yarnpkg.com/oas-schema-walker/-/oas-schema-walker-1.1.5.tgz#74c3cd47b70ff8e0b19adada14455b5d3ac38a22"
+  integrity sha512-2yucenq1a9YPmeNExoUa9Qwrt9RFkjqaMAA1X+U7sbb0AqBeTIdMHky9SQQ6iN94bO5NW0W4TRYXerG+BdAvAQ==
 
-oas-validator@^3.4.0:
-  version "3.4.0"
-  resolved "https://registry.yarnpkg.com/oas-validator/-/oas-validator-3.4.0.tgz#7633b02e495af4a4e0224b249288b0928748476d"
-  integrity sha512-l/SxykuACi2U51osSsBXTxdsFc8Fw41xI7AsZkzgVgWJAzoEFaaNptt35WgY9C3757RUclsm6ye5GvSyYoozLQ==
+oas-validator@^4.0.8:
+  version "4.0.8"
+  resolved "https://registry.yarnpkg.com/oas-validator/-/oas-validator-4.0.8.tgz#4f1a4d6bd9e030ad07db03fd7a7bc3a91aabcc7d"
+  integrity sha512-bIt8erTyclF7bkaySTtQ9sppqyVc+mAlPi7vPzCLVHJsL9nrivQjc/jHLX/o+eGbxHd6a6YBwuY/Vxa6wGsiuw==
   dependencies:
     ajv "^5.5.2"
     better-ajv-errors "^0.6.7"
     call-me-maybe "^1.0.1"
-    oas-kit-common "^1.0.7"
-    oas-linter "^3.1.0"
-    oas-resolver "^2.3.0"
-    oas-schema-walker "^1.1.3"
-    reftools "^1.1.0"
+    oas-kit-common "^1.0.8"
+    oas-linter "^3.1.3"
+    oas-resolver "^2.4.3"
+    oas-schema-walker "^1.1.5"
+    reftools "^1.1.5"
     should "^13.2.1"
     yaml "^1.8.3"
 
@@ -5329,10 +5131,10 @@ ono@^4.0.11:
   dependencies:
     format-util "^1.0.3"
 
-openapi-sampler@^1.0.0-beta.16:
-  version "1.0.0-beta.16"
-  resolved "https://registry.yarnpkg.com/openapi-sampler/-/openapi-sampler-1.0.0-beta.16.tgz#7813524d5b88d222efb772ceb5a809075d6d9174"
-  integrity sha512-05+GvwMagTY7GxoDQoWJfmAUFlxfebciiEzqKmu4iq6+MqBEn62AMUkn0CTxyKhnUGIaR2KXjTeslxIeJwVIOw==
+openapi-sampler@^1.0.0-beta.18:
+  version "1.0.0-beta.18"
+  resolved "https://registry.yarnpkg.com/openapi-sampler/-/openapi-sampler-1.0.0-beta.18.tgz#9e0845616a669e048860625ea5c10d0f554f1b53"
+  integrity sha512-nG/0kvvSY5FbrU5A+Dbp1xTQN++7pKIh87/atryZlxrzDuok5Y6TCbpxO1jYqpUKLycE4ReKGHCywezngG6xtQ==
   dependencies:
     json-pointer "^0.6.0"
 
@@ -5366,35 +5168,11 @@ os-homedir@^1.0.0:
   resolved "https://registry.yarnpkg.com/os-homedir/-/os-homedir-1.0.2.tgz#ffbc4988336e0e833de0c168c7ef152121aa7fb3"
   integrity sha1-/7xJiDNuDoM94MFox+8VISGqf7M=
 
-os-locale@^3.0.0:
-  version "3.1.0"
-  resolved "https://registry.yarnpkg.com/os-locale/-/os-locale-3.1.0.tgz#a802a6ee17f24c10483ab9935719cef4ed16bf1a"
-  integrity sha512-Z8l3R4wYWM40/52Z+S265okfFj8Kt2cC2MKY+xNi3kFs+XGI7WXu/I309QQQYbRW4ijiZ+yxs9pqEhJh0DqW3Q==
-  dependencies:
-    execa "^1.0.0"
-    lcid "^2.0.0"
-    mem "^4.0.0"
-
 os-tmpdir@^1.0.1:
   version "1.0.2"
   resolved "https://registry.yarnpkg.com/os-tmpdir/-/os-tmpdir-1.0.2.tgz#bbe67406c79aa85c5cfec766fe5734555dfa1274"
   integrity sha1-u+Z0BseaqFxc/sdm/lc0VV36EnQ=
 
-p-defer@^1.0.0:
-  version "1.0.0"
-  resolved "https://registry.yarnpkg.com/p-defer/-/p-defer-1.0.0.tgz#9f6eb182f6c9aa8cd743004a7d4f96b196b0fb0c"
-  integrity sha1-n26xgvbJqozXQwBKfU+WsZaw+ww=
-
-p-finally@^1.0.0:
-  version "1.0.0"
-  resolved "https://registry.yarnpkg.com/p-finally/-/p-finally-1.0.0.tgz#3fbcfb15b899a44123b34b6dcc18b724336a2cae"
-  integrity sha1-P7z7FbiZpEEjs0ttzBi3JDNqLK4=
-
-p-is-promise@^2.0.0:
-  version "2.1.0"
-  resolved "https://registry.yarnpkg.com/p-is-promise/-/p-is-promise-2.1.0.tgz#918cebaea248a62cf7ffab8e3bca8c5f882fc42e"
-  integrity sha512-Y3W0wlRPK8ZMRbNq97l4M5otioeA5lm1z7bkNkxCka8HSPjR0xRWmpCmc9utiaLP9Jb1eD8BgeIxTW4AIF45Pg==
-
 p-limit@^1.1.0:
   version "1.3.0"
   resolved "https://registry.yarnpkg.com/p-limit/-/p-limit-1.3.0.tgz#b86bd5f0c25690911c7590fcbfc2010d54b3ccb8"
@@ -5569,7 +5347,7 @@ path-is-inside@^1.0.2:
   resolved "https://registry.yarnpkg.com/path-is-inside/-/path-is-inside-1.0.2.tgz#365417dede44430d1c11af61027facf074bdfc53"
   integrity sha1-NlQX3t5EQw0cEa9hAn+s8HS9/FM=
 
-path-key@^2.0.0, path-key@^2.0.1:
+path-key@^2.0.1:
   version "2.0.1"
   resolved "https://registry.yarnpkg.com/path-key/-/path-key-2.0.1.tgz#411cadb574c5a140d3a4b1910d40d80cc9f40b40"
   integrity sha1-QRyttXTFoUDTpLGRDUDYDMn0C0A=
@@ -5660,12 +5438,12 @@ pkg-dir@^4.1.0:
   dependencies:
     find-up "^4.0.0"
 
-polished@^3.4.4:
-  version "3.6.5"
-  resolved "https://registry.yarnpkg.com/polished/-/polished-3.6.5.tgz#dbefdde64c675935ec55119fe2a2ab627ca82e9c"
-  integrity sha512-VwhC9MlhW7O5dg/z7k32dabcAFW1VI2+7fSe8cE/kXcfL7mVdoa5UxciYGW2sJU78ldDLT6+ROEKIZKFNTnUXQ==
+polished@^3.6.5:
+  version "3.7.1"
+  resolved "https://registry.yarnpkg.com/polished/-/polished-3.7.1.tgz#d1addc87ee16eb5b413c6165eda37600cccb9c11"
+  integrity sha512-/QgHrNGYwIA4mwxJ/7FSvalUJsm7KNfnXiScVSEG2Xa5qxDeBn4nmdjN2pW00mkM2Tts64ktc47U8F7Ed1BRAA==
   dependencies:
-    "@babel/runtime" "^7.9.2"
+    "@babel/runtime" "^7.12.5"
 
 posix-character-classes@^0.1.0:
   version "0.1.1"
@@ -6127,10 +5905,10 @@ prepend-http@^1.0.0:
   resolved "https://registry.yarnpkg.com/prepend-http/-/prepend-http-1.0.4.tgz#d4f4562b0ce3696e41ac52d0e002e57a635dc6dc"
   integrity sha1-1PRWKwzjaW5BrFLQ4ALlemNdxtw=
 
-prismjs@^1.19.0:
-  version "1.21.0"
-  resolved "https://registry.yarnpkg.com/prismjs/-/prismjs-1.21.0.tgz#36c086ec36b45319ec4218ee164c110f9fc015a3"
-  integrity sha512-uGdSIu1nk3kej2iZsLyDoJ7e9bnPzIgY0naW/HdknGj61zScaprVEVGHrPoXqI+M9sP0NDnTK2jpkvmldpuqDw==
+prismjs@^1.22.0:
+  version "1.23.0"
+  resolved "https://registry.yarnpkg.com/prismjs/-/prismjs-1.23.0.tgz#d3b3967f7d72440690497652a9d40ff046067f33"
+  integrity sha512-c29LVsqOaLbBHuIbsTxaKENh1N2EQBOHaWv7gkHN4dgRbxSREqDnDbtFJYdpPauS4YCplMSNCABQ6Eeor69bAA==
   optionalDependencies:
     clipboard "^2.0.0"
 
@@ -6268,22 +6046,15 @@ randomfill@^1.0.3:
     randombytes "^2.0.5"
     safe-buffer "^5.1.0"
 
-react-dropdown-aria@^2.0.6:
-  version "2.0.6"
-  resolved "https://registry.yarnpkg.com/react-dropdown-aria/-/react-dropdown-aria-2.0.6.tgz#40cec5edd97a591d2f29e8c05aa8c53230e2aa6e"
-  integrity sha512-/9NlFopChlSKmuGL2P6S3oDwl9ddXcbNLnd1a7POov4f5/oGtSc3qBFmS4wH5xmLJe/38MhPOKF3e2q3laRi1g==
-  dependencies:
-    emotion "^9.2.6"
-
 react-is@^16.8.1:
   version "16.13.1"
   resolved "https://registry.yarnpkg.com/react-is/-/react-is-16.13.1.tgz#789729a4dc36de2999dc156dd6c1d9c18cea56a4"
   integrity sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ==
 
-react-tabs@^3.1.0:
-  version "3.1.1"
-  resolved "https://registry.yarnpkg.com/react-tabs/-/react-tabs-3.1.1.tgz#b363a239f76046bb2158875a1e5921b11064052f"
-  integrity sha512-HpySC29NN1BkzBAnOC+ajfzPbTaVZcSWzMSjk56uAhPC/rBGtli8lTysR4CfPAyEE/hfweIzagOIoJ7nu80yng==
+react-tabs@^3.1.1:
+  version "3.2.0"
+  resolved "https://registry.yarnpkg.com/react-tabs/-/react-tabs-3.2.0.tgz#0fd8d595ef26d3684da876c27a3cc90392dffb40"
+  integrity sha512-q7oNapNRoYTQq8gDhApXwdBheuuN5qQ4YvUaQUAkb6OSSttJulBAvxJ0FS6W5uojvMxbbIZKu1f2I+GXISoLjw==
   dependencies:
     clsx "^1.1.0"
     prop-types "^15.5.0"
@@ -6379,40 +6150,40 @@ redent@^3.0.0:
     indent-string "^4.0.0"
     strip-indent "^3.0.0"
 
-redoc@^2.0.0-rc.30:
-  version "2.0.0-rc.33"
-  resolved "https://registry.yarnpkg.com/redoc/-/redoc-2.0.0-rc.33.tgz#df43f533bb0cc283cc209d69d2a91404a24bd8d1"
-  integrity sha512-1KLdnOU1aBIddgNBcEIU29h3VqXoTT493gT5hjyHg6sE91x9qEVWPYM2A+eETQFz5ygTwkBCp6xZDxVs+HIA9w==
+redoc@^2.0.0-rc.48:
+  version "2.0.0-rc.48"
+  resolved "https://registry.yarnpkg.com/redoc/-/redoc-2.0.0-rc.48.tgz#5303cff67af5cba8a2b48dc1347a9854d45be835"
+  integrity sha512-shArJWhNG2gQ0XKxW8WcfG8peNOtxbZ86CqxgrR9P7MnE5ESAo559CH/PSlezePeVLpcC0C9tcimOfSN5MaAvA==
   dependencies:
+    "@redocly/react-dropdown-aria" "^2.0.11"
     "@types/node" "^13.11.1"
     classnames "^2.2.6"
     decko "^1.2.0"
-    dompurify "^2.0.8"
-    eventemitter3 "^4.0.0"
+    dompurify "^2.0.12"
+    eventemitter3 "^4.0.4"
     json-pointer "^0.6.0"
     json-schema-ref-parser "^6.1.0"
     lunr "2.3.8"
     mark.js "^8.11.1"
     marked "^0.7.0"
     memoize-one "~5.1.1"
-    mobx-react "6.1.5"
-    openapi-sampler "^1.0.0-beta.16"
+    mobx-react "^7.0.5"
+    openapi-sampler "^1.0.0-beta.18"
     perfect-scrollbar "^1.4.0"
-    polished "^3.4.4"
-    prismjs "^1.19.0"
+    polished "^3.6.5"
+    prismjs "^1.22.0"
     prop-types "^15.7.2"
-    react-dropdown-aria "^2.0.6"
-    react-tabs "^3.1.0"
-    slugify "^1.4.0"
+    react-tabs "^3.1.1"
+    slugify "^1.4.4"
     stickyfill "^1.1.1"
-    swagger2openapi "^5.3.4"
-    tslib "^1.11.1"
+    swagger2openapi "^6.2.1"
+    tslib "^2.0.0"
     url-template "^2.0.8"
 
-reftools@^1.1.0, reftools@^1.1.3:
-  version "1.1.3"
-  resolved "https://registry.yarnpkg.com/reftools/-/reftools-1.1.3.tgz#f430d11677d81ae97b8dbb3836713bb52b1cd0a7"
-  integrity sha512-JTlhKmSzqE/gt5Z5RX25yZDq67MlRRtTz1gLy/NY+wPDx1e1vEJsv1PoNrpKZBwitcEMXs2k7pzmbmraP1ZMAQ==
+reftools@^1.1.5, reftools@^1.1.8:
+  version "1.1.8"
+  resolved "https://registry.yarnpkg.com/reftools/-/reftools-1.1.8.tgz#cc08fd67eb913d779fd330657d010cc080c7d643"
+  integrity sha512-Yvz9NH8uFHzD/AXX82Li1GdAP6FzDBxEZw+njerNBBQv/XHihqsWAjNfXtaq4QD2l4TEZVnp4UbktdYSegAM3g==
 
 regenerate@^1.2.1:
   version "1.4.0"
@@ -6551,11 +6322,6 @@ require-directory@^2.1.1:
   resolved "https://registry.yarnpkg.com/require-directory/-/require-directory-2.1.1.tgz#8c64ad5fd30dab1c976e2344ffe7f792a6a6df42"
   integrity sha1-jGStX9MNqxyXbiNE/+f3kqam30I=
 
-require-main-filename@^1.0.1:
-  version "1.0.1"
-  resolved "https://registry.yarnpkg.com/require-main-filename/-/require-main-filename-1.0.1.tgz#97f717b69d48784f5f526a6c5aa8ffdda055a4d1"
-  integrity sha1-l/cXtp1IeE9fUmpsWqj/3aBVpNE=
-
 require-main-filename@^2.0.0:
   version "2.0.0"
   resolved "https://registry.yarnpkg.com/require-main-filename/-/require-main-filename-2.0.0.tgz#d0b329ecc7cc0f61649f62215be69af54aa8989b"
@@ -6863,11 +6629,6 @@ should@^13.2.1:
     should-type-adaptors "^1.0.1"
     should-util "^1.0.0"
 
-signal-exit@^3.0.0:
-  version "3.0.3"
-  resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.3.tgz#a1410c2edd8f077b08b4e253c8eacfcaf057461c"
-  integrity sha512-VUJ49FC8U1OxwZLxIbTTrDvLnf/6TDgxZcK8wxR8zs13xpx7xbG60ndBlhNrFi2EMuFRoeDoJO7wthSLq42EjA==
-
 signal-exit@^3.0.2:
   version "3.0.2"
   resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.2.tgz#b5fdc08f1287ea1178628e415e25132b73646c6d"
@@ -6899,10 +6660,10 @@ slice-ansi@^2.1.0:
     astral-regex "^1.0.0"
     is-fullwidth-code-point "^2.0.0"
 
-slugify@^1.4.0:
-  version "1.4.4"
-  resolved "https://registry.yarnpkg.com/slugify/-/slugify-1.4.4.tgz#2f032ffa52b1e1ca2a27737c1ce47baae3d0883a"
-  integrity sha512-N2+9NJ8JzfRMh6PQLrBeDEnVDQZSytE/W4BTC4fNNPmO90Uu58uNwSlIJSs+lmPgWsaAF79WLhVPe5tuy7spjw==
+slugify@^1.4.4:
+  version "1.4.7"
+  resolved "https://registry.yarnpkg.com/slugify/-/slugify-1.4.7.tgz#e42359d505afd84a44513280868e31202a79a628"
+  integrity sha512-tf+h5W1IrjNm/9rKKj0JU2MDMruiopx0jjVA5zCdBtcGjfp0+c5rHw/zADLC3IeKlGHtVbHtpfzvYA0OYT+HKg==
 
 snapdragon-node@^2.0.1:
   version "2.1.1"
@@ -6987,7 +6748,7 @@ source-map@^0.6.0, source-map@^0.6.1, source-map@~0.6.1:
   resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.6.1.tgz#74722af32e9614e9c287a8d0bbde48b5e2f1a263"
   integrity sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==
 
-source-map@^0.7.2, source-map@^0.7.3:
+source-map@^0.7.3:
   version "0.7.3"
   resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.7.3.tgz#5302f8169031735226544092e64981f751750383"
   integrity sha512-CkCj6giN3S+n9qrYiBTX5gystlENnRW5jZeNLHpe6aue+SrHcG5VYwujhW9s4dY31mEGsxBDrHR6oI69fTXsaQ==
@@ -7109,23 +6870,6 @@ strict-uri-encode@^1.0.0:
   resolved "https://registry.yarnpkg.com/strict-uri-encode/-/strict-uri-encode-1.1.0.tgz#279b225df1d582b1f54e65addd4352e18faa0713"
   integrity sha1-J5siXfHVgrH1TmWt3UNS4Y+qBxM=
 
-string-width@^1.0.1:
-  version "1.0.2"
-  resolved "https://registry.yarnpkg.com/string-width/-/string-width-1.0.2.tgz#118bdf5b8cdc51a2a7e70d211e07e2b0b9b107d3"
-  integrity sha1-EYvfW4zcUaKn5w0hHgfisLmxB9M=
-  dependencies:
-    code-point-at "^1.0.0"
-    is-fullwidth-code-point "^1.0.0"
-    strip-ansi "^3.0.0"
-
-string-width@^2.0.0, string-width@^2.1.1:
-  version "2.1.1"
-  resolved "https://registry.yarnpkg.com/string-width/-/string-width-2.1.1.tgz#ab93f27a8dc13d28cac815c462143a6d9012ae9e"
-  integrity sha512-nOqH59deCq9SRHlxq1Aw85Jnt4w6KvLKqWVik6oA9ZklXLNIOlqg4F2yrT1MVaTjAqvVwdfeZ7w7aCvJD7ugkw==
-  dependencies:
-    is-fullwidth-code-point "^2.0.0"
-    strip-ansi "^4.0.0"
-
 string-width@^3.0.0, string-width@^3.1.0:
   version "3.1.0"
   resolved "https://registry.yarnpkg.com/string-width/-/string-width-3.1.0.tgz#22767be21b62af1081574306f69ac51b62203961"
@@ -7201,20 +6945,13 @@ stringify-entities@^3.0.0:
     is-decimal "^1.0.2"
     is-hexadecimal "^1.0.0"
 
-strip-ansi@^3.0.0, strip-ansi@^3.0.1:
+strip-ansi@^3.0.0:
   version "3.0.1"
   resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-3.0.1.tgz#6a385fb8853d952d5ff05d0e8aaf94278dc63dcf"
   integrity sha1-ajhfuIU9lS1f8F0Oiq+UJ43GPc8=
   dependencies:
     ansi-regex "^2.0.0"
 
-strip-ansi@^4.0.0:
-  version "4.0.0"
-  resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-4.0.0.tgz#a8479022eb1ac368a871389b635262c505ee368f"
-  integrity sha1-qEeQIusaw2iocTibY1JixQXuNo8=
-  dependencies:
-    ansi-regex "^3.0.0"
-
 strip-ansi@^5.0.0, strip-ansi@^5.1.0, strip-ansi@^5.2.0:
   version "5.2.0"
   resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-5.2.0.tgz#8c9a536feb6afc962bdfa5b104a5091c1ad9c0ae"
@@ -7239,11 +6976,6 @@ strip-comments@^2.0.1:
   resolved "https://registry.yarnpkg.com/strip-comments/-/strip-comments-2.0.1.tgz#4ad11c3fbcac177a67a40ac224ca339ca1c1ba9b"
   integrity sha512-ZprKx+bBLXv067WTCALv8SSz5l2+XhpYCsVtSqlMnkAXMWDq+/ekVbl1ghqP9rUHTzv6sm/DwCOiYutU/yp1fw==
 
-strip-eof@^1.0.0:
-  version "1.0.0"
-  resolved "https://registry.yarnpkg.com/strip-eof/-/strip-eof-1.0.0.tgz#bb43ff5598a6eb05d89b59fcd129c983313606bf"
-  integrity sha1-u0P/VZim6wXYm1n80SnJgzE2Br8=
-
 strip-indent@^3.0.0:
   version "3.0.0"
   resolved "https://registry.yarnpkg.com/strip-indent/-/strip-indent-3.0.0.tgz#c32e1cee940b6b3432c771bc2c54bcce73cd3001"
@@ -7344,16 +7076,6 @@ stylelint@^13.6.1:
     v8-compile-cache "^2.1.1"
     write-file-atomic "^3.0.3"
 
-stylis-rule-sheet@^0.0.10:
-  version "0.0.10"
-  resolved "https://registry.yarnpkg.com/stylis-rule-sheet/-/stylis-rule-sheet-0.0.10.tgz#44e64a2b076643f4b52e5ff71efc04d8c3c4a430"
-  integrity sha512-nTbZoaqoBnmK+ptANthb10ZRZOGC+EmTLLUxeYIuHNkEKcmKgXX1XWKkUBT2Ac4es3NybooPe0SmvKdhKJZAuw==
-
-stylis@^3.5.0:
-  version "3.5.4"
-  resolved "https://registry.yarnpkg.com/stylis/-/stylis-3.5.4.tgz#f665f25f5e299cf3d64654ab949a57c768b73fbe"
-  integrity sha512-8/3pSmthWM7lsPBKv7NXkzn2Uc9W7NotcwGNpJaa3k7WMM1XDCA4MgT5k/8BIexd5ydZdboXtU90XH9Ec4Bv/Q==
-
 sugarss@^2.0.0:
   version "2.0.0"
   resolved "https://registry.yarnpkg.com/sugarss/-/sugarss-2.0.0.tgz#ddd76e0124b297d40bf3cca31c8b22ecb43bc61d"
@@ -7411,22 +7133,22 @@ svgo@^1.0.0:
     unquote "~1.1.1"
     util.promisify "~1.0.0"
 
-swagger2openapi@^5.3.4:
-  version "5.4.0"
-  resolved "https://registry.yarnpkg.com/swagger2openapi/-/swagger2openapi-5.4.0.tgz#1e1c8909f7966b1f455bf1b66490093ac1c0029c"
-  integrity sha512-f5QqfXawiVijhjMtYqWZ55ESHPZFqrPC8L9idhIiuSX8O2qsa1i4MVGtCM3TQF+Smzr/6WfT/7zBuzG3aTgPAA==
+swagger2openapi@^6.2.1:
+  version "6.2.3"
+  resolved "https://registry.yarnpkg.com/swagger2openapi/-/swagger2openapi-6.2.3.tgz#4a8059f89d851aee4c9ab178f9b7190debd904e2"
+  integrity sha512-cUUktzLpK69UwpMbcTzjMw2ns9RZChfxh56AHv6+hTx3StPOX2foZjPgds3HlJcINbxosYYBn/D3cG8nwcCWwQ==
   dependencies:
     better-ajv-errors "^0.6.1"
     call-me-maybe "^1.0.1"
     node-fetch-h2 "^2.3.0"
     node-readfiles "^0.2.0"
-    oas-kit-common "^1.0.7"
-    oas-resolver "^2.3.0"
-    oas-schema-walker "^1.1.3"
-    oas-validator "^3.4.0"
-    reftools "^1.1.0"
+    oas-kit-common "^1.0.8"
+    oas-resolver "^2.4.3"
+    oas-schema-walker "^1.1.5"
+    oas-validator "^4.0.8"
+    reftools "^1.1.5"
     yaml "^1.8.3"
-    yargs "^12.0.5"
+    yargs "^15.3.1"
 
 table@^5.2.3, table@^5.4.6:
   version "5.4.6"
@@ -7556,13 +7278,6 @@ to-regex@^3.0.1, to-regex@^3.0.2:
     regex-not "^1.0.2"
     safe-regex "^1.1.0"
 
-touch@^2.0.1:
-  version "2.0.2"
-  resolved "https://registry.yarnpkg.com/touch/-/touch-2.0.2.tgz#ca0b2a3ae3211246a61b16ba9e6cbf1596287164"
-  integrity sha512-qjNtvsFXTRq7IuMLweVgFxmEuQ6gLbRs2jQxL80TtZ31dEKWYIxRXquij6w6VimyDek5hD3PytljHmEtAs2u0A==
-  dependencies:
-    nopt "~1.0.10"
-
 trim-newlines@^3.0.0:
   version "3.0.0"
   resolved "https://registry.yarnpkg.com/trim-newlines/-/trim-newlines-3.0.0.tgz#79726304a6a898aa8373427298d54c2ee8b1cb30"
@@ -7598,11 +7313,16 @@ tsconfig-paths@^3.9.0:
     minimist "^1.2.0"
     strip-bom "^3.0.0"
 
-tslib@^1.11.1, tslib@^1.9.0:
+tslib@^1.9.0:
   version "1.13.0"
   resolved "https://registry.yarnpkg.com/tslib/-/tslib-1.13.0.tgz#c881e13cc7015894ed914862d276436fa9a47043"
   integrity sha512-i/6DQjL8Xf3be4K/E6Wgpekn5Qasl1usyw++dAA35Ue5orEn65VIxOA+YvNNl9HV3qv70T7CNwjODHZrLwvd1Q==
 
+tslib@^2.0.0:
+  version "2.1.0"
+  resolved "https://registry.yarnpkg.com/tslib/-/tslib-2.1.0.tgz#da60860f1c2ecaa5703ab7d39bc05b6bf988b97a"
+  integrity sha512-hcVC3wYEziELGGmEEXue7D75zbwIIVUMWAVbHItGPx0ziyXxrOMQx4rQEVEV45Ut/1IotuEvwqPopzIOkDMf0A==
+
 tty-browserify@0.0.0:
   version "0.0.0"
   resolved "https://registry.yarnpkg.com/tty-browserify/-/tty-browserify-0.0.0.tgz#a157ba402da24e9bf957f9aa69d524eed42901a6"
@@ -8008,14 +7728,6 @@ worker-farm@^1.7.0:
   dependencies:
     errno "~0.1.7"
 
-wrap-ansi@^2.0.0:
-  version "2.1.0"
-  resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-2.1.0.tgz#d8fc3d284dd05794fe84973caecdd1cf824fdd85"
-  integrity sha1-2Pw9KE3QV5T+hJc8rs3Rz4JP3YU=
-  dependencies:
-    string-width "^1.0.1"
-    strip-ansi "^3.0.1"
-
 wrap-ansi@^5.1.0:
   version "5.1.0"
   resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-5.1.0.tgz#1fd1f67235d5b6d0fee781056001bfb694c03b09"
@@ -8034,6 +7746,15 @@ wrap-ansi@^6.2.0:
     string-width "^4.1.0"
     strip-ansi "^6.0.0"
 
+wrap-ansi@^7.0.0:
+  version "7.0.0"
+  resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43"
+  integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==
+  dependencies:
+    ansi-styles "^4.0.0"
+    string-width "^4.1.0"
+    strip-ansi "^6.0.0"
+
 wrappy@1:
   version "1.0.2"
   resolved "https://registry.yarnpkg.com/wrappy/-/wrappy-1.0.2.tgz#b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"
@@ -8061,11 +7782,16 @@ xtend@^4.0.0, xtend@^4.0.1, xtend@~4.0.1:
   resolved "https://registry.yarnpkg.com/xtend/-/xtend-4.0.2.tgz#bb72779f5fa465186b1f438f674fa347fdb5db54"
   integrity sha512-LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ==
 
-"y18n@^3.2.1 || ^4.0.0", y18n@^4.0.0:
+y18n@^4.0.0:
   version "4.0.0"
   resolved "https://registry.yarnpkg.com/y18n/-/y18n-4.0.0.tgz#95ef94f85ecc81d007c264e190a120f0a3c8566b"
   integrity sha512-r9S/ZyXu/Xu9q1tYlpsLIsa3EeLXXk0VwlxqTcFRfg9EhMW+17kbt9G0NrgCmhGb5vT2hyhJZLfDGx+7+5Uj/w==
 
+y18n@^5.0.5:
+  version "5.0.5"
+  resolved "https://registry.yarnpkg.com/y18n/-/y18n-5.0.5.tgz#8769ec08d03b1ea2df2500acef561743bbb9ab18"
+  integrity sha512-hsRUr4FFrvhhRH12wOdfs38Gy7k2FFzB9qgN9v3aLykRq0dRcdcpz5C9FxdS2NuhOrI/628b/KSTJ3rwHysYSg==
+
 yallist@^3.0.2:
   version "3.1.1"
   resolved "https://registry.yarnpkg.com/yallist/-/yallist-3.1.1.tgz#dbb7daf9bfd8bac9ab45ebf602b8cbad0d5d08fd"
@@ -8076,6 +7802,11 @@ yallist@^4.0.0:
   resolved "https://registry.yarnpkg.com/yallist/-/yallist-4.0.0.tgz#9bb92790d9c0effec63be73519e11a35019a3a72"
   integrity sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==
 
+yaml@^1.10.0, yaml@^1.8.3:
+  version "1.10.0"
+  resolved "https://registry.yarnpkg.com/yaml/-/yaml-1.10.0.tgz#3b593add944876077d4d683fee01081bd9fff31e"
+  integrity sha512-yr2icI4glYaNG+KWONODapy2/jDdMSDnrONSjblABjD9B4Z5LgiircSt8m8sRZFNi08kG9Sm0uSHtEmP3zaEGg==
+
 yaml@^1.7.2:
   version "1.8.3"
   resolved "https://registry.yarnpkg.com/yaml/-/yaml-1.8.3.tgz#2f420fca58b68ce3a332d0ca64be1d191dd3f87a"
@@ -8083,19 +7814,6 @@ yaml@^1.7.2:
   dependencies:
     "@babel/runtime" "^7.8.7"
 
-yaml@^1.8.3:
-  version "1.10.0"
-  resolved "https://registry.yarnpkg.com/yaml/-/yaml-1.10.0.tgz#3b593add944876077d4d683fee01081bd9fff31e"
-  integrity sha512-yr2icI4glYaNG+KWONODapy2/jDdMSDnrONSjblABjD9B4Z5LgiircSt8m8sRZFNi08kG9Sm0uSHtEmP3zaEGg==
-
-yargs-parser@^11.1.1:
-  version "11.1.1"
-  resolved "https://registry.yarnpkg.com/yargs-parser/-/yargs-parser-11.1.1.tgz#879a0865973bca9f6bab5cbdf3b1c67ec7d3bcf4"
-  integrity sha512-C6kB/WJDiaxONLJQnF8ccx9SEeoTTLek8RVbaOIsrAUS8VrBEXfmeSnCZxygc+XC2sNMBIwOOnfcxiynjHsVSQ==
-  dependencies:
-    camelcase "^5.0.0"
-    decamelize "^1.2.0"
-
 yargs-parser@^13.1.2:
   version "13.1.2"
   resolved "https://registry.yarnpkg.com/yargs-parser/-/yargs-parser-13.1.2.tgz#130f09702ebaeef2650d54ce6e3e5706f7a4fb38"
@@ -8112,23 +7830,10 @@ yargs-parser@^18.1.2, yargs-parser@^18.1.3:
     camelcase "^5.0.0"
     decamelize "^1.2.0"
 
-yargs@^12.0.5:
-  version "12.0.5"
-  resolved "https://registry.yarnpkg.com/yargs/-/yargs-12.0.5.tgz#05f5997b609647b64f66b81e3b4b10a368e7ad13"
-  integrity sha512-Lhz8TLaYnxq/2ObqHDql8dX8CJi97oHxrjUcYtzKbbykPtVW9WB+poxI+NM2UIzsMgNCZTIf0AQwsjK5yMAqZw==
-  dependencies:
-    cliui "^4.0.0"
-    decamelize "^1.2.0"
-    find-up "^3.0.0"
-    get-caller-file "^1.0.1"
-    os-locale "^3.0.0"
-    require-directory "^2.1.1"
-    require-main-filename "^1.0.1"
-    set-blocking "^2.0.0"
-    string-width "^2.0.0"
-    which-module "^2.0.0"
-    y18n "^3.2.1 || ^4.0.0"
-    yargs-parser "^11.1.1"
+yargs-parser@^20.2.2:
+  version "20.2.6"
+  resolved "https://registry.yarnpkg.com/yargs-parser/-/yargs-parser-20.2.6.tgz#69f920addf61aafc0b8b89002f5d66e28f2d8b20"
+  integrity sha512-AP1+fQIWSM/sMiET8fyayjx/J+JmTPt2Mr0FkrgqB4todtfa53sOsrSAcIrJRD5XS20bKUwaDIuMkWKCEiQLKA==
 
 yargs@^13.3.2:
   version "13.3.2"
@@ -8162,3 +7867,16 @@ yargs@^15.3.1:
     which-module "^2.0.0"
     y18n "^4.0.0"
     yargs-parser "^18.1.2"
+
+yargs@^16.1.1:
+  version "16.2.0"
+  resolved "https://registry.yarnpkg.com/yargs/-/yargs-16.2.0.tgz#1c82bf0f6b6a66eafce7ef30e376f49a12477f66"
+  integrity sha512-D1mvvtDG0L5ft/jGWkLpG1+m0eQxOfaBvTNELraWj22wSVUMWxZUvYgJYcKh6jGGIkJFhH4IZPQhR4TKpc8mBw==
+  dependencies:
+    cliui "^7.0.2"
+    escalade "^3.1.1"
+    get-caller-file "^2.0.5"
+    require-directory "^2.1.1"
+    string-width "^4.2.0"
+    y18n "^5.0.5"
+    yargs-parser "^20.2.2"

[airflow] 28/42: Bugfix: Plugins endpoint was unauthenticated (#14570)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit bfe57d3fcdd6dde925a5207a3ba04a1b1cde7a4d
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Tue Mar 2 23:48:10 2021 +0000

    Bugfix: Plugins endpoint was unauthenticated (#14570)
    
    The plugins endpoint missed auth check
    
    (cherry picked from commit 0a969db2b025709505f8043721c83218a73bb84d)
---
 airflow/www/views.py    | 5 +++++
 tests/www/test_views.py | 6 ++++++
 2 files changed, 11 insertions(+)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index 78dbbea..fbee413 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -2969,6 +2969,11 @@ class PluginView(AirflowBaseView):
     ]
 
     @expose('/plugin')
+    @auth.has_access(
+        [
+            (permissions.ACTION_CAN_READ, permissions.RESOURCE_PLUGIN),
+        ]
+    )
     def list(self):
         """List loaded plugins."""
         plugins_manager.ensure_plugins_loaded()
diff --git a/tests/www/test_views.py b/tests/www/test_views.py
index efcb46e..b391e56 100644
--- a/tests/www/test_views.py
+++ b/tests/www/test_views.py
@@ -361,6 +361,12 @@ class TestPluginView(TestBase):
         self.check_content_in_response("source", resp)
         self.check_content_in_response("<em>test-entrypoint-testpluginview==1.0.0:</em> <Mock id=", resp)
 
+    def test_endpoint_should_not_be_unauthenticated(self):
+        self.logout()
+        resp = self.client.get('/plugin', follow_redirects=True)
+        self.check_content_not_in_response("test_plugin", resp)
+        self.check_content_in_response("Sign In - Airflow", resp)
+
 
 class TestPoolModelView(TestBase):
     def setUp(self):

[airflow] 32/42: Remember expanded task groups in localStorage (#14661)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3a0fb37dd82724a4b0b602d436e6a0c464ba00d9
Author: yuqian90 <yu...@gmail.com>
AuthorDate: Fri Mar 12 23:11:58 2021 +0800

    Remember expanded task groups in localStorage (#14661)
    
    * Save expanded and focused state of TaskGroups in localStorage
    
    * Restore focus on refresh
    
    * Address style issues according to comments
    
    (cherry picked from commit 456a7ddfd1da797d493abd8a57d36d05424eaaa6)
---
 airflow/www/templates/airflow/graph.html | 117 +++++++++++++++++++++++++++++--
 1 file changed, 110 insertions(+), 7 deletions(-)

diff --git a/airflow/www/templates/airflow/graph.html b/airflow/www/templates/airflow/graph.html
index 844ce38..807cef1 100644
--- a/airflow/www/templates/airflow/graph.html
+++ b/airflow/www/templates/airflow/graph.html
@@ -235,7 +235,7 @@
         });
 
         d3.selectAll("g.node").on("mouseout", function (d) {
-          d3.select(this).selectAll("rect").style("stroke", null);
+          d3.select(this).selectAll("rect,circle").style("stroke", null);
           highlight_nodes(g.predecessors(d), null, initialStrokeWidth)
           highlight_nodes(g.successors(d), null, initialStrokeWidth)
           d3.selectAll("g.node")
@@ -244,6 +244,7 @@
             .style("stroke-width", initialStrokeWidth);
           d3.selectAll("g.edgePath")
             .style("opacity", 1);
+          localStorage.removeItem(focused_group_key(dag_id));
         });
         updateNodesStates(task_instances);
         setUpZoomSupport();
@@ -417,6 +418,8 @@
               .style("opacity", 1);
           d3.selectAll('.js-state-legend-item')
               .style("background-color", null);
+
+          localStorage.removeItem(focused_group_key(dag_id));
       }
 
       function focusState(state, node, color){
@@ -591,6 +594,22 @@
         return children
       }
 
+      // Return list of all task group ids in the given task group including the given group itself.
+      function get_all_group_ids(group) {
+        var children = [group.id];
+
+        for (const [key, val] of Object.entries(group.children)) {
+          if (val.children != undefined) {
+            // group
+            const sub_group_children = get_all_group_ids(val)
+            for (const id of sub_group_children) {
+              children.push(id);
+            }
+          }
+        }
+        return children;
+      }
+
 
       // Return the state for the node based on the state of its taskinstance or that of its
       // children if it's a group node
@@ -626,6 +645,16 @@
         return "no_status"
       }
 
+      // Returns the key used to store expanded task group ids in localStorage
+      function expanded_groups_key(dag_id) {
+          return `expanded_groups_${dag_id}`;
+      }
+
+      // Returns the key used to store the focused task group id in localStorage
+      function focused_group_key(dag_id) {
+          return `focused_group_${dag_id}`;
+      }
+
       // Focus the graph on the expanded/collapsed node
       function focus_group(node_id) {
         if(node_id != null && zoom != null) {
@@ -668,11 +697,13 @@
                       .style("opacity", 0.2).duration(duration)
               }
             });
+
+            localStorage.setItem(focused_group_key(dag_id), node_id);
         }
       }
 
       // Expands a group node
-      function expand_group(node_id, node) {
+      function expand_group(node_id, node, focus=true) {
         node.children.forEach(function (val) {
           // Set children nodes
           g.setNode(val.id, val.value)
@@ -706,17 +737,22 @@
         })
 
         draw()
-        focus_group(node_id)
+
+        if (focus) {
+          focus_group(node_id);
+        }
+
+        save_expanded_group(node_id)
     }
 
     // Remove the node with this node_id from g.
     function remove_node(node_id) {
-      if(g.hasNode(node_id)) {
+      if (g.hasNode(node_id)) {
           node = g.node(node_id)
           if(node.children != undefined) {
             // If the child is an expanded group node, remove children too.
             node.children.forEach(function (child) {
-              remove_node(child.id)
+              remove_node(child.id);
             })
           }
       }
@@ -745,10 +781,77 @@
 
         draw()
         focus_group(node_id)
+
+        remove_expanded_group(node_id, node);
       }
 
-      expand_group(null, nodes)
+    function get_saved_groups(dag_id) {
+        // expanded_groups is a Set
+        try {
+            var expanded_groups = new Set(JSON.parse(localStorage.getItem(expanded_groups_key(dag_id))));
+        } catch {
+            var expanded_groups = new Set();
+        }
+
+        return expanded_groups;
+    }
+
+    // Clean up invalid group_ids from saved_group_ids (e.g. due to DAG changes)
+    function prune_invalid_saved_group_ids() {
+        // All the group_ids in the whole DAG
+        const all_group_ids = new Set(get_all_group_ids(nodes));
+        var expanded_groups = get_saved_groups(dag_id);
+        expanded_groups = Array.from(expanded_groups).filter(group_id => all_group_ids.has(group_id));
+        localStorage.setItem(expanded_groups_key(dag_id), JSON.stringify(expanded_groups));
+    }
+
+    // Remember the expanded groups in local storage so that it can be used to restore the expanded state
+    // of task groups.
+    function save_expanded_group(node_id) {
+        // expanded_groups is a Set
+        var expanded_groups = get_saved_groups(dag_id);
+        expanded_groups.add(node_id)
+        localStorage.setItem(expanded_groups_key(dag_id), JSON.stringify(Array.from(expanded_groups)));
+    }
+
+    // Remove the node_id from the expanded state
+    function remove_expanded_group(node_id, node) {
+        var expanded_groups = get_saved_groups(dag_id);
+        const child_group_ids = get_all_group_ids(node);
+        child_group_ids.forEach(child_id => expanded_groups.delete(child_id));
+        localStorage.setItem(expanded_groups_key(dag_id), JSON.stringify(Array.from(expanded_groups)));
+    }
+
+    // Restore previously expanded task groups
+    function expand_saved_groups(expanded_groups, node) {
+        if (node.children == undefined) {
+            return;
+        }
+
+        node.children.forEach(function (child_node) {
+            if(expanded_groups.has(child_node.id)) {
+                expand_group(child_node.id, g.node(child_node.id), false);
+
+                expand_saved_groups(expanded_groups, child_node);
+            }
+        });
+    }
+
+    prune_invalid_saved_group_ids();
+    const focus_node_id = localStorage.getItem(focused_group_key(dag_id));
+    const expanded_groups = get_saved_groups(dag_id);
+
+    // Always expand the root node
+    expand_group(null, nodes);
+
+    // Expand the node that were previously expanded
+    expand_saved_groups(expanded_groups, nodes);
+
+    // Restore focus (if available)
+    if(g.hasNode(focus_node_id)) {
+      focus_group(focus_node_id);
+    }
 
-      initRefresh();
+    initRefresh();
   </script>
 {% endblock %}

[airflow] 20/42: BugFix: Serialize max_retry_delay as a timedelta (#14436)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 041c9d27e373d8e778ce8f2b08f86b76a845d064
Author: Paul Vickers <pa...@outlook.com>
AuthorDate: Fri Feb 26 11:42:00 2021 +0000

    BugFix: Serialize max_retry_delay as a timedelta (#14436)
    
    closes: #13086, #14212
    
    (cherry picked from commit 59c459fa2a6aafc133db4a89980fb3d3d0d25589)
---
 airflow/models/baseoperator.py                | 9 ++++++++-
 airflow/serialization/schema.json             | 1 +
 airflow/serialization/serialized_objects.py   | 2 +-
 tests/serialization/test_dag_serialization.py | 4 ++++
 4 files changed, 14 insertions(+), 2 deletions(-)

diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
index 64ed4c5..06094a1 100644
--- a/airflow/models/baseoperator.py
+++ b/airflow/models/baseoperator.py
@@ -353,7 +353,7 @@ class BaseOperator(Operator, LoggingMixin, TaskMixin, metaclass=BaseOperatorMeta
         retries: Optional[int] = conf.getint('core', 'default_task_retries', fallback=0),
         retry_delay: timedelta = timedelta(seconds=300),
         retry_exponential_backoff: bool = False,
-        max_retry_delay: Optional[datetime] = None,
+        max_retry_delay: Optional[timedelta] = None,
         start_date: Optional[datetime] = None,
         end_date: Optional[datetime] = None,
         depends_on_past: bool = False,
@@ -460,6 +460,13 @@ class BaseOperator(Operator, LoggingMixin, TaskMixin, metaclass=BaseOperatorMeta
             self.retry_delay = timedelta(seconds=retry_delay)  # noqa
         self.retry_exponential_backoff = retry_exponential_backoff
         self.max_retry_delay = max_retry_delay
+        if max_retry_delay:
+            if isinstance(max_retry_delay, timedelta):
+                self.max_retry_delay = max_retry_delay
+            else:
+                self.log.debug("Max_retry_delay isn't timedelta object, assuming secs")
+                self.max_retry_delay = timedelta(seconds=max_retry_delay)  # noqa
+
         self.params = params or {}  # Available in templates!
         self.priority_weight = priority_weight
         if not WeightRule.is_valid(weight_rule):
diff --git a/airflow/serialization/schema.json b/airflow/serialization/schema.json
index c831334..0fbe20f 100644
--- a/airflow/serialization/schema.json
+++ b/airflow/serialization/schema.json
@@ -145,6 +145,7 @@
         "execution_timeout": { "$ref": "#/definitions/timedelta" },
         "retry_delay": { "$ref": "#/definitions/timedelta" },
         "retry_exponential_backoff": { "type": "boolean" },
+        "max_retry_delay": { "$ref": "#/definitions/timedelta" },
         "params": { "$ref": "#/definitions/dict" },
         "priority_weight": { "type": "number" },
         "weight_rule": { "type": "string" },
diff --git a/airflow/serialization/serialized_objects.py b/airflow/serialization/serialized_objects.py
index a5e9f15..d609c09 100644
--- a/airflow/serialization/serialized_objects.py
+++ b/airflow/serialization/serialized_objects.py
@@ -452,7 +452,7 @@ class SerializedBaseOperator(BaseOperator, BaseSerialization):
                 v = set(v)
             elif k == "subdag":
                 v = SerializedDAG.deserialize_dag(v)
-            elif k in {"retry_delay", "execution_timeout", "sla"}:
+            elif k in {"retry_delay", "execution_timeout", "sla", "max_retry_delay"}:
                 v = cls._deserialize_timedelta(v)
             elif k in encoded_op["template_fields"]:
                 pass
diff --git a/tests/serialization/test_dag_serialization.py b/tests/serialization/test_dag_serialization.py
index 2046e22..a775f5b 100644
--- a/tests/serialization/test_dag_serialization.py
+++ b/tests/serialization/test_dag_serialization.py
@@ -60,6 +60,7 @@ serialized_simple_dag_ground_truth = {
                 "depends_on_past": False,
                 "retries": 1,
                 "retry_delay": {"__type": "timedelta", "__var": 300.0},
+                "max_retry_delay": {"__type": "timedelta", "__var": 600.0},
                 "sla": {"__type": "timedelta", "__var": 100.0},
             },
         },
@@ -85,6 +86,7 @@ serialized_simple_dag_ground_truth = {
                 "owner": "airflow",
                 "retries": 1,
                 "retry_delay": 300.0,
+                "max_retry_delay": 600.0,
                 "sla": 100.0,
                 "_downstream_task_ids": [],
                 "_inlets": [],
@@ -113,6 +115,7 @@ serialized_simple_dag_ground_truth = {
                 "task_id": "custom_task",
                 "retries": 1,
                 "retry_delay": 300.0,
+                "max_retry_delay": 600.0,
                 "sla": 100.0,
                 "_downstream_task_ids": [],
                 "_inlets": [],
@@ -160,6 +163,7 @@ def make_simple_dag():
         default_args={
             "retries": 1,
             "retry_delay": timedelta(minutes=5),
+            "max_retry_delay": timedelta(minutes=10),
             "depends_on_past": False,
             "sla": timedelta(seconds=100),
         },

[airflow] 37/42: Pin SQLAlchemy to <1.4 due to breakage of sqlalchemy-utils (#14812)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6ae0cb019fe94e2cd86d6cccc33b33a3a0922bfc
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Mon Mar 15 21:28:06 2021 +0100

    Pin SQLAlchemy to <1.4 due to breakage of sqlalchemy-utils (#14812)
    
    The 1.4 releae of SQLAlchemy breaks sqlalchemy-utils.
    
    This change pins it to < 1.4
    
    Fixes #14811
    
    (cherry picked from commit c29f6fb76b9d87c50713ae94fda805b9f789a01d)
---
 setup.cfg | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/setup.cfg b/setup.cfg
index 267d972..ed533ca 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -130,7 +130,8 @@ install_requires =
     requests>=2.20.0
     rich==9.2.0
     setproctitle>=1.1.8, <2
-    sqlalchemy>=1.3.18, <2
+    # SQLAlchemy 1.4 breaks sqlalchemy-utils https://github.com/kvesteri/sqlalchemy-utils/issues/505
+    sqlalchemy>=1.3.18, <1.4
     sqlalchemy_jsonfield~=1.0
     tabulate>=0.7.5, <0.9
     tenacity~=6.2.0

[airflow] 12/42: Use `Lax` for `cookie_samesite` when empty string is passed (#14183)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7790b2f68831091b036eed9d7f88cbba80c7d425
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Thu Feb 11 00:20:40 2021 +0000

    Use `Lax` for `cookie_samesite` when empty string is passed (#14183)
    
    closes https://github.com/apache/airflow/issues/13971
    
    The value of `[webserver] cookie_samesite` was changed to `Lax` in >=2.0
    from `''` (empty string) in 1.10.x.
    
    This causes the following error for users migrating from 1.10.x to 2.0
    if the old airflow.cfg already exists.
    
    ```
    Traceback (most recent call last):
    File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
    File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1953, in full_dispatch_request
    return self.finalize_request(rv)
    File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1970, in finalize_request
    response = self.process_response(response)
    File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 2269, in process_response
    self.session_interface.save_session(self, ctx.session, response)
    File "/usr/local/lib/python3.9/site-packages/flask/sessions.py", line 379, in save_session
    response.set_cookie(
    File "/usr/local/lib/python3.9/site-packages/werkzeug/wrappers/base_response.py", line 468, in set_cookie
    dump_cookie(
    File "/usr/local/lib/python3.9/site-packages/werkzeug/http.py", line 1217, in dump_cookie
    raise ValueError("SameSite must be 'Strict', 'Lax', or 'None'.")
    ValueError: SameSite must be 'Strict', 'Lax', or 'None'.**
    ```
    
    This commit takes care of it by using `Lax` when the value is empty string (``)
    
    (cherry picked from commit 4336f4cfdbd843085672b8e49367cf1b9ab4a432)
---
 UPDATING.md           |  2 +-
 airflow/www/app.py    | 12 +++++++++++-
 tests/www/test_app.py |  6 ++++++
 3 files changed, 18 insertions(+), 2 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index 60dc211..d024d51 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -268,7 +268,7 @@ def execution_date_fn(execution_date, ds_nodash, dag):
 ### The default value for `[webserver] cookie_samesite` has been changed to `Lax`
 
 As [recommended](https://flask.palletsprojects.com/en/1.1.x/config/#SESSION_COOKIE_SAMESITE) by Flask, the
-`[webserver] cookie_samesite` has been changed to `Lax` from `None`.
+`[webserver] cookie_samesite` has been changed to `Lax` from `''` (empty string) .
 
 #### Changes to import paths
 
diff --git a/airflow/www/app.py b/airflow/www/app.py
index 77b5a51..aa9b5ed 100644
--- a/airflow/www/app.py
+++ b/airflow/www/app.py
@@ -16,6 +16,7 @@
 # specific language governing permissions and limitations
 # under the License.
 #
+import warnings
 from datetime import timedelta
 from typing import Optional
 
@@ -79,7 +80,16 @@ def create_app(config=None, testing=False, app_name="Airflow"):
 
     flask_app.config['SESSION_COOKIE_HTTPONLY'] = True
     flask_app.config['SESSION_COOKIE_SECURE'] = conf.getboolean('webserver', 'COOKIE_SECURE')
-    flask_app.config['SESSION_COOKIE_SAMESITE'] = conf.get('webserver', 'COOKIE_SAMESITE')
+
+    cookie_samesite_config = conf.get('webserver', 'COOKIE_SAMESITE')
+    if cookie_samesite_config == "":
+        warnings.warn(
+            "Old deprecated value found for `cookie_samesite` option in `[webserver]` section. "
+            "Using `Lax` instead. Change the value to `Lax` in airflow.cfg to remove this warning.",
+            DeprecationWarning,
+        )
+        cookie_samesite_config = "Lax"
+    flask_app.config['SESSION_COOKIE_SAMESITE'] = cookie_samesite_config
 
     if config:
         flask_app.config.from_mapping(config)
diff --git a/tests/www/test_app.py b/tests/www/test_app.py
index b731db5..dddfb71 100644
--- a/tests/www/test_app.py
+++ b/tests/www/test_app.py
@@ -233,6 +233,12 @@ class TestApp(unittest.TestCase):
         app = application.cached_app(testing=True)
         assert app.config['PERMANENT_SESSION_LIFETIME'] == timedelta(minutes=3600)
 
+    @conf_vars({('webserver', 'cookie_samesite'): ''})
+    @mock.patch("airflow.www.app.app", None)
+    def test_correct_default_is_set_for_cookie_samesite(self):
+        app = application.cached_app(testing=True)
+        assert app.config['SESSION_COOKIE_SAMESITE'] == 'Lax'
+
 
 class TestFlaskCli(unittest.TestCase):
     def test_flask_cli_should_display_routes(self):

[airflow] 01/42: Note that the DB must be using UTF-8 (#14742)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ac43056777af2e5206fa50ebcb0cf56ad86eea42
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Fri Mar 12 11:37:00 2021 +0000

    Note that the DB must be using UTF-8 (#14742)
    
    Without it non-ASCII characters in serialized dag will cause an error.
    
    (cherry picked from commit b40beb3036b8221053fdb7ab537a45afccf0bd8e)
---
 docs/apache-airflow/howto/set-up-database.rst | 9 +++++++++
 1 file changed, 9 insertions(+)

diff --git a/docs/apache-airflow/howto/set-up-database.rst b/docs/apache-airflow/howto/set-up-database.rst
index 153ca80..58e8123 100644
--- a/docs/apache-airflow/howto/set-up-database.rst
+++ b/docs/apache-airflow/howto/set-up-database.rst
@@ -116,6 +116,11 @@ In the example below, a database ``airflow_db`` and user  with username ``airflo
    CREATE USER 'airflow_user' IDENTIFIED BY 'airflow_pass';
    GRANT ALL PRIVILEGES ON airflow_db.* TO 'airflow_user';
 
+
+.. note::
+
+   The database must use a UTF-8 character set
+
 We rely on more strict ANSI SQL settings for MySQL in order to have sane defaults.
 Make sure to have specified ``explicit_defaults_for_timestamp=1`` option under ``[mysqld]`` section
 in your ``my.cnf`` file. You can also activate these options with the ``--explicit-defaults-for-timestamp`` switch passed to ``mysqld`` executable
@@ -150,6 +155,10 @@ In the example below, a database ``airflow_db`` and user  with username ``airflo
    CREATE USER airflow_user WITH PASSWORD 'airflow_pass';
    GRANT ALL PRIVILEGES ON DATABASE airflow_db TO airflow_user;
 
+.. note::
+
+   The database must use a UTF-8 character set
+
 You may need to update your Postgres ``pg_hba.conf`` to add the
 ``airflow`` user to the database access control list; and to reload
 the database configuration to load your change. See