You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by as...@apache.org on 2021/06/22 13:45:50 UTC

[airflow] branch v2-1-test updated (abc86c4 -> 46cfeee)

This is an automated email from the ASF dual-hosted git repository.

ash pushed a change to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard abc86c4  Don't fail to log if we can't redact something (#16118)
 discard fb714d6  set max tree width to 1200px (#16067)
 discard 789aace  Ensure that we don't try to mask empty string in logs (#16057)
    omit 779811e  Fill the "job_id" field for `airflow task run` without `--local`/`--raw` for KubeExecutor (#16108)
    omit 194762f  Fixes problem where conf variable was used before initialization (#16088)
    omit 3c169fb  Fix apply defaults for task decorator (#16085)
    omit fcf95fe  Parse recently modified files even if just parsed (#16075)
    omit 47ca826  Ensure that we don't try to mask empty string in logs (#16057)
    omit 554a13d  Don't die when masking `log.exception` when there is no exception (#16047)
    omit ec03f81  Restores apply_defaults import in base_sensor_operator (#16040)
    omit 79c99e9  Fix auto-refresh in tree view When webserver ui is not in ``/`` (#16018)
    omit 8dcbdcd  Fix Celery executor getting stuck randomly because of reset_signals in multiprocessing (#15989)
    omit 589be33  Fix dag.clear() to set multiple dags to running when necessary (#15382)
    omit bdf9bc2  Fix dag.clear() to set multiple dags to running when necessary (#15382)
     new 955de9b  Fix dag.clear() to set multiple dags to running when necessary (#15382)
     new dc26721  Fix auto-refresh in tree view When webserver ui is not in ``/`` (#16018)
     new 2cd66a8  Restores apply_defaults import in base_sensor_operator (#16040)
     new 19332cf  Don't die when masking `log.exception` when there is no exception (#16047)
     new c47171b  Ensure that we don't try to mask empty string in logs (#16057)
     new 734f1dc  Parse recently modified files even if just parsed (#16075)
     new 4e431ec  Fix apply defaults for task decorator (#16085)
     new 9a3fb62  Fixes problem where conf variable was used before initialization (#16088)
     new b9e5a2d  Fill the "job_id" field for `airflow task run` without `--local`/`--raw` for KubeExecutor (#16108)
     new bff528b  Ensure that we don't try to mask empty string in logs (#16057)
     new eefa563  set max tree width to 1200px (#16067)
     new 7603ef6  Don't fail to log if we can't redact something (#16118)
     new c6313e4  Fix Orphaned tasks stuck in CeleryExecutor as running (#16550)
     new 1ca495c  Fix tasks in an infinite slots pool were never scheduled (#15247)
     new e58a6a9  Add `passphrase` and `private_key` to default sensitive fileld names (#16392)
     new acc824f  Fix templated default/example values in config ref docs (#16442)
     new 7b9dd0b  add num_runs query param for tree refresh (#16437)
     new c3bc645  Validate retries value on init for better errors (#16415)
     new 4c06aae  Clean Markdown with dedent to respect indents (#16414)
     new b578120  Fix normalize-url vulnerability (#16375)
     new fb62867  Make task ID on legend have enough width and width of line chart to be 100%.  (#15915)
     new 19468d9  Queue tasks with higher priority and earlier execution_date first. (#15210)
     new 77060cd  Support remote logging in elasticsearch with filebeat 7 (#14625)
     new dbf3064  Make REST API List DAGs endpoint consistent with UI/CLI behaviour (#16318)
     new 5f478ec  Don't show stale Serialized DAGs if they are deleted in DB (#16368)
     new e22b84d  Adding `only_active` parameter to /dags endpoint (#14306)
     new 777fd9b  Correctly handle None returns from Query.scalar() (#16345)
     new 215492d  Tree View UI for larger DAGs & more consistent spacing in Tree View (#16522)
     new 7c094fa  Backfill: Don't create a DagRun if no tasks match task regex (#16461)
     new 4c37aea  Switch to built-in data structures in SecretsMasker (#16424)
     new 6a5e676  Avoid recursing too deep when redacting logs (#16491)
     new 8813c3d  Allow null value for operator field in task_instance schema(REST API) (#16516)
     new e32f22a  Fix unsuccessful KubernetesPod final_state call when `is_delete_operator_pod=True` (#15490)
     new 446e66b  Fix DAG run state not updated while DAG is paused (#16343)
     new 8114542  Fix Dag Details start date bug (#16206)
     new fc30a4c  Fix CLI connections import and migrate logic from secrets to Connection model (#15425)
     new c7a3977  Ensure that `dag_run.conf` is a dict (#15057)
     new 46cfeee  Add back-compat layer to clear_task_instances (#16582)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (abc86c4)
            \
             N -- N -- N   refs/heads/v2-1-test (46cfeee)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 38 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 UPDATING.md                                        |   11 +-
 airflow/api_connexion/endpoints/dag_endpoint.py    |   17 +-
 airflow/api_connexion/openapi/v1.yaml              |   12 +
 airflow/api_connexion/schemas/dag_schema.py        |   13 +
 airflow/cli/commands/connection_command.py         |   29 +-
 airflow/cli/commands/dag_command.py                |    4 +
 airflow/config_templates/airflow_local_settings.py |    4 +
 airflow/config_templates/config.yml                |   14 +
 airflow/config_templates/default_airflow.cfg       |    6 +
 airflow/configuration.py                           |    2 +-
 airflow/executors/celery_executor.py               |    4 +-
 airflow/jobs/local_task_job.py                     |   15 +
 airflow/jobs/scheduler_job.py                      |    1 +
 airflow/models/baseoperator.py                     |    8 +
 airflow/models/connection.py                       |    6 +-
 airflow/models/dag.py                              |    6 +
 airflow/models/dagbag.py                           |   11 +-
 airflow/models/pool.py                             |    6 +-
 airflow/models/serialized_dag.py                   |   10 +-
 airflow/models/taskinstance.py                     |   15 +-
 .../cncf/kubernetes/operators/kubernetes_pod.py    |   25 +-
 .../cncf/kubernetes/utils/pod_launcher.py          |    7 +-
 .../providers/elasticsearch/log/es_task_handler.py |   20 +-
 airflow/typing_compat.py                           |    3 +-
 airflow/utils/log/secrets_masker.py                |   70 +-
 airflow/www/api/experimental/endpoints.py          |    6 +
 airflow/www/package.json                           |    4 +-
 airflow/www/static/js/tree.js                      |    6 +-
 airflow/www/templates/airflow/dag_details.html     |   15 +-
 airflow/www/templates/airflow/tree.html            |    5 +
 airflow/www/templates/airflow/trigger.html         |    2 +-
 airflow/www/utils.py                               |    5 +-
 airflow/www/views.py                               |   41 +-
 airflow/www/yarn.lock                              | 1027 +++++++-------------
 docs/conf.py                                       |   14 +-
 kubernetes_tests/test_kubernetes_pod_operator.py   |   10 +-
 .../test_kubernetes_pod_operator_backcompat.py     |    6 +-
 tests/api_connexion/endpoints/test_dag_endpoint.py |  103 +-
 .../endpoints/test_dag_run_endpoint.py             |   20 +
 .../endpoints/test_task_instance_endpoint.py       |    9 +-
 tests/api_connexion/schemas/test_dag_schema.py     |    5 +
 tests/cli/commands/test_connection_command.py      |   66 +-
 tests/core/test_core.py                            |   46 +
 tests/executors/test_celery_executor.py            |    2 +
 tests/jobs/test_local_task_job.py                  |   42 +-
 tests/jobs/test_scheduler_job.py                   |  194 ++++
 tests/models/test_dagbag.py                        |   31 +-
 tests/models/test_pool.py                          |    4 +-
 .../kubernetes/operators/test_kubernetes_pod.py    |   11 +-
 .../elasticsearch/log/test_es_task_handler.py      |   61 +-
 tests/utils/log/test_secrets_masker.py             |   16 -
 tests/www/api/experimental/test_endpoints.py       |   24 +-
 tests/www/test_utils.py                            |   36 +
 tests/www/views/test_views_trigger_dag.py          |   11 +
 54 files changed, 1288 insertions(+), 853 deletions(-)

[airflow] 21/38: Make task ID on legend have enough width and width of line chart to be 100%. (#15915)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit fb628678e7dc223a5684c29ea538bd22d3541ed9
Author: PengMingHua905001 <84...@users.noreply.github.com>
AuthorDate: Thu Jun 10 23:38:03 2021 +0800

    Make task ID on legend have enough width and width of line chart to be 100%.  (#15915)
    
    * Make task ID on legend have enough width and width of line chart to be 100%.
    
    * Make task ID on legend have enough width and width of line chart to be 100%.
    
    * Fix pylint errors.
    
    (cherry picked from commit 6e9e56246b216a43eabb050c5b220f3665de6305)
---
 airflow/www/views.py | 31 +++++++++++++++++++++++++++----
 1 file changed, 27 insertions(+), 4 deletions(-)

diff --git a/airflow/www/views.py b/airflow/www/views.py
index 6d8a90e..eadec6c 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -431,6 +431,10 @@ class AirflowBaseView(BaseView):  # noqa: D101
         'macros': macros,
     }
 
+    line_chart_attr = {
+        'legend.maxKeyLength': 200,
+    }
+
     def render_template(self, *args, **kwargs):
         return super().render_template(
             *args,
@@ -2270,8 +2274,18 @@ class Airflow(AirflowBaseView):  # noqa: D101  pylint: disable=too-many-public-m
         if root:
             dag = dag.sub_dag(task_ids_or_regex=root, include_upstream=True, include_downstream=False)
         chart_height = wwwutils.get_chart_height(dag)
-        chart = nvd3.lineChart(name="lineChart", x_is_date=True, height=chart_height, width="1200")
-        cum_chart = nvd3.lineChart(name="cumLineChart", x_is_date=True, height=chart_height, width="1200")
+        chart = nvd3.lineChart(
+            name="lineChart",
+            x_is_date=True,
+            height=chart_height,
+            chart_attr=self.line_chart_attr
+        )
+        cum_chart = nvd3.lineChart(
+            name="cumLineChart",
+            x_is_date=True,
+            height=chart_height,
+            chart_attr=self.line_chart_attr
+        )
 
         y_points = defaultdict(list)
         x_points = defaultdict(list)
@@ -2390,7 +2404,11 @@ class Airflow(AirflowBaseView):  # noqa: D101  pylint: disable=too-many-public-m
 
         chart_height = wwwutils.get_chart_height(dag)
         chart = nvd3.lineChart(
-            name="lineChart", x_is_date=True, y_axis_format='d', height=chart_height, width="1200"
+            name="lineChart",
+            x_is_date=True,
+            y_axis_format='d',
+            height=chart_height,
+            chart_attr=self.line_chart_attr
         )
 
         for task in dag.tasks:
@@ -2460,7 +2478,12 @@ class Airflow(AirflowBaseView):  # noqa: D101  pylint: disable=too-many-public-m
             dag = dag.sub_dag(task_ids_or_regex=root, include_upstream=True, include_downstream=False)
 
         chart_height = wwwutils.get_chart_height(dag)
-        chart = nvd3.lineChart(name="lineChart", x_is_date=True, height=chart_height, width="1200")
+        chart = nvd3.lineChart(
+            name="lineChart",
+            x_is_date=True,
+            height=chart_height,
+            chart_attr=self.line_chart_attr
+        )
         y_points = {}
         x_points = {}
         for task in dag.tasks:

[airflow] 08/38: Fixes problem where conf variable was used before initialization (#16088)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9a3fb628c864788d671fbe79cf15337fb18aa970
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Thu May 27 18:37:56 2021 +0200

    Fixes problem where conf variable was used before initialization (#16088)
    
    There was a problem that when we initialized configuration, we've run
    validate() which - among others - checkd if the connection is an `sqlite`
    but when the SQLAlchemy connection was not configured via variable but
    via secret manager, it has fallen back to secret_backend, which should
    be configured via conf and initialized.
    The problem is that the "conf" object is not yet created, because
    the "validate()" method has not finished yet and
    "initialize_configuration" has not yet returned.
    This led to snake eating its own tail.
    
    This PR defers the validate() method to after secret backends have
    been initialized. The effect of it is that secret backends might
    be initialized with configuration that is not valid, but there are
    no real negative consequences of this.
    
    Fixes: #16079
    Fixes: #15685
    
    starting
    
    (cherry picked from commit 65519ab83ddf4bd6fc30c435b5bfccefcb14d596)
---
 airflow/configuration.py      | 4 +---
 tests/www/views/test_views.py | 3 ++-
 2 files changed, 3 insertions(+), 4 deletions(-)

diff --git a/airflow/configuration.py b/airflow/configuration.py
index 4420dda..53e76a6 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -873,9 +873,6 @@ def initialize_config():
 
         log.info('Creating new FAB webserver config file in: %s', WEBSERVER_CONFIG)
         shutil.copy(_default_config_file_path('default_webserver_config.py'), WEBSERVER_CONFIG)
-
-    conf.validate()
-
     return conf
 
 
@@ -1114,6 +1111,7 @@ WEBSERVER_CONFIG = ''  # Set by initialize_config
 
 conf = initialize_config()
 secrets_backend_list = initialize_secrets_backends()
+conf.validate()
 
 
 PY37 = sys.version_info >= (3, 7)
diff --git a/tests/www/views/test_views.py b/tests/www/views/test_views.py
index 738bf26..eac1e5e 100644
--- a/tests/www/views/test_views.py
+++ b/tests/www/views/test_views.py
@@ -44,7 +44,8 @@ def test_configuration_do_not_expose_config(admin_client):
 @mock.patch.dict(os.environ, {"AIRFLOW__CORE__UNIT_TEST_MODE": "False"})
 def test_configuration_expose_config(admin_client):
     # make sure config is initialized (without unit test mote)
-    initialize_config()
+    conf = initialize_config()
+    conf.validate()
     with conf_vars({('webserver', 'expose_config'): 'True'}):
         resp = admin_client.get('configuration', follow_redirects=True)
     check_content_in_response(['Airflow Configuration', 'Running Configuration'], resp)

[airflow] 20/38: Fix normalize-url vulnerability (#16375)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b578120da30a450f2310d5971f0192e67e144e59
Author: Brent Bovenzi <br...@gmail.com>
AuthorDate: Fri Jun 11 00:57:41 2021 -0500

    Fix normalize-url vulnerability (#16375)
    
    Update two packages that used a highly vulnerable version of normalize-url
    
    See https://github.com/facebook/create-react-app/issues/11054
    
    (cherry picked from commit 70bf1b12821e5ac3869cee27ef54b3ee5cc66f47)
---
 airflow/www/package.json |    4 +-
 airflow/www/yarn.lock    | 1027 +++++++++++++++++-----------------------------
 2 files changed, 376 insertions(+), 655 deletions(-)

diff --git a/airflow/www/package.json b/airflow/www/package.json
index 12e808c..1d00d3c 100644
--- a/airflow/www/package.json
+++ b/airflow/www/package.json
@@ -44,9 +44,9 @@
     "eslint-plugin-standard": "^4.0.1",
     "file-loader": "^6.0.0",
     "imports-loader": "^1.1.0",
-    "mini-css-extract-plugin": "0.9.0",
+    "mini-css-extract-plugin": "1.6.0",
     "moment-locales-webpack-plugin": "^1.2.0",
-    "optimize-css-assets-webpack-plugin": "^5.0.4",
+    "optimize-css-assets-webpack-plugin": "6.0.0",
     "style-loader": "^1.2.1",
     "stylelint": "^13.6.1",
     "stylelint-config-standard": "^20.0.0",
diff --git a/airflow/www/yarn.lock b/airflow/www/yarn.lock
index 3416510..2c3a1e0 100644
--- a/airflow/www/yarn.lock
+++ b/airflow/www/yarn.lock
@@ -337,6 +337,11 @@
     remark "^13.0.0"
     unist-util-find-all-after "^3.0.2"
 
+"@trysound/sax@0.1.1":
+  version "0.1.1"
+  resolved "https://registry.yarnpkg.com/@trysound/sax/-/sax-0.1.1.tgz#3348564048e7a2d7398c935d466c0414ebb6a669"
+  integrity sha512-Z6DoceYb/1xSg5+e+ZlPZ9v0N16ZvZ+wYMraFue4HYrE4ttONKtsvruIRf6t9TBR0YvSOfi1hUU0fJfBLCDYow==
+
 "@types/anymatch@*":
   version "1.3.1"
   resolved "https://registry.yarnpkg.com/@types/anymatch/-/anymatch-1.3.1.tgz#336badc1beecb9dacc38bea2cf32adf627a8421a"
@@ -360,6 +365,11 @@
   resolved "https://registry.yarnpkg.com/@types/json-schema/-/json-schema-7.0.5.tgz#dcce4430e64b443ba8945f0290fb564ad5bac6dd"
   integrity sha512-7+2BITlgjgDhH0vvwZU/HZJVyk+2XUlvxXe8dFMedNX/aMkaOq++rMAFXc0tM7ij15QaWlbdQASBR9dihi+bDQ==
 
+"@types/json-schema@^7.0.6":
+  version "7.0.7"
+  resolved "https://registry.yarnpkg.com/@types/json-schema/-/json-schema-7.0.7.tgz#98a993516c859eb0d5c4c8f098317a9ea68db9ad"
+  integrity sha512-cxWFQVseBm6O9Gbw1IWb8r6OS4OhSt3hPZLkFApLjM8TEXROBuQGLAH2i2gZpcXdLBIrpXuTDhH7Vbm1iXmNGA==
+
 "@types/json5@^0.0.29":
   version "0.0.29"
   resolved "https://registry.yarnpkg.com/@types/json5/-/json5-0.0.29.tgz#ee28707ae94e11d2b827bcbe5270bcea7f3e71ee"
@@ -402,11 +412,6 @@
   resolved "https://registry.yarnpkg.com/@types/parse-json/-/parse-json-4.0.0.tgz#2f8bb441434d163b35fb8ffdccd7138927ffb8c0"
   integrity sha512-//oorEZjL6sbPcKUaCdIGlIUeH26mgzimjBB77G6XRgnDl/L5wOnpyBGRe/Mmf5CVW3PwEBE1NjiMZ/ssFh4wA==
 
-"@types/q@^1.5.1":
-  version "1.5.4"
-  resolved "https://registry.yarnpkg.com/@types/q/-/q-1.5.4.tgz#15925414e0ad2cd765bfef58842f7e26a7accb24"
-  integrity sha512-1HcDas8SEj4z1Wc696tH56G8OlRaH/sqZOynNNB+HF0WOeXPaxTtbYzJY2oEfiUxjSKjhCKr+MvR7dCHcEelug==
-
 "@types/source-list-map@*":
   version "0.1.2"
   resolved "https://registry.yarnpkg.com/@types/source-list-map/-/source-list-map-0.1.2.tgz#0078836063ffaf17412349bba364087e0ac02ec9"
@@ -638,6 +643,11 @@ ajv-keywords@^3.1.0, ajv-keywords@^3.4.1:
   resolved "https://registry.yarnpkg.com/ajv-keywords/-/ajv-keywords-3.5.1.tgz#b83ca89c5d42d69031f424cad49aada0236c6957"
   integrity sha512-KWcq3xN8fDjSB+IMoh2VaXVhRI0BBGxoYp3rx7Pkb6z0cFjYR9Q9l4yZqqals0/zsioCmocC5H6UvsGD4MoIBA==
 
+ajv-keywords@^3.5.2:
+  version "3.5.2"
+  resolved "https://registry.yarnpkg.com/ajv-keywords/-/ajv-keywords-3.5.2.tgz#31f29da5ab6e00d1c2d329acf7b5929614d5014d"
+  integrity sha512-5p6WTN0DdTGVQk6VjcEju19IgaHudalcfabD7yhDGeA6bcQnmL+CpveLJq/3hvfwd1aof6L386Ougkx6RfyMIQ==
+
 ajv@^5.5.2:
   version "5.5.2"
   resolved "https://registry.yarnpkg.com/ajv/-/ajv-5.5.2.tgz#73b5eeca3fab653e3d3f9422b341ad42205dc965"
@@ -658,7 +668,7 @@ ajv@^6.1.0, ajv@^6.10.0, ajv@^6.12.2:
     json-schema-traverse "^0.4.1"
     uri-js "^4.2.2"
 
-ajv@^6.10.2:
+ajv@^6.10.2, ajv@^6.12.5:
   version "6.12.6"
   resolved "https://registry.yarnpkg.com/ajv/-/ajv-6.12.6.tgz#baf5a62e802b07d977034586f8c3baf5adf26df4"
   integrity sha512-j3fVLgvTo527anyYyJOGTYJbG+vnnQYvE0m5mmkc1TK+nxAppkCLMIL0aZ4dblVCNoGShhm+kzE4ZUykBoMg4g==
@@ -678,7 +688,7 @@ ajv@^8.0.1:
     require-from-string "^2.0.2"
     uri-js "^4.2.2"
 
-alphanum-sort@^1.0.0:
+alphanum-sort@^1.0.2:
   version "1.0.2"
   resolved "https://registry.yarnpkg.com/alphanum-sort/-/alphanum-sort-1.0.2.tgz#97a1119649b211ad33691d9f9f486a8ec9fbe0a3"
   integrity sha1-l6ERlkmyEa0zaR2fn0hqjsn74KM=
@@ -1130,7 +1140,7 @@ bn.js@^5.1.1:
   resolved "https://registry.yarnpkg.com/bn.js/-/bn.js-5.1.2.tgz#c9686902d3c9a27729f43ab10f9d79c2004da7b0"
   integrity sha512-40rZaf3bUNKTVYu9sIeeEGOg7g14Yvnj9kH7b50EiwX0Q7A6umbvfI5tvHaOERH0XigqKkfLkFQxzb4e6CIXnA==
 
-boolbase@^1.0.0, boolbase@~1.0.0:
+boolbase@^1.0.0:
   version "1.0.0"
   resolved "https://registry.yarnpkg.com/boolbase/-/boolbase-1.0.0.tgz#68dff5fbe60c51eb37725ea9e3ed310dcc1e776e"
   integrity sha1-aN/1++YMUes3cl6p4+0xDcwed24=
@@ -1252,7 +1262,7 @@ browserslist@^4.0.0:
     escalade "^3.1.0"
     node-releases "^1.1.61"
 
-browserslist@^4.12.0, browserslist@^4.14.5:
+browserslist@^4.12.0, browserslist@^4.14.5, browserslist@^4.16.0, browserslist@^4.16.6:
   version "4.16.6"
   resolved "https://registry.yarnpkg.com/browserslist/-/browserslist-4.16.6.tgz#d7901277a5a88e554ed305b183ec9b0c08f66fa2"
   integrity sha512-Wspk/PqO+4W9qp5iUTJsa1B/QrYn1keNCcEP5OvP7WBwT4KaDly0uONYmC6Xa3Z5IqnUgS0KcgLYu1l74x0ZXQ==
@@ -1351,25 +1361,6 @@ call-me-maybe@^1.0.1:
   resolved "https://registry.yarnpkg.com/call-me-maybe/-/call-me-maybe-1.0.1.tgz#26d208ea89e37b5cbde60250a15f031c16a4d66b"
   integrity sha1-JtII6onje1y95gJQoV8DHBak1ms=
 
-caller-callsite@^2.0.0:
-  version "2.0.0"
-  resolved "https://registry.yarnpkg.com/caller-callsite/-/caller-callsite-2.0.0.tgz#847e0fce0a223750a9a027c54b33731ad3154134"
-  integrity sha1-hH4PzgoiN1CpoCfFSzNzGtMVQTQ=
-  dependencies:
-    callsites "^2.0.0"
-
-caller-path@^2.0.0:
-  version "2.0.0"
-  resolved "https://registry.yarnpkg.com/caller-path/-/caller-path-2.0.0.tgz#468f83044e369ab2010fac5f06ceee15bb2cb1f4"
-  integrity sha1-Ro+DBE42mrIBD6xfBs7uFbsssfQ=
-  dependencies:
-    caller-callsite "^2.0.0"
-
-callsites@^2.0.0:
-  version "2.0.0"
-  resolved "https://registry.yarnpkg.com/callsites/-/callsites-2.0.0.tgz#06eb84f00eea413da86affefacbffb36093b3c50"
-  integrity sha1-BuuE8A7qQT2oav/vrL/7Ngk7PFA=
-
 callsites@^3.0.0:
   version "3.1.0"
   resolved "https://registry.yarnpkg.com/callsites/-/callsites-3.1.0.tgz#b3630abd8943432f54b3f0519238e33cd7df2f73"
@@ -1399,15 +1390,10 @@ caniuse-api@^3.0.0:
     lodash.memoize "^4.1.2"
     lodash.uniq "^4.5.0"
 
-caniuse-lite@^1.0.0, caniuse-lite@^1.0.30001135:
-  version "1.0.30001148"
-  resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30001148.tgz#dc97c7ed918ab33bf8706ddd5e387287e015d637"
-  integrity sha512-E66qcd0KMKZHNJQt9hiLZGE3J4zuTqE1OnU53miEVtylFbwOEmeA5OsRu90noZful+XGSQOni1aT2tiqu/9yYw==
-
-caniuse-lite@^1.0.30001109, caniuse-lite@^1.0.30001219:
-  version "1.0.30001228"
-  resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30001228.tgz#bfdc5942cd3326fa51ee0b42fbef4da9d492a7fa"
-  integrity sha512-QQmLOGJ3DEgokHbMSA8cj2a+geXqmnpyOFT0lhQV6P3/YOJvGDEwoedcwxEQ30gJIwIIunHIicunJ2rzK5gB2A==
+caniuse-lite@^1.0.0, caniuse-lite@^1.0.30001109, caniuse-lite@^1.0.30001135, caniuse-lite@^1.0.30001219:
+  version "1.0.30001236"
+  resolved "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001236.tgz"
+  integrity sha512-o0PRQSrSCGJKCPZcgMzl5fUaj5xHe8qA2m4QRvnyY4e1lITqoNkr7q/Oh1NcpGSy0Th97UZ35yoKcINPoq7YOQ==
 
 chalk@^1.1.3:
   version "1.1.3"
@@ -1600,15 +1586,6 @@ co@^4.6.0:
   resolved "https://registry.yarnpkg.com/co/-/co-4.6.0.tgz#6ea6bdf3d853ae54ccb8e47bfa0bf3f9031fb184"
   integrity sha1-bqa989hTrlTMuOR7+gvz+QMfsYQ=
 
-coa@^2.0.2:
-  version "2.0.2"
-  resolved "https://registry.yarnpkg.com/coa/-/coa-2.0.2.tgz#43f6c21151b4ef2bf57187db0d73de229e3e7ec3"
-  integrity sha512-q5/jG+YQnSy4nRTV4F7lPepBJZ8qBNJJDBuJdoejDyLXgmL7IEo+Le2JDZudFTFt7mrCqIRaSjws4ygRCTCAXA==
-  dependencies:
-    "@types/q" "^1.5.1"
-    chalk "^2.4.1"
-    q "^1.1.2"
-
 code-error-fragment@0.0.230:
   version "0.0.230"
   resolved "https://registry.yarnpkg.com/code-error-fragment/-/code-error-fragment-0.0.230.tgz#d736d75c832445342eca1d1fedbf17d9618b14d7"
@@ -1627,7 +1604,7 @@ collection-visit@^1.0.0:
     map-visit "^1.0.0"
     object-visit "^1.0.0"
 
-color-convert@^1.9.0, color-convert@^1.9.1:
+color-convert@^1.9.0:
   version "1.9.3"
   resolved "https://registry.yarnpkg.com/color-convert/-/color-convert-1.9.3.tgz#bb71850690e1f136567de629d2d5471deda4c1e8"
   integrity sha512-QfAUtd+vFdAtFQcC8CCyYt1fYWxSqAiK2cSD6zDB8N3cpsEBAvRxp9zOGg6G/SHHJYAT88/az/IuDGALsNVbGg==
@@ -1646,26 +1623,15 @@ color-name@1.1.3:
   resolved "https://registry.yarnpkg.com/color-name/-/color-name-1.1.3.tgz#a7d0558bd89c42f795dd42328f740831ca53bc25"
   integrity sha1-p9BVi9icQveV3UIyj3QIMcpTvCU=
 
-color-name@^1.0.0, color-name@~1.1.4:
+color-name@~1.1.4:
   version "1.1.4"
   resolved "https://registry.yarnpkg.com/color-name/-/color-name-1.1.4.tgz#c2a09a87acbde69543de6f63fa3995c826c536a2"
   integrity sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==
 
-color-string@^1.5.4:
-  version "1.5.4"
-  resolved "https://registry.yarnpkg.com/color-string/-/color-string-1.5.4.tgz#dd51cd25cfee953d138fe4002372cc3d0e504cb6"
-  integrity sha512-57yF5yt8Xa3czSEW1jfQDE79Idk0+AkN/4KWad6tbdxUmAs3MvjxlWSWD4deYytcRfoZ9nhKyFl1kj5tBvidbw==
-  dependencies:
-    color-name "^1.0.0"
-    simple-swizzle "^0.2.2"
-
-color@^3.0.0:
-  version "3.1.3"
-  resolved "https://registry.yarnpkg.com/color/-/color-3.1.3.tgz#ca67fb4e7b97d611dcde39eceed422067d91596e"
-  integrity sha512-xgXAcTHa2HeFCGLE9Xs/R82hujGtu9Jd9x4NW3T34+OMs7VoPsjwzRczKHvTAHeJwWFwX5j15+MgAppE8ztObQ==
-  dependencies:
-    color-convert "^1.9.1"
-    color-string "^1.5.4"
+colord@^2.0.1:
+  version "2.0.1"
+  resolved "https://registry.yarnpkg.com/colord/-/colord-2.0.1.tgz#1e7fb1f9fa1cf74f42c58cb9c20320bab8435aa0"
+  integrity sha512-vm5YpaWamD0Ov6TSG0GGmUIwstrWcfKQV/h2CmbR7PbNu41+qdB5PW9lpzhjedrpm08uuYvcXi0Oel1RLZIJuA==
 
 colorette@^1.2.1, colorette@^1.2.2:
   version "1.2.2"
@@ -1677,6 +1643,11 @@ commander@2, commander@^2.20.0:
   resolved "https://registry.yarnpkg.com/commander/-/commander-2.20.3.tgz#fd485e84c03eb4881c20722ba48035e8531aeb33"
   integrity sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==
 
+commander@^7.1.0:
+  version "7.2.0"
+  resolved "https://registry.yarnpkg.com/commander/-/commander-7.2.0.tgz#a36cb57d0b501ce108e4d20559a150a391d97ab7"
+  integrity sha512-QrWXB+ZQSVPmIWIhtEO9H+gwHaMGYiF5ChvoJ+K9ZGHG/sVsa6yiesAD1GC/x46sET00Xlwo1u49RVVVzvcSkw==
+
 commondir@^1.0.1:
   version "1.0.1"
   resolved "https://registry.yarnpkg.com/commondir/-/commondir-1.0.1.tgz#ddd800da0c66127393cca5950ea968a3aaf1253b"
@@ -1785,16 +1756,6 @@ core-util-is@~1.0.0:
   resolved "https://registry.yarnpkg.com/core-util-is/-/core-util-is-1.0.2.tgz#b5fd54220aa2bc5ab57aab7140c940754503c1a7"
   integrity sha1-tf1UIgqivFq1eqtxQMlAdUUDwac=
 
-cosmiconfig@^5.0.0:
-  version "5.2.1"
-  resolved "https://registry.yarnpkg.com/cosmiconfig/-/cosmiconfig-5.2.1.tgz#040f726809c591e77a17c0a3626ca45b4f168b1a"
-  integrity sha512-H65gsXo1SKjf8zmrJ67eJk8aIRKV5ff2D4uKZIBZShbhGSpEmsQOPW/SKMKYhSTrqR7ufy6RP69rPogdaPh/kA==
-  dependencies:
-    import-fresh "^2.0.0"
-    is-directory "^0.3.1"
-    js-yaml "^3.13.1"
-    parse-json "^4.0.0"
-
 cosmiconfig@^7.0.0:
   version "7.0.0"
   resolved "https://registry.yarnpkg.com/cosmiconfig/-/cosmiconfig-7.0.0.tgz#ef9b44d773959cae63ddecd122de23853b60f8d3"
@@ -1874,17 +1835,21 @@ crypto-browserify@^3.11.0:
     randombytes "^2.0.0"
     randomfill "^1.0.3"
 
-css-color-names@0.0.4, css-color-names@^0.0.4:
+css-color-names@^0.0.4:
   version "0.0.4"
   resolved "https://registry.yarnpkg.com/css-color-names/-/css-color-names-0.0.4.tgz#808adc2e79cf84738069b646cb20ec27beb629e0"
   integrity sha1-gIrcLnnPhHOAabZGyyDsJ762KeA=
 
-css-declaration-sorter@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/css-declaration-sorter/-/css-declaration-sorter-4.0.1.tgz#c198940f63a76d7e36c1e71018b001721054cb22"
-  integrity sha512-BcxQSKTSEEQUftYpBVnsH4SF05NTuBokb19/sBt6asXGKZ/6VP7PLG1CBCkFDYOnhXhPh0jMhO6xZ71oYHXHBA==
+css-color-names@^1.0.1:
+  version "1.0.1"
+  resolved "https://registry.yarnpkg.com/css-color-names/-/css-color-names-1.0.1.tgz#6ff7ee81a823ad46e020fa2fd6ab40a887e2ba67"
+  integrity sha512-/loXYOch1qU1biStIFsHH8SxTmOseh1IJqFvy8IujXOm1h+QjUdDhkzOrR5HG8K8mlxREj0yfi8ewCHx0eMxzA==
+
+css-declaration-sorter@^6.0.3:
+  version "6.0.3"
+  resolved "https://registry.yarnpkg.com/css-declaration-sorter/-/css-declaration-sorter-6.0.3.tgz#9dfd8ea0df4cc7846827876fafb52314890c21a9"
+  integrity sha512-52P95mvW1SMzuRZegvpluT6yEv0FqQusydKQPZsNN5Q7hh8EwQvN8E2nwuJ16BBvNN6LcoIZXu/Bk58DAhrrxw==
   dependencies:
-    postcss "^7.0.1"
     timsort "^0.3.0"
 
 css-loader@^3.4.2:
@@ -1924,20 +1889,16 @@ css-modules-require-hook@^4.0.6:
     postcss-modules-values "^1.1.1"
     seekout "^1.0.1"
 
-css-select-base-adapter@^0.1.1:
-  version "0.1.1"
-  resolved "https://registry.yarnpkg.com/css-select-base-adapter/-/css-select-base-adapter-0.1.1.tgz#3b2ff4972cc362ab88561507a95408a1432135d7"
-  integrity sha512-jQVeeRG70QI08vSTwf1jHxp74JoZsr2XSgETae8/xC8ovSnL2WF87GTLO86Sbwdt2lK4Umg4HnnwMO4YF3Ce7w==
-
-css-select@^2.0.0:
-  version "2.1.0"
-  resolved "https://registry.yarnpkg.com/css-select/-/css-select-2.1.0.tgz#6a34653356635934a81baca68d0255432105dbef"
-  integrity sha512-Dqk7LQKpwLoH3VovzZnkzegqNSuAziQyNZUcrdDM401iY+R5NkGBXGmtO05/yaXQziALuPogeG0b7UAgjnTJTQ==
+css-select@^3.1.2:
+  version "3.1.2"
+  resolved "https://registry.yarnpkg.com/css-select/-/css-select-3.1.2.tgz#d52cbdc6fee379fba97fb0d3925abbd18af2d9d8"
+  integrity sha512-qmss1EihSuBNWNNhHjxzxSfJoFBM/lERB/Q4EnsJQQC62R2evJDW481091oAdOr9uh46/0n4nrg0It5cAnj1RA==
   dependencies:
     boolbase "^1.0.0"
-    css-what "^3.2.1"
-    domutils "^1.7.0"
-    nth-check "^1.0.2"
+    css-what "^4.0.0"
+    domhandler "^4.0.0"
+    domutils "^2.4.3"
+    nth-check "^2.0.0"
 
 css-selector-tokenizer@^0.7.0:
   version "0.7.1"
@@ -1948,26 +1909,18 @@ css-selector-tokenizer@^0.7.0:
     fastparse "^1.1.1"
     regexpu-core "^1.0.0"
 
-css-tree@1.0.0-alpha.37:
-  version "1.0.0-alpha.37"
-  resolved "https://registry.yarnpkg.com/css-tree/-/css-tree-1.0.0-alpha.37.tgz#98bebd62c4c1d9f960ec340cf9f7522e30709a22"
-  integrity sha512-DMxWJg0rnz7UgxKT0Q1HU/L9BeJI0M6ksor0OgqOnF+aRCDWg/N2641HmVyU9KVIu0OVVWOb2IpC9A+BJRnejg==
-  dependencies:
-    mdn-data "2.0.4"
-    source-map "^0.6.1"
-
-css-tree@1.0.0-alpha.39:
-  version "1.0.0-alpha.39"
-  resolved "https://registry.yarnpkg.com/css-tree/-/css-tree-1.0.0-alpha.39.tgz#2bff3ffe1bb3f776cf7eefd91ee5cba77a149eeb"
-  integrity sha512-7UvkEYgBAHRG9Nt980lYxjsTrCyHFN53ky3wVsDkiMdVqylqRt+Zc+jm5qw7/qyOvN2dHSYtX0e4MbCCExSvnA==
+css-tree@^1.1.2:
+  version "1.1.3"
+  resolved "https://registry.yarnpkg.com/css-tree/-/css-tree-1.1.3.tgz#eb4870fb6fd7707327ec95c2ff2ab09b5e8db91d"
+  integrity sha512-tRpdppF7TRazZrjJ6v3stzv93qxRcSsFmW6cX0Zm2NVKpxE1WV1HblnghVv9TreireHkqI/VDEsfolRF1p6y7Q==
   dependencies:
-    mdn-data "2.0.6"
+    mdn-data "2.0.14"
     source-map "^0.6.1"
 
-css-what@^3.2.1:
-  version "3.4.2"
-  resolved "https://registry.yarnpkg.com/css-what/-/css-what-3.4.2.tgz#ea7026fcb01777edbde52124e21f327e7ae950e4"
-  integrity sha512-ACUm3L0/jiZTqfzRM3Hi9Q8eZqd6IK37mMWPLz9PJxkLWllYeRf+EHUSHYEtFop2Eqytaq1FizFVh7XfBnXCDQ==
+css-what@^4.0.0:
+  version "4.0.0"
+  resolved "https://registry.yarnpkg.com/css-what/-/css-what-4.0.0.tgz#35e73761cab2eeb3d3661126b23d7aa0e8432233"
+  integrity sha512-teijzG7kwYfNVsUh2H/YN62xW3KK9YhXEgSlbxMlcyjPNvdKJqFx5lrwlJgoFP1ZHlB89iGDlo/JyshKeRhv5A==
 
 cssesc@^0.1.0:
   version "0.1.0"
@@ -1979,80 +1932,61 @@ cssesc@^3.0.0:
   resolved "https://registry.yarnpkg.com/cssesc/-/cssesc-3.0.0.tgz#37741919903b868565e1c09ea747445cd18983ee"
   integrity sha512-/Tb/JcjK111nNScGob5MNtsntNM1aCNUDipB/TkwZFhyDrrE47SOx/18wF2bbjgc3ZzCSKW1T5nt5EbFoAz/Vg==
 
-cssnano-preset-default@^4.0.7:
-  version "4.0.7"
-  resolved "https://registry.yarnpkg.com/cssnano-preset-default/-/cssnano-preset-default-4.0.7.tgz#51ec662ccfca0f88b396dcd9679cdb931be17f76"
-  integrity sha512-x0YHHx2h6p0fCl1zY9L9roD7rnlltugGu7zXSKQx6k2rYw0Hi3IqxcoAGF7u9Q5w1nt7vK0ulxV8Lo+EvllGsA==
-  dependencies:
-    css-declaration-sorter "^4.0.1"
-    cssnano-util-raw-cache "^4.0.1"
-    postcss "^7.0.0"
-    postcss-calc "^7.0.1"
-    postcss-colormin "^4.0.3"
-    postcss-convert-values "^4.0.1"
-    postcss-discard-comments "^4.0.2"
-    postcss-discard-duplicates "^4.0.2"
-    postcss-discard-empty "^4.0.1"
-    postcss-discard-overridden "^4.0.1"
-    postcss-merge-longhand "^4.0.11"
-    postcss-merge-rules "^4.0.3"
-    postcss-minify-font-values "^4.0.2"
-    postcss-minify-gradients "^4.0.2"
-    postcss-minify-params "^4.0.2"
-    postcss-minify-selectors "^4.0.2"
-    postcss-normalize-charset "^4.0.1"
-    postcss-normalize-display-values "^4.0.2"
-    postcss-normalize-positions "^4.0.2"
-    postcss-normalize-repeat-style "^4.0.2"
-    postcss-normalize-string "^4.0.2"
-    postcss-normalize-timing-functions "^4.0.2"
-    postcss-normalize-unicode "^4.0.1"
-    postcss-normalize-url "^4.0.1"
-    postcss-normalize-whitespace "^4.0.2"
-    postcss-ordered-values "^4.1.2"
-    postcss-reduce-initial "^4.0.3"
-    postcss-reduce-transforms "^4.0.2"
-    postcss-svgo "^4.0.2"
-    postcss-unique-selectors "^4.0.1"
-
-cssnano-util-get-arguments@^4.0.0:
-  version "4.0.0"
-  resolved "https://registry.yarnpkg.com/cssnano-util-get-arguments/-/cssnano-util-get-arguments-4.0.0.tgz#ed3a08299f21d75741b20f3b81f194ed49cc150f"
-  integrity sha1-7ToIKZ8h11dBsg87gfGU7UnMFQ8=
-
-cssnano-util-get-match@^4.0.0:
-  version "4.0.0"
-  resolved "https://registry.yarnpkg.com/cssnano-util-get-match/-/cssnano-util-get-match-4.0.0.tgz#c0e4ca07f5386bb17ec5e52250b4f5961365156d"
-  integrity sha1-wOTKB/U4a7F+xeUiULT1lhNlFW0=
-
-cssnano-util-raw-cache@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/cssnano-util-raw-cache/-/cssnano-util-raw-cache-4.0.1.tgz#b26d5fd5f72a11dfe7a7846fb4c67260f96bf282"
-  integrity sha512-qLuYtWK2b2Dy55I8ZX3ky1Z16WYsx544Q0UWViebptpwn/xDBmog2TLg4f+DBMg1rJ6JDWtn96WHbOKDWt1WQA==
-  dependencies:
-    postcss "^7.0.0"
-
-cssnano-util-same-parent@^4.0.0:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/cssnano-util-same-parent/-/cssnano-util-same-parent-4.0.1.tgz#574082fb2859d2db433855835d9a8456ea18bbf3"
-  integrity sha512-WcKx5OY+KoSIAxBW6UBBRay1U6vkYheCdjyVNDm85zt5K9mHoGOfsOsqIszfAqrQQFIIKgjh2+FDgIj/zsl21Q==
+cssnano-preset-default@^5.1.3:
+  version "5.1.3"
+  resolved "https://registry.yarnpkg.com/cssnano-preset-default/-/cssnano-preset-default-5.1.3.tgz#caa54183a8c8df03124a9e23f374ab89df5a9a99"
+  integrity sha512-qo9tX+t4yAAZ/yagVV3b+QBKeLklQbmgR3wI7mccrDcR+bEk9iHgZN1E7doX68y9ThznLya3RDmR+nc7l6/2WQ==
+  dependencies:
+    css-declaration-sorter "^6.0.3"
+    cssnano-utils "^2.0.1"
+    postcss-calc "^8.0.0"
+    postcss-colormin "^5.2.0"
+    postcss-convert-values "^5.0.1"
+    postcss-discard-comments "^5.0.1"
+    postcss-discard-duplicates "^5.0.1"
+    postcss-discard-empty "^5.0.1"
+    postcss-discard-overridden "^5.0.1"
+    postcss-merge-longhand "^5.0.2"
+    postcss-merge-rules "^5.0.2"
+    postcss-minify-font-values "^5.0.1"
+    postcss-minify-gradients "^5.0.1"
+    postcss-minify-params "^5.0.1"
+    postcss-minify-selectors "^5.1.0"
+    postcss-normalize-charset "^5.0.1"
+    postcss-normalize-display-values "^5.0.1"
+    postcss-normalize-positions "^5.0.1"
+    postcss-normalize-repeat-style "^5.0.1"
+    postcss-normalize-string "^5.0.1"
+    postcss-normalize-timing-functions "^5.0.1"
+    postcss-normalize-unicode "^5.0.1"
+    postcss-normalize-url "^5.0.2"
+    postcss-normalize-whitespace "^5.0.1"
+    postcss-ordered-values "^5.0.2"
+    postcss-reduce-initial "^5.0.1"
+    postcss-reduce-transforms "^5.0.1"
+    postcss-svgo "^5.0.2"
+    postcss-unique-selectors "^5.0.1"
+
+cssnano-utils@^2.0.1:
+  version "2.0.1"
+  resolved "https://registry.yarnpkg.com/cssnano-utils/-/cssnano-utils-2.0.1.tgz#8660aa2b37ed869d2e2f22918196a9a8b6498ce2"
+  integrity sha512-i8vLRZTnEH9ubIyfdZCAdIdgnHAUeQeByEeQ2I7oTilvP9oHO6RScpeq3GsFUVqeB8uZgOQ9pw8utofNn32hhQ==
 
-cssnano@^4.1.10:
-  version "4.1.10"
-  resolved "https://registry.yarnpkg.com/cssnano/-/cssnano-4.1.10.tgz#0ac41f0b13d13d465487e111b778d42da631b8b2"
-  integrity sha512-5wny+F6H4/8RgNlaqab4ktc3e0/blKutmq8yNlBFXA//nSFFAqAngjNVRzUvCgYROULmZZUoosL/KSoZo5aUaQ==
+cssnano@^5.0.2:
+  version "5.0.6"
+  resolved "https://registry.yarnpkg.com/cssnano/-/cssnano-5.0.6.tgz#2a91ad34c6521ae31eab3da9c90108ea3093535d"
+  integrity sha512-NiaLH/7yqGksFGsFNvSRe2IV/qmEBAeDE64dYeD8OBrgp6lE8YoMeQJMtsv5ijo6MPyhuoOvFhI94reahBRDkw==
   dependencies:
-    cosmiconfig "^5.0.0"
-    cssnano-preset-default "^4.0.7"
-    is-resolvable "^1.0.0"
-    postcss "^7.0.0"
+    cosmiconfig "^7.0.0"
+    cssnano-preset-default "^5.1.3"
+    is-resolvable "^1.1.0"
 
-csso@^4.0.2:
-  version "4.0.3"
-  resolved "https://registry.yarnpkg.com/csso/-/csso-4.0.3.tgz#0d9985dc852c7cc2b2cacfbbe1079014d1a8e903"
-  integrity sha512-NL3spysxUkcrOgnpsT4Xdl2aiEiBG6bXswAABQVHcMrfjjBisFOKwLDOmf4wf32aPdcJws1zds2B0Rg+jqMyHQ==
+csso@^4.2.0:
+  version "4.2.0"
+  resolved "https://registry.yarnpkg.com/csso/-/csso-4.2.0.tgz#ea3a561346e8dc9f546d6febedd50187cf389529"
+  integrity sha512-wvlcdIbf6pwKEk7vHj8/Bkc0B4ylXZruLvOgs9doS5eOsOpuodOV2zJChSpkp+pRpYQLQMeF04nr3Z68Sta9jA==
   dependencies:
-    css-tree "1.0.0-alpha.39"
+    css-tree "^1.1.2"
 
 cyclist@^1.0.1:
   version "1.0.1"
@@ -2525,6 +2459,15 @@ dom-serializer@0, dom-serializer@^0.2.1:
     domelementtype "^2.0.1"
     entities "^2.0.0"
 
+dom-serializer@^1.0.1:
+  version "1.3.2"
+  resolved "https://registry.yarnpkg.com/dom-serializer/-/dom-serializer-1.3.2.tgz#6206437d32ceefaec7161803230c7a20bc1b4d91"
+  integrity sha512-5c54Bk5Dw4qAxNOI1pFEizPSjVsx5+bpJKmL2kPn8JhBUq2q09tTCa3mjijun2NfK78NMouDYNMBkOrPZiS+ig==
+  dependencies:
+    domelementtype "^2.0.1"
+    domhandler "^4.2.0"
+    entities "^2.0.0"
+
 domain-browser@^1.1.1:
   version "1.2.0"
   resolved "https://registry.yarnpkg.com/domain-browser/-/domain-browser-1.2.0.tgz#3d31f50191a6749dd1375a7f522e823d42e54eda"
@@ -2535,7 +2478,7 @@ domelementtype@1, domelementtype@^1.3.1:
   resolved "https://registry.yarnpkg.com/domelementtype/-/domelementtype-1.3.1.tgz#d048c44b37b0d10a7f2a3d5fee3f4333d790481f"
   integrity sha512-BSKB+TSpMpFI/HOxCNr1O8aMOTZ8hT3pM3GQ0w/mWRmkhEDSFJkkyzz4XQsBV44BChwGkrDfMyjVD0eA2aFV3w==
 
-domelementtype@^2.0.1:
+domelementtype@^2.0.1, domelementtype@^2.2.0:
   version "2.2.0"
   resolved "https://registry.yarnpkg.com/domelementtype/-/domelementtype-2.2.0.tgz#9a0b6c2782ed6a1c7323d42267183df9bd8b1d57"
   integrity sha512-DtBMo82pv1dFtUmHyr48beiuq792Sxohr+8Hm9zoxklYPfa6n0Z3Byjj2IV7bmr2IyqClnqEQhfgHJJ5QF0R5A==
@@ -2561,6 +2504,13 @@ domhandler@^3.0.0:
   dependencies:
     domelementtype "^2.0.1"
 
+domhandler@^4.0.0, domhandler@^4.2.0:
+  version "4.2.0"
+  resolved "https://registry.yarnpkg.com/domhandler/-/domhandler-4.2.0.tgz#f9768a5f034be60a89a27c2e4d0f74eba0d8b059"
+  integrity sha512-zk7sgt970kzPks2Bf+dwT/PLzghLnsivb9CcxkvR8Mzr66Olr0Ofd8neSbglHJHaHa2MadfoSdNlKYAaafmWfA==
+  dependencies:
+    domelementtype "^2.2.0"
+
 dompurify@^2.0.12:
   version "2.2.6"
   resolved "https://registry.yarnpkg.com/dompurify/-/dompurify-2.2.6.tgz#54945dc5c0b45ce5ae228705777e8e59d7b2edc4"
@@ -2574,7 +2524,7 @@ domutils@1.5:
     dom-serializer "0"
     domelementtype "1"
 
-domutils@^1.5.1, domutils@^1.7.0:
+domutils@^1.5.1:
   version "1.7.0"
   resolved "https://registry.yarnpkg.com/domutils/-/domutils-1.7.0.tgz#56ea341e834e06e6748af7a1cb25da67ea9f8c2a"
   integrity sha512-Lgd2XcJ/NjEw+7tFvfKxOzCYKZsdct5lczQ2ZaQY8Djz7pfAD3Gbp8ySJWtreII/vDlMVmxwa6pHmdxIYgttDg==
@@ -2591,12 +2541,14 @@ domutils@^2.0.0:
     domelementtype "^2.0.1"
     domhandler "^3.0.0"
 
-dot-prop@^5.2.0:
-  version "5.3.0"
-  resolved "https://registry.yarnpkg.com/dot-prop/-/dot-prop-5.3.0.tgz#90ccce708cd9cd82cc4dc8c3ddd9abdd55b20e88"
-  integrity sha512-QM8q3zDe58hqUqjraQOmzZ1LIH9SWQJTlEKCH4kJ2oQvLZk7RbQXvtDM2XEq3fwkV9CCvvH4LA0AV+ogFsBM2Q==
+domutils@^2.4.3:
+  version "2.7.0"
+  resolved "https://registry.yarnpkg.com/domutils/-/domutils-2.7.0.tgz#8ebaf0c41ebafcf55b0b72ec31c56323712c5442"
+  integrity sha512-8eaHa17IwJUPAiB+SoTYBo5mCdeMgdcAoXJ59m6DT1vw+5iLS3gNoqYaRowaBKtGVrOF1Jz4yDTgYKLK2kvfJg==
   dependencies:
-    is-obj "^2.0.0"
+    dom-serializer "^1.0.1"
+    domelementtype "^2.2.0"
+    domhandler "^4.2.0"
 
 duplexify@^3.4.2, duplexify@^3.6.0:
   version "3.7.1"
@@ -2747,41 +2699,6 @@ es-abstract@^1.17.0-next.1:
     string.prototype.trimleft "^2.1.0"
     string.prototype.trimright "^2.1.0"
 
-es-abstract@^1.17.2:
-  version "1.17.7"
-  resolved "https://registry.yarnpkg.com/es-abstract/-/es-abstract-1.17.7.tgz#a4de61b2f66989fc7421676c1cb9787573ace54c"
-  integrity sha512-VBl/gnfcJ7OercKA9MVaegWsBHFjV492syMudcnQZvt/Dw8ezpcOHYZXa/J96O8vx+g4x65YKhxOwDUh63aS5g==
-  dependencies:
-    es-to-primitive "^1.2.1"
-    function-bind "^1.1.1"
-    has "^1.0.3"
-    has-symbols "^1.0.1"
-    is-callable "^1.2.2"
-    is-regex "^1.1.1"
-    object-inspect "^1.8.0"
-    object-keys "^1.1.1"
-    object.assign "^4.1.1"
-    string.prototype.trimend "^1.0.1"
-    string.prototype.trimstart "^1.0.1"
-
-es-abstract@^1.18.0-next.0:
-  version "1.18.0-next.1"
-  resolved "https://registry.yarnpkg.com/es-abstract/-/es-abstract-1.18.0-next.1.tgz#6e3a0a4bda717e5023ab3b8e90bec36108d22c68"
-  integrity sha512-I4UGspA0wpZXWENrdA0uHbnhte683t3qT/1VFH9aX2dA5PPSf6QW5HHXf5HImaqPmjXaVeVk4RGWnaylmV7uAA==
-  dependencies:
-    es-to-primitive "^1.2.1"
-    function-bind "^1.1.1"
-    has "^1.0.3"
-    has-symbols "^1.0.1"
-    is-callable "^1.2.2"
-    is-negative-zero "^2.0.0"
-    is-regex "^1.1.1"
-    object-inspect "^1.8.0"
-    object-keys "^1.1.1"
-    object.assign "^4.1.1"
-    string.prototype.trimend "^1.0.1"
-    string.prototype.trimstart "^1.0.1"
-
 es-to-primitive@^1.2.1:
   version "1.2.1"
   resolved "https://registry.yarnpkg.com/es-to-primitive/-/es-to-primitive-1.2.1.tgz#e55cd4c9cdc188bcefb03b366c736323fc5c898a"
@@ -3626,7 +3543,7 @@ has-values@^1.0.0:
     is-number "^3.0.0"
     kind-of "^4.0.0"
 
-has@^1.0.0, has@^1.0.3:
+has@^1.0.3:
   version "1.0.3"
   resolved "https://registry.yarnpkg.com/has/-/has-1.0.3.tgz#722d7cbfc1f6aa8241f16dd814e011e1f41e8796"
   integrity sha512-f2dvO0VU6Oej7RkWJGrehjbzMAjFp5/VKPp5tTpWIV4JHHZK1/BxbFRtf/siA2SWTe09caDmVtYYzWEIbBS4zw==
@@ -3701,11 +3618,6 @@ hsla-regex@^1.0.0:
   resolved "https://registry.yarnpkg.com/hsla-regex/-/hsla-regex-1.0.0.tgz#c1ce7a3168c8c6614033a4b5f7877f3b225f9c38"
   integrity sha1-wc56MWjIxmFAM6S194d/OyJfnDg=
 
-html-comment-regex@^1.1.0:
-  version "1.1.2"
-  resolved "https://registry.yarnpkg.com/html-comment-regex/-/html-comment-regex-1.1.2.tgz#97d4688aeb5c81886a364faa0cad1dda14d433a7"
-  integrity sha512-P+M65QY2JQ5Y0G9KKdlDpo0zK+/OHptU5AaBwUfAIDJZk1MYf32Frm84EcOytfJE0t5JvkAnKlmjsXDnWzCJmQ==
-
 html-tags@^3.1.0:
   version "3.1.0"
   resolved "https://registry.yarnpkg.com/html-tags/-/html-tags-3.1.0.tgz#7b5e6f7e665e9fb41f30007ed9e0d41e97fb2140"
@@ -3800,14 +3712,6 @@ ignore@^5.1.1, ignore@^5.1.4, ignore@^5.1.8:
   resolved "https://registry.yarnpkg.com/ignore/-/ignore-5.1.8.tgz#f150a8b50a34289b33e22f5889abd4d8016f0e57"
   integrity sha512-BMpfD7PpiETpBl/A6S498BaIJ6Y/ABT93ETbby2fP00v4EbvPBXWEoaR1UBPKs3iR53pJY7EtZk5KACI57i1Uw==
 
-import-fresh@^2.0.0:
-  version "2.0.0"
-  resolved "https://registry.yarnpkg.com/import-fresh/-/import-fresh-2.0.0.tgz#d81355c15612d386c61f9ddd3922d4304822a546"
-  integrity sha1-2BNVwVYS04bGH53dOSLUMEgipUY=
-  dependencies:
-    caller-path "^2.0.0"
-    resolve-from "^3.0.0"
-
 import-fresh@^3.0.0:
   version "3.2.1"
   resolved "https://registry.yarnpkg.com/import-fresh/-/import-fresh-3.2.1.tgz#633ff618506e793af5ac91bf48b72677e15cbe66"
@@ -3907,10 +3811,10 @@ invariant@^2.2.2:
   dependencies:
     loose-envify "^1.0.0"
 
-is-absolute-url@^2.0.0:
-  version "2.1.0"
-  resolved "https://registry.yarnpkg.com/is-absolute-url/-/is-absolute-url-2.1.0.tgz#50530dfb84fcc9aa7dbe7852e83a37b93b9f2aa6"
-  integrity sha1-UFMN+4T8yap9vnhS6Do3uTufKqY=
+is-absolute-url@^3.0.3:
+  version "3.0.3"
+  resolved "https://registry.yarnpkg.com/is-absolute-url/-/is-absolute-url-3.0.3.tgz#96c6a22b6a23929b11ea0afb1836c36ad4a5d698"
+  integrity sha512-opmNIX7uFnS96NtPmhWQgQx6/NYFgsUXYMllcfzwWKUMwfo8kku1TvE6hkNcH+Q1ts5cMVrsY7j0bxXQDciu9Q==
 
 is-accessor-descriptor@^0.1.6:
   version "0.1.6"
@@ -3944,11 +3848,6 @@ is-arrayish@^0.2.1:
   resolved "https://registry.yarnpkg.com/is-arrayish/-/is-arrayish-0.2.1.tgz#77c99840527aa8ecb1a8ba697b80645a7a926a9d"
   integrity sha1-d8mYQFJ6qOyxqLppe4BkWnqSap0=
 
-is-arrayish@^0.3.1:
-  version "0.3.2"
-  resolved "https://registry.yarnpkg.com/is-arrayish/-/is-arrayish-0.3.2.tgz#4574a2ae56f7ab206896fb431eaeed066fdf8f03"
-  integrity sha512-eVRqCvVlZbuw3GrM63ovNSNAeA1K16kaR/LRY/92w0zxQ5/1YzwblUX652i4Xs9RwAGjW9d9y6X88t8OaAJfWQ==
-
 is-binary-path@^1.0.0:
   version "1.0.1"
   resolved "https://registry.yarnpkg.com/is-binary-path/-/is-binary-path-1.0.1.tgz#75f16642b480f187a711c814161fd3a4a7655898"
@@ -3983,12 +3882,7 @@ is-callable@^1.2.0:
   resolved "https://registry.yarnpkg.com/is-callable/-/is-callable-1.2.0.tgz#83336560b54a38e35e3a2df7afd0454d691468bb"
   integrity sha512-pyVD9AaGLxtg6srb2Ng6ynWJqkHU9bEM087AKck0w8QwDarTfNcpIYoU8x8Hv2Icm8u6kFJM18Dag8lyqGkviw==
 
-is-callable@^1.2.2:
-  version "1.2.2"
-  resolved "https://registry.yarnpkg.com/is-callable/-/is-callable-1.2.2.tgz#c7c6715cd22d4ddb48d3e19970223aceabb080d9"
-  integrity sha512-dnMqspv5nU3LoewK2N/y7KLtxtakvTuaCsU9FU50/QDmdbHNy/4/JuRtMHqRU22o3q+W89YQndQEeCVwK+3qrA==
-
-is-color-stop@^1.0.0:
+is-color-stop@^1.1.0:
   version "1.1.0"
   resolved "https://registry.yarnpkg.com/is-color-stop/-/is-color-stop-1.1.0.tgz#cfff471aee4dd5c9e158598fbe12967b5cdad345"
   integrity sha1-z/9HGu5N1cnhWFmPvhKWe1za00U=
@@ -4049,11 +3943,6 @@ is-descriptor@^1.0.0, is-descriptor@^1.0.2:
     is-data-descriptor "^1.0.0"
     kind-of "^6.0.2"
 
-is-directory@^0.3.1:
-  version "0.3.1"
-  resolved "https://registry.yarnpkg.com/is-directory/-/is-directory-0.3.1.tgz#61339b6f2475fc772fd9c9d83f5c8575dc154ae1"
-  integrity sha1-YTObbyR1/Hcv2cnYP1yFddwVSuE=
-
 is-extendable@^0.1.0, is-extendable@^0.1.1:
   version "0.1.1"
   resolved "https://registry.yarnpkg.com/is-extendable/-/is-extendable-0.1.1.tgz#62b110e289a471418e3ec36a617d472e301dfc89"
@@ -4107,11 +3996,6 @@ is-hexadecimal@^1.0.0:
   resolved "https://registry.yarnpkg.com/is-hexadecimal/-/is-hexadecimal-1.0.4.tgz#cc35c97588da4bd49a8eedd6bc4082d44dcb23a7"
   integrity sha512-gyPJuv83bHMpocVYoqof5VDiZveEoGoFL8m3BXNb2VW8Xs+rz9kqO8LOQ5DH6EsuvilT1ApazU0pyl+ytbPtlw==
 
-is-negative-zero@^2.0.0:
-  version "2.0.0"
-  resolved "https://registry.yarnpkg.com/is-negative-zero/-/is-negative-zero-2.0.0.tgz#9553b121b0fac28869da9ed459e20c7543788461"
-  integrity sha1-lVOxIbD6wohp2p7UWeIMdUN4hGE=
-
 is-number@^3.0.0:
   version "3.0.0"
   resolved "https://registry.yarnpkg.com/is-number/-/is-number-3.0.0.tgz#24fd6201a4782cf50561c810276afc7d12d71195"
@@ -4124,11 +4008,6 @@ is-number@^7.0.0:
   resolved "https://registry.yarnpkg.com/is-number/-/is-number-7.0.0.tgz#7535345b896734d5f80c4d06c50955527a14f12b"
   integrity sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==
 
-is-obj@^2.0.0:
-  version "2.0.0"
-  resolved "https://registry.yarnpkg.com/is-obj/-/is-obj-2.0.0.tgz#473fb05d973705e3fd9620545018ca8e22ef4982"
-  integrity sha512-drqDG3cbczxxEJRoOXcOjtdp1J/lyp1mNn0xaznRs8+muBhgQcrnbspox5X5fOw0HnMnbfDzvnEMEtqDEJEo8w==
-
 is-path-cwd@^2.0.0:
   version "2.2.0"
   resolved "https://registry.yarnpkg.com/is-path-cwd/-/is-path-cwd-2.2.0.tgz#67d43b82664a7b5191fd9119127eb300048a9fdb"
@@ -4148,7 +4027,7 @@ is-path-inside@^2.1.0:
   dependencies:
     path-is-inside "^1.0.2"
 
-is-plain-obj@^1.0.0, is-plain-obj@^1.1.0:
+is-plain-obj@^1.1.0:
   version "1.1.0"
   resolved "https://registry.yarnpkg.com/is-plain-obj/-/is-plain-obj-1.1.0.tgz#71a50c8429dfca773c92a390a4a03b39fcd51d3e"
   integrity sha1-caUMhCnfync8kqOQpKA7OfzVHT4=
@@ -4179,19 +4058,12 @@ is-regex@^1.1.0:
   dependencies:
     has-symbols "^1.0.1"
 
-is-regex@^1.1.1:
-  version "1.1.1"
-  resolved "https://registry.yarnpkg.com/is-regex/-/is-regex-1.1.1.tgz#c6f98aacc546f6cec5468a07b7b153ab564a57b9"
-  integrity sha512-1+QkEcxiLlB7VEyFtyBg94e08OAsvq7FUBgApTq/w2ymCLyKJgDPsybBENVtA7XCQEgEXxKPonG+mvYRxh/LIg==
-  dependencies:
-    has-symbols "^1.0.1"
-
 is-regexp@^2.0.0:
   version "2.1.0"
   resolved "https://registry.yarnpkg.com/is-regexp/-/is-regexp-2.1.0.tgz#cd734a56864e23b956bf4e7c66c396a4c0b22c2d"
   integrity sha512-OZ4IlER3zmRIoB9AqNhEggVxqIH4ofDns5nRrPS6yQxXE1TPCUpFznBfRQmQa8uC+pXqjMnukiJBxCisIxiLGA==
 
-is-resolvable@^1.0.0:
+is-resolvable@^1.1.0:
   version "1.1.0"
   resolved "https://registry.yarnpkg.com/is-resolvable/-/is-resolvable-1.1.0.tgz#fb18f87ce1feb925169c9a407c19318a3206ed88"
   integrity sha512-qgDYXFSR5WvEfuS5dMj6oTMEbrrSaM0CrFk2Yiq/gXnBvD9pMa2jGXxyhGLfvhZpuMZe18CJpFxAt3CRs42NMg==
@@ -4201,13 +4073,6 @@ is-string@^1.0.5:
   resolved "https://registry.yarnpkg.com/is-string/-/is-string-1.0.5.tgz#40493ed198ef3ff477b8c7f92f644ec82a5cd3a6"
   integrity sha512-buY6VNRjhQMiF1qWDouloZlQbRhDPCebwxSjxMjxgemYT46YMd2NR0/H+fBhEfWX4A/w9TBJ+ol+okqJKFE6vQ==
 
-is-svg@^3.0.0:
-  version "3.0.0"
-  resolved "https://registry.yarnpkg.com/is-svg/-/is-svg-3.0.0.tgz#9321dbd29c212e5ca99c4fa9794c714bcafa2f75"
-  integrity sha512-gi4iHK53LR2ujhLVVj+37Ykh9GLqYHX6JOVXbLAucaG/Cqw9xwdFOjDM2qeifLs1sF1npXXFvDu0r5HNgCMrzQ==
-  dependencies:
-    html-comment-regex "^1.1.0"
-
 is-symbol@^1.0.2:
   version "1.0.3"
   resolved "https://registry.yarnpkg.com/is-symbol/-/is-symbol-1.0.3.tgz#38e1014b9e6329be0de9d24a414fd7441ec61937"
@@ -4314,7 +4179,7 @@ jshint@^2.12.0:
     shelljs "0.3.x"
     strip-json-comments "1.0.x"
 
-json-parse-better-errors@^1.0.1, json-parse-better-errors@^1.0.2:
+json-parse-better-errors@^1.0.2:
   version "1.0.2"
   resolved "https://registry.yarnpkg.com/json-parse-better-errors/-/json-parse-better-errors-1.0.2.tgz#bb867cfb3450e69107c131d1c514bab3dc8bcaa9"
   integrity sha512-mrqyZKfX5EhL7hvqcV6WG1yYjnjeuYDzDhhcAAUrq8Po85NBQBJP+ZDUT75qZQ98IkUoBqdkExkukOU7Ts2wrw==
@@ -4479,15 +4344,6 @@ loader-utils@^0.2.16:
     json5 "^0.5.0"
     object-assign "^4.0.1"
 
-loader-utils@^1.1.0:
-  version "1.2.3"
-  resolved "https://registry.yarnpkg.com/loader-utils/-/loader-utils-1.2.3.tgz#1ff5dc6911c9f0a062531a4c04b609406108c2c7"
-  integrity sha512-fkpz8ejdnEMG3s37wGL07iSBDg99O9D5yflE9RGNH3hRdx9SOwYfnGYdZOUIZitN8E+E2vkq3MUMYMvPYl5ZZA==
-  dependencies:
-    big.js "^5.2.2"
-    emojis-list "^2.0.0"
-    json5 "^1.0.1"
-
 loader-utils@^1.2.3, loader-utils@^1.4.0:
   version "1.4.0"
   resolved "https://registry.yarnpkg.com/loader-utils/-/loader-utils-1.4.0.tgz#c579b5e34cb34b1a74edc6c1fb36bfa371d5a613"
@@ -4687,15 +4543,10 @@ mdast-util-to-string@^2.0.0:
   resolved "https://registry.yarnpkg.com/mdast-util-to-string/-/mdast-util-to-string-2.0.0.tgz#b8cfe6a713e1091cb5b728fc48885a4767f8b97b"
   integrity sha512-AW4DRS3QbBayY/jJmD8437V1Gombjf8RSOUCMFBuo5iHi58AGEgVCKQ+ezHkZZDpAQS75hcBMpLqjpJTjtUL7w==
 
-mdn-data@2.0.4:
-  version "2.0.4"
-  resolved "https://registry.yarnpkg.com/mdn-data/-/mdn-data-2.0.4.tgz#699b3c38ac6f1d728091a64650b65d388502fd5b"
-  integrity sha512-iV3XNKw06j5Q7mi6h+9vbx23Tv7JkjEVgKHW4pimwyDGWm0OIQntJJ+u1C6mg6mK1EaTv42XQ7w76yuzH7M2cA==
-
-mdn-data@2.0.6:
-  version "2.0.6"
-  resolved "https://registry.yarnpkg.com/mdn-data/-/mdn-data-2.0.6.tgz#852dc60fcaa5daa2e8cf6c9189c440ed3e042978"
-  integrity sha512-rQvjv71olwNHgiTbfPZFkJtjNMciWgswYeciZhtvWLO8bmX3TnhyA62I6sTWOyZssWHJJjY6/KiWwqQsWWsqOA==
+mdn-data@2.0.14:
+  version "2.0.14"
+  resolved "https://registry.yarnpkg.com/mdn-data/-/mdn-data-2.0.14.tgz#7113fc4281917d63ce29b43446f701e68c25ba50"
+  integrity sha512-dn6wd0uw5GsdswPFfsgMp5NSB0/aDe6fK94YJV/AJDYXL6HVLWBsxeq7js7Ad+mU2K9LAlwpk6kN2D5mwCPVow==
 
 memoize-one@~5.1.1:
   version "5.1.1"
@@ -4801,14 +4652,13 @@ min-indent@^1.0.0:
   resolved "https://registry.yarnpkg.com/min-indent/-/min-indent-1.0.1.tgz#a63f681673b30571fbe8bc25686ae746eefa9869"
   integrity sha512-I9jwMn07Sy/IwOj3zVkVik2JTvgpaykDZEigL6Rx6N9LbMywwUSMtxET+7lVoDLLd3O3IXwJwvuuns8UB/HeAg==
 
-mini-css-extract-plugin@0.9.0:
-  version "0.9.0"
-  resolved "https://registry.yarnpkg.com/mini-css-extract-plugin/-/mini-css-extract-plugin-0.9.0.tgz#47f2cf07aa165ab35733b1fc97d4c46c0564339e"
-  integrity sha512-lp3GeY7ygcgAmVIcRPBVhIkf8Us7FZjA+ILpal44qLdSu11wmjKQ3d9k15lfD7pO4esu9eUIAW7qiYIBppv40A==
+mini-css-extract-plugin@1.6.0:
+  version "1.6.0"
+  resolved "https://registry.yarnpkg.com/mini-css-extract-plugin/-/mini-css-extract-plugin-1.6.0.tgz#b4db2525af2624899ed64a23b0016e0036411893"
+  integrity sha512-nPFKI7NSy6uONUo9yn2hIfb9vyYvkFu95qki0e21DQ9uaqNKDP15DGpK0KnV6wDroWxPHtExrdEwx/yDQ8nVRw==
   dependencies:
-    loader-utils "^1.1.0"
-    normalize-url "1.9.1"
-    schema-utils "^1.0.0"
+    loader-utils "^2.0.0"
+    schema-utils "^3.0.0"
     webpack-sources "^1.1.0"
 
 minimalistic-assert@^1.0.0, minimalistic-assert@^1.0.1:
@@ -4902,7 +4752,7 @@ mixin-deep@^1.2.0:
     for-in "^1.0.2"
     is-extendable "^1.0.1"
 
-mkdirp@^0.5.1, mkdirp@^0.5.3, mkdirp@~0.5.1:
+mkdirp@^0.5.1, mkdirp@^0.5.3:
   version "0.5.5"
   resolved "https://registry.yarnpkg.com/mkdirp/-/mkdirp-0.5.5.tgz#d91cefd62d1436ca0f41620e251288d420099def"
   integrity sha512-NKmAlESf6jMGym1++R0Ra7wvhV+wFW63FaSOFPwRahvea0gMUcGUhVeAg/0BC0wiv9ih5NYPB1Wn1UEI1/L+xQ==
@@ -4984,6 +4834,11 @@ nan@^2.12.1:
   resolved "https://registry.yarnpkg.com/nan/-/nan-2.14.1.tgz#d7be34dfa3105b91494c3147089315eff8874b01"
   integrity sha512-isWHgVjnFjh2x2yuJ/tj3JbwoHu3UC2dX5G/88Cm24yB6YopVgxvBObDY7n5xW6ExmFhJpSEQqFPvq9zaXc8Jw==
 
+nanoid@^3.1.23:
+  version "3.1.23"
+  resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.23.tgz#f744086ce7c2bc47ee0a8472574d5c78e4183a81"
+  integrity sha512-FiB0kzdP0FFVGDKlRLEQ1BgDzU87dy5NnzjeW9YZNt+/c3+q82EQDUwniSAUxp/F0gFNI1ZhKU1FqYsMuqZVnw==
+
 nanomatch@^1.2.9:
   version "1.2.13"
   resolved "https://registry.yarnpkg.com/nanomatch/-/nanomatch-1.2.13.tgz#b87a8aa4fc0de8fe6be88895b38983ff265bd119"
@@ -5111,27 +4966,17 @@ normalize-selector@^0.2.0:
   resolved "https://registry.yarnpkg.com/normalize-selector/-/normalize-selector-0.2.0.tgz#d0b145eb691189c63a78d201dc4fdb1293ef0c03"
   integrity sha1-0LFF62kRicY6eNIB3E/bEpPvDAM=
 
-normalize-url@1.9.1:
-  version "1.9.1"
-  resolved "https://registry.yarnpkg.com/normalize-url/-/normalize-url-1.9.1.tgz#2cc0d66b31ea23036458436e3620d85954c66c3c"
-  integrity sha1-LMDWazHqIwNkWENuNiDYWVTGbDw=
-  dependencies:
-    object-assign "^4.0.1"
-    prepend-http "^1.0.0"
-    query-string "^4.1.0"
-    sort-keys "^1.0.0"
-
-normalize-url@^3.0.0:
-  version "3.3.0"
-  resolved "https://registry.yarnpkg.com/normalize-url/-/normalize-url-3.3.0.tgz#b2e1c4dc4f7c6d57743df733a4f5978d18650559"
-  integrity sha512-U+JJi7duF1o+u2pynbp2zXDW2/PADgC30f0GsHZtRh+HOcXHnw137TrNlyxxRvWW5fjKd3bcLHPxofWuCjaeZg==
+normalize-url@^6.0.1:
+  version "6.0.1"
+  resolved "https://registry.yarnpkg.com/normalize-url/-/normalize-url-6.0.1.tgz#a4f27f58cf8c7b287b440b8a8201f42d0b00d256"
+  integrity sha512-VU4pzAuh7Kip71XEmO9aNREYAdMHFGTVj/i+CaTImS8x0i1d3jUZkXhqluy/PRgjPLMgsLQulYY3PJ/aSbSjpQ==
 
-nth-check@^1.0.2:
-  version "1.0.2"
-  resolved "https://registry.yarnpkg.com/nth-check/-/nth-check-1.0.2.tgz#b2bd295c37e3dd58a3bf0700376663ba4d9cf05c"
-  integrity sha512-WeBOdju8SnzPN5vTUJYxYUxLeXpCaVP5i5e0LF8fg7WORF2Wd7wFX/pk0tYZk7s8T+J7VLy0Da6J1+wCT0AtHg==
+nth-check@^2.0.0:
+  version "2.0.0"
+  resolved "https://registry.yarnpkg.com/nth-check/-/nth-check-2.0.0.tgz#1bb4f6dac70072fc313e8c9cd1417b5074c0a125"
+  integrity sha512-i4sc/Kj8htBrAiH1viZ0TgU8Y5XqCaV/FziYK6TBczxmeKm3AEFWqqF3195yKudrarqy7Zu80Ra5dobFjn9X/Q==
   dependencies:
-    boolbase "~1.0.0"
+    boolbase "^1.0.0"
 
 num2fraction@^1.2.2:
   version "1.2.2"
@@ -5196,7 +5041,7 @@ oas-validator@^4.0.8:
     should "^13.2.1"
     yaml "^1.8.3"
 
-object-assign@^4.0.1, object-assign@^4.1.0, object-assign@^4.1.1:
+object-assign@^4.0.1, object-assign@^4.1.1:
   version "4.1.1"
   resolved "https://registry.yarnpkg.com/object-assign/-/object-assign-4.1.1.tgz#2109adc7965887cfc05cbbd442cac8bfbb360863"
   integrity sha1-IQmtx5ZYh8/AXLvUQsrIv7s2CGM=
@@ -5215,11 +5060,6 @@ object-inspect@^1.7.0:
   resolved "https://registry.yarnpkg.com/object-inspect/-/object-inspect-1.7.0.tgz#f4f6bd181ad77f006b5ece60bd0b6f398ff74a67"
   integrity sha512-a7pEHdh1xKIAgTySUGgLMx/xwDZskN1Ud6egYYN3EdRW4ZMPNEDUTF+hwy2LUC+Bl+SyLXANnwz/jyh/qutKUw==
 
-object-inspect@^1.8.0:
-  version "1.8.0"
-  resolved "https://registry.yarnpkg.com/object-inspect/-/object-inspect-1.8.0.tgz#df807e5ecf53a609cc6bfe93eac3cc7be5b3a9d0"
-  integrity sha512-jLdtEOB112fORuypAyl/50VRVIBIdVQOSUUGQHzJ4xBSbit81zRarz7GThkEFZy1RceYrWYcPcBFPQwHyAc1gA==
-
 object-keys@^1.0.11, object-keys@^1.0.12, object-keys@^1.1.1:
   version "1.1.1"
   resolved "https://registry.yarnpkg.com/object-keys/-/object-keys-1.1.1.tgz#1c47f272df277f3b1daf061677d9c82e2322c60e"
@@ -5242,16 +5082,6 @@ object.assign@^4.1.0:
     has-symbols "^1.0.0"
     object-keys "^1.0.11"
 
-object.assign@^4.1.1:
-  version "4.1.1"
-  resolved "https://registry.yarnpkg.com/object.assign/-/object.assign-4.1.1.tgz#303867a666cdd41936ecdedfb1f8f3e32a478cdd"
-  integrity sha512-VT/cxmx5yaoHSOTSyrCygIDFco+RsibY2NM0a4RdEeY/4KgqezwFtK1yr3U67xYhqJSlASm2pKhLVzPj2lr4bA==
-  dependencies:
-    define-properties "^1.1.3"
-    es-abstract "^1.18.0-next.0"
-    has-symbols "^1.0.1"
-    object-keys "^1.1.1"
-
 object.entries@^1.1.0:
   version "1.1.1"
   resolved "https://registry.yarnpkg.com/object.entries/-/object.entries-1.1.1.tgz#ee1cf04153de02bb093fec33683900f57ce5399b"
@@ -5271,14 +5101,6 @@ object.entries@^1.1.2:
     es-abstract "^1.17.5"
     has "^1.0.3"
 
-object.getownpropertydescriptors@^2.1.0:
-  version "2.1.0"
-  resolved "https://registry.yarnpkg.com/object.getownpropertydescriptors/-/object.getownpropertydescriptors-2.1.0.tgz#369bf1f9592d8ab89d712dced5cb81c7c5352649"
-  integrity sha512-Z53Oah9A3TdLoblT7VKJaTDdXdT+lQO+cNpKVnya5JDe9uLvzu1YyY1yFDFrcxrlRgWrEFH0jJtD/IbuwjcEVg==
-  dependencies:
-    define-properties "^1.1.3"
-    es-abstract "^1.17.0-next.1"
-
 object.pick@^1.3.0:
   version "1.3.0"
   resolved "https://registry.yarnpkg.com/object.pick/-/object.pick-1.3.0.tgz#87a10ac4c1694bd2e1cbf53591a66141fb5dd747"
@@ -5286,7 +5108,7 @@ object.pick@^1.3.0:
   dependencies:
     isobject "^3.0.1"
 
-object.values@^1.1.0, object.values@^1.1.1:
+object.values@^1.1.1:
   version "1.1.1"
   resolved "https://registry.yarnpkg.com/object.values/-/object.values-1.1.1.tgz#68a99ecde356b7e9295a3c5e0ce31dc8c953de5e"
   integrity sha512-WTa54g2K8iu0kmS/us18jEmdv1a4Wi//BZ/DTVYEcH0XhLM5NYdpDHja3gt57VrZLcNAO2WGA+KpWsDBaHt6eA==
@@ -5317,13 +5139,14 @@ openapi-sampler@^1.0.0-beta.18:
   dependencies:
     json-pointer "^0.6.0"
 
-optimize-css-assets-webpack-plugin@^5.0.4:
-  version "5.0.4"
-  resolved "https://registry.yarnpkg.com/optimize-css-assets-webpack-plugin/-/optimize-css-assets-webpack-plugin-5.0.4.tgz#85883c6528aaa02e30bbad9908c92926bb52dc90"
-  integrity sha512-wqd6FdI2a5/FdoiCNNkEvLeA//lHHfG24Ln2Xm2qqdIk4aOlsR18jwpyOihqQ8849W3qu2DX8fOYxpvTMj+93A==
+optimize-css-assets-webpack-plugin@6.0.0:
+  version "6.0.0"
+  resolved "https://registry.yarnpkg.com/optimize-css-assets-webpack-plugin/-/optimize-css-assets-webpack-plugin-6.0.0.tgz#00acd99d420715ad96ed3d8ad65a8a4df1be233b"
+  integrity sha512-XKVxJuCBSslP1Eyuf1uVtZT3Pkp6jEIkmg7BMcNU/pq6XAnDXTINkYFWmiQWt8+j//FO4dIDd4v+gn0m5VWJIw==
   dependencies:
-    cssnano "^4.1.10"
+    cssnano "^5.0.2"
     last-call-webpack-plugin "^3.0.0"
+    postcss "^8.2.1"
 
 optionator@^0.9.1:
   version "0.9.1"
@@ -5468,14 +5291,6 @@ parse-json@^2.2.0:
   dependencies:
     error-ex "^1.2.0"
 
-parse-json@^4.0.0:
-  version "4.0.0"
-  resolved "https://registry.yarnpkg.com/parse-json/-/parse-json-4.0.0.tgz#be35f5425be1f7f6c747184f98a788cb99477ee0"
-  integrity sha1-vjX1Qlvh9/bHRxhPmKeIy5lHfuA=
-  dependencies:
-    error-ex "^1.3.1"
-    json-parse-better-errors "^1.0.1"
-
 parse-json@^5.0.0:
   version "5.2.0"
   resolved "https://registry.yarnpkg.com/parse-json/-/parse-json-5.2.0.tgz#c76fc66dee54231c962b22bcc8a72cf2f99753cd"
@@ -5634,61 +5449,50 @@ posix-character-classes@^0.1.0:
   resolved "https://registry.yarnpkg.com/posix-character-classes/-/posix-character-classes-0.1.1.tgz#01eac0fe3b5af71a2a6c02feabb8c1fef7e00eab"
   integrity sha1-AerA/jta9xoqbAL+q7jB/vfgDqs=
 
-postcss-calc@^7.0.1:
-  version "7.0.5"
-  resolved "https://registry.yarnpkg.com/postcss-calc/-/postcss-calc-7.0.5.tgz#f8a6e99f12e619c2ebc23cf6c486fdc15860933e"
-  integrity sha512-1tKHutbGtLtEZF6PT4JSihCHfIVldU72mZ8SdZHIYriIZ9fh9k9aWSppaT8rHsyI3dX+KSR+W+Ix9BMY3AODrg==
+postcss-calc@^8.0.0:
+  version "8.0.0"
+  resolved "https://registry.yarnpkg.com/postcss-calc/-/postcss-calc-8.0.0.tgz#a05b87aacd132740a5db09462a3612453e5df90a"
+  integrity sha512-5NglwDrcbiy8XXfPM11F3HeC6hoT9W7GUH/Zi5U/p7u3Irv4rHhdDcIZwG0llHXV4ftsBjpfWMXAnXNl4lnt8g==
   dependencies:
-    postcss "^7.0.27"
     postcss-selector-parser "^6.0.2"
     postcss-value-parser "^4.0.2"
 
-postcss-colormin@^4.0.3:
-  version "4.0.3"
-  resolved "https://registry.yarnpkg.com/postcss-colormin/-/postcss-colormin-4.0.3.tgz#ae060bce93ed794ac71264f08132d550956bd381"
-  integrity sha512-WyQFAdDZpExQh32j0U0feWisZ0dmOtPl44qYmJKkq9xFWY3p+4qnRzCHeNrkeRhwPHz9bQ3mo0/yVkaply0MNw==
+postcss-colormin@^5.2.0:
+  version "5.2.0"
+  resolved "https://registry.yarnpkg.com/postcss-colormin/-/postcss-colormin-5.2.0.tgz#2b620b88c0ff19683f3349f4cf9e24ebdafb2c88"
+  integrity sha512-+HC6GfWU3upe5/mqmxuqYZ9B2Wl4lcoUUNkoaX59nEWV4EtADCMiBqui111Bu8R8IvaZTmqmxrqOAqjbHIwXPw==
   dependencies:
-    browserslist "^4.0.0"
-    color "^3.0.0"
-    has "^1.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    browserslist "^4.16.6"
+    caniuse-api "^3.0.0"
+    colord "^2.0.1"
+    postcss-value-parser "^4.1.0"
 
-postcss-convert-values@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/postcss-convert-values/-/postcss-convert-values-4.0.1.tgz#ca3813ed4da0f812f9d43703584e449ebe189a7f"
-  integrity sha512-Kisdo1y77KUC0Jmn0OXU/COOJbzM8cImvw1ZFsBgBgMgb1iL23Zs/LXRe3r+EZqM3vGYKdQ2YJVQ5VkJI+zEJQ==
+postcss-convert-values@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-convert-values/-/postcss-convert-values-5.0.1.tgz#4ec19d6016534e30e3102fdf414e753398645232"
+  integrity sha512-C3zR1Do2BkKkCgC0g3sF8TS0koF2G+mN8xxayZx3f10cIRmTaAnpgpRQZjNekTZxM2ciSPoh2IWJm0VZx8NoQg==
   dependencies:
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    postcss-value-parser "^4.1.0"
 
-postcss-discard-comments@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-discard-comments/-/postcss-discard-comments-4.0.2.tgz#1fbabd2c246bff6aaad7997b2b0918f4d7af4033"
-  integrity sha512-RJutN259iuRf3IW7GZyLM5Sw4GLTOH8FmsXBnv8Ab/Tc2k4SR4qbV4DNbyyY4+Sjo362SyDmW2DQ7lBSChrpkg==
-  dependencies:
-    postcss "^7.0.0"
+postcss-discard-comments@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-discard-comments/-/postcss-discard-comments-5.0.1.tgz#9eae4b747cf760d31f2447c27f0619d5718901fe"
+  integrity sha512-lgZBPTDvWrbAYY1v5GYEv8fEO/WhKOu/hmZqmCYfrpD6eyDWWzAOsl2rF29lpvziKO02Gc5GJQtlpkTmakwOWg==
 
-postcss-discard-duplicates@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-discard-duplicates/-/postcss-discard-duplicates-4.0.2.tgz#3fe133cd3c82282e550fc9b239176a9207b784eb"
-  integrity sha512-ZNQfR1gPNAiXZhgENFfEglF93pciw0WxMkJeVmw8eF+JZBbMD7jp6C67GqJAXVZP2BWbOztKfbsdmMp/k8c6oQ==
-  dependencies:
-    postcss "^7.0.0"
+postcss-discard-duplicates@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-discard-duplicates/-/postcss-discard-duplicates-5.0.1.tgz#68f7cc6458fe6bab2e46c9f55ae52869f680e66d"
+  integrity sha512-svx747PWHKOGpAXXQkCc4k/DsWo+6bc5LsVrAsw+OU+Ibi7klFZCyX54gjYzX4TH+f2uzXjRviLARxkMurA2bA==
 
-postcss-discard-empty@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/postcss-discard-empty/-/postcss-discard-empty-4.0.1.tgz#c8c951e9f73ed9428019458444a02ad90bb9f765"
-  integrity sha512-B9miTzbznhDjTfjvipfHoqbWKwd0Mj+/fL5s1QOz06wufguil+Xheo4XpOnc4NqKYBCNqqEzgPv2aPBIJLox0w==
-  dependencies:
-    postcss "^7.0.0"
+postcss-discard-empty@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-discard-empty/-/postcss-discard-empty-5.0.1.tgz#ee136c39e27d5d2ed4da0ee5ed02bc8a9f8bf6d8"
+  integrity sha512-vfU8CxAQ6YpMxV2SvMcMIyF2LX1ZzWpy0lqHDsOdaKKLQVQGVP1pzhrI9JlsO65s66uQTfkQBKBD/A5gp9STFw==
 
-postcss-discard-overridden@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/postcss-discard-overridden/-/postcss-discard-overridden-4.0.1.tgz#652aef8a96726f029f5e3e00146ee7a4e755ff57"
-  integrity sha512-IYY2bEDD7g1XM1IDEsUT4//iEYCxAmP5oDSFMVU/JVvT7gh+l4fmjciLqGgwjdWpQIdb0Che2VX00QObS5+cTg==
-  dependencies:
-    postcss "^7.0.0"
+postcss-discard-overridden@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-discard-overridden/-/postcss-discard-overridden-5.0.1.tgz#454b41f707300b98109a75005ca4ab0ff2743ac6"
+  integrity sha512-Y28H7y93L2BpJhrdUR2SR2fnSsT+3TVx1NmVQLbcnZWwIUpJ7mfcTC6Za9M2PG6w8j7UQRfzxqn8jU2VwFxo3Q==
 
 postcss-html@^0.36.0:
   version "0.36.0"
@@ -5709,67 +5513,60 @@ postcss-media-query-parser@^0.2.3:
   resolved "https://registry.yarnpkg.com/postcss-media-query-parser/-/postcss-media-query-parser-0.2.3.tgz#27b39c6f4d94f81b1a73b8f76351c609e5cef244"
   integrity sha1-J7Ocb02U+Bsac7j3Y1HGCeXO8kQ=
 
-postcss-merge-longhand@^4.0.11:
-  version "4.0.11"
-  resolved "https://registry.yarnpkg.com/postcss-merge-longhand/-/postcss-merge-longhand-4.0.11.tgz#62f49a13e4a0ee04e7b98f42bb16062ca2549e24"
-  integrity sha512-alx/zmoeXvJjp7L4mxEMjh8lxVlDFX1gqWHzaaQewwMZiVhLo42TEClKaeHbRf6J7j82ZOdTJ808RtN0ZOZwvw==
+postcss-merge-longhand@^5.0.2:
+  version "5.0.2"
+  resolved "https://registry.yarnpkg.com/postcss-merge-longhand/-/postcss-merge-longhand-5.0.2.tgz#277ada51d9a7958e8ef8cf263103c9384b322a41"
+  integrity sha512-BMlg9AXSI5G9TBT0Lo/H3PfUy63P84rVz3BjCFE9e9Y9RXQZD3+h3YO1kgTNsNJy7bBc1YQp8DmSnwLIW5VPcw==
   dependencies:
-    css-color-names "0.0.4"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
-    stylehacks "^4.0.0"
+    css-color-names "^1.0.1"
+    postcss-value-parser "^4.1.0"
+    stylehacks "^5.0.1"
 
-postcss-merge-rules@^4.0.3:
-  version "4.0.3"
-  resolved "https://registry.yarnpkg.com/postcss-merge-rules/-/postcss-merge-rules-4.0.3.tgz#362bea4ff5a1f98e4075a713c6cb25aefef9a650"
-  integrity sha512-U7e3r1SbvYzO0Jr3UT/zKBVgYYyhAz0aitvGIYOYK5CPmkNih+WDSsS5tvPrJ8YMQYlEMvsZIiqmn7HdFUaeEQ==
+postcss-merge-rules@^5.0.2:
+  version "5.0.2"
+  resolved "https://registry.yarnpkg.com/postcss-merge-rules/-/postcss-merge-rules-5.0.2.tgz#d6e4d65018badbdb7dcc789c4f39b941305d410a"
+  integrity sha512-5K+Md7S3GwBewfB4rjDeol6V/RZ8S+v4B66Zk2gChRqLTCC8yjnHQ601omj9TKftS19OPGqZ/XzoqpzNQQLwbg==
   dependencies:
-    browserslist "^4.0.0"
+    browserslist "^4.16.6"
     caniuse-api "^3.0.0"
-    cssnano-util-same-parent "^4.0.0"
-    postcss "^7.0.0"
-    postcss-selector-parser "^3.0.0"
-    vendors "^1.0.0"
+    cssnano-utils "^2.0.1"
+    postcss-selector-parser "^6.0.5"
+    vendors "^1.0.3"
 
-postcss-minify-font-values@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-minify-font-values/-/postcss-minify-font-values-4.0.2.tgz#cd4c344cce474343fac5d82206ab2cbcb8afd5a6"
-  integrity sha512-j85oO6OnRU9zPf04+PZv1LYIYOprWm6IA6zkXkrJXyRveDEuQggG6tvoy8ir8ZwjLxLuGfNkCZEQG7zan+Hbtg==
+postcss-minify-font-values@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-minify-font-values/-/postcss-minify-font-values-5.0.1.tgz#a90cefbfdaa075bd3dbaa1b33588bb4dc268addf"
+  integrity sha512-7JS4qIsnqaxk+FXY1E8dHBDmraYFWmuL6cgt0T1SWGRO5bzJf8sUoelwa4P88LEWJZweHevAiDKxHlofuvtIoA==
   dependencies:
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    postcss-value-parser "^4.1.0"
 
-postcss-minify-gradients@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-minify-gradients/-/postcss-minify-gradients-4.0.2.tgz#93b29c2ff5099c535eecda56c4aa6e665a663471"
-  integrity sha512-qKPfwlONdcf/AndP1U8SJ/uzIJtowHlMaSioKzebAXSG4iJthlWC9iSWznQcX4f66gIWX44RSA841HTHj3wK+Q==
+postcss-minify-gradients@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-minify-gradients/-/postcss-minify-gradients-5.0.1.tgz#2dc79fd1a1afcb72a9e727bc549ce860f93565d2"
+  integrity sha512-odOwBFAIn2wIv+XYRpoN2hUV3pPQlgbJ10XeXPq8UY2N+9ZG42xu45lTn/g9zZ+d70NKSQD6EOi6UiCMu3FN7g==
   dependencies:
-    cssnano-util-get-arguments "^4.0.0"
-    is-color-stop "^1.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    cssnano-utils "^2.0.1"
+    is-color-stop "^1.1.0"
+    postcss-value-parser "^4.1.0"
 
-postcss-minify-params@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-minify-params/-/postcss-minify-params-4.0.2.tgz#6b9cef030c11e35261f95f618c90036d680db874"
-  integrity sha512-G7eWyzEx0xL4/wiBBJxJOz48zAKV2WG3iZOqVhPet/9geefm/Px5uo1fzlHu+DOjT+m0Mmiz3jkQzVHe6wxAWg==
+postcss-minify-params@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-minify-params/-/postcss-minify-params-5.0.1.tgz#371153ba164b9d8562842fdcd929c98abd9e5b6c"
+  integrity sha512-4RUC4k2A/Q9mGco1Z8ODc7h+A0z7L7X2ypO1B6V8057eVK6mZ6xwz6QN64nHuHLbqbclkX1wyzRnIrdZehTEHw==
   dependencies:
-    alphanum-sort "^1.0.0"
-    browserslist "^4.0.0"
-    cssnano-util-get-arguments "^4.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    alphanum-sort "^1.0.2"
+    browserslist "^4.16.0"
+    cssnano-utils "^2.0.1"
+    postcss-value-parser "^4.1.0"
     uniqs "^2.0.0"
 
-postcss-minify-selectors@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-minify-selectors/-/postcss-minify-selectors-4.0.2.tgz#e2e5eb40bfee500d0cd9243500f5f8ea4262fbd8"
-  integrity sha512-D5S1iViljXBj9kflQo4YutWnJmwm8VvIsU1GeXJGiG9j8CIg9zs4voPMdQDUmIxetUOh60VilsNzCiAFTOqu3g==
+postcss-minify-selectors@^5.1.0:
+  version "5.1.0"
+  resolved "https://registry.yarnpkg.com/postcss-minify-selectors/-/postcss-minify-selectors-5.1.0.tgz#4385c845d3979ff160291774523ffa54eafd5a54"
+  integrity sha512-NzGBXDa7aPsAcijXZeagnJBKBPMYLaJJzB8CQh6ncvyl2sIndLVWfbcDi0SBjRWk5VqEjXvf8tYwzoKf4Z07og==
   dependencies:
-    alphanum-sort "^1.0.0"
-    has "^1.0.0"
-    postcss "^7.0.0"
-    postcss-selector-parser "^3.0.0"
+    alphanum-sort "^1.0.2"
+    postcss-selector-parser "^6.0.5"
 
 postcss-modules-extract-imports@^1.0.0:
   version "1.2.1"
@@ -5844,115 +5641,96 @@ postcss-modules-values@^3.0.0:
     icss-utils "^4.0.0"
     postcss "^7.0.6"
 
-postcss-normalize-charset@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-charset/-/postcss-normalize-charset-4.0.1.tgz#8b35add3aee83a136b0471e0d59be58a50285dd4"
-  integrity sha512-gMXCrrlWh6G27U0hF3vNvR3w8I1s2wOBILvA87iNXaPvSNo5uZAMYsZG7XjCUf1eVxuPfyL4TJ7++SGZLc9A3g==
-  dependencies:
-    postcss "^7.0.0"
+postcss-normalize-charset@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-charset/-/postcss-normalize-charset-5.0.1.tgz#121559d1bebc55ac8d24af37f67bd4da9efd91d0"
+  integrity sha512-6J40l6LNYnBdPSk+BHZ8SF+HAkS4q2twe5jnocgd+xWpz/mx/5Sa32m3W1AA8uE8XaXN+eg8trIlfu8V9x61eg==
 
-postcss-normalize-display-values@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-display-values/-/postcss-normalize-display-values-4.0.2.tgz#0dbe04a4ce9063d4667ed2be476bb830c825935a"
-  integrity sha512-3F2jcsaMW7+VtRMAqf/3m4cPFhPD3EFRgNs18u+k3lTJJlVe7d0YPO+bnwqo2xg8YiRpDXJI2u8A0wqJxMsQuQ==
+postcss-normalize-display-values@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-display-values/-/postcss-normalize-display-values-5.0.1.tgz#62650b965981a955dffee83363453db82f6ad1fd"
+  integrity sha512-uupdvWk88kLDXi5HEyI9IaAJTE3/Djbcrqq8YgjvAVuzgVuqIk3SuJWUisT2gaJbZm1H9g5k2w1xXilM3x8DjQ==
   dependencies:
-    cssnano-util-get-match "^4.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    cssnano-utils "^2.0.1"
+    postcss-value-parser "^4.1.0"
 
-postcss-normalize-positions@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-positions/-/postcss-normalize-positions-4.0.2.tgz#05f757f84f260437378368a91f8932d4b102917f"
-  integrity sha512-Dlf3/9AxpxE+NF1fJxYDeggi5WwV35MXGFnnoccP/9qDtFrTArZ0D0R+iKcg5WsUd8nUYMIl8yXDCtcrT8JrdA==
+postcss-normalize-positions@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-positions/-/postcss-normalize-positions-5.0.1.tgz#868f6af1795fdfa86fbbe960dceb47e5f9492fe5"
+  integrity sha512-rvzWAJai5xej9yWqlCb1OWLd9JjW2Ex2BCPzUJrbaXmtKtgfL8dBMOOMTX6TnvQMtjk3ei1Lswcs78qKO1Skrg==
   dependencies:
-    cssnano-util-get-arguments "^4.0.0"
-    has "^1.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    postcss-value-parser "^4.1.0"
 
-postcss-normalize-repeat-style@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-repeat-style/-/postcss-normalize-repeat-style-4.0.2.tgz#c4ebbc289f3991a028d44751cbdd11918b17910c"
-  integrity sha512-qvigdYYMpSuoFs3Is/f5nHdRLJN/ITA7huIoCyqqENJe9PvPmLhNLMu7QTjPdtnVf6OcYYO5SHonx4+fbJE1+Q==
+postcss-normalize-repeat-style@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-repeat-style/-/postcss-normalize-repeat-style-5.0.1.tgz#cbc0de1383b57f5bb61ddd6a84653b5e8665b2b5"
+  integrity sha512-syZ2itq0HTQjj4QtXZOeefomckiV5TaUO6ReIEabCh3wgDs4Mr01pkif0MeVwKyU/LHEkPJnpwFKRxqWA/7O3w==
   dependencies:
-    cssnano-util-get-arguments "^4.0.0"
-    cssnano-util-get-match "^4.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    cssnano-utils "^2.0.1"
+    postcss-value-parser "^4.1.0"
 
-postcss-normalize-string@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-string/-/postcss-normalize-string-4.0.2.tgz#cd44c40ab07a0c7a36dc5e99aace1eca4ec2690c"
-  integrity sha512-RrERod97Dnwqq49WNz8qo66ps0swYZDSb6rM57kN2J+aoyEAJfZ6bMx0sx/F9TIEX0xthPGCmeyiam/jXif0eA==
+postcss-normalize-string@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-string/-/postcss-normalize-string-5.0.1.tgz#d9eafaa4df78c7a3b973ae346ef0e47c554985b0"
+  integrity sha512-Ic8GaQ3jPMVl1OEn2U//2pm93AXUcF3wz+OriskdZ1AOuYV25OdgS7w9Xu2LO5cGyhHCgn8dMXh9bO7vi3i9pA==
   dependencies:
-    has "^1.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    postcss-value-parser "^4.1.0"
 
-postcss-normalize-timing-functions@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-timing-functions/-/postcss-normalize-timing-functions-4.0.2.tgz#8e009ca2a3949cdaf8ad23e6b6ab99cb5e7d28d9"
-  integrity sha512-acwJY95edP762e++00Ehq9L4sZCEcOPyaHwoaFOhIwWCDfik6YvqsYNxckee65JHLKzuNSSmAdxwD2Cud1Z54A==
+postcss-normalize-timing-functions@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-timing-functions/-/postcss-normalize-timing-functions-5.0.1.tgz#8ee41103b9130429c6cbba736932b75c5e2cb08c"
+  integrity sha512-cPcBdVN5OsWCNEo5hiXfLUnXfTGtSFiBU9SK8k7ii8UD7OLuznzgNRYkLZow11BkQiiqMcgPyh4ZqXEEUrtQ1Q==
   dependencies:
-    cssnano-util-get-match "^4.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    cssnano-utils "^2.0.1"
+    postcss-value-parser "^4.1.0"
 
-postcss-normalize-unicode@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-unicode/-/postcss-normalize-unicode-4.0.1.tgz#841bd48fdcf3019ad4baa7493a3d363b52ae1cfb"
-  integrity sha512-od18Uq2wCYn+vZ/qCOeutvHjB5jm57ToxRaMeNuf0nWVHaP9Hua56QyMF6fs/4FSUnVIw0CBPsU0K4LnBPwYwg==
+postcss-normalize-unicode@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-unicode/-/postcss-normalize-unicode-5.0.1.tgz#82d672d648a411814aa5bf3ae565379ccd9f5e37"
+  integrity sha512-kAtYD6V3pK0beqrU90gpCQB7g6AOfP/2KIPCVBKJM2EheVsBQmx/Iof+9zR9NFKLAx4Pr9mDhogB27pmn354nA==
   dependencies:
-    browserslist "^4.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    browserslist "^4.16.0"
+    postcss-value-parser "^4.1.0"
 
-postcss-normalize-url@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-url/-/postcss-normalize-url-4.0.1.tgz#10e437f86bc7c7e58f7b9652ed878daaa95faae1"
-  integrity sha512-p5oVaF4+IHwu7VpMan/SSpmpYxcJMtkGppYf0VbdH5B6hN8YNmVyJLuY9FmLQTzY3fag5ESUUHDqM+heid0UVA==
+postcss-normalize-url@^5.0.2:
+  version "5.0.2"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-url/-/postcss-normalize-url-5.0.2.tgz#ddcdfb7cede1270740cf3e4dfc6008bd96abc763"
+  integrity sha512-k4jLTPUxREQ5bpajFQZpx8bCF2UrlqOTzP9kEqcEnOfwsRshWs2+oAFIHfDQB8GO2PaUaSE0NlTAYtbluZTlHQ==
   dependencies:
-    is-absolute-url "^2.0.0"
-    normalize-url "^3.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    is-absolute-url "^3.0.3"
+    normalize-url "^6.0.1"
+    postcss-value-parser "^4.1.0"
 
-postcss-normalize-whitespace@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-normalize-whitespace/-/postcss-normalize-whitespace-4.0.2.tgz#bf1d4070fe4fcea87d1348e825d8cc0c5faa7d82"
-  integrity sha512-tO8QIgrsI3p95r8fyqKV+ufKlSHh9hMJqACqbv2XknufqEDhDvbguXGBBqxw9nsQoXWf0qOqppziKJKHMD4GtA==
+postcss-normalize-whitespace@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-normalize-whitespace/-/postcss-normalize-whitespace-5.0.1.tgz#b0b40b5bcac83585ff07ead2daf2dcfbeeef8e9a"
+  integrity sha512-iPklmI5SBnRvwceb/XH568yyzK0qRVuAG+a1HFUsFRf11lEJTiQQa03a4RSCQvLKdcpX7XsI1Gen9LuLoqwiqA==
   dependencies:
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    postcss-value-parser "^4.1.0"
 
-postcss-ordered-values@^4.1.2:
-  version "4.1.2"
-  resolved "https://registry.yarnpkg.com/postcss-ordered-values/-/postcss-ordered-values-4.1.2.tgz#0cf75c820ec7d5c4d280189559e0b571ebac0eee"
-  integrity sha512-2fCObh5UanxvSxeXrtLtlwVThBvHn6MQcu4ksNT2tsaV2Fg76R2CV98W7wNSlX+5/pFwEyaDwKLLoEV7uRybAw==
+postcss-ordered-values@^5.0.2:
+  version "5.0.2"
+  resolved "https://registry.yarnpkg.com/postcss-ordered-values/-/postcss-ordered-values-5.0.2.tgz#1f351426977be00e0f765b3164ad753dac8ed044"
+  integrity sha512-8AFYDSOYWebJYLyJi3fyjl6CqMEG/UVworjiyK1r573I56kb3e879sCJLGvR3merj+fAdPpVplXKQZv+ey6CgQ==
   dependencies:
-    cssnano-util-get-arguments "^4.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    cssnano-utils "^2.0.1"
+    postcss-value-parser "^4.1.0"
 
-postcss-reduce-initial@^4.0.3:
-  version "4.0.3"
-  resolved "https://registry.yarnpkg.com/postcss-reduce-initial/-/postcss-reduce-initial-4.0.3.tgz#7fd42ebea5e9c814609639e2c2e84ae270ba48df"
-  integrity sha512-gKWmR5aUulSjbzOfD9AlJiHCGH6AEVLaM0AV+aSioxUDd16qXP1PCh8d1/BGVvpdWn8k/HiK7n6TjeoXN1F7DA==
+postcss-reduce-initial@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-reduce-initial/-/postcss-reduce-initial-5.0.1.tgz#9d6369865b0f6f6f6b165a0ef5dc1a4856c7e946"
+  integrity sha512-zlCZPKLLTMAqA3ZWH57HlbCjkD55LX9dsRyxlls+wfuRfqCi5mSlZVan0heX5cHr154Dq9AfbH70LyhrSAezJw==
   dependencies:
-    browserslist "^4.0.0"
+    browserslist "^4.16.0"
     caniuse-api "^3.0.0"
-    has "^1.0.0"
-    postcss "^7.0.0"
 
-postcss-reduce-transforms@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-reduce-transforms/-/postcss-reduce-transforms-4.0.2.tgz#17efa405eacc6e07be3414a5ca2d1074681d4e29"
-  integrity sha512-EEVig1Q2QJ4ELpJXMZR8Vt5DQx8/mo+dGWSR7vWXqcob2gQLyQGsionYcGKATXvQzMPn6DSN1vTN7yFximdIAg==
+postcss-reduce-transforms@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-reduce-transforms/-/postcss-reduce-transforms-5.0.1.tgz#93c12f6a159474aa711d5269923e2383cedcf640"
+  integrity sha512-a//FjoPeFkRuAguPscTVmRQUODP+f3ke2HqFNgGPwdYnpeC29RZdCBvGRGTsKpMURb/I3p6jdKoBQ2zI+9Q7kA==
   dependencies:
-    cssnano-util-get-match "^4.0.0"
-    has "^1.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
+    cssnano-utils "^2.0.1"
+    postcss-value-parser "^4.1.0"
 
 postcss-resolve-nested-selector@^0.1.1:
   version "0.1.1"
@@ -5981,15 +5759,6 @@ postcss-scss@^2.1.1:
   dependencies:
     postcss "^7.0.6"
 
-postcss-selector-parser@^3.0.0:
-  version "3.1.2"
-  resolved "https://registry.yarnpkg.com/postcss-selector-parser/-/postcss-selector-parser-3.1.2.tgz#b310f5c4c0fdaf76f94902bbaa30db6aa84f5270"
-  integrity sha512-h7fJ/5uWuRVyOtkO45pnt1Ih40CEleeyCHzipqAZO2e5H20g25Y48uYnFUiShvY4rZWNJ/Bib/KVPmanaCtOhA==
-  dependencies:
-    dot-prop "^5.2.0"
-    indexes-of "^1.0.1"
-    uniq "^1.0.1"
-
 postcss-selector-parser@^6.0.0:
   version "6.0.2"
   resolved "https://registry.yarnpkg.com/postcss-selector-parser/-/postcss-selector-parser-6.0.2.tgz#934cf799d016c83411859e09dcecade01286ec5c"
@@ -5999,7 +5768,7 @@ postcss-selector-parser@^6.0.0:
     indexes-of "^1.0.1"
     uniq "^1.0.1"
 
-postcss-selector-parser@^6.0.2, postcss-selector-parser@^6.0.5:
+postcss-selector-parser@^6.0.2, postcss-selector-parser@^6.0.4, postcss-selector-parser@^6.0.5:
   version "6.0.6"
   resolved "https://registry.yarnpkg.com/postcss-selector-parser/-/postcss-selector-parser-6.0.6.tgz#2c5bba8174ac2f6981ab631a42ab0ee54af332ea"
   integrity sha512-9LXrvaaX3+mcv5xkg5kFwqSzSH1JIObIx51PrndZwlmznwXRfxMddDvo9gve3gVR8ZTKgoFDdWkbRFmEhT4PMg==
@@ -6007,35 +5776,28 @@ postcss-selector-parser@^6.0.2, postcss-selector-parser@^6.0.5:
     cssesc "^3.0.0"
     util-deprecate "^1.0.2"
 
-postcss-svgo@^4.0.2:
-  version "4.0.2"
-  resolved "https://registry.yarnpkg.com/postcss-svgo/-/postcss-svgo-4.0.2.tgz#17b997bc711b333bab143aaed3b8d3d6e3d38258"
-  integrity sha512-C6wyjo3VwFm0QgBy+Fu7gCYOkCmgmClghO+pjcxvrcBKtiKt0uCF+hvbMO1fyv5BMImRK90SMb+dwUnfbGd+jw==
+postcss-svgo@^5.0.2:
+  version "5.0.2"
+  resolved "https://registry.yarnpkg.com/postcss-svgo/-/postcss-svgo-5.0.2.tgz#bc73c4ea4c5a80fbd4b45e29042c34ceffb9257f"
+  integrity sha512-YzQuFLZu3U3aheizD+B1joQ94vzPfE6BNUcSYuceNxlVnKKsOtdo6hL9/zyC168Q8EwfLSgaDSalsUGa9f2C0A==
   dependencies:
-    is-svg "^3.0.0"
-    postcss "^7.0.0"
-    postcss-value-parser "^3.0.0"
-    svgo "^1.0.0"
+    postcss-value-parser "^4.1.0"
+    svgo "^2.3.0"
 
 postcss-syntax@^0.36.2:
   version "0.36.2"
   resolved "https://registry.yarnpkg.com/postcss-syntax/-/postcss-syntax-0.36.2.tgz#f08578c7d95834574e5593a82dfbfa8afae3b51c"
   integrity sha512-nBRg/i7E3SOHWxF3PpF5WnJM/jQ1YpY9000OaVXlAQj6Zp/kIqJxEDWIZ67tAd7NLuk7zqN4yqe9nc0oNAOs1w==
 
-postcss-unique-selectors@^4.0.1:
-  version "4.0.1"
-  resolved "https://registry.yarnpkg.com/postcss-unique-selectors/-/postcss-unique-selectors-4.0.1.tgz#9446911f3289bfd64c6d680f073c03b1f9ee4bac"
-  integrity sha512-+JanVaryLo9QwZjKrmJgkI4Fn8SBgRO6WXQBJi7KiAVPlmxikB5Jzc4EvXMT2H0/m0RjrVVm9rGNhZddm/8Spg==
+postcss-unique-selectors@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/postcss-unique-selectors/-/postcss-unique-selectors-5.0.1.tgz#3be5c1d7363352eff838bd62b0b07a0abad43bfc"
+  integrity sha512-gwi1NhHV4FMmPn+qwBNuot1sG1t2OmacLQ/AX29lzyggnjd+MnVD5uqQmpXO3J17KGL2WAxQruj1qTd3H0gG/w==
   dependencies:
-    alphanum-sort "^1.0.0"
-    postcss "^7.0.0"
+    alphanum-sort "^1.0.2"
+    postcss-selector-parser "^6.0.5"
     uniqs "^2.0.0"
 
-postcss-value-parser@^3.0.0:
-  version "3.3.1"
-  resolved "https://registry.yarnpkg.com/postcss-value-parser/-/postcss-value-parser-3.3.1.tgz#9ff822547e2893213cf1c30efa51ac5fd1ba8281"
-  integrity sha512-pISE66AbVkp4fDQ7VHBwRNXzAAKJjw4Vw7nWI/+Q3vuly7SNfgYXvm6i5IgFylHGK5sP/xHAbB7N49OS4gWNyQ==
-
 postcss-value-parser@^4.0.0, postcss-value-parser@^4.0.2, postcss-value-parser@^4.1.0:
   version "4.1.0"
   resolved "https://registry.yarnpkg.com/postcss-value-parser/-/postcss-value-parser-4.1.0.tgz#443f6a20ced6481a2bda4fa8532a6e55d789a2cb"
@@ -6050,7 +5812,7 @@ postcss@^6.0.1, postcss@^6.0.2:
     source-map "^0.6.1"
     supports-color "^5.4.0"
 
-postcss@^7.0.0, postcss@^7.0.1, postcss@^7.0.14, postcss@^7.0.2, postcss@^7.0.21, postcss@^7.0.26, postcss@^7.0.27, postcss@^7.0.32, postcss@^7.0.35, postcss@^7.0.6:
+postcss@^7.0.14, postcss@^7.0.2, postcss@^7.0.21, postcss@^7.0.26, postcss@^7.0.32, postcss@^7.0.35, postcss@^7.0.6:
   version "7.0.35"
   resolved "https://registry.yarnpkg.com/postcss/-/postcss-7.0.35.tgz#d2be00b998f7f211d8a276974079f2e92b970e24"
   integrity sha512-3QT8bBJeX/S5zKTTjTCIjRF3If4avAT6kqxcASlTWEtAFCb9NH0OUxNDfgZSWdP5fJnBYCMEWkIFfWeugjzYMg==
@@ -6068,16 +5830,20 @@ postcss@^7.0.16, postcss@^7.0.5:
     source-map "^0.6.1"
     supports-color "^6.1.0"
 
+postcss@^8.2.1:
+  version "8.3.1"
+  resolved "https://registry.yarnpkg.com/postcss/-/postcss-8.3.1.tgz#71f380151c227f83b898294a46481f689f86b70a"
+  integrity sha512-9qH0MGjsSm+fjxOi3GnwViL1otfi7qkj+l/WX5gcRGmZNGsIcqc+A5fBkE6PUobEQK4APqYVaES+B3Uti98TCw==
+  dependencies:
+    colorette "^1.2.2"
+    nanoid "^3.1.23"
+    source-map-js "^0.6.2"
+
 prelude-ls@^1.2.1:
   version "1.2.1"
   resolved "https://registry.yarnpkg.com/prelude-ls/-/prelude-ls-1.2.1.tgz#debc6489d7a6e6b0e7611888cec880337d316396"
   integrity sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g==
 
-prepend-http@^1.0.0:
-  version "1.0.4"
-  resolved "https://registry.yarnpkg.com/prepend-http/-/prepend-http-1.0.4.tgz#d4f4562b0ce3696e41ac52d0e002e57a635dc6dc"
-  integrity sha1-1PRWKwzjaW5BrFLQ4ALlemNdxtw=
-
 prismjs@^1.22.0:
   version "1.23.0"
   resolved "https://registry.yarnpkg.com/prismjs/-/prismjs-1.23.0.tgz#d3b3967f7d72440690497652a9d40ff046067f33"
@@ -6176,19 +5942,6 @@ punycode@^2.1.0:
   resolved "https://registry.yarnpkg.com/punycode/-/punycode-2.1.1.tgz#b58b010ac40c22c5657616c8d2c2c02c7bf479ec"
   integrity sha512-XRsRjdf+j5ml+y/6GKHPZbrF/8p2Yga0JPtdqTIY2Xe5ohJPD9saDJJLPvp9+NSBprVvevdXZybnj2cv8OEd0A==
 
-q@^1.1.2:
-  version "1.5.1"
-  resolved "https://registry.yarnpkg.com/q/-/q-1.5.1.tgz#7e32f75b41381291d04611f1bf14109ac00651d7"
-  integrity sha1-fjL3W0E4EpHQRhHxvxQQmsAGUdc=
-
-query-string@^4.1.0:
-  version "4.3.4"
-  resolved "https://registry.yarnpkg.com/query-string/-/query-string-4.3.4.tgz#bbb693b9ca915c232515b228b1a02b609043dbeb"
-  integrity sha1-u7aTucqRXCMlFbIosaArYJBD2+s=
-  dependencies:
-    object-assign "^4.1.0"
-    strict-uri-encode "^1.0.0"
-
 querystring-es3@^0.2.0:
   version "0.2.1"
   resolved "https://registry.yarnpkg.com/querystring-es3/-/querystring-es3-0.2.1.tgz#9ec61f79049875707d69414596fd907a4d711e73"
@@ -6625,11 +6378,6 @@ safe-regex@^1.1.0:
   resolved "https://registry.yarnpkg.com/safer-buffer/-/safer-buffer-2.1.2.tgz#44fa161b0187b9549dd84bb91802f9bd8385cd6a"
   integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==
 
-sax@~1.2.4:
-  version "1.2.4"
-  resolved "https://registry.yarnpkg.com/sax/-/sax-1.2.4.tgz#2816234e2378bddc4e5354fab5caa895df7100d9"
-  integrity sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw==
-
 schema-utils@^1.0.0:
   version "1.0.0"
   resolved "https://registry.yarnpkg.com/schema-utils/-/schema-utils-1.0.0.tgz#0b79a93204d7b600d4b2850d1f66c2a34951c770"
@@ -6648,6 +6396,15 @@ schema-utils@^2.6.5, schema-utils@^2.6.6, schema-utils@^2.7.0:
     ajv "^6.12.2"
     ajv-keywords "^3.4.1"
 
+schema-utils@^3.0.0:
+  version "3.0.0"
+  resolved "https://registry.yarnpkg.com/schema-utils/-/schema-utils-3.0.0.tgz#67502f6aa2b66a2d4032b4279a2944978a0913ef"
+  integrity sha512-6D82/xSzO094ajanoOSbe4YvXWMfn2A//8Y1+MUqFAJul5Bs+yn36xbK9OtNDcRVSBJ9jjeoXftM6CfztsjOAA==
+  dependencies:
+    "@types/json-schema" "^7.0.6"
+    ajv "^6.12.5"
+    ajv-keywords "^3.5.2"
+
 seekout@^1.0.1:
   version "1.0.2"
   resolved "https://registry.yarnpkg.com/seekout/-/seekout-1.0.2.tgz#09ba9f1bd5b46fbb134718eb19a68382cbb1b9c9"
@@ -6800,13 +6557,6 @@ signal-exit@^3.0.2:
   resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.3.tgz#a1410c2edd8f077b08b4e253c8eacfcaf057461c"
   integrity sha512-VUJ49FC8U1OxwZLxIbTTrDvLnf/6TDgxZcK8wxR8zs13xpx7xbG60ndBlhNrFi2EMuFRoeDoJO7wthSLq42EjA==
 
-simple-swizzle@^0.2.2:
-  version "0.2.2"
-  resolved "https://registry.yarnpkg.com/simple-swizzle/-/simple-swizzle-0.2.2.tgz#a4da6b635ffcccca33f70d17cb92592de95e557a"
-  integrity sha1-pNprY1/8zMoz9w0Xy5JZLeleVXo=
-  dependencies:
-    is-arrayish "^0.3.1"
-
 slash@^1.0.0:
   version "1.0.0"
   resolved "https://registry.yarnpkg.com/slash/-/slash-1.0.0.tgz#c41f2f6c39fc16d1cd17ad4b5d896114ae470d55"
@@ -6870,18 +6620,16 @@ snapdragon@^0.8.1:
     source-map-resolve "^0.5.0"
     use "^3.1.0"
 
-sort-keys@^1.0.0:
-  version "1.1.2"
-  resolved "https://registry.yarnpkg.com/sort-keys/-/sort-keys-1.1.2.tgz#441b6d4d346798f1b4e49e8920adfba0e543f9ad"
-  integrity sha1-RBttTTRnmPG05J6JIK37oOVD+a0=
-  dependencies:
-    is-plain-obj "^1.0.0"
-
 source-list-map@^2.0.0:
   version "2.0.1"
   resolved "https://registry.yarnpkg.com/source-list-map/-/source-list-map-2.0.1.tgz#3993bd873bfc48479cca9ea3a547835c7c154b34"
   integrity sha512-qnQ7gVMxGNxsiL4lEuJwe/To8UnK7fAnmbGEEH8RpLouuKbeEm0lhbQVFIrNSuB+G7tVrAlVsZgETT5nljf+Iw==
 
+source-map-js@^0.6.2:
+  version "0.6.2"
+  resolved "https://registry.yarnpkg.com/source-map-js/-/source-map-js-0.6.2.tgz#0bb5de631b41cfbda6cfba8bd05a80efdfd2385e"
+  integrity sha512-/3GptzWzu0+0MBQFrDKzw/DvvMTUORvgY6k6jd/VS6iCR4RDTKWH6v6WPwQoUO8667uQEf9Oe38DxAYWY5F/Ug==
+
 source-map-resolve@^0.5.0:
   version "0.5.3"
   resolved "https://registry.yarnpkg.com/source-map-resolve/-/source-map-resolve-0.5.3.tgz#190866bece7553e1f8f267a2ee82c606b5509a1a"
@@ -7035,11 +6783,6 @@ stream-shift@^1.0.0:
   resolved "https://registry.yarnpkg.com/stream-shift/-/stream-shift-1.0.1.tgz#d7088281559ab2778424279b0877da3c392d5a3d"
   integrity sha512-AiisoFqQ0vbGcZgQPY1cdP2I76glaVA/RauYR4G4thNFgkTqr90yXTo4LYX60Jl+sIlPNHHdGSwo01AvbKUSVQ==
 
-strict-uri-encode@^1.0.0:
-  version "1.1.0"
-  resolved "https://registry.yarnpkg.com/strict-uri-encode/-/strict-uri-encode-1.1.0.tgz#279b225df1d582b1f54e65addd4352e18faa0713"
-  integrity sha1-J5siXfHVgrH1TmWt3UNS4Y+qBxM=
-
 string-width@^3.0.0, string-width@^3.1.0:
   version "3.1.0"
   resolved "https://registry.yarnpkg.com/string-width/-/string-width-3.1.0.tgz#22767be21b62af1081574306f69ac51b62203961"
@@ -7179,14 +6922,13 @@ style-search@^0.1.0:
   resolved "https://registry.yarnpkg.com/style-search/-/style-search-0.1.0.tgz#7958c793e47e32e07d2b5cafe5c0bf8e12e77902"
   integrity sha1-eVjHk+R+MuB9K1yv5cC/jhLneQI=
 
-stylehacks@^4.0.0:
-  version "4.0.3"
-  resolved "https://registry.yarnpkg.com/stylehacks/-/stylehacks-4.0.3.tgz#6718fcaf4d1e07d8a1318690881e8d96726a71d5"
-  integrity sha512-7GlLk9JwlElY4Y6a/rmbH2MhVlTyVmiJd1PfTCqFaIBEGMYNsrO/v3SeGTdhBThLg4Z+NbOk/qFMwCa+J+3p/g==
+stylehacks@^5.0.1:
+  version "5.0.1"
+  resolved "https://registry.yarnpkg.com/stylehacks/-/stylehacks-5.0.1.tgz#323ec554198520986806388c7fdaebc38d2c06fb"
+  integrity sha512-Es0rVnHIqbWzveU1b24kbw92HsebBepxfcqe5iix7t9j0PQqhs0IxXVXv0pY2Bxa08CgMkzD6OWql7kbGOuEdA==
   dependencies:
-    browserslist "^4.0.0"
-    postcss "^7.0.0"
-    postcss-selector-parser "^3.0.0"
+    browserslist "^4.16.0"
+    postcss-selector-parser "^6.0.4"
 
 stylelint-config-recommended@^3.0.0:
   version "3.0.0"
@@ -7292,24 +7034,18 @@ svg-tags@^1.0.0:
   resolved "https://registry.yarnpkg.com/svg-tags/-/svg-tags-1.0.0.tgz#58f71cee3bd519b59d4b2a843b6c7de64ac04764"
   integrity sha1-WPcc7jvVGbWdSyqEO2x95krAR2Q=
 
-svgo@^1.0.0:
-  version "1.3.2"
-  resolved "https://registry.yarnpkg.com/svgo/-/svgo-1.3.2.tgz#b6dc511c063346c9e415b81e43401145b96d4167"
-  integrity sha512-yhy/sQYxR5BkC98CY7o31VGsg014AKLEPxdfhora76l36hD9Rdy5NZA/Ocn6yayNPgSamYdtX2rFJdcv07AYVw==
+svgo@^2.3.0:
+  version "2.3.0"
+  resolved "https://registry.yarnpkg.com/svgo/-/svgo-2.3.0.tgz#6b3af81d0cbd1e19c83f5f63cec2cb98c70b5373"
+  integrity sha512-fz4IKjNO6HDPgIQxu4IxwtubtbSfGEAJUq/IXyTPIkGhWck/faiiwfkvsB8LnBkKLvSoyNNIY6d13lZprJMc9Q==
   dependencies:
-    chalk "^2.4.1"
-    coa "^2.0.2"
-    css-select "^2.0.0"
-    css-select-base-adapter "^0.1.1"
-    css-tree "1.0.0-alpha.37"
-    csso "^4.0.2"
-    js-yaml "^3.13.1"
-    mkdirp "~0.5.1"
-    object.values "^1.1.0"
-    sax "~1.2.4"
+    "@trysound/sax" "0.1.1"
+    chalk "^4.1.0"
+    commander "^7.1.0"
+    css-select "^3.1.2"
+    css-tree "^1.1.2"
+    csso "^4.2.0"
     stable "^0.1.8"
-    unquote "~1.1.1"
-    util.promisify "~1.0.0"
 
 swagger2openapi@^6.2.1:
   version "6.2.3"
@@ -7612,11 +7348,6 @@ universalify@^0.1.0:
   resolved "https://registry.yarnpkg.com/universalify/-/universalify-0.1.2.tgz#b646f69be3942dabcecc9d6639c80dc105efaa66"
   integrity sha512-rBJeI5CXAlmy1pV+617WB9J63U6XcazHHF2f2dbJix4XzpUF0RS3Zbj0FGIOCAva5P/d/GBOYaACQ1w+0azUkg==
 
-unquote@~1.1.1:
-  version "1.1.1"
-  resolved "https://registry.yarnpkg.com/unquote/-/unquote-1.1.1.tgz#8fded7324ec6e88a0ff8b905e7c098cdc086d544"
-  integrity sha1-j97XMk7G6IoP+LkF58CYzcCG1UQ=
-
 unset-value@^1.0.0:
   version "1.0.0"
   resolved "https://registry.yarnpkg.com/unset-value/-/unset-value-1.0.0.tgz#8376873f7d2335179ffb1e6fc3a8ed0dfc8ab559"
@@ -7679,16 +7410,6 @@ util-deprecate@^1.0.1, util-deprecate@^1.0.2, util-deprecate@~1.0.1:
   resolved "https://registry.yarnpkg.com/util-deprecate/-/util-deprecate-1.0.2.tgz#450d4dc9fa70de732762fbd2d4a28981419a0ccf"
   integrity sha1-RQ1Nyfpw3nMnYvvS1KKJgUGaDM8=
 
-util.promisify@~1.0.0:
-  version "1.0.1"
-  resolved "https://registry.yarnpkg.com/util.promisify/-/util.promisify-1.0.1.tgz#6baf7774b80eeb0f7520d8b81d07982a59abbaee"
-  integrity sha512-g9JpC/3He3bm38zsLupWryXHoEcS22YHthuPQSJdMy6KNrzIRzWqcsHzD/WUnqe45whVou4VIsPew37DoXWNrA==
-  dependencies:
-    define-properties "^1.1.3"
-    es-abstract "^1.17.2"
-    has-symbols "^1.0.1"
-    object.getownpropertydescriptors "^2.1.0"
-
 util@0.10.3:
   version "0.10.3"
   resolved "https://registry.yarnpkg.com/util/-/util-0.10.3.tgz#7afb1afe50805246489e3db7fe0ed379336ac0f9"
@@ -7721,7 +7442,7 @@ validate-npm-package-license@^3.0.1:
     spdx-correct "^3.0.0"
     spdx-expression-parse "^3.0.0"
 
-vendors@^1.0.0:
+vendors@^1.0.3:
   version "1.0.4"
   resolved "https://registry.yarnpkg.com/vendors/-/vendors-1.0.4.tgz#e2b800a53e7a29b93506c3cf41100d16c4c4ad8e"
   integrity sha512-/juG65kTL4Cy2su4P8HjtkTxk6VmJDiOPBufWniqQ6wknac6jNiXS9vU+hO3wgusiyqWlzTbVHi0dyJqRONg3w==

[airflow] 37/38: Ensure that `dag_run.conf` is a dict (#15057)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c7a3977ff7b436749d9ca0fdf53f0a8ca22dbdfa
Author: Jens Scheffler <47...@users.noreply.github.com>
AuthorDate: Tue Jun 22 14:31:37 2021 +0200

    Ensure that `dag_run.conf` is a dict (#15057)
    
    Co-authored-by: Ash Berlin-Taylor <as...@firemirror.com>
    (cherry picked from commit 01c9818405107271ee8341c72b3d2d1e48574e08)
---
 airflow/www/api/experimental/endpoints.py          |  6 ++++++
 airflow/www/templates/airflow/trigger.html         |  2 +-
 airflow/www/views.py                               |  7 ++++++-
 .../endpoints/test_dag_run_endpoint.py             | 20 ++++++++++++++++++
 tests/www/api/experimental/test_endpoints.py       | 24 +++++++++++++++-------
 tests/www/views/test_views_trigger_dag.py          | 11 ++++++++++
 6 files changed, 61 insertions(+), 9 deletions(-)

diff --git a/airflow/www/api/experimental/endpoints.py b/airflow/www/api/experimental/endpoints.py
index 78cacde..3033964 100644
--- a/airflow/www/api/experimental/endpoints.py
+++ b/airflow/www/api/experimental/endpoints.py
@@ -88,6 +88,12 @@ def trigger_dag(dag_id):
     conf = None
     if 'conf' in data:
         conf = data['conf']
+        if not isinstance(conf, dict):
+            error_message = 'Dag Run conf must be a dictionary object, other types are not supported'
+            log.error(error_message)
+            response = jsonify({'error': error_message})
+            response.status_code = 400
+            return response
 
     execution_date = None
     if 'execution_date' in data and data['execution_date'] is not None:
diff --git a/airflow/www/templates/airflow/trigger.html b/airflow/www/templates/airflow/trigger.html
index c4187f1..80164b8 100644
--- a/airflow/www/templates/airflow/trigger.html
+++ b/airflow/www/templates/airflow/trigger.html
@@ -35,7 +35,7 @@
     <input type="hidden" name="dag_id" value="{{ dag_id }}">
     <input type="hidden" name="origin" value="{{ origin }}">
     <div class="form-group">
-      <label for="conf">Configuration JSON (Optional)</label>
+      <label for="conf">Configuration JSON (Optional, must be a dict object)</label>
       <textarea class="form-control" name="conf" id="json">{{ conf }}</textarea>
     </div>
     <p>
diff --git a/airflow/www/views.py b/airflow/www/views.py
index 424892e..55fd7de 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -1523,8 +1523,13 @@ class Airflow(AirflowBaseView):  # noqa: D101  pylint: disable=too-many-public-m
         if request_conf:
             try:
                 run_conf = json.loads(request_conf)
+                if not isinstance(conf, dict):
+                    flash("Invalid JSON configuration, must be a dict", "error")
+                    return self.render_template(
+                        'airflow/trigger.html', dag_id=dag_id, origin=origin, conf=request_conf
+                    )
             except json.decoder.JSONDecodeError:
-                flash("Invalid JSON configuration", "error")
+                flash("Invalid JSON configuration, not parseable", "error")
                 return self.render_template(
                     'airflow/trigger.html', dag_id=dag_id, origin=origin, conf=request_conf
                 )
diff --git a/tests/api_connexion/endpoints/test_dag_run_endpoint.py b/tests/api_connexion/endpoints/test_dag_run_endpoint.py
index 4ea9110..482cbea 100644
--- a/tests/api_connexion/endpoints/test_dag_run_endpoint.py
+++ b/tests/api_connexion/endpoints/test_dag_run_endpoint.py
@@ -945,6 +945,26 @@ class TestPostDagRun(TestDagRunEndpoint):
         assert response.status_code == 400
         assert response.json['detail'] == expected
 
+    @parameterized.expand(
+        [
+            (
+                {
+                    "dag_run_id": "TEST_DAG_RUN",
+                    "execution_date": "2020-06-11T18:00:00+00:00",
+                    "conf": "some string",
+                },
+                "'some string' is not of type 'object' - 'conf'",
+            )
+        ]
+    )
+    def test_should_response_400_for_non_dict_dagrun_conf(self, data, expected):
+        self._create_dag("TEST_DAG_ID")
+        response = self.client.post(
+            "api/v1/dags/TEST_DAG_ID/dagRuns", json=data, environ_overrides={'REMOTE_USER': "test"}
+        )
+        assert response.status_code == 400
+        assert response.json['detail'] == expected
+
     def test_response_404(self):
         response = self.client.post(
             "api/v1/dags/TEST_DAG_ID/dagRuns",
diff --git a/tests/www/api/experimental/test_endpoints.py b/tests/www/api/experimental/test_endpoints.py
index 8c6734c..a63b0bb 100644
--- a/tests/www/api/experimental/test_endpoints.py
+++ b/tests/www/api/experimental/test_endpoints.py
@@ -148,9 +148,25 @@ class TestApiExperimental(TestBase):
     def test_trigger_dag(self):
         url_template = '/api/experimental/dags/{}/dag_runs'
         run_id = 'my_run' + utcnow().isoformat()
+
+        # Test error for nonexistent dag
+        response = self.client.post(
+            url_template.format('does_not_exist_dag'), data=json.dumps({}), content_type="application/json"
+        )
+        assert 404 == response.status_code
+
+        # Test error for bad conf data
         response = self.client.post(
             url_template.format('example_bash_operator'),
-            data=json.dumps({'run_id': run_id}),
+            data=json.dumps({'conf': 'This is a string not a dict'}),
+            content_type="application/json",
+        )
+        assert 400 == response.status_code
+
+        # Test OK case
+        response = self.client.post(
+            url_template.format('example_bash_operator'),
+            data=json.dumps({'run_id': run_id, 'conf': {'param': 'value'}}),
             content_type="application/json",
         )
         self.assert_deprecated(response)
@@ -168,12 +184,6 @@ class TestApiExperimental(TestBase):
         assert run_id == dag_run_id
         assert dag_run_id == response['run_id']
 
-        # Test error for nonexistent dag
-        response = self.client.post(
-            url_template.format('does_not_exist_dag'), data=json.dumps({}), content_type="application/json"
-        )
-        assert 404 == response.status_code
-
     def test_trigger_dag_for_date(self):
         url_template = '/api/experimental/dags/{}/dag_runs'
         dag_id = 'example_bash_operator'
diff --git a/tests/www/views/test_views_trigger_dag.py b/tests/www/views/test_views_trigger_dag.py
index fdd80b6..b36f891 100644
--- a/tests/www/views/test_views_trigger_dag.py
+++ b/tests/www/views/test_views_trigger_dag.py
@@ -78,6 +78,17 @@ def test_trigger_dag_conf_malformed(admin_client):
     assert run is None
 
 
+def test_trigger_dag_conf_not_dict(admin_client):
+    test_dag_id = "example_bash_operator"
+
+    response = admin_client.post(f'trigger?dag_id={test_dag_id}', data={'conf': 'string and not a dict'})
+    check_content_in_response('must be a dict', response)
+
+    with create_session() as session:
+        run = session.query(DagRun).filter(DagRun.dag_id == test_dag_id).first()
+    assert run is None
+
+
 def test_trigger_dag_form(admin_client):
     test_dag_id = "example_bash_operator"
     resp = admin_client.get(f'trigger?dag_id={test_dag_id}')

[airflow] 32/38: Allow null value for operator field in task_instance schema(REST API) (#16516)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8813c3dc85503e7c9cf8189b1bbe76e886f62bdd
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Fri Jun 18 08:56:05 2021 +0100

    Allow null value for operator field in task_instance schema(REST API) (#16516)
    
    This change makes it possible to get "old" task instances from 1.10.x
    
    (cherry picked from commit 087556f0c210e345ac1749933ff4de38e40478f6)
---
 airflow/api_connexion/openapi/v1.yaml                        | 1 +
 tests/api_connexion/endpoints/test_task_instance_endpoint.py | 9 ++++++++-
 2 files changed, 9 insertions(+), 1 deletion(-)

diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml
index 02a6690..182f356 100644
--- a/airflow/api_connexion/openapi/v1.yaml
+++ b/airflow/api_connexion/openapi/v1.yaml
@@ -2137,6 +2137,7 @@ components:
           type: integer
         operator:
           type: string
+          nullable: true
         queued_when:
           type: string
           nullable: true
diff --git a/tests/api_connexion/endpoints/test_task_instance_endpoint.py b/tests/api_connexion/endpoints/test_task_instance_endpoint.py
index 0b35ee3..b28330e 100644
--- a/tests/api_connexion/endpoints/test_task_instance_endpoint.py
+++ b/tests/api_connexion/endpoints/test_task_instance_endpoint.py
@@ -135,6 +135,13 @@ class TestTaskInstanceEndpoint:
 class TestGetTaskInstance(TestTaskInstanceEndpoint):
     def test_should_respond_200(self, session):
         self.create_task_instances(session)
+        # Update ti and set operator to None to
+        # test that operator field is nullable.
+        # This prevents issue when users upgrade to 2.0+
+        # from 1.10.x
+        # https://github.com/apache/airflow/issues/14421
+        session.query(TaskInstance).update({TaskInstance.operator: None}, synchronize_session='fetch')
+        session.commit()
         response = self.client.get(
             "/api/v1/dags/example_python_operator/dagRuns/TEST_DAG_RUN_ID/taskInstances/print_the_context",
             environ_overrides={"REMOTE_USER": "test"},
@@ -148,7 +155,7 @@ class TestGetTaskInstance(TestTaskInstanceEndpoint):
             "executor_config": "{}",
             "hostname": "",
             "max_tries": 0,
-            "operator": "PythonOperator",
+            "operator": None,
             "pid": 100,
             "pool": "default_pool",
             "pool_slots": 1,

[airflow] 01/38: Fix dag.clear() to set multiple dags to running when necessary (#15382)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 955de9bf895757f3d88a17542d67dee39ab40876
Author: yuqian90 <yu...@gmail.com>
AuthorDate: Sat May 29 23:01:39 2021 +0800

    Fix dag.clear() to set multiple dags to running when necessary (#15382)
    
    closes: #14260
    related: #9824
    
    When clearing task across dags using ExternalTaskMarker the dag state of the external DagRun is not set to active. So cleared tasks in the external dag will not automatically start if the DagRun is a Failed or Succeeded state.
    
    Two changes are made to fix the issue:
    
    Make clear_task_instances set DagRuns' state to dag_run_state for all the affected DagRuns.
    The filter for DagRun in clear_task_instances is fixed too. Previously, it made an assumption that execution_dates for all the dag_ids are the same, which is not always correct.
    test_external_task_marker_clear_activate is added to make sure the fix does the right thing.
    
    (cherry picked from commit 2bca8a5425c234b04fdf32d6c50ae3a91cd08262)
---
 UPDATING.md                                        | 11 +++
 .../endpoints/task_instance_endpoint.py            | 12 +--
 airflow/models/dag.py                              | 21 ++---
 airflow/models/taskinstance.py                     | 19 +++--
 tests/sensors/test_external_task_sensor.py         | 96 ++++++++++++++++++++++
 5 files changed, 129 insertions(+), 30 deletions(-)

diff --git a/UPDATING.md b/UPDATING.md
index e7a9119..910b489 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -26,6 +26,7 @@ assists users migrating to a new version.
 <!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
 **Table of contents**
 
+- [Main](#main)
 - [Airflow 2.1.0](#airflow-210)
 - [Airflow 2.0.2](#airflow-202)
 - [Airflow 2.0.1](#airflow-201)
@@ -69,6 +70,16 @@ https://developers.google.com/style/inclusive-documentation
 
 -->
 
+## Main
+
+### `activate_dag_runs` argument of the function `clear_task_instances` is replaced with `dag_run_state`
+
+To achieve the previous default behaviour of `clear_task_instances` with `activate_dag_runs=True`, no change is needed. To achieve the previous behaviour of `activate_dag_runs=False`, pass `dag_run_state=False` instead.
+
+### `dag.set_dag_runs_state` is deprecated
+
+The method `set_dag_runs_state` is no longer needed after a bug fix in PR: [#15382](https://github.com/apache/airflow/pull/15382). This method is now deprecated and will be removed in a future version.
+
 ## Airflow 2.1.0
 
 ### New "deprecated_api" extra
diff --git a/airflow/api_connexion/endpoints/task_instance_endpoint.py b/airflow/api_connexion/endpoints/task_instance_endpoint.py
index b84c59a..418bded 100644
--- a/airflow/api_connexion/endpoints/task_instance_endpoint.py
+++ b/airflow/api_connexion/endpoints/task_instance_endpoint.py
@@ -251,18 +251,8 @@ def post_clear_task_instances(dag_id: str, session=None):
     task_instances = dag.clear(get_tis=True, **data)
     if not data["dry_run"]:
         clear_task_instances(
-            task_instances,
-            session,
-            dag=dag,
-            activate_dag_runs=False,  # We will set DagRun state later.
+            task_instances, session, dag=dag, dag_run_state=State.RUNNING if reset_dag_runs else False
         )
-        if reset_dag_runs:
-            dag.set_dag_runs_state(
-                session=session,
-                start_date=data["start_date"],
-                end_date=data["end_date"],
-                state=State.RUNNING,
-            )
     task_instances = task_instances.join(
         DR, and_(DR.dag_id == TI.dag_id, DR.execution_date == TI.execution_date)
     ).add_column(DR.run_id)
diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index c90fb4f..8e96554 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -1123,6 +1123,11 @@ class DAG(LoggingMixin):
         end_date: Optional[datetime] = None,
         dag_ids: List[str] = None,
     ) -> None:
+        warnings.warn(
+            "This method is deprecated and will be removed in a future version.",
+            DeprecationWarning,
+            stacklevel=2,
+        )
         dag_ids = dag_ids or [self.dag_id]
         query = session.query(DagRun).filter(DagRun.dag_id.in_(dag_ids))
         if start_date:
@@ -1172,7 +1177,8 @@ class DAG(LoggingMixin):
         :type include_subdags: bool
         :param include_parentdag: Clear tasks in the parent dag of the subdag.
         :type include_parentdag: bool
-        :param dag_run_state: state to set DagRun to
+        :param dag_run_state: state to set DagRun to. If set to False, dagrun state will not
+            be changed.
         :param dry_run: Find the tasks to clear but don't clear them.
         :type dry_run: bool
         :param session: The sqlalchemy session to use
@@ -1193,20 +1199,17 @@ class DAG(LoggingMixin):
         """
         TI = TaskInstance
         tis = session.query(TI)
-        dag_ids = []
         if include_subdags:
             # Crafting the right filter for dag_id and task_ids combo
             conditions = []
             for dag in self.subdags + [self]:
                 conditions.append((TI.dag_id == dag.dag_id) & TI.task_id.in_(dag.task_ids))
-                dag_ids.append(dag.dag_id)
             tis = tis.filter(or_(*conditions))
         else:
             tis = session.query(TI).filter(TI.dag_id == self.dag_id)
             tis = tis.filter(TI.task_id.in_(self.task_ids))
 
         if include_parentdag and self.is_subdag and self.parent_dag is not None:
-            dag_ids.append(self.parent_dag.dag_id)
             p_dag = self.parent_dag.sub_dag(
                 task_ids_or_regex=r"^{}$".format(self.dag_id.split('.')[1]),
                 include_upstream=False,
@@ -1340,15 +1343,7 @@ class DAG(LoggingMixin):
                 tis,
                 session,
                 dag=self,
-                activate_dag_runs=False,  # We will set DagRun state later.
-            )
-
-            self.set_dag_runs_state(
-                session=session,
-                start_date=start_date,
-                end_date=end_date,
-                state=dag_run_state,
-                dag_ids=dag_ids,
+                dag_run_state=dag_run_state,
             )
         else:
             count = 0
diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
index a7e94bd..ae7eeef 100644
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -133,7 +133,7 @@ def set_error_file(error_file: str, error: Union[str, Exception]) -> None:
 def clear_task_instances(
     tis,
     session,
-    activate_dag_runs=True,
+    dag_run_state: str = State.RUNNING,
     dag=None,
 ):
     """
@@ -142,7 +142,8 @@ def clear_task_instances(
 
     :param tis: a list of task instances
     :param session: current session
-    :param activate_dag_runs: flag to check for active dag run
+    :param dag_run_state: state to set DagRun to. If set to False, dagrun state will not
+        be changed.
     :param dag: DAG object
     """
     job_ids = []
@@ -204,19 +205,25 @@ def clear_task_instances(
         for job in session.query(BaseJob).filter(BaseJob.id.in_(job_ids)).all():  # noqa
             job.state = State.SHUTDOWN
 
-    if activate_dag_runs and tis:
+    if dag_run_state is not False and tis:
         from airflow.models.dagrun import DagRun  # Avoid circular import
 
+        dates_by_dag_id = defaultdict(set)
+        for instance in tis:
+            dates_by_dag_id[instance.dag_id].add(instance.execution_date)
+
         drs = (
             session.query(DagRun)
             .filter(
-                DagRun.dag_id.in_({ti.dag_id for ti in tis}),
-                DagRun.execution_date.in_({ti.execution_date for ti in tis}),
+                or_(
+                    and_(DagRun.dag_id == dag_id, DagRun.execution_date.in_(dates))
+                    for dag_id, dates in dates_by_dag_id.items()
+                )
             )
             .all()
         )
         for dr in drs:
-            dr.state = State.RUNNING
+            dr.state = dag_run_state
             dr.start_date = timezone.utcnow()
 
 
diff --git a/tests/sensors/test_external_task_sensor.py b/tests/sensors/test_external_task_sensor.py
index 55080c9..19161a7 100644
--- a/tests/sensors/test_external_task_sensor.py
+++ b/tests/sensors/test_external_task_sensor.py
@@ -416,6 +416,53 @@ def dag_bag_ext():
     return dag_bag
 
 
+@pytest.fixture
+def dag_bag_parent_child():
+    """
+    Create a DagBag with two DAGs looking like this. task_1 of child_dag_1 on day 1 depends on
+    task_0 of parent_dag_0 on day 1. Therefore, when task_0 of parent_dag_0 on day 1 and day 2
+    are cleared, parent_dag_0 DagRuns need to be set to running on both days, but child_dag_1
+    only needs to be set to running on day 1.
+
+                   day 1   day 2
+
+     parent_dag_0  task_0  task_0
+                     |
+                     |
+                     v
+     child_dag_1   task_1  task_1
+
+    """
+    dag_bag = DagBag(dag_folder=DEV_NULL, include_examples=False)
+
+    day_1 = DEFAULT_DATE
+
+    dag_0 = DAG("parent_dag_0", start_date=day_1, schedule_interval=None)
+    task_0 = ExternalTaskMarker(
+        task_id="task_0",
+        external_dag_id="child_dag_1",
+        external_task_id="task_1",
+        execution_date=day_1.isoformat(),
+        recursion_depth=3,
+        dag=dag_0,
+    )
+
+    dag_1 = DAG("child_dag_1", start_date=day_1, schedule_interval=None)
+    _ = ExternalTaskSensor(
+        task_id="task_1",
+        external_dag_id=dag_0.dag_id,
+        external_task_id=task_0.task_id,
+        execution_date_fn=lambda execution_date: day_1 if execution_date == day_1 else [],
+        mode='reschedule',
+        dag=dag_1,
+    )
+
+    for dag in [dag_0, dag_1]:
+        dag_bag.bag_dag(dag=dag, root_dag=dag)
+
+    return dag_bag
+
+
 def run_tasks(dag_bag, execution_date=DEFAULT_DATE):
     """
     Run all tasks in the DAGs in the given dag_bag. Return the TaskInstance objects as a dict
@@ -464,6 +511,55 @@ def test_external_task_marker_transitive(dag_bag_ext):
     assert_ti_state_equal(ti_b_3, State.NONE)
 
 
+# pylint: disable=redefined-outer-name
+def test_external_task_marker_clear_activate(dag_bag_parent_child):
+    """
+    Test clearing tasks across DAGs and make sure the right DagRuns are activated.
+    """
+    from airflow.utils.session import create_session
+    from airflow.utils.types import DagRunType
+
+    dag_bag = dag_bag_parent_child
+    day_1 = DEFAULT_DATE
+    day_2 = DEFAULT_DATE + timedelta(days=1)
+
+    run_tasks(dag_bag, execution_date=day_1)
+    run_tasks(dag_bag, execution_date=day_2)
+
+    with create_session() as session:
+        for dag in dag_bag.dags.values():
+            for execution_date in [day_1, day_2]:
+                dagrun = dag.create_dagrun(
+                    State.RUNNING, execution_date, run_type=DagRunType.MANUAL, session=session
+                )
+                dagrun.set_state(State.SUCCESS)
+                session.add(dagrun)
+
+        session.commit()
+
+    # Assert that dagruns of all the affected dags are set to SUCCESS before tasks are cleared.
+    for dag in dag_bag.dags.values():
+        for execution_date in [day_1, day_2]:
+            dagrun = dag.get_dagrun(execution_date=execution_date)
+            assert dagrun.state == State.SUCCESS
+
+    dag_0 = dag_bag.get_dag("parent_dag_0")
+    task_0 = dag_0.get_task("task_0")
+    clear_tasks(dag_bag, dag_0, task_0, start_date=day_1, end_date=day_2)
+
+    # Assert that dagruns of all the affected dags are set to RUNNING after tasks are cleared.
+    # Unaffected dagruns should be left as SUCCESS.
+    dagrun_0_1 = dag_bag.get_dag('parent_dag_0').get_dagrun(execution_date=day_1)
+    dagrun_0_2 = dag_bag.get_dag('parent_dag_0').get_dagrun(execution_date=day_2)
+    dagrun_1_1 = dag_bag.get_dag('child_dag_1').get_dagrun(execution_date=day_1)
+    dagrun_1_2 = dag_bag.get_dag('child_dag_1').get_dagrun(execution_date=day_2)
+
+    assert dagrun_0_1.state == State.RUNNING
+    assert dagrun_0_2.state == State.RUNNING
+    assert dagrun_1_1.state == State.RUNNING
+    assert dagrun_1_2.state == State.SUCCESS
+
+
 def test_external_task_marker_future(dag_bag_ext):
     """
     Test clearing tasks with no end_date. This is the case when users clear tasks with

[airflow] 17/38: add num_runs query param for tree refresh (#16437)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7b9dd0b3a9524563713ed491114c9e4322f50af8
Author: Brent Bovenzi <br...@gmail.com>
AuthorDate: Mon Jun 14 15:02:35 2021 -0500

    add num_runs query param for tree refresh (#16437)
    
    - add `num_runs` as a meta field to add to the tree refresh request
    
    (cherry picked from commit 6087a09f89c7da4aac47eab3756a7fe24e3b602b)
---
 airflow/www/static/js/tree.js           | 3 ++-
 airflow/www/templates/airflow/tree.html | 5 +++++
 2 files changed, 7 insertions(+), 1 deletion(-)

diff --git a/airflow/www/static/js/tree.js b/airflow/www/static/js/tree.js
index cc8276d..882d7ce5 100644
--- a/airflow/www/static/js/tree.js
+++ b/airflow/www/static/js/tree.js
@@ -28,6 +28,7 @@ import getMetaValue from './meta_value';
 // dagId comes from dag.html
 const dagId = getMetaValue('dag_id');
 const treeDataUrl = getMetaValue('tree_data');
+const numRuns = getMetaValue('num_runs');
 
 function toDateString(ts) {
   const dt = new Date(ts * 1000);
@@ -412,7 +413,7 @@ document.addEventListener('DOMContentLoaded', () => {
 
   function handleRefresh() {
     $('#loading-dots').css('display', 'inline-block');
-    $.get(`${treeDataUrl}?dag_id=${dagId}`)
+    $.get(`${treeDataUrl}?dag_id=${dagId}&num_runs=${numRuns}`)
       .done(
         (runs) => {
           const newData = {
diff --git a/airflow/www/templates/airflow/tree.html b/airflow/www/templates/airflow/tree.html
index acc68dd..34c4ae7 100644
--- a/airflow/www/templates/airflow/tree.html
+++ b/airflow/www/templates/airflow/tree.html
@@ -33,6 +33,11 @@
   </style>
 {% endblock %}
 
+{% block head_meta %}
+  {{ super() }}
+  <meta name="num_runs" content="{{ num_runs }}">
+{% endblock %}
+
 {% block content %}
   {{ super() }}
   <div class="row dag-view-tools">

[airflow] 24/38: Make REST API List DAGs endpoint consistent with UI/CLI behaviour (#16318)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit dbf306476e066205a28689067ac3508733d8d188
Author: jpyen <29...@users.noreply.github.com>
AuthorDate: Thu Jun 10 11:19:50 2021 -0700

    Make REST API List DAGs endpoint consistent with UI/CLI behaviour (#16318)
    
    Co-authored-by: jpyen <>
    (cherry picked from commit 9ba796ef40fe833aba58f5aa13a63587106d8ffd)
---
 airflow/api_connexion/endpoints/dag_endpoint.py    | 14 ++++++++++----
 tests/api_connexion/endpoints/test_dag_endpoint.py | 19 ++++++++++++++++++-
 2 files changed, 28 insertions(+), 5 deletions(-)

diff --git a/airflow/api_connexion/endpoints/dag_endpoint.py b/airflow/api_connexion/endpoints/dag_endpoint.py
index 5bd6f88..7b19aed 100644
--- a/airflow/api_connexion/endpoints/dag_endpoint.py
+++ b/airflow/api_connexion/endpoints/dag_endpoint.py
@@ -59,11 +59,17 @@ def get_dag_details(dag_id):
 
 @security.requires_access([(permissions.ACTION_CAN_READ, permissions.RESOURCE_DAG)])
 @format_parameters({'limit': check_limit})
-def get_dags(limit, offset=0):
+@provide_session
+def get_dags(limit, session, offset=0):
     """Get all DAGs."""
-    readable_dags = current_app.appbuilder.sm.get_readable_dags(g.user)
-    dags = readable_dags.order_by(DagModel.dag_id).offset(offset).limit(limit).all()
-    total_entries = readable_dags.count()
+    dags_query = session.query(DagModel).filter(~DagModel.is_subdag, DagModel.is_active)
+
+    readable_dags = current_app.appbuilder.sm.get_accessible_dag_ids(g.user)
+
+    dags_query = dags_query.filter(DagModel.dag_id.in_(readable_dags))
+    total_entries = len(dags_query.all())
+
+    dags = dags_query.order_by(DagModel.dag_id).offset(offset).limit(limit).all()
 
     return dags_collection_schema.dump(DAGCollection(dags=dags, total_entries=total_entries))
 
diff --git a/tests/api_connexion/endpoints/test_dag_endpoint.py b/tests/api_connexion/endpoints/test_dag_endpoint.py
index 031290c..f3f7501 100644
--- a/tests/api_connexion/endpoints/test_dag_endpoint.py
+++ b/tests/api_connexion/endpoints/test_dag_endpoint.py
@@ -121,9 +121,20 @@ class TestDagEndpoint:
                 dag_id=f"TEST_DAG_{num}",
                 fileloc=f"/tmp/dag_{num}.py",
                 schedule_interval="2 2 * * *",
+                is_active=True,
             )
             session.add(dag_model)
 
+    @provide_session
+    def _create_deactivated_dag(self, session=None):
+        dag_model = DagModel(
+            dag_id="TEST_DAG_DELETED_1",
+            fileloc="/tmp/dag_del_1.py",
+            schedule_interval="2 2 * * *",
+            is_active=False,
+        )
+        session.add(dag_model)
+
 
 class TestGetDag(TestDagEndpoint):
     @conf_vars({("webserver", "secret_key"): "mysecret"})
@@ -385,12 +396,18 @@ class TestGetDagDetails(TestDagEndpoint):
 
 
 class TestGetDags(TestDagEndpoint):
-    def test_should_respond_200(self):
+    @provide_session
+    def test_should_respond_200(self, session):
         self._create_dag_models(2)
+        self._create_deactivated_dag()
+
+        dags_query = session.query(DagModel).filter(~DagModel.is_subdag)
+        assert len(dags_query.all()) == 3
 
         response = self.client.get("api/v1/dags", environ_overrides={'REMOTE_USER': "test"})
         file_token = SERIALIZER.dumps("/tmp/dag_1.py")
         file_token2 = SERIALIZER.dumps("/tmp/dag_2.py")
+
         assert response.status_code == 200
         assert {
             "dags": [

[airflow] 11/38: set max tree width to 1200px (#16067)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit eefa56313a795e2c56b249d97178bc15406cf1c3
Author: Brent Bovenzi <br...@gmail.com>
AuthorDate: Tue May 25 16:20:31 2021 -0400

    set max tree width to 1200px (#16067)
    
    the totalwidth of the tree view will depend on the window size like before, but max out at 1200px
    
    (cherry picked from commit f2aa9b58cb012a3bc347f43baeaa41ecdece4cbf)
---
 airflow/www/static/js/tree.js | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/airflow/www/static/js/tree.js b/airflow/www/static/js/tree.js
index 04702a7..cc8276d 100644
--- a/airflow/www/static/js/tree.js
+++ b/airflow/www/static/js/tree.js
@@ -70,7 +70,9 @@ document.addEventListener('DOMContentLoaded', () => {
     if (node.depth > treeDepth) treeDepth = node.depth;
   });
   treeDepth += 1;
-  const squareX = window.innerWidth - (data.instances.length * squareSize) - (treeDepth * 50);
+
+  const innerWidth = window.innerWidth > 1200 ? 1200 : window.innerWidth;
+  const squareX = innerWidth - (data.instances.length * squareSize) - (treeDepth * 50);
 
   const squareSpacing = 2;
   const margin = {

[airflow] 26/38: Adding `only_active` parameter to /dags endpoint (#14306)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e22b84dbb3214038b243968a5f6fba5cd646b85d
Author: Sam Wheating <sa...@shopify.com>
AuthorDate: Mon Jun 14 14:49:11 2021 -0700

    Adding `only_active` parameter to /dags endpoint (#14306)
    
    I noticed that the `/dags` endpoint returns information on all entries in the DAG table, which is often many more DAGs than are activeand likely includes DAGs which have been removed from Airflow.
    
    This PR adds a boolean `only_active` parameter to the `/dags` endpoint which will then only return active DAGs.
    
    I also noticed that this endpoint was hitting a deprecated codepath by dumping a `DAG` object to the DAGDetailSchema, thus hitting calling `DAG.is_paused()` I have updated the schema to call the correct function (`DAG.get_is_paused`) since I'm assuming the deprecated functions may be removed some day.
    
    (cherry picked from commit 9526a249cceb170ddfa68530fdcc786ec3e9e5c2)
---
 airflow/api_connexion/endpoints/dag_endpoint.py    |  7 +-
 airflow/api_connexion/openapi/v1.yaml              | 11 +++
 airflow/api_connexion/schemas/dag_schema.py        | 13 ++++
 airflow/models/dag.py                              |  6 ++
 tests/api_connexion/endpoints/test_dag_endpoint.py | 84 ++++++++++++++++++++++
 tests/api_connexion/schemas/test_dag_schema.py     |  5 ++
 6 files changed, 124 insertions(+), 2 deletions(-)

diff --git a/airflow/api_connexion/endpoints/dag_endpoint.py b/airflow/api_connexion/endpoints/dag_endpoint.py
index 7b19aed..ea4a1e3 100644
--- a/airflow/api_connexion/endpoints/dag_endpoint.py
+++ b/airflow/api_connexion/endpoints/dag_endpoint.py
@@ -60,9 +60,12 @@ def get_dag_details(dag_id):
 @security.requires_access([(permissions.ACTION_CAN_READ, permissions.RESOURCE_DAG)])
 @format_parameters({'limit': check_limit})
 @provide_session
-def get_dags(limit, session, offset=0):
+def get_dags(limit, session, offset=0, only_active=True):
     """Get all DAGs."""
-    dags_query = session.query(DagModel).filter(~DagModel.is_subdag, DagModel.is_active)
+    if only_active:
+        dags_query = session.query(DagModel).filter(~DagModel.is_subdag, DagModel.is_active)
+    else:
+        dags_query = session.query(DagModel).filter(~DagModel.is_subdag)
 
     readable_dags = current_app.appbuilder.sm.get_accessible_dag_ids(g.user)
 
diff --git a/airflow/api_connexion/openapi/v1.yaml b/airflow/api_connexion/openapi/v1.yaml
index 72c46d6..02a6690 100644
--- a/airflow/api_connexion/openapi/v1.yaml
+++ b/airflow/api_connexion/openapi/v1.yaml
@@ -379,6 +379,13 @@ paths:
         - $ref: '#/components/parameters/PageLimit'
         - $ref: '#/components/parameters/PageOffset'
         - $ref: '#/components/parameters/OrderBy'
+        - name: only_active
+          in: query
+          schema:
+            type: boolean
+            default: true
+          required: false
+          description: Only return active DAGs.
       responses:
         '200':
           description: Success.
@@ -1756,6 +1763,10 @@ components:
           type: boolean
           nullable: true
           description: Whether the DAG is paused.
+        is_active:
+          type: boolean
+          nullable: true
+          description: Whether the DAG is currently seen by the scheduler(s).
         is_subdag:
           description: Whether the DAG is SubDAG.
           type: boolean
diff --git a/airflow/api_connexion/schemas/dag_schema.py b/airflow/api_connexion/schemas/dag_schema.py
index aabd215..bc5a9ca 100644
--- a/airflow/api_connexion/schemas/dag_schema.py
+++ b/airflow/api_connexion/schemas/dag_schema.py
@@ -49,6 +49,7 @@ class DAGSchema(SQLAlchemySchema):
     dag_id = auto_field(dump_only=True)
     root_dag_id = auto_field(dump_only=True)
     is_paused = auto_field()
+    is_active = auto_field(dump_only=True)
     is_subdag = auto_field(dump_only=True)
     fileloc = auto_field(dump_only=True)
     file_token = fields.Method("get_token", dump_only=True)
@@ -85,6 +86,8 @@ class DAGDetailSchema(DAGSchema):
     default_view = fields.String()
     params = fields.Dict()
     tags = fields.Method("get_tags", dump_only=True)
+    is_paused = fields.Method("get_is_paused", dump_only=True)
+    is_active = fields.Method("get_is_active", dump_only=True)
 
     @staticmethod
     def get_tags(obj: DAG):
@@ -101,6 +104,16 @@ class DAGDetailSchema(DAGSchema):
             return []
         return obj.owner.split(",")
 
+    @staticmethod
+    def get_is_paused(obj: DAG):
+        """Checks entry in DAG table to see if this DAG is paused"""
+        return obj.get_is_paused()
+
+    @staticmethod
+    def get_is_active(obj: DAG):
+        """Checks entry in DAG table to see if this DAG is active"""
+        return obj.get_is_active()
+
 
 class DAGCollection(NamedTuple):
     """List of DAGs with metadata"""
diff --git a/airflow/models/dag.py b/airflow/models/dag.py
index 8e96554..616ae6f 100644
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -802,6 +802,12 @@ class DAG(LoggingMixin):
         return self.get_concurrency_reached()
 
     @provide_session
+    def get_is_active(self, session=None) -> Optional[None]:
+        """Returns a boolean indicating whether this DAG is active"""
+        qry = session.query(DagModel).filter(DagModel.dag_id == self.dag_id)
+        return qry.value(DagModel.is_active)
+
+    @provide_session
     def get_is_paused(self, session=None) -> Optional[None]:
         """Returns a boolean indicating whether this DAG is paused"""
         qry = session.query(DagModel).filter(DagModel.dag_id == self.dag_id)
diff --git a/tests/api_connexion/endpoints/test_dag_endpoint.py b/tests/api_connexion/endpoints/test_dag_endpoint.py
index f3f7501..e11ad72 100644
--- a/tests/api_connexion/endpoints/test_dag_endpoint.py
+++ b/tests/api_connexion/endpoints/test_dag_endpoint.py
@@ -148,6 +148,7 @@ class TestGetDag(TestDagEndpoint):
             "fileloc": "/tmp/dag_1.py",
             "file_token": 'Ii90bXAvZGFnXzEucHki.EnmIdPaUPo26lHQClbWMbDFD1Pk',
             "is_paused": False,
+            "is_active": True,
             "is_subdag": False,
             "owners": [],
             "root_dag_id": None,
@@ -172,6 +173,7 @@ class TestGetDag(TestDagEndpoint):
             "fileloc": "/tmp/dag_1.py",
             "file_token": 'Ii90bXAvZGFnXzEucHki.EnmIdPaUPo26lHQClbWMbDFD1Pk',
             "is_paused": False,
+            "is_active": False,
             "is_subdag": False,
             "owners": [],
             "root_dag_id": None,
@@ -228,6 +230,7 @@ class TestGetDagDetails(TestDagEndpoint):
             "fileloc": __file__,
             "file_token": FILE_TOKEN,
             "is_paused": None,
+            "is_active": None,
             "is_subdag": False,
             "orientation": "LR",
             "owners": ['airflow'],
@@ -260,6 +263,7 @@ class TestGetDagDetails(TestDagEndpoint):
             "fileloc": __file__,
             "file_token": FILE_TOKEN,
             "is_paused": None,
+            "is_active": None,
             "is_subdag": False,
             "orientation": "LR",
             "owners": ['airflow'],
@@ -292,6 +296,7 @@ class TestGetDagDetails(TestDagEndpoint):
             "fileloc": __file__,
             "file_token": FILE_TOKEN,
             "is_paused": None,
+            "is_active": None,
             "is_subdag": False,
             "orientation": "LR",
             "owners": ['airflow'],
@@ -328,6 +333,7 @@ class TestGetDagDetails(TestDagEndpoint):
             "fileloc": __file__,
             "file_token": FILE_TOKEN,
             "is_paused": None,
+            "is_active": None,
             "is_subdag": False,
             "orientation": "LR",
             "owners": ['airflow'],
@@ -366,6 +372,7 @@ class TestGetDagDetails(TestDagEndpoint):
             'fileloc': __file__,
             "file_token": FILE_TOKEN,
             'is_paused': None,
+            "is_active": None,
             'is_subdag': False,
             'orientation': 'LR',
             'owners': ['airflow'],
@@ -417,6 +424,7 @@ class TestGetDags(TestDagEndpoint):
                     "fileloc": "/tmp/dag_1.py",
                     "file_token": file_token,
                     "is_paused": False,
+                    "is_active": True,
                     "is_subdag": False,
                     "owners": [],
                     "root_dag_id": None,
@@ -432,6 +440,80 @@ class TestGetDags(TestDagEndpoint):
                     "fileloc": "/tmp/dag_2.py",
                     "file_token": file_token2,
                     "is_paused": False,
+                    "is_active": True,
+                    "is_subdag": False,
+                    "owners": [],
+                    "root_dag_id": None,
+                    "schedule_interval": {
+                        "__type": "CronExpression",
+                        "value": "2 2 * * *",
+                    },
+                    "tags": [],
+                },
+            ],
+            "total_entries": 2,
+        } == response.json
+
+    def test_only_active_true_returns_active_dags(self):
+        self._create_dag_models(1)
+        self._create_deactivated_dag()
+        response = self.client.get("api/v1/dags?only_active=True", environ_overrides={'REMOTE_USER': "test"})
+        file_token = SERIALIZER.dumps("/tmp/dag_1.py")
+        assert response.status_code == 200
+        assert {
+            "dags": [
+                {
+                    "dag_id": "TEST_DAG_1",
+                    "description": None,
+                    "fileloc": "/tmp/dag_1.py",
+                    "file_token": file_token,
+                    "is_paused": False,
+                    "is_active": True,
+                    "is_subdag": False,
+                    "owners": [],
+                    "root_dag_id": None,
+                    "schedule_interval": {
+                        "__type": "CronExpression",
+                        "value": "2 2 * * *",
+                    },
+                    "tags": [],
+                }
+            ],
+            "total_entries": 1,
+        } == response.json
+
+    def test_only_active_false_returns_all_dags(self):
+        self._create_dag_models(1)
+        self._create_deactivated_dag()
+        response = self.client.get("api/v1/dags?only_active=False", environ_overrides={'REMOTE_USER': "test"})
+        file_token = SERIALIZER.dumps("/tmp/dag_1.py")
+        file_token_2 = SERIALIZER.dumps("/tmp/dag_del_1.py")
+        assert response.status_code == 200
+        assert {
+            "dags": [
+                {
+                    "dag_id": "TEST_DAG_1",
+                    "description": None,
+                    "fileloc": "/tmp/dag_1.py",
+                    "file_token": file_token,
+                    "is_paused": False,
+                    "is_active": True,
+                    "is_subdag": False,
+                    "owners": [],
+                    "root_dag_id": None,
+                    "schedule_interval": {
+                        "__type": "CronExpression",
+                        "value": "2 2 * * *",
+                    },
+                    "tags": [],
+                },
+                {
+                    "dag_id": "TEST_DAG_DELETED_1",
+                    "description": None,
+                    "fileloc": "/tmp/dag_del_1.py",
+                    "file_token": file_token_2,
+                    "is_paused": False,
+                    "is_active": False,
                     "is_subdag": False,
                     "owners": [],
                     "root_dag_id": None,
@@ -538,6 +620,7 @@ class TestPatchDag(TestDagEndpoint):
             "fileloc": "/tmp/dag_1.py",
             "file_token": self.file_token,
             "is_paused": False,
+            "is_active": False,
             "is_subdag": False,
             "owners": [],
             "root_dag_id": None,
@@ -619,6 +702,7 @@ class TestPatchDag(TestDagEndpoint):
             "fileloc": "/tmp/dag_1.py",
             "file_token": self.file_token,
             "is_paused": False,
+            "is_active": False,
             "is_subdag": False,
             "owners": [],
             "root_dag_id": None,
diff --git a/tests/api_connexion/schemas/test_dag_schema.py b/tests/api_connexion/schemas/test_dag_schema.py
index 96aba1f..4fd8a52 100644
--- a/tests/api_connexion/schemas/test_dag_schema.py
+++ b/tests/api_connexion/schemas/test_dag_schema.py
@@ -39,6 +39,7 @@ class TestDagSchema(unittest.TestCase):
             dag_id="test_dag_id",
             root_dag_id="test_root_dag_id",
             is_paused=True,
+            is_active=True,
             is_subdag=False,
             fileloc="/root/airflow/dags/my_dag.py",
             owners="airflow1,airflow2",
@@ -53,6 +54,7 @@ class TestDagSchema(unittest.TestCase):
             "fileloc": "/root/airflow/dags/my_dag.py",
             "file_token": SERIALIZER.dumps("/root/airflow/dags/my_dag.py"),
             "is_paused": True,
+            "is_active": True,
             "is_subdag": False,
             "owners": ["airflow1", "airflow2"],
             "root_dag_id": "test_root_dag_id",
@@ -76,6 +78,7 @@ class TestDAGCollectionSchema(unittest.TestCase):
                     "file_token": SERIALIZER.dumps("/tmp/a.py"),
                     "is_paused": None,
                     "is_subdag": None,
+                    "is_active": None,
                     "owners": [],
                     "root_dag_id": None,
                     "schedule_interval": None,
@@ -86,6 +89,7 @@ class TestDAGCollectionSchema(unittest.TestCase):
                     "description": None,
                     "fileloc": "/tmp/a.py",
                     "file_token": SERIALIZER.dumps("/tmp/a.py"),
+                    "is_active": None,
                     "is_paused": None,
                     "is_subdag": None,
                     "owners": [],
@@ -120,6 +124,7 @@ class TestDAGDetailSchema:
             'doc_md': 'docs',
             'fileloc': __file__,
             "file_token": SERIALIZER.dumps(__file__),
+            "is_active": None,
             'is_paused': None,
             'is_subdag': False,
             'orientation': 'LR',

[airflow] 03/38: Restores apply_defaults import in base_sensor_operator (#16040)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 2cd66a8e0f6ff8f9db97994ef419a65d18f3966e
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Thu May 27 07:08:34 2021 +0200

    Restores apply_defaults import in base_sensor_operator (#16040)
    
    The GCSToLocalFilesystemOperator in Google Provider <=3.0.0 had wrong
    import for apply_defaults. It used
    
    `from airflow.sensors.base_sensor_operator import apply_defaults`
    
    instead of
    
    `from airflow.utils.decorators import apply_defaults`
    
    When we removed apply_defaults in #15667, the base_sensor_operator
    import was removed as well which made the GCSToLocalFilestystemOperator
    stops working in 2.1.0
    
    Fixes: #16035
    (cherry picked from commit 0f8f66eb6bb5fe7f91ecfaa2e93d4c3409813b61)
---
 airflow/sensors/base.py | 5 +++++
 1 file changed, 5 insertions(+)

diff --git a/airflow/sensors/base.py b/airflow/sensors/base.py
index 880a4ef..24e3699 100644
--- a/airflow/sensors/base.py
+++ b/airflow/sensors/base.py
@@ -36,6 +36,11 @@ from airflow.models.taskreschedule import TaskReschedule
 from airflow.ti_deps.deps.ready_to_reschedule import ReadyToRescheduleDep
 from airflow.utils import timezone
 
+# We need to keep the import here because GCSToLocalFilesystemOperator released in
+# Google Provider before 3.0.0 imported apply_defaults from here.
+# See  https://github.com/apache/airflow/issues/16035
+from airflow.utils.decorators import apply_defaults  # pylint: disable=unused-import
+
 
 class BaseSensorOperator(BaseOperator, SkipMixin):
     """

[airflow] 34/38: Fix DAG run state not updated while DAG is paused (#16343)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 446e66b052a116f8371be331624c1e1b03299380
Author: Ephraim Anierobi <sp...@gmail.com>
AuthorDate: Fri Jun 18 00:29:00 2021 +0100

    Fix DAG run state not updated while DAG is paused (#16343)
    
    The state of a DAG run does not update while the DAG is paused.
    The tasks continue to run if the DAG run was kicked off before
    the DAG was paused and eventually finish and are marked correctly.
    The DAG run state does not get updated and stays in Running state until the DAG is unpaused.
    
    This change fixes it by running a check on task exit to update state(if possible)
     of the DagRun if the task was able to finish the DagRun while the DAG is paused
    
    Co-authored-by: Ash Berlin-Taylor <as...@firemirror.com>
    (cherry picked from commit 3834df6ade22b33addd47e3ab2165a0b282926fa)
---
 airflow/jobs/local_task_job.py    | 15 ++++++++++++++
 tests/jobs/test_local_task_job.py | 42 +++++++++++++++++++++++++++++++++++++--
 2 files changed, 55 insertions(+), 2 deletions(-)

diff --git a/airflow/jobs/local_task_job.py b/airflow/jobs/local_task_job.py
index 9e68450..efd84d6 100644
--- a/airflow/jobs/local_task_job.py
+++ b/airflow/jobs/local_task_job.py
@@ -160,6 +160,8 @@ class LocalTaskJob(BaseJob):
         if self.task_instance.state != State.SUCCESS:
             error = self.task_runner.deserialize_run_error()
         self.task_instance._run_finished_callback(error=error)  # pylint: disable=protected-access
+        if not self.task_instance.test_mode:
+            self._update_dagrun_state_for_paused_dag()
 
     def on_kill(self):
         self.task_runner.terminate()
@@ -206,3 +208,16 @@ class LocalTaskJob(BaseJob):
                 error = self.task_runner.deserialize_run_error() or "task marked as failed externally"
             ti._run_finished_callback(error=error)  # pylint: disable=protected-access
             self.terminating = True
+
+    @provide_session
+    def _update_dagrun_state_for_paused_dag(self, session=None):
+        """
+        Checks for paused dags with DagRuns in the running state and
+        update the DagRun state if possible
+        """
+        dag = self.task_instance.task.dag
+        if dag.get_is_paused():
+            dag_run = self.task_instance.get_dagrun(session=session)
+            if dag_run:
+                dag_run.dag = dag
+                dag_run.update_state(session=session, execute_callbacks=True)
diff --git a/tests/jobs/test_local_task_job.py b/tests/jobs/test_local_task_job.py
index 9047f8a..82d85f6 100644
--- a/tests/jobs/test_local_task_job.py
+++ b/tests/jobs/test_local_task_job.py
@@ -44,6 +44,7 @@ from airflow.utils.net import get_hostname
 from airflow.utils.session import create_session
 from airflow.utils.state import State
 from airflow.utils.timeout import timeout
+from airflow.utils.types import DagRunType
 from tests.test_utils.asserts import assert_queries_count
 from tests.test_utils.db import clear_db_jobs, clear_db_runs
 from tests.test_utils.mock_executor import MockExecutor
@@ -571,6 +572,43 @@ class TestLocalTaskJob(unittest.TestCase):
         assert task_terminated_externally.value == 1
         assert not process.is_alive()
 
+    def test_task_exit_should_update_state_of_finished_dagruns_with_dag_paused(self):
+        """Test that with DAG paused, DagRun state will update when the tasks finishes the run"""
+        dag = DAG(dag_id='test_dags', start_date=DEFAULT_DATE)
+        op1 = PythonOperator(task_id='dummy', dag=dag, owner='airflow', python_callable=lambda: True)
+
+        session = settings.Session()
+        orm_dag = DagModel(
+            dag_id=dag.dag_id,
+            has_task_concurrency_limits=False,
+            next_dagrun=dag.start_date,
+            next_dagrun_create_after=dag.following_schedule(DEFAULT_DATE),
+            is_active=True,
+            is_paused=True,
+        )
+        session.add(orm_dag)
+        session.flush()
+        # Write Dag to DB
+        dagbag = DagBag(dag_folder="/dev/null", include_examples=False, read_dags_from_db=False)
+        dagbag.bag_dag(dag, root_dag=dag)
+        dagbag.sync_to_db()
+
+        dr = dag.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            state=State.RUNNING,
+            execution_date=DEFAULT_DATE,
+            start_date=DEFAULT_DATE,
+            session=session,
+        )
+        assert dr.state == State.RUNNING
+        ti = TaskInstance(op1, dr.execution_date)
+        job1 = LocalTaskJob(task_instance=ti, ignore_ti_state=True, executor=SequentialExecutor())
+        job1.task_runner = StandardTaskRunner(job1)
+        job1.run()
+        session.add(dr)
+        session.refresh(dr)
+        assert dr.state == State.SUCCESS
+
 
 @pytest.fixture()
 def clean_db_helper():
@@ -589,12 +627,12 @@ class TestLocalTaskJobPerformance:
         task = DummyOperator(task_id='test_state_succeeded1', dag=dag)
 
         dag.clear()
-        dag.create_dagrun(run_id=unique_prefix, state=State.NONE)
+        dag.create_dagrun(run_id=unique_prefix, execution_date=DEFAULT_DATE, state=State.NONE)
 
         ti = TaskInstance(task=task, execution_date=DEFAULT_DATE)
 
         mock_get_task_runner.return_value.return_code.side_effects = return_codes
 
         job = LocalTaskJob(task_instance=ti, executor=MockExecutor())
-        with assert_queries_count(15):
+        with assert_queries_count(16):
             job.run()

[airflow] 25/38: Don't show stale Serialized DAGs if they are deleted in DB (#16368)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 5f478eca8c1b804a630f8f2eb71b37c0e4b27445
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Fri Jun 11 21:29:56 2021 +0100

    Don't show stale Serialized DAGs if they are deleted in DB (#16368)
    
    If `DagBag.get_dag()` is called currently, it will return the DAG
    even if the DAG does not exist in `serialized_dag` table.
    
    This PR changes that logic to remove the dag from local cache too
    when `DagBag.get_dag()` is called. This happens after
    `min_serialized_dag_fetch_secs`.
    
    (cherry picked from commit e3b3c1fd1cf61b5d1bbe7aef11ddc85b9a7aa171)
---
 airflow/models/dagbag.py         | 11 ++++++++++-
 airflow/models/serialized_dag.py |  4 ++--
 tests/models/test_dagbag.py      | 31 ++++++++++++++++++++++++++++++-
 3 files changed, 42 insertions(+), 4 deletions(-)

diff --git a/airflow/models/dagbag.py b/airflow/models/dagbag.py
index b78463b..be2701b 100644
--- a/airflow/models/dagbag.py
+++ b/airflow/models/dagbag.py
@@ -190,6 +190,8 @@ class DagBag(LoggingMixin):
             # 1. if time has come to check if DAG is updated (controlled by min_serialized_dag_fetch_secs)
             # 2. check the last_updated column in SerializedDag table to see if Serialized DAG is updated
             # 3. if (2) is yes, fetch the Serialized DAG.
+            # 4. if (2) returns None (i.e. Serialized DAG is deleted), remove dag from dagbag
+            # if it exists and return None.
             min_serialized_dag_fetch_secs = timedelta(seconds=settings.MIN_SERIALIZED_DAG_FETCH_INTERVAL)
             if (
                 dag_id in self.dags_last_fetched
@@ -199,7 +201,14 @@ class DagBag(LoggingMixin):
                     dag_id=dag_id,
                     session=session,
                 )
-                if sd_last_updated_datetime and sd_last_updated_datetime > self.dags_last_fetched[dag_id]:
+                if not sd_last_updated_datetime:
+                    self.log.warning("Serialized DAG %s no longer exists", dag_id)
+                    del self.dags[dag_id]
+                    del self.dags_last_fetched[dag_id]
+                    del self.dags_hash[dag_id]
+                    return None
+
+                if sd_last_updated_datetime > self.dags_last_fetched[dag_id]:
                     self._add_dag_from_db(dag_id=dag_id, session=session)
 
             return self.dags.get(dag_id)
diff --git a/airflow/models/serialized_dag.py b/airflow/models/serialized_dag.py
index 81448b6..4e8ebc4 100644
--- a/airflow/models/serialized_dag.py
+++ b/airflow/models/serialized_dag.py
@@ -261,7 +261,7 @@ class SerializedDagModel(Base):
 
     @classmethod
     @provide_session
-    def get_last_updated_datetime(cls, dag_id: str, session: Session = None) -> datetime:
+    def get_last_updated_datetime(cls, dag_id: str, session: Session = None) -> Optional[datetime]:
         """
         Get the date when the Serialized DAG associated to DAG was last updated
         in serialized_dag table
@@ -295,7 +295,7 @@ class SerializedDagModel(Base):
         :param session: ORM Session
         :type session: Session
         :return: DAG Hash
-        :rtype: str
+        :rtype: str | None
         """
         return session.query(cls.dag_hash).filter(cls.dag_id == dag_id).scalar()
 
diff --git a/tests/models/test_dagbag.py b/tests/models/test_dagbag.py
index 359cd5c..0c52c49 100644
--- a/tests/models/test_dagbag.py
+++ b/tests/models/test_dagbag.py
@@ -19,7 +19,7 @@ import os
 import shutil
 import textwrap
 import unittest
-from datetime import datetime, timezone
+from datetime import datetime, timedelta, timezone
 from tempfile import NamedTemporaryFile, mkdtemp
 from unittest import mock
 from unittest.mock import patch
@@ -33,6 +33,7 @@ from airflow import models
 from airflow.exceptions import SerializationError
 from airflow.models import DagBag, DagModel
 from airflow.models.serialized_dag import SerializedDagModel
+from airflow.serialization.serialized_objects import SerializedDAG
 from airflow.utils.dates import timezone as tz
 from airflow.utils.session import create_session
 from airflow.www.security import ApplessAirflowSecurityManager
@@ -311,6 +312,34 @@ class TestDagBag(unittest.TestCase):
         assert dag_id == dag.dag_id
         assert 2 == dagbag.process_file_calls
 
+    def test_dag_removed_if_serialized_dag_is_removed(self):
+        """
+        Test that if a DAG does not exist in serialized_dag table (as the DAG file was removed),
+        remove dags from the DagBag
+        """
+        from airflow.operators.dummy import DummyOperator
+
+        dag = models.DAG(
+            dag_id="test_dag_removed_if_serialized_dag_is_removed",
+            schedule_interval=None,
+            start_date=tz.datetime(2021, 10, 12),
+        )
+
+        with dag:
+            DummyOperator(task_id="task_1")
+
+        dagbag = DagBag(dag_folder=self.empty_dir, include_examples=False, read_dags_from_db=True)
+        dagbag.dags = {dag.dag_id: SerializedDAG.from_dict(SerializedDAG.to_dict(dag))}
+        dagbag.dags_last_fetched = {dag.dag_id: (tz.utcnow() - timedelta(minutes=2))}
+        dagbag.dags_hash = {dag.dag_id: mock.ANY}
+
+        assert SerializedDagModel.has_dag(dag.dag_id) is False
+
+        assert dagbag.get_dag(dag.dag_id) is None
+        assert dag.dag_id not in dagbag.dags
+        assert dag.dag_id not in dagbag.dags_last_fetched
+        assert dag.dag_id not in dagbag.dags_hash
+
     def process_dag(self, create_dag):
         """
         Helper method to process a file generated from the input create_dag function.

[airflow] 28/38: Tree View UI for larger DAGs & more consistent spacing in Tree View (#16522)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 215492d17a8c4ea55a7b2e57b3fd742a0574031c
Author: freget <fr...@googlemail.com>
AuthorDate: Mon Jun 21 17:25:24 2021 +0200

    Tree View UI for larger DAGs & more consistent spacing in Tree View (#16522)
    
    * Made squareX independent of screen width
    
    * Removed now unnecessary variable innerWidth.
    
    Co-authored-by: Brent Bovenzi <br...@gmail.com>
    
    Co-authored-by: Schneider, Thilo <t....@fraport.de>
    Co-authored-by: Brent Bovenzi <br...@gmail.com>
    (cherry picked from commit f9786d42f1f861c7a40745c00cd4d3feaf6254a7)
---
 airflow/www/static/js/tree.js | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/airflow/www/static/js/tree.js b/airflow/www/static/js/tree.js
index 882d7ce5..07daf0e 100644
--- a/airflow/www/static/js/tree.js
+++ b/airflow/www/static/js/tree.js
@@ -72,8 +72,7 @@ document.addEventListener('DOMContentLoaded', () => {
   });
   treeDepth += 1;
 
-  const innerWidth = window.innerWidth > 1200 ? 1200 : window.innerWidth;
-  const squareX = innerWidth - (data.instances.length * squareSize) - (treeDepth * 50);
+  const squareX = (treeDepth * 25) + 200;
 
   const squareSpacing = 2;
   const margin = {

[airflow] 31/38: Avoid recursing too deep when redacting logs (#16491)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6a5e6760e8c463b4ae163b2f579262bf1f41a2b0
Author: Tzu-ping Chung <tp...@astronomer.io>
AuthorDate: Thu Jun 17 23:38:04 2021 +0800

    Avoid recursing too deep when redacting logs (#16491)
    
    Fix #16473
    
    (cherry picked from commit 7453d3e81039573f4d621d13439bd6bcc97e6fa5)
---
 airflow/utils/log/secrets_masker.py | 50 +++++++++++++++++++++----------------
 1 file changed, 29 insertions(+), 21 deletions(-)

diff --git a/airflow/utils/log/secrets_masker.py b/airflow/utils/log/secrets_masker.py
index 2fd0d0a..6775b18 100644
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -124,6 +124,7 @@ class SecretsMasker(logging.Filter):
     patterns: Set[str]
 
     ALREADY_FILTERED_FLAG = "__SecretsMasker_filtered"
+    MAX_RECURSION_DEPTH = 5
 
     def __init__(self):
         super().__init__()
@@ -169,35 +170,34 @@ class SecretsMasker(logging.Filter):
 
         return True
 
-    def _redact_all(self, item: "RedactableItem") -> "RedactableItem":
-        if isinstance(item, dict):
-            return {dict_key: self._redact_all(subval) for dict_key, subval in item.items()}
-        elif isinstance(item, str):
+    def _redact_all(self, item: "RedactableItem", depth: int) -> "RedactableItem":
+        if depth > self.MAX_RECURSION_DEPTH or isinstance(item, str):
             return '***'
+        if isinstance(item, dict):
+            return {dict_key: self._redact_all(subval, depth + 1) for dict_key, subval in item.items()}
         elif isinstance(item, (tuple, set)):
             # Turn set in to tuple!
-            return tuple(self._redact_all(subval) for subval in item)
+            return tuple(self._redact_all(subval, depth + 1) for subval in item)
         elif isinstance(item, list):
-            return list(self._redact_all(subval) for subval in item)
+            return list(self._redact_all(subval, depth + 1) for subval in item)
         else:
             return item
 
     # pylint: disable=too-many-return-statements
-    def redact(self, item: "RedactableItem", name: str = None) -> "RedactableItem":
-        """
-        Redact an any secrets found in ``item``, if it is a string.
-
-        If ``name`` is given, and it's a "sensitive" name (see
-        :func:`should_hide_value_for_key`) then all string values in the item
-        is redacted.
-
-        """
+    def _redact(self, item: "RedactableItem", name: Optional[str], depth: int) -> "RedactableItem":
+        # Avoid spending too much effort on redacting on deeply nested
+        # structures. This also avoid infinite recursion if a structure has
+        # reference to self.
+        if depth > self.MAX_RECURSION_DEPTH:
+            return item
         try:
             if name and should_hide_value_for_key(name):
-                return self._redact_all(item)
-
+                return self._redact_all(item, depth)
             if isinstance(item, dict):
-                return {dict_key: self.redact(subval, dict_key) for dict_key, subval in item.items()}
+                return {
+                    dict_key: self._redact(subval, name=dict_key, depth=(depth + 1))
+                    for dict_key, subval in item.items()
+                }
             elif isinstance(item, str):
                 if self.replacer:
                     # We can't replace specific values, but the key-based redacting
@@ -207,9 +207,9 @@ class SecretsMasker(logging.Filter):
                 return item
             elif isinstance(item, (tuple, set)):
                 # Turn set in to tuple!
-                return tuple(self.redact(subval) for subval in item)
+                return tuple(self._redact(subval, name=None, depth=(depth + 1)) for subval in item)
             elif isinstance(item, list):
-                return list(self.redact(subval) for subval in item)
+                return [self._redact(subval, name=None, depth=(depth + 1)) for subval in item]
             else:
                 return item
         # I think this should never happen, but it does not hurt to leave it just in case
@@ -223,8 +223,16 @@ class SecretsMasker(logging.Filter):
             )
             return item
 
-    # pylint: enable=too-many-return-statements
+    def redact(self, item: "RedactableItem", name: Optional[str] = None) -> "RedactableItem":
+        """Redact an any secrets found in ``item``, if it is a string.
 
+        If ``name`` is given, and it's a "sensitive" name (see
+        :func:`should_hide_value_for_key`) then all string values in the item
+        is redacted.
+        """
+        return self._redact(item, name, depth=0)
+
+    # pylint: enable=too-many-return-statements
     def add_mask(self, secret: Union[str, dict, Iterable], name: str = None):
         """Add a new secret to be masked to this filter instance."""
         if isinstance(secret, dict):

[airflow] 36/38: Fix CLI connections import and migrate logic from secrets to Connection model (#15425)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit fc30a4c80ffc19f40ab04cb502581d81b127c55f
Author: natanweinberger <na...@gmail.com>
AuthorDate: Fri Jun 11 07:34:06 2021 -0400

    Fix CLI connections import and migrate logic from secrets to Connection model (#15425)
    
    * Add field 'extra' to Connection init
    
    * Fix connections import CLI
    
    In connections_import, each connection was deserialized and stored into a
    Connection model instance rather than a dictionary, so an erroneous call to the
    dictionary methods .items() resulted in an AttributeError. With this fix,
    connection information is loaded from dictionaries directly into the
    Connection constructor and committed to the DB.
    
    * Apply suggestions from code review
    
    * Use load_connections_dict in connections import
    
    Co-authored-by: Ash Berlin-Taylor <as...@firemirror.com>
    (cherry picked from commit 002075af91965416e595880040e138b1d6ddec43)
---
 airflow/cli/commands/connection_command.py    | 29 +++---------
 airflow/models/connection.py                  |  6 ++-
 tests/cli/commands/test_connection_command.py | 66 ++++++++++++++-------------
 3 files changed, 44 insertions(+), 57 deletions(-)

diff --git a/airflow/cli/commands/connection_command.py b/airflow/cli/commands/connection_command.py
index 6e45e2a..8f60022 100644
--- a/airflow/cli/commands/connection_command.py
+++ b/airflow/cli/commands/connection_command.py
@@ -29,8 +29,8 @@ from airflow.cli.simple_table import AirflowConsole
 from airflow.exceptions import AirflowNotFoundException
 from airflow.hooks.base import BaseHook
 from airflow.models import Connection
-from airflow.secrets.local_filesystem import _create_connection, load_connections_dict
-from airflow.utils import cli as cli_utils
+from airflow.secrets.local_filesystem import load_connections_dict
+from airflow.utils import cli as cli_utils, yaml
 from airflow.utils.cli import suppress_logs_and_warning
 from airflow.utils.session import create_session
 
@@ -239,7 +239,7 @@ def connections_delete(args):
 
 @cli_utils.action_logging
 def connections_import(args):
-    """Imports connections from a given file"""
+    """Imports connections from a file"""
     if os.path.exists(args.file):
         _import_helper(args.file)
     else:
@@ -247,31 +247,14 @@ def connections_import(args):
 
 
 def _import_helper(file_path):
-    """Helps import connections from a file"""
+    """Load connections from a file and save them to the DB. On collision, skip."""
     connections_dict = load_connections_dict(file_path)
     with create_session() as session:
-        for conn_id, conn_values in connections_dict.items():
+        for conn_id, conn in connections_dict.items():
             if session.query(Connection).filter(Connection.conn_id == conn_id).first():
                 print(f'Could not import connection {conn_id}: connection already exists.')
                 continue
 
-            allowed_fields = [
-                'extra',
-                'description',
-                'conn_id',
-                'login',
-                'conn_type',
-                'host',
-                'password',
-                'schema',
-                'port',
-                'uri',
-                'extra_dejson',
-            ]
-            filtered_connection_values = {
-                key: value for key, value in conn_values.items() if key in allowed_fields
-            }
-            connection = _create_connection(conn_id, filtered_connection_values)
-            session.add(connection)
+            session.add(conn)
             session.commit()
             print(f'Imported connection {conn_id}')
diff --git a/airflow/models/connection.py b/airflow/models/connection.py
index 9021edb..73d0d8d 100644
--- a/airflow/models/connection.py
+++ b/airflow/models/connection.py
@@ -19,7 +19,7 @@
 import json
 import warnings
 from json import JSONDecodeError
-from typing import Dict, Optional
+from typing import Dict, Optional, Union
 from urllib.parse import parse_qsl, quote, unquote, urlencode, urlparse
 
 from sqlalchemy import Boolean, Column, Integer, String, Text
@@ -117,12 +117,14 @@ class Connection(Base, LoggingMixin):  # pylint: disable=too-many-instance-attri
         password: Optional[str] = None,
         schema: Optional[str] = None,
         port: Optional[int] = None,
-        extra: Optional[str] = None,
+        extra: Optional[Union[str, dict]] = None,
         uri: Optional[str] = None,
     ):
         super().__init__()
         self.conn_id = conn_id
         self.description = description
+        if extra and not isinstance(extra, str):
+            extra = json.dumps(extra)
         if uri and (  # pylint: disable=too-many-boolean-expressions
             conn_type or host or login or password or schema or port or extra
         ):
diff --git a/tests/cli/commands/test_connection_command.py b/tests/cli/commands/test_connection_command.py
index 136811d..5339083 100644
--- a/tests/cli/commands/test_connection_command.py
+++ b/tests/cli/commands/test_connection_command.py
@@ -758,9 +758,9 @@ class TestCliImportConnections(unittest.TestCase):
         ):
             connection_command.connections_import(self.parser.parse_args(["connections", "import", filepath]))
 
-    @mock.patch('airflow.cli.commands.connection_command.load_connections_dict')
+    @mock.patch('airflow.secrets.local_filesystem._parse_secret_file')
     @mock.patch('os.path.exists')
-    def test_cli_connections_import_should_load_connections(self, mock_exists, mock_load_connections_dict):
+    def test_cli_connections_import_should_load_connections(self, mock_exists, mock_parse_secret_file):
         mock_exists.return_value = True
 
         # Sample connections to import
@@ -769,26 +769,26 @@ class TestCliImportConnections(unittest.TestCase):
                 "conn_type": "postgres",
                 "description": "new0 description",
                 "host": "host",
-                "is_encrypted": False,
-                "is_extra_encrypted": False,
                 "login": "airflow",
+                "password": "password",
                 "port": 5432,
                 "schema": "airflow",
+                "extra": "test",
             },
             "new1": {
                 "conn_type": "mysql",
                 "description": "new1 description",
                 "host": "host",
-                "is_encrypted": False,
-                "is_extra_encrypted": False,
                 "login": "airflow",
+                "password": "password",
                 "port": 3306,
                 "schema": "airflow",
+                "extra": "test",
             },
         }
 
-        # We're not testing the behavior of load_connections_dict, assume successfully reads JSON, YAML or env
-        mock_load_connections_dict.return_value = expected_connections
+        # We're not testing the behavior of _parse_secret_file, assume it successfully reads JSON, YAML or env
+        mock_parse_secret_file.return_value = expected_connections
 
         connection_command.connections_import(
             self.parser.parse_args(["connections", "import", 'sample.json'])
@@ -799,14 +799,15 @@ class TestCliImportConnections(unittest.TestCase):
             current_conns = session.query(Connection).all()
 
             comparable_attrs = [
+                "conn_id",
                 "conn_type",
                 "description",
                 "host",
-                "is_encrypted",
-                "is_extra_encrypted",
                 "login",
+                "password",
                 "port",
                 "schema",
+                "extra",
             ]
 
             current_conns_as_dicts = {
@@ -816,80 +817,81 @@ class TestCliImportConnections(unittest.TestCase):
             assert expected_connections == current_conns_as_dicts
 
     @provide_session
-    @mock.patch('airflow.cli.commands.connection_command.load_connections_dict')
+    @mock.patch('airflow.secrets.local_filesystem._parse_secret_file')
     @mock.patch('os.path.exists')
     def test_cli_connections_import_should_not_overwrite_existing_connections(
-        self, mock_exists, mock_load_connections_dict, session=None
+        self, mock_exists, mock_parse_secret_file, session=None
     ):
         mock_exists.return_value = True
 
-        # Add a pre-existing connection "new1"
+        # Add a pre-existing connection "new3"
         merge_conn(
             Connection(
-                conn_id="new1",
+                conn_id="new3",
                 conn_type="mysql",
-                description="mysql description",
+                description="original description",
                 host="mysql",
                 login="root",
-                password="",
+                password="password",
                 schema="airflow",
             ),
             session=session,
         )
 
-        # Sample connections to import, including a collision with "new1"
+        # Sample connections to import, including a collision with "new3"
         expected_connections = {
-            "new0": {
+            "new2": {
                 "conn_type": "postgres",
-                "description": "new0 description",
+                "description": "new2 description",
                 "host": "host",
-                "is_encrypted": False,
-                "is_extra_encrypted": False,
                 "login": "airflow",
+                "password": "password",
                 "port": 5432,
                 "schema": "airflow",
+                "extra": "test",
             },
-            "new1": {
+            "new3": {
                 "conn_type": "mysql",
-                "description": "new1 description",
+                "description": "updated description",
                 "host": "host",
-                "is_encrypted": False,
-                "is_extra_encrypted": False,
                 "login": "airflow",
+                "password": "new password",
                 "port": 3306,
                 "schema": "airflow",
+                "extra": "test",
             },
         }
 
-        # We're not testing the behavior of load_connections_dict, assume successfully reads JSON, YAML or env
-        mock_load_connections_dict.return_value = expected_connections
+        # We're not testing the behavior of _parse_secret_file, assume it successfully reads JSON, YAML or env
+        mock_parse_secret_file.return_value = expected_connections
 
         with redirect_stdout(io.StringIO()) as stdout:
             connection_command.connections_import(
                 self.parser.parse_args(["connections", "import", 'sample.json'])
             )
 
-            assert 'Could not import connection new1: connection already exists.' in stdout.getvalue()
+            assert 'Could not import connection new3: connection already exists.' in stdout.getvalue()
 
         # Verify that the imported connections match the expected, sample connections
         current_conns = session.query(Connection).all()
 
         comparable_attrs = [
+            "conn_id",
             "conn_type",
             "description",
             "host",
-            "is_encrypted",
-            "is_extra_encrypted",
             "login",
+            "password",
             "port",
             "schema",
+            "extra",
         ]
 
         current_conns_as_dicts = {
             current_conn.conn_id: {attr: getattr(current_conn, attr) for attr in comparable_attrs}
             for current_conn in current_conns
         }
-        assert current_conns_as_dicts['new0'] == expected_connections['new0']
+        assert current_conns_as_dicts['new2'] == expected_connections['new2']
 
         # The existing connection's description should not have changed
-        assert current_conns_as_dicts['new1']['description'] == 'new1 description'
+        assert current_conns_as_dicts['new3']['description'] == 'original description'

[airflow] 02/38: Fix auto-refresh in tree view When webserver ui is not in ``/`` (#16018)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit dc267217875d2acc0d981a32651607242f8aacd2
Author: Felipe Lolas <fl...@alumnos.uai.cl>
AuthorDate: Mon May 24 16:22:14 2021 -0400

    Fix auto-refresh in tree view When webserver ui is not in ``/`` (#16018)
    
    Co-authored-by: Felipe Lolas <fe...@bci.cl>
    
    Obtain tree_data object endpoint from meta.
    closes: #16017
    
    (cherry picked from commit c288957939ad534eb968a90a34b92dd3a009ddb3)
---
 airflow/www/static/js/tree.js          | 3 ++-
 airflow/www/templates/airflow/dag.html | 1 +
 2 files changed, 3 insertions(+), 1 deletion(-)

diff --git a/airflow/www/static/js/tree.js b/airflow/www/static/js/tree.js
index 1c443cd..04702a7 100644
--- a/airflow/www/static/js/tree.js
+++ b/airflow/www/static/js/tree.js
@@ -27,6 +27,7 @@ import getMetaValue from './meta_value';
 
 // dagId comes from dag.html
 const dagId = getMetaValue('dag_id');
+const treeDataUrl = getMetaValue('tree_data');
 
 function toDateString(ts) {
   const dt = new Date(ts * 1000);
@@ -409,7 +410,7 @@ document.addEventListener('DOMContentLoaded', () => {
 
   function handleRefresh() {
     $('#loading-dots').css('display', 'inline-block');
-    $.get(`/object/tree_data?dag_id=${dagId}`)
+    $.get(`${treeDataUrl}?dag_id=${dagId}`)
       .done(
         (runs) => {
           const newData = {
diff --git a/airflow/www/templates/airflow/dag.html b/airflow/www/templates/airflow/dag.html
index 94d926a..171b4a9 100644
--- a/airflow/www/templates/airflow/dag.html
+++ b/airflow/www/templates/airflow/dag.html
@@ -35,6 +35,7 @@
   <meta name="external_log_url" content="{{ url_for('Airflow.redirect_to_external_log') }}">
   <meta name="extra_links_url" content="{{ url_for('Airflow.extra_links') }}">
   <meta name="paused_url" content="{{ url_for('Airflow.paused') }}">
+  <meta name="tree_data" content="{{ url_for('Airflow.tree_data') }}">
   {% if show_external_log_redirect is defined %}
     <meta name="show_external_log_redirect" content="{{ show_external_log_redirect }}">
   {% endif %}

[airflow] 18/38: Validate retries value on init for better errors (#16415)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c3bc645c1322fff970766633ed906891c213bfcc
Author: Tzu-ping Chung <tp...@astronomer.io>
AuthorDate: Sun Jun 13 08:29:14 2021 +0800

    Validate retries value on init for better errors (#16415)
    
    (cherry picked from commit 15ff2388e8a52348afcc923653f85ce15a3c5f71)
---
 airflow/models/baseoperator.py |  8 ++++++++
 tests/core/test_core.py        | 46 ++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 54 insertions(+)

diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
index e243b5e..f74c5f9 100644
--- a/airflow/models/baseoperator.py
+++ b/airflow/models/baseoperator.py
@@ -563,6 +563,14 @@ class BaseOperator(Operator, LoggingMixin, TaskMixin, metaclass=BaseOperatorMeta
         if wait_for_downstream:
             self.depends_on_past = True
 
+        if retries is not None and not isinstance(retries, int):
+            try:
+                parsed_retries = int(retries)
+            except (TypeError, ValueError):
+                raise AirflowException(f"'retries' type must be int, not {type(retries).__name__}")
+            self.log.warning("Implicitly converting 'retries' for %s from %r to int", self, retries)
+            retries = parsed_retries
+
         self.retries = retries
         self.queue = queue
         self.pool = Pool.DEFAULT_POOL_NAME if pool is None else pool
diff --git a/tests/core/test_core.py b/tests/core/test_core.py
index 7149112..78f2676 100644
--- a/tests/core/test_core.py
+++ b/tests/core/test_core.py
@@ -16,6 +16,7 @@
 # specific language governing permissions and limitations
 # under the License.
 
+import logging
 import multiprocessing
 import os
 import signal
@@ -453,3 +454,48 @@ class TestCore(unittest.TestCase):
 
         assert context1['params'] == {'key_1': 'value_1', 'key_2': 'value_2_new', 'key_3': 'value_3'}
         assert context2['params'] == {'key_1': 'value_1', 'key_2': 'value_2_old'}
+
+
+@pytest.fixture()
+def dag():
+    return DAG(TEST_DAG_ID, default_args={'owner': 'airflow', 'start_date': DEFAULT_DATE})
+
+
+def test_operator_retries_invalid(dag):
+    with pytest.raises(AirflowException) as ctx:
+        BashOperator(
+            task_id='test_illegal_args',
+            bash_command='echo success',
+            dag=dag,
+            retries='foo',
+        )
+    assert str(ctx.value) == "'retries' type must be int, not str"
+
+
+def test_operator_retries_coerce(caplog, dag):
+    with caplog.at_level(logging.WARNING):
+        BashOperator(
+            task_id='test_illegal_args',
+            bash_command='echo success',
+            dag=dag,
+            retries='1',
+        )
+    assert caplog.record_tuples == [
+        (
+            "airflow.operators.bash.BashOperator",
+            logging.WARNING,
+            "Implicitly converting 'retries' for <Task(BashOperator): test_illegal_args> from '1' to int",
+        ),
+    ]
+
+
+@pytest.mark.parametrize("retries", [None, 5])
+def test_operator_retries(caplog, dag, retries):
+    with caplog.at_level(logging.WARNING):
+        BashOperator(
+            task_id='test_illegal_args',
+            bash_command='echo success',
+            dag=dag,
+            retries=retries,
+        )
+    assert caplog.records == []

[airflow] 22/38: Queue tasks with higher priority and earlier execution_date first. (#15210)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 19468d9aae8eb5cb01a6751af8cebad8a9c5bc49
Author: ginevragaudioso <gi...@gmail.com>
AuthorDate: Mon Jun 14 06:34:03 2021 -0500

    Queue tasks with higher priority and earlier execution_date first. (#15210)
    
    Co-authored-by: Ginevra Gaudioso <gg...@vectra.ai>
    Co-authored-by: Ash Berlin-Taylor <as...@firemirror.com>
    (cherry picked from commit 943292b4e0c494f023c86d648289b1f23ccb0ee9)
---
 airflow/jobs/scheduler_job.py    |   1 +
 tests/jobs/test_scheduler_job.py | 159 +++++++++++++++++++++++++++++++++++++++
 2 files changed, 160 insertions(+)

diff --git a/airflow/jobs/scheduler_job.py b/airflow/jobs/scheduler_job.py
index cece87e..7514d92 100644
--- a/airflow/jobs/scheduler_job.py
+++ b/airflow/jobs/scheduler_job.py
@@ -938,6 +938,7 @@ class SchedulerJob(BaseJob):  # pylint: disable=too-many-instance-attributes
             .filter(not_(DM.is_paused))
             .filter(TI.state == State.SCHEDULED)
             .options(selectinload('dag_model'))
+            .order_by(-TI.priority_weight, TI.execution_date)
         )
         starved_pools = [pool_name for pool_name, stats in pools.items() if stats['open'] <= 0]
         if starved_pools:
diff --git a/tests/jobs/test_scheduler_job.py b/tests/jobs/test_scheduler_job.py
index faf19d9..36bc1b5 100644
--- a/tests/jobs/test_scheduler_job.py
+++ b/tests/jobs/test_scheduler_job.py
@@ -1177,6 +1177,165 @@ class TestSchedulerJob(unittest.TestCase):
         assert tis[3].key in res_keys
         session.rollback()
 
+    def test_find_executable_task_instances_order_execution_date(self):
+        dag_id_1 = 'SchedulerJobTest.test_find_executable_task_instances_order_execution_date-a'
+        dag_id_2 = 'SchedulerJobTest.test_find_executable_task_instances_order_execution_date-b'
+        task_id = 'task-a'
+        dag_1 = DAG(dag_id=dag_id_1, start_date=DEFAULT_DATE, concurrency=16)
+        dag_2 = DAG(dag_id=dag_id_2, start_date=DEFAULT_DATE, concurrency=16)
+        dag1_task = DummyOperator(dag=dag_1, task_id=task_id)
+        dag2_task = DummyOperator(dag=dag_2, task_id=task_id)
+        dag_1 = SerializedDAG.from_dict(SerializedDAG.to_dict(dag_1))
+        dag_2 = SerializedDAG.from_dict(SerializedDAG.to_dict(dag_2))
+
+        self.scheduler_job = SchedulerJob(subdir=os.devnull)
+        session = settings.Session()
+
+        dag_model_1 = DagModel(
+            dag_id=dag_id_1,
+            is_paused=False,
+            concurrency=dag_1.concurrency,
+            has_task_concurrency_limits=False,
+        )
+        session.add(dag_model_1)
+        dag_model_2 = DagModel(
+            dag_id=dag_id_2,
+            is_paused=False,
+            concurrency=dag_2.concurrency,
+            has_task_concurrency_limits=False,
+        )
+        session.add(dag_model_2)
+        dr1 = dag_1.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE + timedelta(hours=1),
+            state=State.RUNNING,
+        )
+        dr2 = dag_2.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE,
+            state=State.RUNNING,
+        )
+
+        tis = [
+            TaskInstance(dag1_task, dr1.execution_date),
+            TaskInstance(dag2_task, dr2.execution_date),
+        ]
+        for ti in tis:
+            ti.state = State.SCHEDULED
+            session.merge(ti)
+        session.flush()
+
+        res = self.scheduler_job._executable_task_instances_to_queued(max_tis=1, session=session)
+        session.flush()
+        assert [ti.key for ti in res] == [tis[1].key]
+        session.rollback()
+
+    def test_find_executable_task_instances_order_priority(self):
+        dag_id_1 = 'SchedulerJobTest.test_find_executable_task_instances_order_priority-a'
+        dag_id_2 = 'SchedulerJobTest.test_find_executable_task_instances_order_priority-b'
+        task_id = 'task-a'
+        dag_1 = DAG(dag_id=dag_id_1, start_date=DEFAULT_DATE, concurrency=16)
+        dag_2 = DAG(dag_id=dag_id_2, start_date=DEFAULT_DATE, concurrency=16)
+        dag1_task = DummyOperator(dag=dag_1, task_id=task_id, priority_weight=1)
+        dag2_task = DummyOperator(dag=dag_2, task_id=task_id, priority_weight=4)
+        dag_1 = SerializedDAG.from_dict(SerializedDAG.to_dict(dag_1))
+        dag_2 = SerializedDAG.from_dict(SerializedDAG.to_dict(dag_2))
+
+        self.scheduler_job = SchedulerJob(subdir=os.devnull)
+        session = settings.Session()
+
+        dag_model_1 = DagModel(
+            dag_id=dag_id_1,
+            is_paused=False,
+            concurrency=dag_1.concurrency,
+            has_task_concurrency_limits=False,
+        )
+        session.add(dag_model_1)
+        dag_model_2 = DagModel(
+            dag_id=dag_id_2,
+            is_paused=False,
+            concurrency=dag_2.concurrency,
+            has_task_concurrency_limits=False,
+        )
+        session.add(dag_model_2)
+        dr1 = dag_1.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE,
+            state=State.RUNNING,
+        )
+        dr2 = dag_2.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE,
+            state=State.RUNNING,
+        )
+
+        tis = [
+            TaskInstance(dag1_task, dr1.execution_date),
+            TaskInstance(dag2_task, dr2.execution_date),
+        ]
+        for ti in tis:
+            ti.state = State.SCHEDULED
+            session.merge(ti)
+        session.flush()
+
+        res = self.scheduler_job._executable_task_instances_to_queued(max_tis=1, session=session)
+        session.flush()
+        assert [ti.key for ti in res] == [tis[1].key]
+        session.rollback()
+
+    def test_find_executable_task_instances_order_execution_date_and_priority(self):
+        dag_id_1 = 'SchedulerJobTest.test_find_executable_task_instances_order_execution_date_and_priority-a'
+        dag_id_2 = 'SchedulerJobTest.test_find_executable_task_instances_order_execution_date_and_priority-b'
+        task_id = 'task-a'
+        dag_1 = DAG(dag_id=dag_id_1, start_date=DEFAULT_DATE, concurrency=16)
+        dag_2 = DAG(dag_id=dag_id_2, start_date=DEFAULT_DATE, concurrency=16)
+        dag1_task = DummyOperator(dag=dag_1, task_id=task_id, priority_weight=1)
+        dag2_task = DummyOperator(dag=dag_2, task_id=task_id, priority_weight=4)
+        dag_1 = SerializedDAG.from_dict(SerializedDAG.to_dict(dag_1))
+        dag_2 = SerializedDAG.from_dict(SerializedDAG.to_dict(dag_2))
+
+        self.scheduler_job = SchedulerJob(subdir=os.devnull)
+        session = settings.Session()
+
+        dag_model_1 = DagModel(
+            dag_id=dag_id_1,
+            is_paused=False,
+            concurrency=dag_1.concurrency,
+            has_task_concurrency_limits=False,
+        )
+        session.add(dag_model_1)
+        dag_model_2 = DagModel(
+            dag_id=dag_id_2,
+            is_paused=False,
+            concurrency=dag_2.concurrency,
+            has_task_concurrency_limits=False,
+        )
+        session.add(dag_model_2)
+        dr1 = dag_1.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE,
+            state=State.RUNNING,
+        )
+        dr2 = dag_2.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE + timedelta(hours=1),
+            state=State.RUNNING,
+        )
+
+        tis = [
+            TaskInstance(dag1_task, dr1.execution_date),
+            TaskInstance(dag2_task, dr2.execution_date),
+        ]
+        for ti in tis:
+            ti.state = State.SCHEDULED
+            session.merge(ti)
+        session.flush()
+
+        res = self.scheduler_job._executable_task_instances_to_queued(max_tis=1, session=session)
+        session.flush()
+        assert [ti.key for ti in res] == [tis[1].key]
+        session.rollback()
+
     def test_find_executable_task_instances_in_default_pool(self):
         set_default_pool_slots(1)
 

[airflow] 29/38: Backfill: Don't create a DagRun if no tasks match task regex (#16461)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7c094faf957d612f581137fe79fcd005f652fbf4
Author: Dmirty Suvorov <dm...@scribd.com>
AuthorDate: Wed Jun 16 23:07:58 2021 +0300

    Backfill: Don't create a DagRun if no tasks match task regex (#16461)
    
    Backfill should not create a DagRun in case there is no any task that matches the regex.
    
    closes: #16460
    (cherry picked from commit f2c79b238f4ea3ee801038a6305b925f2f4e753b)
---
 airflow/cli/commands/dag_command.py | 4 ++++
 1 file changed, 4 insertions(+)

diff --git a/airflow/cli/commands/dag_command.py b/airflow/cli/commands/dag_command.py
index fe2f329..e739162 100644
--- a/airflow/cli/commands/dag_command.py
+++ b/airflow/cli/commands/dag_command.py
@@ -78,6 +78,10 @@ def dag_backfill(args, dag=None):
         dag = dag.partial_subset(
             task_ids_or_regex=args.task_regex, include_upstream=not args.ignore_dependencies
         )
+        if not dag.task_dict:
+            raise AirflowException(
+                f"There are no tasks that match '{args.task_regex}' regex. Nothing to run, exiting..."
+            )
 
     run_conf = None
     if args.conf:

[airflow] 04/38: Don't die when masking `log.exception` when there is no exception (#16047)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 19332cf80b1016ac83536860adee3f7670e24062
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Tue May 25 12:03:29 2021 +0100

    Don't die when masking `log.exception` when there is no exception (#16047)
    
    It is possible that `exc_info` can be set, but contain no exception.
    
    We shouldn't fail in this case, even if the output doesn't make sense as
    shown by the test (the `NoneType: None` line is the exception being
    logged.)
    
    (cherry picked from commit 2f776334e30ce9cc2c6a01b377703914acb7139e)
---
 airflow/utils/log/secrets_masker.py    |  2 +-
 tests/utils/log/test_secrets_masker.py | 15 +++++++++++++++
 2 files changed, 16 insertions(+), 1 deletion(-)

diff --git a/airflow/utils/log/secrets_masker.py b/airflow/utils/log/secrets_masker.py
index 42e0e55..73a2aef 100644
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -155,7 +155,7 @@ class SecretsMasker(logging.Filter):
                 if k in self._record_attrs_to_ignore:
                     continue
                 record.__dict__[k] = self.redact(v)
-            if record.exc_info:
+            if record.exc_info and record.exc_info[1] is not None:
                 exc = record.exc_info[1]
                 # I'm not sure if this is a good idea!
                 exc.args = (self.redact(v) for v in exc.args)
diff --git a/tests/utils/log/test_secrets_masker.py b/tests/utils/log/test_secrets_masker.py
index ba88b87..1146bce 100644
--- a/tests/utils/log/test_secrets_masker.py
+++ b/tests/utils/log/test_secrets_masker.py
@@ -97,6 +97,21 @@ class TestSecretsMasker:
             """
         )
 
+    def test_exception_not_raised(self, logger, caplog):
+        """
+        Test that when ``logger.exception`` is called when there is no current exception we still log.
+
+        (This is a "bug" in user code, but we shouldn't die because of it!)
+        """
+        logger.exception("Err")
+
+        assert caplog.text == textwrap.dedent(
+            """\
+            ERROR Err
+            NoneType: None
+            """
+        )
+
     @pytest.mark.xfail(reason="Cannot filter secrets in traceback source")
     def test_exc_tb(self, logger, caplog):
         """

[airflow] 38/38: Add back-compat layer to clear_task_instances (#16582)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 46cfeee30c2b579b25de8806250a95068b33d69f
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Tue Jun 22 14:42:42 2021 +0100

    Add back-compat layer to clear_task_instances (#16582)
    
    It is unlikely that anyone is using this function directly, but it is
    easy for us to maintain compatibility, so we should
    
    (cherry picked from commit 5b0acfef87d609e4d5e11e10e878e41e6ea89302)
---
 airflow/models/taskinstance.py | 15 ++++++++++++++-
 airflow/typing_compat.py       |  3 ++-
 2 files changed, 16 insertions(+), 2 deletions(-)

diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
index ae7eeef..b77f0d7 100644
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -61,6 +61,7 @@ from airflow.sentry import Sentry
 from airflow.stats import Stats
 from airflow.ti_deps.dep_context import DepContext
 from airflow.ti_deps.dependencies_deps import REQUEUEABLE_DEPS, RUNNING_DEPS
+from airflow.typing_compat import Literal
 from airflow.utils import timezone
 from airflow.utils.email import send_email
 from airflow.utils.helpers import is_container
@@ -133,8 +134,9 @@ def set_error_file(error_file: str, error: Union[str, Exception]) -> None:
 def clear_task_instances(
     tis,
     session,
-    dag_run_state: str = State.RUNNING,
+    activate_dag_runs=None,
     dag=None,
+    dag_run_state: Union[str, Literal[False]] = State.RUNNING,
 ):
     """
     Clears a set of task instances, but makes sure the running ones
@@ -145,6 +147,7 @@ def clear_task_instances(
     :param dag_run_state: state to set DagRun to. If set to False, dagrun state will not
         be changed.
     :param dag: DAG object
+    :param activate_dag_runs: Deprecated parameter, do not pass
     """
     job_ids = []
     task_id_by_key = defaultdict(lambda: defaultdict(lambda: defaultdict(set)))
@@ -205,6 +208,16 @@ def clear_task_instances(
         for job in session.query(BaseJob).filter(BaseJob.id.in_(job_ids)).all():  # noqa
             job.state = State.SHUTDOWN
 
+    if activate_dag_runs is not None:
+        warnings.warn(
+            "`activate_dag_runs` parameter to clear_task_instances function is deprecated. "
+            "Please use `dag_run_state`",
+            DeprecationWarning,
+            stacklevel=2,
+        )
+        if not activate_dag_runs:
+            dag_run_state = False
+
     if dag_run_state is not False and tis:
         from airflow.models.dagrun import DagRun  # Avoid circular import
 
diff --git a/airflow/typing_compat.py b/airflow/typing_compat.py
index d98eb7b..a207ef2 100644
--- a/airflow/typing_compat.py
+++ b/airflow/typing_compat.py
@@ -26,12 +26,13 @@ try:
     # python 3.8 we can safely remove this shim import after Airflow drops
     # support for <3.8
     from typing import (  # type: ignore # noqa # pylint: disable=unused-import
+        Literal,
         Protocol,
         TypedDict,
         runtime_checkable,
     )
 except ImportError:
-    from typing_extensions import Protocol, TypedDict, runtime_checkable  # type: ignore # noqa
+    from typing_extensions import Literal, Protocol, TypedDict, runtime_checkable  # type: ignore # noqa
 
 
 # Before Py 3.7, there is no re.Pattern class

[airflow] 05/38: Ensure that we don't try to mask empty string in logs (#16057)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c47171be467fba8a5dee626e7c2e92b41fe1034e
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Tue May 25 19:31:22 2021 +0100

    Ensure that we don't try to mask empty string in logs (#16057)
    
    Although `Connection.password` being empty was guarded against, there
    are other possible cases (such as an extra field) that wasn't guarded
    against, which ended up with this in the logs:
    
        WARNING - ***-***-***-*** ***L***o***g***g***i***n***g*** ***e***r***r***o***r*** ***-***-***-***
    
    Oops!
    
    (cherry picked from commit 8814a59a5bf54dd17aef21eefd0900703330c22c)
---
 airflow/utils/log/secrets_masker.py    | 2 ++
 tests/utils/log/test_secrets_masker.py | 2 ++
 2 files changed, 4 insertions(+)

diff --git a/airflow/utils/log/secrets_masker.py b/airflow/utils/log/secrets_masker.py
index 73a2aef..6df8d39 100644
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -214,6 +214,8 @@ class SecretsMasker(logging.Filter):
             for k, v in secret.items():
                 self.add_mask(v, k)
         elif isinstance(secret, str):
+            if not secret:
+                return
             pattern = re.escape(secret)
             if pattern not in self.patterns and (not name or should_hide_value_for_key(name)):
                 self.patterns.add(pattern)
diff --git a/tests/utils/log/test_secrets_masker.py b/tests/utils/log/test_secrets_masker.py
index 1146bce..8c88bdd 100644
--- a/tests/utils/log/test_secrets_masker.py
+++ b/tests/utils/log/test_secrets_masker.py
@@ -156,6 +156,8 @@ class TestSecretsMasker:
             # When the "sensitive value" is a dict, don't mask anything
             # (Or should this be mask _everything_ under it ?
             ("api_key", {"other": "innoent"}, set()),
+            (None, {"password": ""}, set()),
+            (None, "", set()),
         ],
     )
     def test_mask_secret(self, name, value, expected_mask):

[airflow] 27/38: Correctly handle None returns from Query.scalar() (#16345)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 777fd9b3a92b942f268cbb9869d62d2c9d2d9ed1
Author: Tzu-ping Chung <tp...@astronomer.io>
AuthorDate: Wed Jun 16 03:23:32 2021 +0800

    Correctly handle None returns from Query.scalar() (#16345)
    
    This is possible when the query does not return a row, according to
    SQLAlchemy documentation. We can handle them to provide better errors in
    unexpected situations.
    
    Toward #8171, fix #16328.
    
    (cherry picked from commit 147bcecc4902793e0b913dfdad1bd799621971c7)
---
 airflow/models/serialized_dag.py | 6 +++---
 airflow/www/views.py             | 3 ++-
 2 files changed, 5 insertions(+), 4 deletions(-)

diff --git a/airflow/models/serialized_dag.py b/airflow/models/serialized_dag.py
index 4e8ebc4..bba58ad 100644
--- a/airflow/models/serialized_dag.py
+++ b/airflow/models/serialized_dag.py
@@ -275,7 +275,7 @@ class SerializedDagModel(Base):
 
     @classmethod
     @provide_session
-    def get_max_last_updated_datetime(cls, session: Session = None) -> datetime:
+    def get_max_last_updated_datetime(cls, session: Session = None) -> Optional[datetime]:
         """
         Get the maximum date when any DAG was last updated in serialized_dag table
 
@@ -286,7 +286,7 @@ class SerializedDagModel(Base):
 
     @classmethod
     @provide_session
-    def get_latest_version_hash(cls, dag_id: str, session: Session = None) -> str:
+    def get_latest_version_hash(cls, dag_id: str, session: Session = None) -> Optional[str]:
         """
         Get the latest DAG version for a given DAG ID.
 
@@ -294,7 +294,7 @@ class SerializedDagModel(Base):
         :type dag_id: str
         :param session: ORM Session
         :type session: Session
-        :return: DAG Hash
+        :return: DAG Hash, or None if the DAG is not found
         :rtype: str | None
         """
         return session.query(cls.dag_hash).filter(cls.dag_id == dag_id).scalar()
diff --git a/airflow/www/views.py b/airflow/www/views.py
index eadec6c..424892e 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -4023,7 +4023,8 @@ class DagDependenciesView(AirflowBaseView):
         title = "DAG Dependencies"
 
         if timezone.utcnow() > self.last_refresh + self.refresh_interval:
-            if SerializedDagModel.get_max_last_updated_datetime() > self.last_refresh:
+            max_last_updated = SerializedDagModel.get_max_last_updated_datetime()
+            if max_last_updated is None or max_last_updated > self.last_refresh:
                 self._calculate_graph()
             self.last_refresh = timezone.utcnow()
 

[airflow] 14/38: Fix tasks in an infinite slots pool were never scheduled (#15247)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 1ca495c3d7a34181a2c8f9e743cbef78dc63c490
Author: Benoit Person <be...@gmail.com>
AuthorDate: Tue Jun 22 08:31:04 2021 +0000

    Fix tasks in an infinite slots pool were never scheduled (#15247)
    
    Infinite pools: Make their `total_slots` be `inf` instead of `-1`
    
    (cherry picked from commit 96f764389eded9f1ea908e899b54bf00635ec787)
---
 airflow/models/pool.py           |  6 +++++-
 tests/jobs/test_scheduler_job.py | 35 +++++++++++++++++++++++++++++++++++
 tests/models/test_pool.py        |  4 ++--
 3 files changed, 42 insertions(+), 3 deletions(-)

diff --git a/airflow/models/pool.py b/airflow/models/pool.py
index feade77..3d152ee 100644
--- a/airflow/models/pool.py
+++ b/airflow/models/pool.py
@@ -106,6 +106,8 @@ class Pool(Base):
 
         pool_rows: Iterable[Tuple[str, int]] = query.all()
         for (pool_name, total_slots) in pool_rows:
+            if total_slots == -1:
+                total_slots = float('inf')  # type: ignore
             pools[pool_name] = PoolStats(total=total_slots, running=0, queued=0, open=0)
 
         state_count_by_pool = (
@@ -115,8 +117,10 @@ class Pool(Base):
         ).all()
 
         # calculate queued and running metrics
-        count: int
         for (pool_name, state, count) in state_count_by_pool:
+            # Some databases return decimal.Decimal here.
+            count = int(count)
+
             stats_dict: Optional[PoolStats] = pools.get(pool_name)
             if not stats_dict:
                 continue
diff --git a/tests/jobs/test_scheduler_job.py b/tests/jobs/test_scheduler_job.py
index 954b395..faf19d9 100644
--- a/tests/jobs/test_scheduler_job.py
+++ b/tests/jobs/test_scheduler_job.py
@@ -1264,6 +1264,41 @@ class TestSchedulerJob(unittest.TestCase):
         assert 0 == len(res)
         session.rollback()
 
+    def test_infinite_pool(self):
+        dag_id = 'SchedulerJobTest.test_infinite_pool'
+        task_id = 'dummy'
+        dag = DAG(dag_id=dag_id, start_date=DEFAULT_DATE, concurrency=16)
+        task = DummyOperator(dag=dag, task_id=task_id, pool="infinite_pool")
+        dag = SerializedDAG.from_dict(SerializedDAG.to_dict(dag))
+
+        self.scheduler_job = SchedulerJob(subdir=os.devnull)
+        session = settings.Session()
+
+        dag_model = DagModel(
+            dag_id=dag_id,
+            is_paused=False,
+            concurrency=dag.concurrency,
+            has_task_concurrency_limits=False,
+        )
+        session.add(dag_model)
+        dr = dag.create_dagrun(
+            run_type=DagRunType.SCHEDULED,
+            execution_date=DEFAULT_DATE,
+            state=State.RUNNING,
+        )
+
+        ti = TaskInstance(task, dr.execution_date)
+        ti.state = State.SCHEDULED
+        session.merge(ti)
+        infinite_pool = Pool(pool='infinite_pool', slots=-1, description='infinite pool')
+        session.add(infinite_pool)
+        session.commit()
+
+        res = self.scheduler_job._executable_task_instances_to_queued(max_tis=32, session=session)
+        session.flush()
+        assert 1 == len(res)
+        session.rollback()
+
     def test_find_executable_task_instances_none(self):
         dag_id = 'SchedulerJobTest.test_find_executable_task_instances_none'
         task_id_1 = 'dummy'
diff --git a/tests/models/test_pool.py b/tests/models/test_pool.py
index f4c7626..7981e23 100644
--- a/tests/models/test_pool.py
+++ b/tests/models/test_pool.py
@@ -110,10 +110,10 @@ class TestPool(unittest.TestCase):
                 "running": 0,
             },
             "test_pool": {
-                "open": -1,
+                "open": float('inf'),
                 "queued": 1,
                 "running": 1,
-                "total": -1,
+                "total": float('inf'),
             },
         } == pool.slots_stats()
 

[airflow] 13/38: Fix Orphaned tasks stuck in CeleryExecutor as running (#16550)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c6313e48a2ea53836b2d6619741534443f08f9aa
Author: Jorrick Sleijster <jo...@gmail.com>
AuthorDate: Tue Jun 22 10:08:00 2021 +0200

    Fix Orphaned tasks stuck in CeleryExecutor as running (#16550)
    
    (cherry picked from commit 90f0088c5752b56177597725cc716f707f2f8456)
---
 airflow/executors/celery_executor.py    | 4 +---
 tests/executors/test_celery_executor.py | 2 ++
 2 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/airflow/executors/celery_executor.py b/airflow/executors/celery_executor.py
index 553639b..567fe58 100644
--- a/airflow/executors/celery_executor.py
+++ b/airflow/executors/celery_executor.py
@@ -369,9 +369,7 @@ class CeleryExecutor(BaseExecutor):
                 "\n\t".join([repr(x) for x in timedout_keys]),
             )
             for key in timedout_keys:
-                self.event_buffer[key] = (State.FAILED, None)
-                del self.tasks[key]
-                del self.adopted_task_timeouts[key]
+                self.change_state(key, State.FAILED)
 
     def debug_dump(self) -> None:
         """Called in response to SIGUSR2 by the scheduler"""
diff --git a/tests/executors/test_celery_executor.py b/tests/executors/test_celery_executor.py
index 19c8a0d..d15ca9a 100644
--- a/tests/executors/test_celery_executor.py
+++ b/tests/executors/test_celery_executor.py
@@ -371,10 +371,12 @@ class TestCeleryExecutor(unittest.TestCase):
             key_1: queued_dttm + executor.task_adoption_timeout,
             key_2: queued_dttm + executor.task_adoption_timeout,
         }
+        executor.running = {key_1, key_2}
         executor.tasks = {key_1: AsyncResult("231"), key_2: AsyncResult("232")}
         executor.sync()
         assert executor.event_buffer == {key_1: (State.FAILED, None), key_2: (State.FAILED, None)}
         assert executor.tasks == {}
+        assert executor.running == set()
         assert executor.adopted_task_timeouts == {}
 
 

[airflow] 19/38: Clean Markdown with dedent to respect indents (#16414)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4c06aaebde277ce037de0ed81ec9d91a28b798bf
Author: Tzu-ping Chung <tp...@astronomer.io>
AuthorDate: Sun Jun 13 08:30:11 2021 +0800

    Clean Markdown with dedent to respect indents (#16414)
    
    (cherry picked from commit 6f9c0ceeb40947c226d35587097529d04c3e3e59)
---
 airflow/www/utils.py    |  5 ++---
 tests/www/test_utils.py | 36 ++++++++++++++++++++++++++++++++++++
 2 files changed, 38 insertions(+), 3 deletions(-)

diff --git a/airflow/www/utils.py b/airflow/www/utils.py
index b0c93ba..e0ba0db 100644
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -16,6 +16,7 @@
 # specific language governing permissions and limitations
 # under the License.
 import json
+import textwrap
 import time
 from urllib.parse import urlencode
 
@@ -344,9 +345,7 @@ def wrapped_markdown(s, css_class='rich_doc'):
     """Convert a Markdown string to HTML."""
     if s is None:
         return None
-
-    s = '\n'.join(line.lstrip() for line in s.split('\n'))
-
+    s = textwrap.dedent(s)
     return Markup(f'<div class="{css_class}" >' + markdown.markdown(s, extensions=['tables']) + "</div>")
 
 
diff --git a/tests/www/test_utils.py b/tests/www/test_utils.py
index 8f381a5..01c49e1 100644
--- a/tests/www/test_utils.py
+++ b/tests/www/test_utils.py
@@ -218,3 +218,39 @@ class TestWrappedMarkdown(unittest.TestCase):
         )
 
         assert '<div class="rich_doc" ><h1>header</h1>\n<p>1st line\n2nd line</p></div>' == rendered
+
+    def test_wrapped_markdown_with_raw_code_block(self):
+        rendered = wrapped_markdown(
+            """\
+            # Markdown code block
+
+            Inline `code` works well.
+
+                Code block
+                does not
+                respect
+                newlines
+
+            """
+        )
+
+        assert (
+            '<div class="rich_doc" ><h1>Markdown code block</h1>\n'
+            '<p>Inline <code>code</code> works well.</p>\n'
+            '<pre><code>Code block\ndoes not\nrespect\nnewlines\n</code></pre></div>'
+        ) == rendered
+
+    def test_wrapped_markdown_with_nested_list(self):
+        rendered = wrapped_markdown(
+            """
+            ### Docstring with a code block
+
+            - And
+                - A nested list
+            """
+        )
+
+        assert (
+            '<div class="rich_doc" ><h3>Docstring with a code block</h3>\n'
+            '<ul>\n<li>And<ul>\n<li>A nested list</li>\n</ul>\n</li>\n</ul></div>'
+        ) == rendered

[airflow] 07/38: Fix apply defaults for task decorator (#16085)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4e431ec87509a19eb6db330434197161e97b860a
Author: Jun <ju...@gmail.com>
AuthorDate: Thu May 27 23:34:03 2021 +0800

    Fix apply defaults for task decorator (#16085)
    
    (cherry picked from commit 9d06ee8019ecbc07d041ccede15d0e322aa797a3)
---
 airflow/decorators/base.py      | 14 ++++++++++++++
 airflow/models/baseoperator.py  |  6 ++++++
 tests/decorators/test_python.py | 16 ++++++++++++++++
 3 files changed, 36 insertions(+)

diff --git a/airflow/decorators/base.py b/airflow/decorators/base.py
index 47fb0d2..3307f05 100644
--- a/airflow/decorators/base.py
+++ b/airflow/decorators/base.py
@@ -162,6 +162,20 @@ class DecoratedOperator(BaseOperator):
             )
         return return_value
 
+    def _hook_apply_defaults(self, *args, **kwargs):
+        if 'python_callable' not in kwargs:
+            return args, kwargs
+
+        python_callable = kwargs['python_callable']
+        default_args = kwargs.get('default_args') or {}
+        op_kwargs = kwargs.get('op_kwargs') or {}
+        f_sig = signature(python_callable)
+        for arg in f_sig.parameters:
+            if arg not in op_kwargs and arg in default_args:
+                op_kwargs[arg] = default_args[arg]
+        kwargs['op_kwargs'] = op_kwargs
+        return args, kwargs
+
 
 T = TypeVar("T", bound=Callable)  # pylint: disable=invalid-name
 
diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
index f6fec77..e243b5e 100644
--- a/airflow/models/baseoperator.py
+++ b/airflow/models/baseoperator.py
@@ -176,6 +176,12 @@ class BaseOperatorMeta(abc.ABCMeta):
             if dag_params:
                 kwargs['params'] = dag_params
 
+            if default_args:
+                kwargs['default_args'] = default_args
+
+            if hasattr(self, '_hook_apply_defaults'):
+                args, kwargs = self._hook_apply_defaults(*args, **kwargs)  # pylint: disable=protected-access
+
             result = func(self, *args, **kwargs)
 
             # Here we set upstream task defined by XComArgs passed to template fields of the operator
diff --git a/tests/decorators/test_python.py b/tests/decorators/test_python.py
index a829863..59849fc 100644
--- a/tests/decorators/test_python.py
+++ b/tests/decorators/test_python.py
@@ -411,6 +411,22 @@ class TestAirflowTaskDecorator(TestPythonBase):
             ret = do_run()
         assert ret.operator.owner == 'airflow'  # pylint: disable=maybe-no-member
 
+        @task_decorator
+        def test_apply_default_raise(unknow):
+            return unknow
+
+        with pytest.raises(TypeError):
+            with self.dag:
+                test_apply_default_raise()  # pylint: disable=no-value-for-parameter
+
+        @task_decorator
+        def test_apply_default(owner):
+            return owner
+
+        with self.dag:
+            ret = test_apply_default()  # pylint: disable=no-value-for-parameter
+        assert 'owner' in ret.operator.op_kwargs
+
     def test_xcom_arg(self):
         """Tests that returned key in XComArg is returned correctly"""
 

[airflow] 15/38: Add `passphrase` and `private_key` to default sensitive fileld names (#16392)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e58a6a9514768a11f9e043e76277a0201308ce63
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Fri Jun 11 19:08:35 2021 +0100

    Add `passphrase` and `private_key` to default sensitive fileld names (#16392)
    
    (cherry picked from commit 430073132446f7cc9c7d3baef99019be470d2a37)
---
 airflow/utils/log/secrets_masker.py | 12 +++++++-----
 1 file changed, 7 insertions(+), 5 deletions(-)

diff --git a/airflow/utils/log/secrets_masker.py b/airflow/utils/log/secrets_masker.py
index b3ccfdb..3177c58 100644
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -47,13 +47,15 @@ log = logging.getLogger(__name__)
 
 DEFAULT_SENSITIVE_FIELDS = frozenset(
     {
-        'password',
-        'secret',
-        'passwd',
-        'authorization',
+        'access_token',
         'api_key',
         'apikey',
-        'access_token',
+        'authorization',
+        'passphrase',
+        'passwd',
+        'password',
+        'private_key',
+        'secret',
     }
 )
 """Names of fields (Connection extra, Variable key name etc.) that are deemed sensitive"""

[airflow] 10/38: Ensure that we don't try to mask empty string in logs (#16057)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit bff528b938db781a42090cbf09a77bdcebe41b45
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Tue May 25 19:31:22 2021 +0100

    Ensure that we don't try to mask empty string in logs (#16057)
    
    Although `Connection.password` being empty was guarded against, there
    are other possible cases (such as an extra field) that wasn't guarded
    against, which ended up with this in the logs:
    
        WARNING - ***-***-***-*** ***L***o***g***g***i***n***g*** ***e***r***r***o***r*** ***-***-***-***
    
    Oops!
    
    (cherry picked from commit 8814a59a5bf54dd17aef21eefd0900703330c22c)

[airflow] 12/38: Don't fail to log if we can't redact something (#16118)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7603ef612c1f7cfdf9b85704895cad8d67c149f1
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Mon Jun 7 09:27:01 2021 +0100

    Don't fail to log if we can't redact something (#16118)
    
    Rather than dying with an exception, catch it and warn about that,
    asking users to report it to us.
    
    Additionally handle the specific case where a file handle/IO object is
    logged -- we definitely don't want to iterate over that!
    
    (cherry picked from commit 57bd6fb2925a7d505a80b83140811b94b363f49c)
---
 airflow/utils/log/secrets_masker.py    | 53 ++++++++++++++++++++++------------
 tests/utils/log/test_secrets_masker.py | 24 +++++++++++++++
 2 files changed, 59 insertions(+), 18 deletions(-)

diff --git a/airflow/utils/log/secrets_masker.py b/airflow/utils/log/secrets_masker.py
index 6df8d39..b3ccfdb 100644
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -16,6 +16,7 @@
 # under the License.
 """Mask sensitive information from logs"""
 import collections
+import io
 import logging
 import re
 from typing import TYPE_CHECKING, Iterable, Optional, Set, TypeVar, Union
@@ -40,6 +41,10 @@ if TYPE_CHECKING:
 
     RedactableItem = TypeVar('RedctableItem')
 
+
+log = logging.getLogger(__name__)
+
+
 DEFAULT_SENSITIVE_FIELDS = frozenset(
     {
         'password',
@@ -186,24 +191,36 @@ class SecretsMasker(logging.Filter):
         is redacted.
 
         """
-        if name and should_hide_value_for_key(name):
-            return self._redact_all(item)
-
-        if isinstance(item, dict):
-            return {dict_key: self.redact(subval, dict_key) for dict_key, subval in item.items()}
-        elif isinstance(item, str):
-            if self.replacer:
-                # We can't replace specific values, but the key-based redacting
-                # can still happen, so we can't short-circuit, we need to walk
-                # the structure.
-                return self.replacer.sub('***', item)
-            return item
-        elif isinstance(item, (tuple, set)):
-            # Turn set in to tuple!
-            return tuple(self.redact(subval) for subval in item)
-        elif isinstance(item, Iterable):
-            return list(self.redact(subval) for subval in item)
-        else:
+        try:
+            if name and should_hide_value_for_key(name):
+                return self._redact_all(item)
+
+            if isinstance(item, dict):
+                return {dict_key: self.redact(subval, dict_key) for dict_key, subval in item.items()}
+            elif isinstance(item, str):
+                if self.replacer:
+                    # We can't replace specific values, but the key-based redacting
+                    # can still happen, so we can't short-circuit, we need to walk
+                    # the structure.
+                    return self.replacer.sub('***', item)
+                return item
+            elif isinstance(item, (tuple, set)):
+                # Turn set in to tuple!
+                return tuple(self.redact(subval) for subval in item)
+            elif isinstance(item, io.IOBase):
+                return item
+            elif isinstance(item, Iterable):
+                return list(self.redact(subval) for subval in item)
+            else:
+                return item
+        except Exception as e:  # pylint: disable=broad-except
+            log.warning(
+                "Unable to redact %r, please report this via <https://github.com/apache/airflow/issues>. "
+                "Error was: %s: %s",
+                item,
+                type(e).__name__,
+                str(e),
+            )
             return item
 
     # pylint: enable=too-many-return-statements
diff --git a/tests/utils/log/test_secrets_masker.py b/tests/utils/log/test_secrets_masker.py
index 8c88bdd..24e86c1 100644
--- a/tests/utils/log/test_secrets_masker.py
+++ b/tests/utils/log/test_secrets_masker.py
@@ -72,6 +72,22 @@ class TestSecretsMasker:
 
         assert caplog.text == "INFO Cannot connect to user:***\n"
 
+    def test_non_redactable(self, logger, caplog):
+        class NonReactable:
+            def __iter__(self):
+                raise RuntimeError("force fail")
+
+            def __repr__(self):
+                return "<NonRedactable>"
+
+        logger.info("Logging %s", NonReactable())
+
+        assert caplog.messages == [
+            "Unable to redact <NonRedactable>, please report this via "
+            + "<https://github.com/apache/airflow/issues>. Error was: RuntimeError: force fail",
+            "Logging <NonRedactable>",
+        ]
+
     def test_extra(self, logger, caplog):
         logger.handlers[0].formatter = ShortExcFormatter("%(levelname)s %(message)s %(conn)s")
         logger.info("Cannot connect", extra={'conn': "user:password"})
@@ -202,6 +218,14 @@ class TestSecretsMasker:
 
         assert filt.redact(value, name) == expected
 
+    def test_redact_filehandles(self, caplog):
+        filt = SecretsMasker()
+        with open("/dev/null", "w") as handle:
+            assert filt.redact(handle, None) == handle
+
+        # We shouldn't have logged a warning here
+        assert caplog.messages == []
+
 
 class TestShouldHideValueForKey:
     @pytest.mark.parametrize(

[airflow] 09/38: Fill the "job_id" field for `airflow task run` without `--local`/`--raw` for KubeExecutor (#16108)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit b9e5a2dc46b31e3589529dde5f8e1d3e2041b313
Author: Tao He <si...@gmail.com>
AuthorDate: Thu May 27 20:50:03 2021 +0800

    Fill the "job_id" field for `airflow task run` without `--local`/`--raw` for KubeExecutor (#16108)
    
    (cherry picked from commit cdc9f1a33854254607fa81265a323cf1eed6d6bb)
---
 airflow/cli/commands/task_command.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/airflow/cli/commands/task_command.py b/airflow/cli/commands/task_command.py
index dc38b87..ee34646 100644
--- a/airflow/cli/commands/task_command.py
+++ b/airflow/cli/commands/task_command.py
@@ -88,6 +88,7 @@ def _run_task_by_executor(args, dag, ti):
             print(e)
             raise e
     executor = ExecutorLoader.get_default_executor()
+    executor.job_id = "manual"
     executor.start()
     print("Sending to executor.")
     executor.queue_task_instance(

[airflow] 35/38: Fix Dag Details start date bug (#16206)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8114542a21159bf3943d0966f60d7d583b39de88
Author: Brent Bovenzi <br...@gmail.com>
AuthorDate: Tue Jun 8 10:13:18 2021 -0400

    Fix Dag Details start date bug (#16206)
    
    * Only show Start Date when catchup=True
    
    * add catchup field to details
    
    * add started field for catchup=false dags
    
    (cherry picked from commit ebc03c63af7282c9d826054b17fe7ed50e09fe4e)
---
 airflow/www/templates/airflow/dag_details.html | 15 +++++++++++++--
 1 file changed, 13 insertions(+), 2 deletions(-)

diff --git a/airflow/www/templates/airflow/dag_details.html b/airflow/www/templates/airflow/dag_details.html
index 8d30ef2..e841319 100644
--- a/airflow/www/templates/airflow/dag_details.html
+++ b/airflow/www/templates/airflow/dag_details.html
@@ -49,9 +49,20 @@
       <td>{{ dag.schedule_interval }}</td>
     </tr>
     <tr>
-      <th>Start Date</th>
-      <td>{{ dag.start_date }}</td>
+      <th>Catchup</th>
+      <td>{{ dag.catchup }}</td>
     </tr>
+    {% if dag.catchup %}
+      <tr>
+        <th>Start Date</th>
+        <td class="js-format-date">{{ dag.start_date }}</td>
+      </tr>
+    {% else %}
+      <tr>
+        <th>Started</th>
+        <td>{{ states|length > 0 }}</td>
+      </tr>
+    {% endif %}
     <tr>
       <th>End Date</th>
       <td>{{ dag.end_date }}</td>

[airflow] 06/38: Parse recently modified files even if just parsed (#16075)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 734f1dc3f45712d4702e09e0e68c76f5a8327f7b
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Wed May 26 11:29:35 2021 +0100

    Parse recently modified files even if just parsed (#16075)
    
    This commit adds an optimization where the recently modified files
    (detected by mtime) will be parsed even though it has not reached
    `min_file_process_interval`.
    
    This way you can increase `[scheduler] min_file_process_interval` to
    a higher value like `600` or so when you have large number of files to
    avoid unnecessary reparsing if files haven't changed, while still making
    sure that modified files are taken care of.
    
    (cherry picked from commit add7490145fabd097d605d85a662dccd02b600de)
---
 airflow/utils/dag_processing.py    |  8 ++++-
 tests/utils/test_dag_processing.py | 60 ++++++++++++++++++++++++++++++++++++++
 2 files changed, 67 insertions(+), 1 deletion(-)

diff --git a/airflow/utils/dag_processing.py b/airflow/utils/dag_processing.py
index 4b85234..676b610 100644
--- a/airflow/utils/dag_processing.py
+++ b/airflow/utils/dag_processing.py
@@ -1054,14 +1054,20 @@ class DagFileProcessorManager(LoggingMixin):  # pylint: disable=too-many-instanc
 
             if is_mtime_mode:
                 files_with_mtime[file_path] = os.path.getmtime(file_path)
+                file_modified_time = timezone.make_aware(datetime.fromtimestamp(files_with_mtime[file_path]))
             else:
                 file_paths.append(file_path)
+                file_modified_time = None
 
-            # Find file paths that were recently processed
+            # Find file paths that were recently processed to exclude them
+            # from being added to file_path_queue
+            # unless they were modified recently and parsing mode is "modified_time"
+            # in which case we don't honor "self._file_process_interval" (min_file_process_interval)
             last_finish_time = self.get_last_finish_time(file_path)
             if (
                 last_finish_time is not None
                 and (now - last_finish_time).total_seconds() < self._file_process_interval
+                and not (is_mtime_mode and file_modified_time and (file_modified_time > last_finish_time))
             ):
                 file_paths_recently_processed.append(file_path)
 
diff --git a/tests/utils/test_dag_processing.py b/tests/utils/test_dag_processing.py
index 78cd988..3242cf3 100644
--- a/tests/utils/test_dag_processing.py
+++ b/tests/utils/test_dag_processing.py
@@ -31,6 +31,7 @@ from unittest import mock
 from unittest.mock import MagicMock, PropertyMock
 
 import pytest
+from freezegun import freeze_time
 
 from airflow.configuration import conf
 from airflow.jobs.local_task_job import LocalTaskJob as LJ
@@ -324,6 +325,65 @@ class TestDagFileProcessorManager(unittest.TestCase):
         manager.prepare_file_path_queue()
         assert manager._file_path_queue == ['file_4.py', 'file_1.py', 'file_3.py', 'file_2.py']
 
+    @conf_vars({("scheduler", "file_parsing_sort_mode"): "modified_time"})
+    @mock.patch("zipfile.is_zipfile", return_value=True)
+    @mock.patch("airflow.utils.file.might_contain_dag", return_value=True)
+    @mock.patch("airflow.utils.file.find_path_from_directory", return_value=True)
+    @mock.patch("airflow.utils.file.os.path.isfile", return_value=True)
+    @mock.patch("airflow.utils.file.os.path.getmtime")
+    def test_recently_modified_file_is_parsed_with_mtime_mode(
+        self, mock_getmtime, mock_isfile, mock_find_path, mock_might_contain_dag, mock_zipfile
+    ):
+        """
+        Test recently updated files are processed even if min_file_process_interval is not reached
+        """
+        freezed_base_time = timezone.datetime(2020, 1, 5, 0, 0, 0)
+        initial_file_1_mtime = (freezed_base_time - timedelta(minutes=5)).timestamp()
+        dag_files = ["file_1.py"]
+        mock_getmtime.side_effect = [initial_file_1_mtime]
+        mock_find_path.return_value = dag_files
+
+        manager = DagFileProcessorManager(
+            dag_directory='directory',
+            max_runs=3,
+            processor_factory=MagicMock().return_value,
+            processor_timeout=timedelta.max,
+            signal_conn=MagicMock(),
+            dag_ids=[],
+            pickle_dags=False,
+            async_mode=True,
+        )
+
+        # let's say the DAG was just parsed 2 seconds before the Freezed time
+        last_finish_time = freezed_base_time - timedelta(seconds=10)
+        manager._file_stats = {
+            "file_1.py": DagFileStat(1, 0, last_finish_time, 1.0, 1),
+        }
+        with freeze_time(freezed_base_time):
+            manager.set_file_paths(dag_files)
+            assert manager._file_path_queue == []
+            # File Path Queue will be empty as the "modified time" < "last finish time"
+            manager.prepare_file_path_queue()
+            assert manager._file_path_queue == []
+
+        # Simulate the DAG modification by using modified_time which is greater
+        # than the last_parse_time but still less than now - min_file_process_interval
+        file_1_new_mtime = freezed_base_time - timedelta(seconds=5)
+        file_1_new_mtime_ts = file_1_new_mtime.timestamp()
+        with freeze_time(freezed_base_time):
+            manager.set_file_paths(dag_files)
+            assert manager._file_path_queue == []
+            # File Path Queue will be empty as the "modified time" < "last finish time"
+            mock_getmtime.side_effect = [file_1_new_mtime_ts]
+            manager.prepare_file_path_queue()
+            # Check that file is added to the queue even though file was just recently passed
+            assert manager._file_path_queue == ["file_1.py"]
+            assert last_finish_time < file_1_new_mtime
+            assert (
+                manager._file_process_interval
+                > (freezed_base_time - manager.get_last_finish_time("file_1.py")).total_seconds()
+            )
+
     def test_find_zombies(self):
         manager = DagFileProcessorManager(
             dag_directory='directory',

[airflow] 23/38: Support remote logging in elasticsearch with filebeat 7 (#14625)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 77060cdeeb6b82f4aa7891e9feb0cea677857525
Author: Jed Cunningham <66...@users.noreply.github.com>
AuthorDate: Fri Jun 11 12:32:42 2021 -0600

    Support remote logging in elasticsearch with filebeat 7 (#14625)
    
    Filebeat 7 renamed some fields (offset->log.offset and host->host.name),
    so allow the field names Airflow uses to be configured.
    
    Airflow isn't directly involved with getting the logs _to_
    elasticsearch, so we should allow easy configuration to accomodate
    whatever tools are used in that process.
    
    (cherry picked from commit 5cd0bf733b839951c075c54e808a595ac923c4e8)
---
 airflow/config_templates/airflow_local_settings.py |  4 ++
 airflow/config_templates/config.yml                | 14 +++++
 airflow/config_templates/default_airflow.cfg       |  6 +++
 .../providers/elasticsearch/log/es_task_handler.py | 20 ++++---
 .../elasticsearch/log/test_es_task_handler.py      | 61 +++++++++++++++++++++-
 5 files changed, 97 insertions(+), 8 deletions(-)

diff --git a/airflow/config_templates/airflow_local_settings.py b/airflow/config_templates/airflow_local_settings.py
index 3c8ffb8..b3705bf 100644
--- a/airflow/config_templates/airflow_local_settings.py
+++ b/airflow/config_templates/airflow_local_settings.py
@@ -254,6 +254,8 @@ if REMOTE_LOGGING:
         ELASTICSEARCH_WRITE_STDOUT: bool = conf.getboolean('elasticsearch', 'WRITE_STDOUT')
         ELASTICSEARCH_JSON_FORMAT: bool = conf.getboolean('elasticsearch', 'JSON_FORMAT')
         ELASTICSEARCH_JSON_FIELDS: str = conf.get('elasticsearch', 'JSON_FIELDS')
+        ELASTICSEARCH_HOST_FIELD: str = conf.get('elasticsearch', 'HOST_FIELD')
+        ELASTICSEARCH_OFFSET_FIELD: str = conf.get('elasticsearch', 'OFFSET_FIELD')
 
         ELASTIC_REMOTE_HANDLERS: Dict[str, Dict[str, Union[str, bool]]] = {
             'task': {
@@ -268,6 +270,8 @@ if REMOTE_LOGGING:
                 'write_stdout': ELASTICSEARCH_WRITE_STDOUT,
                 'json_format': ELASTICSEARCH_JSON_FORMAT,
                 'json_fields': ELASTICSEARCH_JSON_FIELDS,
+                'host_field': ELASTICSEARCH_HOST_FIELD,
+                'offset_field': ELASTICSEARCH_OFFSET_FIELD,
             },
         }
 
diff --git a/airflow/config_templates/config.yml b/airflow/config_templates/config.yml
index 39d2539..dd2d48f 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -1992,6 +1992,20 @@
       type: string
       example: ~
       default: "asctime, filename, lineno, levelname, message"
+    - name: host_field
+      description: |
+        The field where host name is stored (normally either `host` or `host.name`)
+      version_added: 2.1.1
+      type: string
+      example: ~
+      default: "host"
+    - name: offset_field
+      description: |
+        The field where offset is stored (normally either `offset` or `log.offset`)
+      version_added: 2.1.1
+      type: string
+      example: ~
+      default: "offset"
 - name: elasticsearch_configs
   description: ~
   options:
diff --git a/airflow/config_templates/default_airflow.cfg b/airflow/config_templates/default_airflow.cfg
index bf033ef..f8e8588 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -989,6 +989,12 @@ json_format = False
 # Log fields to also attach to the json output, if enabled
 json_fields = asctime, filename, lineno, levelname, message
 
+# The field where host name is stored (normally either `host` or `host.name`)
+host_field = host
+
+# The field where offset is stored (normally either `offset` or `log.offset`)
+offset_field = offset
+
 [elasticsearch_configs]
 use_ssl = False
 verify_certs = True
diff --git a/airflow/providers/elasticsearch/log/es_task_handler.py b/airflow/providers/elasticsearch/log/es_task_handler.py
index 44e72bf..16e4d65 100644
--- a/airflow/providers/elasticsearch/log/es_task_handler.py
+++ b/airflow/providers/elasticsearch/log/es_task_handler.py
@@ -20,6 +20,7 @@ import logging
 import sys
 from collections import defaultdict
 from datetime import datetime
+from operator import attrgetter
 from time import time
 from typing import List, Optional, Tuple
 from urllib.parse import quote
@@ -71,6 +72,8 @@ class ElasticsearchTaskHandler(FileTaskHandler, LoggingMixin):
         write_stdout: bool,
         json_format: bool,
         json_fields: str,
+        host_field: str = "host",
+        offset_field: str = "offset",
         host: str = "localhost:9200",
         frontend: str = "localhost:5601",
         es_kwargs: Optional[dict] = conf.getsection("elasticsearch_configs"),
@@ -94,6 +97,8 @@ class ElasticsearchTaskHandler(FileTaskHandler, LoggingMixin):
         self.write_stdout = write_stdout
         self.json_format = json_format
         self.json_fields = [label.strip() for label in json_fields.split(",")]
+        self.host_field = host_field
+        self.offset_field = offset_field
         self.handler = None
         self.context_set = False
 
@@ -122,11 +127,10 @@ class ElasticsearchTaskHandler(FileTaskHandler, LoggingMixin):
         """
         return execution_date.strftime("%Y_%m_%dT%H_%M_%S_%f")
 
-    @staticmethod
-    def _group_logs_by_host(logs):
+    def _group_logs_by_host(self, logs):
         grouped_logs = defaultdict(list)
         for log in logs:
-            key = getattr(log, 'host', 'default_host')
+            key = getattr(log, self.host_field, 'default_host')
             grouped_logs[key].append(log)
 
         # return items sorted by timestamp.
@@ -160,7 +164,7 @@ class ElasticsearchTaskHandler(FileTaskHandler, LoggingMixin):
         logs = self.es_read(log_id, offset, metadata)
         logs_by_host = self._group_logs_by_host(logs)
 
-        next_offset = offset if not logs else logs[-1].offset
+        next_offset = offset if not logs else attrgetter(self.offset_field)(logs[-1])
 
         # Ensure a string here. Large offset numbers will get JSON.parsed incorrectly
         # on the client. Sending as a string prevents this issue.
@@ -227,14 +231,16 @@ class ElasticsearchTaskHandler(FileTaskHandler, LoggingMixin):
         :type metadata: dict
         """
         # Offset is the unique key for sorting logs given log_id.
-        search = Search(using=self.client).query('match_phrase', log_id=log_id).sort('offset')
+        search = Search(using=self.client).query('match_phrase', log_id=log_id).sort(self.offset_field)
 
-        search = search.filter('range', offset={'gt': int(offset)})
+        search = search.filter('range', **{self.offset_field: {'gt': int(offset)}})
         max_log_line = search.count()
         if 'download_logs' in metadata and metadata['download_logs'] and 'max_offset' not in metadata:
             try:
                 if max_log_line > 0:
-                    metadata['max_offset'] = search[max_log_line - 1].execute()[-1].offset
+                    metadata['max_offset'] = attrgetter(self.offset_field)(
+                        search[max_log_line - 1].execute()[-1]
+                    )
                 else:
                     metadata['max_offset'] = 0
             except Exception:  # pylint: disable=broad-except
diff --git a/tests/providers/elasticsearch/log/test_es_task_handler.py b/tests/providers/elasticsearch/log/test_es_task_handler.py
index 8a3a7a2..4025722 100644
--- a/tests/providers/elasticsearch/log/test_es_task_handler.py
+++ b/tests/providers/elasticsearch/log/test_es_task_handler.py
@@ -38,7 +38,7 @@ from airflow.utils.timezone import datetime
 from .elasticmock import elasticmock
 
 
-class TestElasticsearchTaskHandler(unittest.TestCase):
+class TestElasticsearchTaskHandler(unittest.TestCase):  # pylint: disable=too-many-instance-attributes
     DAG_ID = 'dag_for_testing_file_task_handler'
     TASK_ID = 'task_for_testing_file_log_handler'
     EXECUTION_DATE = datetime(2016, 1, 1)
@@ -54,6 +54,8 @@ class TestElasticsearchTaskHandler(unittest.TestCase):
         self.write_stdout = False
         self.json_format = False
         self.json_fields = 'asctime,filename,lineno,levelname,message,exc_text'
+        self.host_field = 'host'
+        self.offset_field = 'offset'
         self.es_task_handler = ElasticsearchTaskHandler(
             self.local_log_location,
             self.filename_template,
@@ -62,6 +64,8 @@ class TestElasticsearchTaskHandler(unittest.TestCase):
             self.write_stdout,
             self.json_format,
             self.json_fields,
+            self.host_field,
+            self.offset_field,
         )
 
         self.es = elasticsearch.Elasticsearch(  # pylint: disable=invalid-name
@@ -103,6 +107,8 @@ class TestElasticsearchTaskHandler(unittest.TestCase):
             self.write_stdout,
             self.json_format,
             self.json_fields,
+            self.host_field,
+            self.offset_field,
             es_kwargs=es_conf,
         )
 
@@ -276,6 +282,55 @@ class TestElasticsearchTaskHandler(unittest.TestCase):
         )
         assert "[2020-12-24 19:25:00,962] {taskinstance.py:851} INFO - some random stuff - " == logs[0][0][1]
 
+    def test_read_with_json_format_with_custom_offset_and_host_fields(self):
+        ts = pendulum.now()
+        formatter = logging.Formatter(
+            '[%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s - %(exc_text)s'
+        )
+        self.es_task_handler.formatter = formatter
+        self.es_task_handler.json_format = True
+        self.es_task_handler.host_field = "host.name"
+        self.es_task_handler.offset_field = "log.offset"
+
+        self.body = {
+            'message': self.test_message,
+            'log_id': f'{self.DAG_ID}-{self.TASK_ID}-2016_01_01T00_00_00_000000-1',
+            'log': {'offset': 1},
+            'host': {'name': 'somehostname'},
+            'asctime': '2020-12-24 19:25:00,962',
+            'filename': 'taskinstance.py',
+            'lineno': 851,
+            'levelname': 'INFO',
+        }
+        self.es_task_handler.set_context(self.ti)
+        self.es.index(index=self.index_name, doc_type=self.doc_type, body=self.body, id=id)
+
+        logs, _ = self.es_task_handler.read(
+            self.ti, 1, {'offset': 0, 'last_log_timestamp': str(ts), 'end_of_log': False}
+        )
+        assert "[2020-12-24 19:25:00,962] {taskinstance.py:851} INFO - some random stuff - " == logs[0][0][1]
+
+    def test_read_with_custom_offset_and_host_fields(self):
+        ts = pendulum.now()
+        # Delete the existing log entry as it doesn't have the new offset and host fields
+        self.es.delete(index=self.index_name, doc_type=self.doc_type, id=1)
+
+        self.es_task_handler.host_field = "host.name"
+        self.es_task_handler.offset_field = "log.offset"
+
+        self.body = {
+            'message': self.test_message,
+            'log_id': self.LOG_ID,
+            'log': {'offset': 1},
+            'host': {'name': 'somehostname'},
+        }
+        self.es.index(index=self.index_name, doc_type=self.doc_type, body=self.body, id=id)
+
+        logs, _ = self.es_task_handler.read(
+            self.ti, 1, {'offset': 0, 'last_log_timestamp': str(ts), 'end_of_log': False}
+        )
+        assert self.test_message == logs[0][0][1]
+
     def test_close(self):
         formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
         self.es_task_handler.formatter = formatter
@@ -357,6 +412,8 @@ class TestElasticsearchTaskHandler(unittest.TestCase):
             self.write_stdout,
             self.json_format,
             self.json_fields,
+            self.host_field,
+            self.offset_field,
         )
         log_id = self.es_task_handler._render_log_id(self.ti, 1)
         assert expected_log_id == log_id
@@ -382,6 +439,8 @@ class TestElasticsearchTaskHandler(unittest.TestCase):
             self.write_stdout,
             self.json_format,
             self.json_fields,
+            self.host_field,
+            self.offset_field,
             frontend=es_frontend,
         )
         url = es_task_handler.get_external_log_url(self.ti, self.ti.try_number)

[airflow] 30/38: Switch to built-in data structures in SecretsMasker (#16424)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 4c37aeab97fb1f40a20b912402d2747cd5fc1d5a
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Wed Jun 16 11:29:45 2021 +0200

    Switch to built-in data structures in SecretsMasker (#16424)
    
    Using Iterable in SecretsMasker might cause undesireable
    side effect in case the object passed as log parameter
    is an iterable object and actually iterating it is not idempotent.
    
    For example in case of botocore, it passes StreamingBody
    object to log and this object is Iterable. However it can be
    iterated only once. Masking causes the object to be iterated
    during logging and results in empty body when actual results
    are retrieved later.
    
    This change only iterates list type of objects and recurrently
    redacts only dicts/strs/tuples/sets/lists which should never
    produce any side effects as all those objects do not have side
    effects when they are accessed.
    
    Fixes: #16148
    (cherry picked from commit d1d02b62e3436dedfe9a2b80cd1e61954639ca4d)
---
 airflow/utils/log/secrets_masker.py    |  8 +++-----
 tests/utils/log/test_secrets_masker.py | 16 ----------------
 2 files changed, 3 insertions(+), 21 deletions(-)

diff --git a/airflow/utils/log/secrets_masker.py b/airflow/utils/log/secrets_masker.py
index 3177c58..2fd0d0a 100644
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -16,7 +16,6 @@
 # under the License.
 """Mask sensitive information from logs"""
 import collections
-import io
 import logging
 import re
 from typing import TYPE_CHECKING, Iterable, Optional, Set, TypeVar, Union
@@ -178,7 +177,7 @@ class SecretsMasker(logging.Filter):
         elif isinstance(item, (tuple, set)):
             # Turn set in to tuple!
             return tuple(self._redact_all(subval) for subval in item)
-        elif isinstance(item, Iterable):
+        elif isinstance(item, list):
             return list(self._redact_all(subval) for subval in item)
         else:
             return item
@@ -209,12 +208,11 @@ class SecretsMasker(logging.Filter):
             elif isinstance(item, (tuple, set)):
                 # Turn set in to tuple!
                 return tuple(self.redact(subval) for subval in item)
-            elif isinstance(item, io.IOBase):
-                return item
-            elif isinstance(item, Iterable):
+            elif isinstance(item, list):
                 return list(self.redact(subval) for subval in item)
             else:
                 return item
+        # I think this should never happen, but it does not hurt to leave it just in case
         except Exception as e:  # pylint: disable=broad-except
             log.warning(
                 "Unable to redact %r, please report this via <https://github.com/apache/airflow/issues>. "
diff --git a/tests/utils/log/test_secrets_masker.py b/tests/utils/log/test_secrets_masker.py
index 24e86c1..5d3b404 100644
--- a/tests/utils/log/test_secrets_masker.py
+++ b/tests/utils/log/test_secrets_masker.py
@@ -72,22 +72,6 @@ class TestSecretsMasker:
 
         assert caplog.text == "INFO Cannot connect to user:***\n"
 
-    def test_non_redactable(self, logger, caplog):
-        class NonReactable:
-            def __iter__(self):
-                raise RuntimeError("force fail")
-
-            def __repr__(self):
-                return "<NonRedactable>"
-
-        logger.info("Logging %s", NonReactable())
-
-        assert caplog.messages == [
-            "Unable to redact <NonRedactable>, please report this via "
-            + "<https://github.com/apache/airflow/issues>. Error was: RuntimeError: force fail",
-            "Logging <NonRedactable>",
-        ]
-
     def test_extra(self, logger, caplog):
         logger.handlers[0].formatter = ShortExcFormatter("%(levelname)s %(message)s %(conn)s")
         logger.info("Cannot connect", extra={'conn': "user:password"})

[airflow] 16/38: Fix templated default/example values in config ref docs (#16442)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit acc824f60786069e596fd0e97fa99fcef9f00fa6
Author: Jed Cunningham <66...@users.noreply.github.com>
AuthorDate: Tue Jun 15 10:34:44 2021 -0600

    Fix templated default/example values in config ref docs (#16442)
    
    We should show the actual default/example value in the configuration
    reference docs, not the templated values.
    
    e.g. `{dag_id}` like you get in a generated airflow.cfg, not `{{dag_id}}
    like is stored in the airflow.cfg template.
    
    (cherry picked from commit cc3c13c1f52a0051aea79d90bf8258e7d156d6b7)
---
 airflow/configuration.py |  2 +-
 docs/conf.py             | 14 +++++++++++++-
 2 files changed, 14 insertions(+), 2 deletions(-)

diff --git a/airflow/configuration.py b/airflow/configuration.py
index 53e76a6..c3595d7 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -91,7 +91,7 @@ def _default_config_file_path(file_name: str):
     return os.path.join(templates_dir, file_name)
 
 
-def default_config_yaml() -> dict:
+def default_config_yaml() -> List[dict]:
     """
     Read Airflow configs from YAML file
 
diff --git a/docs/conf.py b/docs/conf.py
index 39426d6..e403ffe 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -339,8 +339,20 @@ if PACKAGE_NAME == 'apache-airflow':
     ) in AirflowConfigParser.deprecated_options.items():
         deprecated_options[deprecated_section][deprecated_key] = section, key, since_version
 
+    configs = default_config_yaml()
+
+    # We want the default/example we show in the docs to reflect the value _after_
+    # the config has been templated, not before
+    # e.g. {{dag_id}} in default_config.cfg -> {dag_id} in airflow.cfg, and what we want in docs
+    keys_to_format = ["default", "example"]
+    for conf_section in configs:
+        for option in conf_section["options"]:
+            for key in keys_to_format:
+                if option[key] and "{{" in option[key]:
+                    option[key] = option[key].replace("{{", "{").replace("}}", "}")
+
     jinja_contexts = {
-        'config_ctx': {"configs": default_config_yaml(), "deprecated_options": deprecated_options},
+        'config_ctx': {"configs": configs, "deprecated_options": deprecated_options},
         'quick_start_ctx': {
             'doc_root_url': f'https://airflow.apache.org/docs/apache-airflow/{PACKAGE_VERSION}/'
             if FOR_PRODUCTION

[airflow] 33/38: Fix unsuccessful KubernetesPod final_state call when `is_delete_operator_pod=True` (#15490)

Posted by as...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ash pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e32f22a1f791018fd2573eb66958a11a94414b65
Author: MatthewRBruce <ma...@shopify.com>
AuthorDate: Wed Jun 16 19:16:27 2021 -0400

    Fix unsuccessful KubernetesPod final_state call when `is_delete_operator_pod=True` (#15490)
    
    If a Kubernetes Pod ends in a state other than `SUCCESS` and `is_delete_operator_pod` is True, then use the `final_state` from the previous `create_new_pod_for_operator` call since the pod is already deleted and the current state can't be re-read.
    
    closes: https://github.com/apache/airflow/issues/15456
    (cherry picked from commit 4c9735ff9b0201758564fcd64166abde318ec8a7)
---
 .../cncf/kubernetes/operators/kubernetes_pod.py    | 25 +++++++++++-----------
 .../cncf/kubernetes/utils/pod_launcher.py          |  7 +++---
 kubernetes_tests/test_kubernetes_pod_operator.py   | 10 ++++-----
 .../test_kubernetes_pod_operator_backcompat.py     |  6 +++---
 .../kubernetes/operators/test_kubernetes_pod.py    | 11 +++++-----
 5 files changed, 29 insertions(+), 30 deletions(-)

diff --git a/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py b/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py
index 213614c..d6b2eb2 100644
--- a/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py
+++ b/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py
@@ -356,15 +356,14 @@ class KubernetesPodOperator(BaseOperator):  # pylint: disable=too-many-instance-
 
             if len(pod_list.items) == 1:
                 try_numbers_match = self._try_numbers_match(context, pod_list.items[0])
-                final_state, result = self.handle_pod_overlap(
+                final_state, remote_pod, result = self.handle_pod_overlap(
                     labels, try_numbers_match, launcher, pod_list.items[0]
                 )
             else:
                 self.log.info("creating pod with labels %s and launcher %s", labels, launcher)
-                final_state, _, result = self.create_new_pod_for_operator(labels, launcher)
+                final_state, remote_pod, result = self.create_new_pod_for_operator(labels, launcher)
             if final_state != State.SUCCESS:
-                status = self.client.read_namespaced_pod(self.pod.metadata.name, self.namespace)
-                raise AirflowException(f'Pod {self.pod.metadata.name} returned a failure: {status}')
+                raise AirflowException(f'Pod {self.pod.metadata.name} returned a failure: {remote_pod}')
             context['task_instance'].xcom_push(key='pod_name', value=self.pod.metadata.name)
             context['task_instance'].xcom_push(key='pod_namespace', value=self.namespace)
             return result
@@ -373,7 +372,7 @@ class KubernetesPodOperator(BaseOperator):  # pylint: disable=too-many-instance-
 
     def handle_pod_overlap(
         self, labels: dict, try_numbers_match: bool, launcher: Any, pod: k8s.V1Pod
-    ) -> Tuple[State, Optional[str]]:
+    ) -> Tuple[State, k8s.V1Pod, Optional[str]]:
         """
 
         In cases where the Scheduler restarts while a KubernetesPodOperator task is running,
@@ -398,12 +397,12 @@ class KubernetesPodOperator(BaseOperator):  # pylint: disable=too-many-instance-
             log_line += " Will attach to this pod and monitor instead of starting new one"
             self.log.info(log_line)
             self.pod = pod
-            final_state, result = self.monitor_launched_pod(launcher, pod)
+            final_state, remote_pod, result = self.monitor_launched_pod(launcher, pod)
         else:
             log_line += f"creating pod with labels {labels} and launcher {launcher}"
             self.log.info(log_line)
-            final_state, _, result = self.create_new_pod_for_operator(labels, launcher)
-        return final_state, result
+            final_state, remote_pod, result = self.create_new_pod_for_operator(labels, launcher)
+        return final_state, remote_pod, result
 
     @staticmethod
     def _get_pod_identifying_label_string(labels) -> str:
@@ -516,7 +515,7 @@ class KubernetesPodOperator(BaseOperator):  # pylint: disable=too-many-instance-
         self.log.debug("Starting pod:\n%s", yaml.safe_dump(self.pod.to_dict()))
         try:
             launcher.start_pod(self.pod, startup_timeout=self.startup_timeout_seconds)
-            final_state, result = launcher.monitor_pod(pod=self.pod, get_logs=self.get_logs)
+            final_state, remote_pod, result = launcher.monitor_pod(pod=self.pod, get_logs=self.get_logs)
         except AirflowException:
             if self.log_events_on_failure:
                 for event in launcher.read_pod_events(self.pod).items:
@@ -526,7 +525,7 @@ class KubernetesPodOperator(BaseOperator):  # pylint: disable=too-many-instance-
             if self.is_delete_operator_pod:
                 self.log.debug("Deleting pod for task %s", self.task_id)
                 launcher.delete_pod(self.pod)
-        return final_state, self.pod, result
+        return final_state, remote_pod, result
 
     def patch_already_checked(self, pod: k8s.V1Pod):
         """Add an "already tried annotation to ensure we only retry once"""
@@ -543,7 +542,7 @@ class KubernetesPodOperator(BaseOperator):  # pylint: disable=too-many-instance-
         :return:
         """
         try:
-            (final_state, result) = launcher.monitor_pod(pod, get_logs=self.get_logs)
+            (final_state, remote_pod, result) = launcher.monitor_pod(pod, get_logs=self.get_logs)
         finally:
             if self.is_delete_operator_pod:
                 launcher.delete_pod(pod)
@@ -551,9 +550,9 @@ class KubernetesPodOperator(BaseOperator):  # pylint: disable=too-many-instance-
             if self.log_events_on_failure:
                 for event in launcher.read_pod_events(pod).items:
                     self.log.error("Pod Event: %s - %s", event.reason, event.message)
-            self.patch_already_checked(self.pod)
+            self.patch_already_checked(pod)
             raise AirflowException(f'Pod returned a failure: {final_state}')
-        return final_state, result
+        return final_state, remote_pod, result
 
     def on_kill(self) -> None:
         if self.pod:
diff --git a/airflow/providers/cncf/kubernetes/utils/pod_launcher.py b/airflow/providers/cncf/kubernetes/utils/pod_launcher.py
index 741b475..791c77e 100644
--- a/airflow/providers/cncf/kubernetes/utils/pod_launcher.py
+++ b/airflow/providers/cncf/kubernetes/utils/pod_launcher.py
@@ -129,9 +129,9 @@ class PodLauncher(LoggingMixin):
                     raise AirflowException("Pod took too long to start")
                 time.sleep(1)
 
-    def monitor_pod(self, pod: V1Pod, get_logs: bool) -> Tuple[State, Optional[str]]:
+    def monitor_pod(self, pod: V1Pod, get_logs: bool) -> Tuple[State, V1Pod, Optional[str]]:
         """
-        Monitors a pod and returns the final state
+        Monitors a pod and returns the final state, pod and xcom result
 
         :param pod: pod spec that will be monitored
         :param get_logs: whether to read the logs locally
@@ -167,7 +167,8 @@ class PodLauncher(LoggingMixin):
         while self.pod_is_running(pod):
             self.log.info('Pod %s has state %s', pod.metadata.name, State.RUNNING)
             time.sleep(2)
-        return self._task_status(self.read_pod(pod)), result
+        remote_pod = self.read_pod(pod)
+        return self._task_status(remote_pod), remote_pod, result
 
     def parse_log_line(self, line: str) -> Tuple[str, str]:
         """
diff --git a/kubernetes_tests/test_kubernetes_pod_operator.py b/kubernetes_tests/test_kubernetes_pod_operator.py
index 48e77aa..5208e3d 100644
--- a/kubernetes_tests/test_kubernetes_pod_operator.py
+++ b/kubernetes_tests/test_kubernetes_pod_operator.py
@@ -596,7 +596,7 @@ class TestKubernetesPodOperatorSystem(unittest.TestCase):
             do_xcom_push=False,
         )
         # THEN
-        monitor_mock.return_value = (State.SUCCESS, None)
+        monitor_mock.return_value = (State.SUCCESS, None, None)
         context = create_context(k)
         k.execute(context)
         assert start_mock.call_args[0][0].spec.containers[0].env_from == [
@@ -828,7 +828,7 @@ class TestKubernetesPodOperatorSystem(unittest.TestCase):
             task_id="task" + self.get_current_task_name(), pod_template_file=path, do_xcom_push=True
         )
 
-        monitor_mock.return_value = (State.SUCCESS, None)
+        monitor_mock.return_value = (State.SUCCESS, None, None)
         context = create_context(k)
         with self.assertLogs(k.log, level=logging.DEBUG) as cm:
             k.execute(context)
@@ -924,7 +924,7 @@ class TestKubernetesPodOperatorSystem(unittest.TestCase):
             priority_class_name=priority_class_name,
         )
 
-        monitor_mock.return_value = (State.SUCCESS, None)
+        monitor_mock.return_value = (State.SUCCESS, None, None)
         context = create_context(k)
         k.execute(context)
         actual_pod = self.api_client.sanitize_for_serialization(k.pod)
@@ -966,7 +966,7 @@ class TestKubernetesPodOperatorSystem(unittest.TestCase):
             termination_grace_period=0,
         )
         context = create_context(k)
-        monitor_mock.return_value = (State.SUCCESS, None)
+        monitor_mock.return_value = (State.SUCCESS, None, None)
         k.execute(context)
         name = k.pod.metadata.name
         pod = client.read_namespaced_pod(name=name, namespace=namespace)
@@ -1000,7 +1000,7 @@ class TestKubernetesPodOperatorSystem(unittest.TestCase):
         with mock.patch(
             "airflow.providers.cncf.kubernetes.utils.pod_launcher.PodLauncher.monitor_pod"
         ) as monitor_mock:
-            monitor_mock.return_value = (State.SUCCESS, None)
+            monitor_mock.return_value = (State.SUCCESS, None, None)
             k.execute(context)
             name = k.pod.metadata.name
             pod = client.read_namespaced_pod(name=name, namespace=namespace)
diff --git a/kubernetes_tests/test_kubernetes_pod_operator_backcompat.py b/kubernetes_tests/test_kubernetes_pod_operator_backcompat.py
index 5facd47..e2fc6bc 100644
--- a/kubernetes_tests/test_kubernetes_pod_operator_backcompat.py
+++ b/kubernetes_tests/test_kubernetes_pod_operator_backcompat.py
@@ -136,7 +136,7 @@ class TestKubernetesPodOperatorSystem(unittest.TestCase):
             image_pull_secrets=fake_pull_secrets,
             cluster_context='default',
         )
-        monitor_mock.return_value = (State.SUCCESS, None)
+        monitor_mock.return_value = (State.SUCCESS, None, None)
         context = create_context(k)
         k.execute(context=context)
         assert start_mock.call_args[0][0].spec.image_pull_secrets == [
@@ -468,7 +468,7 @@ class TestKubernetesPodOperatorSystem(unittest.TestCase):
             configmaps=[configmap],
         )
         # THEN
-        mock_monitor.return_value = (State.SUCCESS, None)
+        mock_monitor.return_value = (State.SUCCESS, None, None)
         context = create_context(k)
         k.execute(context)
         assert mock_start.call_args[0][0].spec.containers[0].env_from == [
@@ -496,7 +496,7 @@ class TestKubernetesPodOperatorSystem(unittest.TestCase):
             do_xcom_push=False,
         )
         # THEN
-        monitor_mock.return_value = (State.SUCCESS, None)
+        monitor_mock.return_value = (State.SUCCESS, None, None)
         context = create_context(k)
         k.execute(context)
         assert start_mock.call_args[0][0].spec.containers[0].env_from == [
diff --git a/tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py b/tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py
index a087fc9..cbd0080 100644
--- a/tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py
+++ b/tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py
@@ -60,7 +60,7 @@ class TestKubernetesPodOperator(unittest.TestCase):
         }
 
     def run_pod(self, operator) -> k8s.V1Pod:
-        self.monitor_mock.return_value = (State.SUCCESS, None)
+        self.monitor_mock.return_value = (State.SUCCESS, None, None)
         context = self.create_context(operator)
         operator.execute(context=context)
         return self.start_mock.call_args[0][0]
@@ -83,7 +83,7 @@ class TestKubernetesPodOperator(unittest.TestCase):
             config_file=file_path,
             cluster_context="default",
         )
-        self.monitor_mock.return_value = (State.SUCCESS, None)
+        self.monitor_mock.return_value = (State.SUCCESS, None, None)
         self.client_mock.list_namespaced_pod.return_value = []
         context = self.create_context(k)
         k.execute(context=context)
@@ -411,8 +411,8 @@ class TestKubernetesPodOperator(unittest.TestCase):
             do_xcom_push=False,
             cluster_context="default",
         )
-        self.monitor_mock.return_value = (State.FAILED, None)
         failed_pod_status = "read_pod_namespaced_result"
+        self.monitor_mock.return_value = (State.FAILED, failed_pod_status, None)
         read_namespaced_pod_mock = self.client_mock.return_value.read_namespaced_pod
         read_namespaced_pod_mock.return_value = failed_pod_status
 
@@ -424,8 +424,7 @@ class TestKubernetesPodOperator(unittest.TestCase):
             str(ctx.value)
             == f"Pod Launching failed: Pod {k.pod.metadata.name} returned a failure: {failed_pod_status}"
         )
-        assert self.client_mock.return_value.read_namespaced_pod.called
-        assert read_namespaced_pod_mock.call_args[0][0] == k.pod.metadata.name
+        assert not self.client_mock.return_value.read_namespaced_pod.called
 
     def test_no_need_to_describe_pod_on_success(self):
         name_base = "test"
@@ -442,7 +441,7 @@ class TestKubernetesPodOperator(unittest.TestCase):
             do_xcom_push=False,
             cluster_context="default",
         )
-        self.monitor_mock.return_value = (State.SUCCESS, None)
+        self.monitor_mock.return_value = (State.SUCCESS, None, None)
 
         context = self.create_context(k)
         k.execute(context=context)