You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ka...@apache.org on 2021/04/09 21:12:29 UTC

[airflow] branch v2-0-test updated (9105770 -> 9eb0e13)

This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


 discard 9105770  Bugfix: Task docs are not shown in the Task Instance Detail View (#15191)
 discard 7d192de  Fix mistake and typos in doc/docstrings (#15180)
 discard ce0ca24  Update import path and fix typo in `dag-run.rst` (#15201)
 discard 091fae9  Bugfix: Fix overriding `pod_template_file` in KubernetesExecutor (#15197)
 discard 50a1666  Fix celery executor bug trying to call len on map (#14883)
     new 20ed260  Bugfix: Fix overriding `pod_template_file` in KubernetesExecutor (#15197)
     new 97bad2c  Update import path and fix typo in `dag-run.rst` (#15201)
     new ddf306b  Fix mistake and typos in doc/docstrings (#15180)
     new 9eb0e13  Bugfix: Task docs are not shown in the Task Instance Detail View (#15191)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (9105770)
            \
             N -- N -- N   refs/heads/v2-0-test (9eb0e13)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 4 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/executors/celery_executor.py | 3 ---
 1 file changed, 3 deletions(-)

[airflow] 02/04: Update import path and fix typo in `dag-run.rst` (#15201)

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 97bad2ccb8893fc25f99dac6e3b4fbc666361344
Author: eladkal <45...@users.noreply.github.com>
AuthorDate: Mon Apr 5 14:46:58 2021 +0300

    Update import path and fix typo in `dag-run.rst` (#15201)
    
    1. fix typo parametrized ->  parameterized
    2. update `from airflow.operators.bash_operator import BashOperator` -> `from airflow.operators.bash import BashOperator`
    
    (cherry picked from commit 4099108f554130cf3f87ba33b9d6084a74e70231)
---
 docs/apache-airflow/dag-run.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/apache-airflow/dag-run.rst b/docs/apache-airflow/dag-run.rst
index 72204f1..dbcf68a 100644
--- a/docs/apache-airflow/dag-run.rst
+++ b/docs/apache-airflow/dag-run.rst
@@ -208,10 +208,10 @@ Example of a parameterized DAG:
 .. code-block:: python
 
     from airflow import DAG
-    from airflow.operators.bash_operator import BashOperator
+    from airflow.operators.bash import BashOperator
     from airflow.utils.dates import days_ago
 
-    dag = DAG("example_parametrized_dag", schedule_interval=None, start_date=days_ago(2))
+    dag = DAG("example_parameterized_dag", schedule_interval=None, start_date=days_ago(2))
 
     parameterized_task = BashOperator(
         task_id='parameterized_task',
@@ -227,7 +227,7 @@ Using CLI
 
 .. code-block:: bash
 
-    airflow dags trigger --conf '{"conf1": "value1"}' example_parametrized_dag
+    airflow dags trigger --conf '{"conf1": "value1"}' example_parameterized_dag
 
 Using UI
 ^^^^^^^^^^

[airflow] 04/04: Bugfix: Task docs are not shown in the Task Instance Detail View (#15191)

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 9eb0e13aec346ff24c08e8ea546ec940d53dde23
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Mon Apr 5 03:46:41 2021 +0100

    Bugfix: Task docs are not shown in the Task Instance Detail View (#15191)
    
    closes https://github.com/apache/airflow/issues/15178
    closes https://github.com/apache/airflow/issues/13761
    
    This feature was added in 2015 in https://github.com/apache/airflow/pull/74 and it was expected to set `doc_md` (or `doc_rst` and other `doc_*`) via `task.doc_md` instead of passing via arg. However, this did not work with DAG Serialization as we only allowed a selected args to be stored in Serialized version of DAG.
    
    (cherry picked from commit e86f5ca8fa5ff22c1e1f48addc012919034c672f)
---
 airflow/example_dags/tutorial.py              |  1 +
 airflow/models/baseoperator.py                | 26 ++++++++++++++++++++++++++
 airflow/serialization/schema.json             |  7 ++++++-
 airflow/www/utils.py                          |  2 +-
 airflow/www/views.py                          |  2 +-
 docs/apache-airflow/concepts.rst              |  6 +++---
 tests/serialization/test_dag_serialization.py |  9 +++++++++
 tests/www/test_utils.py                       |  4 ++--
 8 files changed, 49 insertions(+), 8 deletions(-)

diff --git a/airflow/example_dags/tutorial.py b/airflow/example_dags/tutorial.py
index 518c801..09d6ca3 100644
--- a/airflow/example_dags/tutorial.py
+++ b/airflow/example_dags/tutorial.py
@@ -97,6 +97,7 @@ with DAG(
     You can document your task using the attributes `doc_md` (markdown),
     `doc` (plain text), `doc_rst`, `doc_json`, `doc_yaml` which gets
     rendered in the UI's Task Instance Details page.
+
     ![img](http://montcs.bloomu.edu/~bobmon/Semesters/2012-01/491/import%20soul.png)
     """
     )
diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
index 8bda785..eacea64 100644
--- a/airflow/models/baseoperator.py
+++ b/airflow/models/baseoperator.py
@@ -278,6 +278,21 @@ class BaseOperator(Operator, LoggingMixin, TaskMixin, metaclass=BaseOperatorMeta
     :param do_xcom_push: if True, an XCom is pushed containing the Operator's
         result
     :type do_xcom_push: bool
+    :param doc: Add documentation or notes to your Task objects that is visible in
+        Task Instance details View in the Webserver
+    :type doc: str
+    :param doc_md: Add documentation (in Markdown format) or notes to your Task objects
+        that is visible in Task Instance details View in the Webserver
+    :type doc_md: str
+    :param doc_rst: Add documentation (in RST format) or notes to your Task objects
+        that is visible in Task Instance details View in the Webserver
+    :type doc_rst: str
+    :param doc_json: Add documentation (in JSON format) or notes to your Task objects
+        that is visible in Task Instance details View in the Webserver
+    :type doc_json: str
+    :param doc_yaml: Add documentation (in YAML format) or notes to your Task objects
+        that is visible in Task Instance details View in the Webserver
+    :type doc_yaml: str
     """
 
     # For derived classes to define which fields will get jinjaified
@@ -381,6 +396,11 @@ class BaseOperator(Operator, LoggingMixin, TaskMixin, metaclass=BaseOperatorMeta
         inlets: Optional[Any] = None,
         outlets: Optional[Any] = None,
         task_group: Optional["TaskGroup"] = None,
+        doc: Optional[str] = None,
+        doc_md: Optional[str] = None,
+        doc_json: Optional[str] = None,
+        doc_yaml: Optional[str] = None,
+        doc_rst: Optional[str] = None,
         **kwargs,
     ):
         from airflow.models.dag import DagContext
@@ -486,6 +506,12 @@ class BaseOperator(Operator, LoggingMixin, TaskMixin, metaclass=BaseOperatorMeta
         self.executor_config = executor_config or {}
         self.do_xcom_push = do_xcom_push
 
+        self.doc_md = doc_md
+        self.doc_json = doc_json
+        self.doc_yaml = doc_yaml
+        self.doc_rst = doc_rst
+        self.doc = doc
+
         # Private attributes
         self._upstream_task_ids: Set[str] = set()
         self._downstream_task_ids: Set[str] = set()
diff --git a/airflow/serialization/schema.json b/airflow/serialization/schema.json
index 0fbe20f..3bc11ee 100644
--- a/airflow/serialization/schema.json
+++ b/airflow/serialization/schema.json
@@ -168,7 +168,12 @@
           "type": "array",
           "items": { "type": "string" },
           "uniqueItems": true
-        }
+        },
+        "doc":  { "type": "string" },
+        "doc_md":  { "type": "string" },
+        "doc_json":  { "type": "string" },
+        "doc_yaml":  { "type": "string" },
+        "doc_rst":  { "type": "string" }
       },
       "additionalProperties": true
     },
diff --git a/airflow/www/utils.py b/airflow/www/utils.py
index afd94c6..ad53436 100644
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -321,7 +321,7 @@ def render(obj, lexer):
     return out
 
 
-def wrapped_markdown(s, css_class=None):
+def wrapped_markdown(s, css_class='rich_doc'):
     """Convert a Markdown string to HTML."""
     if s is None:
         return None
diff --git a/airflow/www/views.py b/airflow/www/views.py
index f0116b3..5f4c8c5 100644
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -1220,7 +1220,7 @@ class Airflow(AirflowBaseView):  # noqa: D101  pylint: disable=too-many-public-m
         # Color coding the special attributes that are code
         special_attrs_rendered = {}
         for attr_name in wwwutils.get_attr_renderer():
-            if hasattr(task, attr_name):
+            if getattr(task, attr_name, None) is not None:
                 source = getattr(task, attr_name)
                 special_attrs_rendered[attr_name] = wwwutils.get_attr_renderer()[attr_name](source)
 
diff --git a/docs/apache-airflow/concepts.rst b/docs/apache-airflow/concepts.rst
index 2637b78..3de060b 100644
--- a/docs/apache-airflow/concepts.rst
+++ b/docs/apache-airflow/concepts.rst
@@ -1394,8 +1394,8 @@ Documentation & Notes
 =====================
 
 It's possible to add documentation or notes to your DAGs & task objects that
-become visible in the web interface ("Graph View" & "Tree View" for DAGs, "Task Details" for
-tasks). There are a set of special task attributes that get rendered as rich
+become visible in the web interface ("Graph View" & "Tree View" for DAGs, "Task Instance Details"
+for tasks). There are a set of special task attributes that get rendered as rich
 content if defined:
 
 ==========  ================
@@ -1430,7 +1430,7 @@ to the related tasks in Airflow.
     """
 
 This content will get rendered as markdown respectively in the "Graph View" and
-"Task Details" pages.
+"Task Instance Details" pages.
 
 .. _jinja-templating:
 
diff --git a/tests/serialization/test_dag_serialization.py b/tests/serialization/test_dag_serialization.py
index 55d2c5a..e447751 100644
--- a/tests/serialization/test_dag_serialization.py
+++ b/tests/serialization/test_dag_serialization.py
@@ -79,6 +79,7 @@ serialized_simple_dag_ground_truth = {
         },
         "is_paused_upon_creation": False,
         "_dag_id": "simple_dag",
+        "doc_md": "### DAG Tutorial Documentation",
         "fileloc": None,
         "tasks": [
             {
@@ -110,6 +111,7 @@ serialized_simple_dag_ground_truth = {
                         }
                     },
                 },
+                "doc_md": "### Task Tutorial Documentation",
             },
             {
                 "task_id": "custom_task",
@@ -170,6 +172,7 @@ def make_simple_dag():
         start_date=datetime(2019, 8, 1),
         is_paused_upon_creation=False,
         access_control={"test_role": {permissions.ACTION_CAN_READ, permissions.ACTION_CAN_EDIT}},
+        doc_md="### DAG Tutorial Documentation",
     ) as dag:
         CustomOperator(task_id='custom_task')
         BashOperator(
@@ -177,6 +180,7 @@ def make_simple_dag():
             bash_command='echo {{ task.task_id }}',
             owner='airflow',
             executor_config={"pod_override": executor_config_pod},
+            doc_md="### Task Tutorial Documentation",
         )
         return {'simple_dag': dag}
 
@@ -853,6 +857,11 @@ class TestStringifiedDAGs(unittest.TestCase):
             '_upstream_task_ids': set(),
             'depends_on_past': False,
             'do_xcom_push': True,
+            'doc': None,
+            'doc_json': None,
+            'doc_md': None,
+            'doc_rst': None,
+            'doc_yaml': None,
             'email': None,
             'email_on_failure': True,
             'email_on_retry': True,
diff --git a/tests/www/test_utils.py b/tests/www/test_utils.py
index 5ced73a..f4e50d9 100644
--- a/tests/www/test_utils.py
+++ b/tests/www/test_utils.py
@@ -240,7 +240,7 @@ class TestWrappedMarkdown(unittest.TestCase):
         )
 
         assert (
-            '<div class="None" ><table>\n<thead>\n<tr>\n<th>Job</th>\n'
+            '<div class="rich_doc" ><table>\n<thead>\n<tr>\n<th>Job</th>\n'
             '<th>Duration</th>\n</tr>\n</thead>\n<tbody>\n<tr>\n<td>ETL'
             '</td>\n<td>14m</td>\n</tr>\n</tbody>\n'
             '</table></div>'
@@ -255,4 +255,4 @@ class TestWrappedMarkdown(unittest.TestCase):
             """
         )
 
-        assert '<div class="None" ><h1>header</h1>\n<p>1st line\n2nd line</p></div>' == rendered
+        assert '<div class="rich_doc" ><h1>header</h1>\n<p>1st line\n2nd line</p></div>' == rendered

[airflow] 01/04: Bugfix: Fix overriding `pod_template_file` in KubernetesExecutor (#15197)

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 20ed260c3aee7674488bc3231dea81378d6ddb21
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Mon Apr 5 16:56:00 2021 +0100

    Bugfix: Fix overriding `pod_template_file` in KubernetesExecutor (#15197)
    
    This feature was added in https://github.com/apache/airflow/pull/11784 but
    it was broken as it got `pod_template_override` from `executor_config`
    instead of `pod_template_file`.
    
    closes #14199
    
    (cherry picked from commit 5606137ba32c0daa87d557301d82f7f2bdc0b0a4)
---
 .../example_kubernetes_executor_config.py          |  3 +-
 airflow/executors/kubernetes_executor.py           |  2 +-
 .../basic_template.yaml                            |  4 +-
 docs/apache-airflow/executor/kubernetes.rst        |  2 +-
 .../basic_template.yaml                            | 34 ++++++++
 tests/executors/test_kubernetes_executor.py        | 91 +++++++++++++++++++++-
 6 files changed, 130 insertions(+), 6 deletions(-)

diff --git a/airflow/example_dags/example_kubernetes_executor_config.py b/airflow/example_dags/example_kubernetes_executor_config.py
index cbd69cb..5290dd8 100644
--- a/airflow/example_dags/example_kubernetes_executor_config.py
+++ b/airflow/example_dags/example_kubernetes_executor_config.py
@@ -24,6 +24,7 @@ import os
 from airflow import DAG
 from airflow.example_dags.libs.helper import print_stuff
 from airflow.operators.python import PythonOperator
+from airflow.settings import AIRFLOW_HOME
 from airflow.utils.dates import days_ago
 
 default_args = {
@@ -110,7 +111,7 @@ try:
             task_id="task_with_template",
             python_callable=print_stuff,
             executor_config={
-                "pod_template_file": "/usr/local/airflow/pod_templates/basic_template.yaml",
+                "pod_template_file": os.path.join(AIRFLOW_HOME, "pod_templates/basic_template.yaml"),
                 "pod_override": k8s.V1Pod(metadata=k8s.V1ObjectMeta(labels={"release": "stable"})),
             },
         )
diff --git a/airflow/executors/kubernetes_executor.py b/airflow/executors/kubernetes_executor.py
index 7e3d82b..ec7cbf7 100644
--- a/airflow/executors/kubernetes_executor.py
+++ b/airflow/executors/kubernetes_executor.py
@@ -496,7 +496,7 @@ class KubernetesExecutor(BaseExecutor, LoggingMixin):
             return
 
         if executor_config:
-            pod_template_file = executor_config.get("pod_template_override", None)
+            pod_template_file = executor_config.get("pod_template_file", None)
         else:
             pod_template_file = None
         if not self.task_queue:
diff --git a/airflow/kubernetes_executor_templates/basic_template.yaml b/airflow/kubernetes_executor_templates/basic_template.yaml
index a953867..a6eb83f 100644
--- a/airflow/kubernetes_executor_templates/basic_template.yaml
+++ b/airflow/kubernetes_executor_templates/basic_template.yaml
@@ -69,8 +69,8 @@ spec:
         defaultMode: 420
   restartPolicy: Never
   terminationGracePeriodSeconds: 30
-  serviceAccountName: airflow-worker-serviceaccount
-  serviceAccount: airflow-worker-serviceaccount
+  serviceAccountName: airflow-worker
+  serviceAccount: airflow-worker
   securityContext:
     runAsUser: 50000
     fsGroup: 50000
diff --git a/docs/apache-airflow/executor/kubernetes.rst b/docs/apache-airflow/executor/kubernetes.rst
index 217a29c..61d13f4 100644
--- a/docs/apache-airflow/executor/kubernetes.rst
+++ b/docs/apache-airflow/executor/kubernetes.rst
@@ -125,7 +125,7 @@ name ``base`` and a second container containing your desired sidecar.
     :end-before: [END task_with_sidecar]
 
 You can also create custom ``pod_template_file`` on a per-task basis so that you can recycle the same base values between multiple tasks.
-This will replace the default ``pod_template_file`` named in the airflow.cfg and then override that template using the ``pod_override_spec``.
+This will replace the default ``pod_template_file`` named in the airflow.cfg and then override that template using the ``pod_override``.
 
 Here is an example of a task with both features:
 
diff --git a/tests/executors/kubernetes_executor_template_files/basic_template.yaml b/tests/executors/kubernetes_executor_template_files/basic_template.yaml
new file mode 100644
index 0000000..1fb00f2
--- /dev/null
+++ b/tests/executors/kubernetes_executor_template_files/basic_template.yaml
@@ -0,0 +1,34 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+---
+kind: Pod
+apiVersion: v1
+metadata:
+  name: dummy-name-dont-delete
+  namespace: dummy-name-dont-delete
+  labels:
+    mylabel: foo
+spec:
+  containers:
+    - name: base
+      image: dummy-name-dont-delete
+  securityContext:
+    runAsUser: 50000
+    fsGroup: 50000
+  imagePullSecrets:
+    - name: airflow-registry
+  schedulerName: default-scheduler
diff --git a/tests/executors/test_kubernetes_executor.py b/tests/executors/test_kubernetes_executor.py
index 68b0006..8d3d5b4 100644
--- a/tests/executors/test_kubernetes_executor.py
+++ b/tests/executors/test_kubernetes_executor.py
@@ -15,6 +15,7 @@
 # specific language governing permissions and limitations
 # under the License.
 #
+import pathlib
 import random
 import re
 import string
@@ -22,6 +23,7 @@ import unittest
 from datetime import datetime
 from unittest import mock
 
+import pytest
 from kubernetes.client import models as k8s
 from urllib3 import HTTPResponse
 
@@ -39,7 +41,7 @@ try:
         get_base_pod_from_template,
     )
     from airflow.kubernetes import pod_generator
-    from airflow.kubernetes.pod_generator import PodGenerator
+    from airflow.kubernetes.pod_generator import PodGenerator, datetime_to_label_safe_datestring
     from airflow.utils.state import State
 except ImportError:
     AirflowKubernetesScheduler = None  # type: ignore
@@ -215,6 +217,93 @@ class TestKubernetesExecutor(unittest.TestCase):
 
         assert list(executor.event_buffer.values())[0][1] == "Invalid executor_config passed"
 
+    @pytest.mark.execution_timeout(10)
+    @unittest.skipIf(AirflowKubernetesScheduler is None, 'kubernetes python package is not installed')
+    @mock.patch('airflow.kubernetes.pod_launcher.PodLauncher.run_pod_async')
+    @mock.patch('airflow.executors.kubernetes_executor.get_kube_client')
+    def test_pod_template_file_override_in_executor_config(self, mock_get_kube_client, mock_run_pod_async):
+        current_folder = pathlib.Path(__file__).parent.absolute()
+        template_file = str(
+            (current_folder / "kubernetes_executor_template_files" / "basic_template.yaml").absolute()
+        )
+
+        mock_kube_client = mock.patch('kubernetes.client.CoreV1Api', autospec=True)
+        mock_get_kube_client.return_value = mock_kube_client
+
+        with conf_vars({('kubernetes', 'pod_template_file'): ''}):
+            executor = self.kubernetes_executor
+            executor.start()
+
+            assert executor.event_buffer == {}
+            assert executor.task_queue.empty()
+
+            execution_date = datetime.utcnow()
+
+            executor.execute_async(
+                key=('dag', 'task', execution_date, 1),
+                queue=None,
+                command=['airflow', 'tasks', 'run', 'true', 'some_parameter'],
+                executor_config={
+                    "pod_template_file": template_file,
+                    "pod_override": k8s.V1Pod(
+                        metadata=k8s.V1ObjectMeta(labels={"release": "stable"}),
+                        spec=k8s.V1PodSpec(
+                            containers=[k8s.V1Container(name="base", image="airflow:3.6")],
+                        ),
+                    ),
+                },
+            )
+
+            assert not executor.task_queue.empty()
+            task = executor.task_queue.get_nowait()
+            _, _, expected_executor_config, expected_pod_template_file = task
+
+            # Test that the correct values have been put to queue
+            assert expected_executor_config.metadata.labels == {'release': 'stable'}
+            assert expected_pod_template_file == template_file
+
+            self.kubernetes_executor.kube_scheduler.run_next(task)
+            mock_run_pod_async.assert_called_once_with(
+                k8s.V1Pod(
+                    api_version="v1",
+                    kind="Pod",
+                    metadata=k8s.V1ObjectMeta(
+                        name=mock.ANY,
+                        namespace="default",
+                        annotations={
+                            'dag_id': 'dag',
+                            'execution_date': execution_date.isoformat(),
+                            'task_id': 'task',
+                            'try_number': '1',
+                        },
+                        labels={
+                            'airflow-worker': '5',
+                            'airflow_version': mock.ANY,
+                            'dag_id': 'dag',
+                            'execution_date': datetime_to_label_safe_datestring(execution_date),
+                            'kubernetes_executor': 'True',
+                            'mylabel': 'foo',
+                            'release': 'stable',
+                            'task_id': 'task',
+                            'try_number': '1',
+                        },
+                    ),
+                    spec=k8s.V1PodSpec(
+                        containers=[
+                            k8s.V1Container(
+                                name="base",
+                                image="airflow:3.6",
+                                args=['airflow', 'tasks', 'run', 'true', 'some_parameter'],
+                                env=[k8s.V1EnvVar(name='AIRFLOW_IS_K8S_EXECUTOR_POD', value='True')],
+                            )
+                        ],
+                        image_pull_secrets=[k8s.V1LocalObjectReference(name='airflow-registry')],
+                        scheduler_name='default-scheduler',
+                        security_context=k8s.V1PodSecurityContext(fs_group=50000, run_as_user=50000),
+                    ),
+                )
+            )
+
     @mock.patch('airflow.executors.kubernetes_executor.KubernetesJobWatcher')
     @mock.patch('airflow.executors.kubernetes_executor.get_kube_client')
     def test_change_state_running(self, mock_get_kube_client, mock_kubernetes_job_watcher):

[airflow] 03/04: Fix mistake and typos in doc/docstrings (#15180)

Posted by ka...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v2-0-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ddf306b257d75978f0bd0f1e433342f60d0ceb0f
Author: Xiaodong DENG <xd...@apache.org>
AuthorDate: Sun Apr 4 19:44:03 2021 +0200

    Fix mistake and typos in doc/docstrings (#15180)
    
    - Fix an apparent mistake in doc relating to catchup
    - Fix typo pickable (should be picklable)
    
    (cherry picked from commit 53dafa593fd7ce0be2a48dc9a9e993bb42b6abc5)
---
 airflow/providers/apache/hive/hooks/hive.py | 2 +-
 airflow/utils/timezone.py                   | 4 ++--
 docs/apache-airflow/dag-run.rst             | 2 +-
 3 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/airflow/providers/apache/hive/hooks/hive.py b/airflow/providers/apache/hive/hooks/hive.py
index ab7b7b7..d261ab2 100644
--- a/airflow/providers/apache/hive/hooks/hive.py
+++ b/airflow/providers/apache/hive/hooks/hive.py
@@ -487,7 +487,7 @@ class HiveMetastoreHook(BaseHook):
 
     def __getstate__(self) -> Dict[str, Any]:
         # This is for pickling to work despite the thrift hive client not
-        # being pickable
+        # being picklable
         state = dict(self.__dict__)
         del state['metastore']
         return state
diff --git a/airflow/utils/timezone.py b/airflow/utils/timezone.py
index d302cbe..09736e5 100644
--- a/airflow/utils/timezone.py
+++ b/airflow/utils/timezone.py
@@ -56,7 +56,7 @@ def utcnow() -> dt.datetime:
     :return:
     """
     # pendulum utcnow() is not used as that sets a TimezoneInfo object
-    # instead of a Timezone. This is not pickable and also creates issues
+    # instead of a Timezone. This is not picklable and also creates issues
     # when using replace()
     result = dt.datetime.utcnow()
     result = result.replace(tzinfo=utc)
@@ -71,7 +71,7 @@ def utc_epoch() -> dt.datetime:
     :return:
     """
     # pendulum utcnow() is not used as that sets a TimezoneInfo object
-    # instead of a Timezone. This is not pickable and also creates issues
+    # instead of a Timezone. This is not picklable and also creates issues
     # when using replace()
     result = dt.datetime(1970, 1, 1)
     result = result.replace(tzinfo=utc)
diff --git a/docs/apache-airflow/dag-run.rst b/docs/apache-airflow/dag-run.rst
index dbcf68a..0752990 100644
--- a/docs/apache-airflow/dag-run.rst
+++ b/docs/apache-airflow/dag-run.rst
@@ -80,7 +80,7 @@ An Airflow DAG with a ``start_date``, possibly an ``end_date``, and a ``sched
 series of intervals which the scheduler turns into individual DAG Runs and executes. The scheduler, by default, will
 kick off a DAG Run for any interval that has not been run since the last execution date (or has been cleared). This concept is called Catchup.
 
-If your DAG is written to handle its catchup (i.e., not limited to the interval, but instead to ``Now`` for instance.),
+If your DAG is not written to handle its catchup (i.e., not limited to the interval, but instead to ``Now`` for instance.),
 then you will want to turn catchup off. This can be done by setting ``catchup = False`` in DAG  or ``catchup_by_default = False``
 in the configuration file. When turned off, the scheduler creates a DAG run only for the latest interval.