You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by je...@apache.org on 2021/11/30 15:22:35 UTC

[airflow] branch main updated: Capitalize names in docs (#19893)

This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new 9a469d8  Capitalize names in docs (#19893)
9a469d8 is described below

commit 9a469d813fc083a7a5f402727e93df3e3f9c9118
Author: Bas Harenslak <Ba...@users.noreply.github.com>
AuthorDate: Tue Nov 30 16:22:03 2021 +0100

    Capitalize names in docs (#19893)
    
    Co-authored-by: Bas Harenslak <ba...@astronomer.io>
---
 docs/apache-airflow-providers-apache-drill/index.rst         |  2 +-
 docs/apache-airflow-providers-apache-druid/index.rst         |  2 +-
 .../connections/kubernetes.rst                               |  2 +-
 docs/apache-airflow-providers-dingding/index.rst             |  2 +-
 docs/apache-airflow-providers-docker/connections/docker.rst  | 10 +++++-----
 docs/apache-airflow-providers-grpc/index.rst                 |  2 +-
 docs/apache-airflow-providers-jenkins/index.rst              |  2 +-
 docs/apache-airflow-providers-mongo/index.rst                |  2 +-
 docs/apache-airflow-providers-mysql/index.rst                |  2 +-
 .../connections/postgres.rst                                 |  2 +-
 .../operators/postgres_operator_howto_guide.rst              | 12 ++++++------
 docs/apache-airflow-providers-segment/index.rst              |  2 +-
 docs/apache-airflow-providers-singularity/index.rst          |  2 +-
 docs/apache-airflow-providers-sqlite/index.rst               |  2 +-
 docs/apache-airflow-providers-ssh/index.rst                  |  2 +-
 docs/apache-airflow-providers-yandex/index.rst               |  2 +-
 docs/apache-airflow/concepts/smart-sensors.rst               |  4 ++--
 docs/apache-airflow/deprecated-rest-api-ref.rst              |  2 +-
 docs/apache-airflow/executor/kubernetes.rst                  |  2 +-
 docs/apache-airflow/extra-packages-ref.rst                   |  2 +-
 docs/apache-airflow/howto/connection.rst                     |  4 ++--
 docs/apache-airflow/howto/custom-operator.rst                |  2 +-
 docs/apache-airflow/howto/define_extra_link.rst              |  2 +-
 docs/apache-airflow/installation/index.rst                   |  2 +-
 docs/apache-airflow/installation/installing-from-pypi.rst    |  4 ++--
 docs/apache-airflow/logging-monitoring/logging-tasks.rst     |  2 +-
 docs/apache-airflow/plugins.rst                              |  2 +-
 docs/apache-airflow/production-deployment.rst                |  2 +-
 docs/apache-airflow/start/docker.rst                         |  2 +-
 docs/apache-airflow/tutorial.rst                             |  6 +++---
 docs/docker-stack/build-arg-ref.rst                          |  4 ++--
 docs/docker-stack/build.rst                                  |  4 ++--
 docs/helm-chart/manage-dags-files.rst                        |  4 ++--
 docs/helm-chart/production-guide.rst                         |  4 ++--
 docs/helm-chart/setting-resources-for-containers.rst         |  6 +++---
 35 files changed, 55 insertions(+), 55 deletions(-)

diff --git a/docs/apache-airflow-providers-apache-drill/index.rst b/docs/apache-airflow-providers-apache-drill/index.rst
index a520dc3..b8cecfc 100644
--- a/docs/apache-airflow-providers-apache-drill/index.rst
+++ b/docs/apache-airflow-providers-apache-drill/index.rst
@@ -69,7 +69,7 @@ are in ``airflow.providers.apache.drill`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-apache-drill``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-apache-druid/index.rst b/docs/apache-airflow-providers-apache-druid/index.rst
index 07b98a0..56bdc81 100644
--- a/docs/apache-airflow-providers-apache-druid/index.rst
+++ b/docs/apache-airflow-providers-apache-druid/index.rst
@@ -63,7 +63,7 @@ are in ``airflow.providers.apache.druid`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-apache-druid``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-cncf-kubernetes/connections/kubernetes.rst b/docs/apache-airflow-providers-cncf-kubernetes/connections/kubernetes.rst
index db83e2c..c0a8539 100644
--- a/docs/apache-airflow-providers-cncf-kubernetes/connections/kubernetes.rst
+++ b/docs/apache-airflow-providers-cncf-kubernetes/connections/kubernetes.rst
@@ -56,7 +56,7 @@ Kube config (JSON format)
   that used to connect to Kubernetes client.
 
 Namespace
-  Default kubernetes namespace for the connection.
+  Default Kubernetes namespace for the connection.
 
 When specifying the connection in environment variable you should specify
 it using URI syntax.
diff --git a/docs/apache-airflow-providers-dingding/index.rst b/docs/apache-airflow-providers-dingding/index.rst
index 0cf8299..86ba73d 100644
--- a/docs/apache-airflow-providers-dingding/index.rst
+++ b/docs/apache-airflow-providers-dingding/index.rst
@@ -69,7 +69,7 @@ are in ``airflow.providers.dingding`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-dingding``
 
 Cross provider package dependencies
diff --git a/docs/apache-airflow-providers-docker/connections/docker.rst b/docs/apache-airflow-providers-docker/connections/docker.rst
index cf56a2b..ce733b4 100644
--- a/docs/apache-airflow-providers-docker/connections/docker.rst
+++ b/docs/apache-airflow-providers-docker/connections/docker.rst
@@ -27,7 +27,7 @@ The Docker connection type enables connection to the Docker registry.
 Authenticating to Docker
 ------------------------
 
-Authenticate to docker by using the login information for docker registry.
+Authenticate to Docker by using the login information for Docker registry.
 More information on `Docker authentication here
 <https://docker-py.readthedocs.io/en/1.2.3/api/>`_.
 
@@ -40,13 +40,13 @@ Configuring the Connection
 --------------------------
 
 Login
-    Specify the docker registry username.
+    Specify the Docker registry username.
 
 Password
-    Specify the docker registry plaintext password.
+    Specify the Docker registry plaintext password.
 
 Host
-    Specify the URL to the docker registry. Ex: ``https://index.docker.io/v1``
+    Specify the URL to the Docker registry. Ex: ``https://index.docker.io/v1``
 
 Port (optional)
     Specify the port if not specified in host.
@@ -56,7 +56,7 @@ Extra
     The following parameters are all optional:
 
     * ``email``: Specify the email used for the registry account.
-    * ``reauth``: Specify whether refresh existing authentication on the docker server. (bool)
+    * ``reauth``: Specify whether refresh existing authentication on the Docker server. (bool)
 
 When specifying the connection in environment variable you should specify
 it using URI syntax.
diff --git a/docs/apache-airflow-providers-grpc/index.rst b/docs/apache-airflow-providers-grpc/index.rst
index 6d19fee..b6e1933 100644
--- a/docs/apache-airflow-providers-grpc/index.rst
+++ b/docs/apache-airflow-providers-grpc/index.rst
@@ -68,7 +68,7 @@ are in ``airflow.providers.grpc`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-grpc``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-jenkins/index.rst b/docs/apache-airflow-providers-jenkins/index.rst
index ab4b000..490f942 100644
--- a/docs/apache-airflow-providers-jenkins/index.rst
+++ b/docs/apache-airflow-providers-jenkins/index.rst
@@ -63,7 +63,7 @@ are in ``airflow.providers.jenkins`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-jenkins``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-mongo/index.rst b/docs/apache-airflow-providers-mongo/index.rst
index 1be18dd..a93a6fa 100644
--- a/docs/apache-airflow-providers-mongo/index.rst
+++ b/docs/apache-airflow-providers-mongo/index.rst
@@ -63,7 +63,7 @@ are in ``airflow.providers.mongo`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-mongo``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-mysql/index.rst b/docs/apache-airflow-providers-mysql/index.rst
index 911500b..c23a0bd 100644
--- a/docs/apache-airflow-providers-mysql/index.rst
+++ b/docs/apache-airflow-providers-mysql/index.rst
@@ -70,7 +70,7 @@ are in ``airflow.providers.mysql`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-mysql``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-postgres/connections/postgres.rst b/docs/apache-airflow-providers-postgres/connections/postgres.rst
index f777c85..542511b 100644
--- a/docs/apache-airflow-providers-postgres/connections/postgres.rst
+++ b/docs/apache-airflow-providers-postgres/connections/postgres.rst
@@ -38,7 +38,7 @@ Password (required)
     Specify the password to connect.
 
 Extra (optional)
-    Specify the extra parameters (as json dictionary) that can be used in postgres
+    Specify the extra parameters (as json dictionary) that can be used in Postgres
     connection. The following parameters out of the standard python parameters
     are supported:
 
diff --git a/docs/apache-airflow-providers-postgres/operators/postgres_operator_howto_guide.rst b/docs/apache-airflow-providers-postgres/operators/postgres_operator_howto_guide.rst
index f035920..b789aa0 100644
--- a/docs/apache-airflow-providers-postgres/operators/postgres_operator_howto_guide.rst
+++ b/docs/apache-airflow-providers-postgres/operators/postgres_operator_howto_guide.rst
@@ -34,8 +34,8 @@ Under the hood, the :class:`~airflow.providers.postgres.operators.postgres.Postg
 Common Database Operations with PostgresOperator
 ------------------------------------------------
 
-To use the postgres operator to carry out SQL request, two parameters are required: ``sql`` and ``postgres_conn_id``.
-These two parameters are eventually fed to the postgres hook object that interacts directly with the postgres database.
+To use the PostgresOperator to carry out SQL request, two parameters are required: ``sql`` and ``postgres_conn_id``.
+These two parameters are eventually fed to the PostgresHook object that interacts directly with the Postgres database.
 
 Creating a Postgres database table
 ----------------------------------
@@ -100,10 +100,10 @@ We can then create a PostgresOperator task that populate the ``pet`` table.
   )
 
 
-Fetching records from your postgres database table
+Fetching records from your Postgres database table
 --------------------------------------------------
 
-Fetching records from your postgres database table can be as simple as:
+Fetching records from your Postgres database table can be as simple as:
 
 .. code-block:: python
 
@@ -171,5 +171,5 @@ Conclusion
 In this how-to guide we explored the Apache Airflow PostgreOperator. Let's quickly highlight the key takeaways.
 In Airflow-2.0, PostgresOperator class now resides in the ``providers`` package. It is best practice to create subdirectory
 called ``sql`` in your ``dags`` directory where you can store your sql files. This will make your code more elegant and more
-maintainable. And finally, we looked at the different ways you can dynamically pass parameters into our postgres operator
-tasks  using ``parameters`` or ``params`` attribute.
+maintainable. And finally, we looked at the different ways you can dynamically pass parameters into our PostgresOperator
+tasks using ``parameters`` or ``params`` attribute.
diff --git a/docs/apache-airflow-providers-segment/index.rst b/docs/apache-airflow-providers-segment/index.rst
index fec2e68..25be305 100644
--- a/docs/apache-airflow-providers-segment/index.rst
+++ b/docs/apache-airflow-providers-segment/index.rst
@@ -62,7 +62,7 @@ are in ``airflow.providers.segment`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-segment``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-singularity/index.rst b/docs/apache-airflow-providers-singularity/index.rst
index 59a5321..303283c 100644
--- a/docs/apache-airflow-providers-singularity/index.rst
+++ b/docs/apache-airflow-providers-singularity/index.rst
@@ -63,7 +63,7 @@ are in ``airflow.providers.singularity`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-singularity``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-sqlite/index.rst b/docs/apache-airflow-providers-sqlite/index.rst
index 2b3ad2c..8718948 100644
--- a/docs/apache-airflow-providers-sqlite/index.rst
+++ b/docs/apache-airflow-providers-sqlite/index.rst
@@ -75,7 +75,7 @@ are in ``airflow.providers.sqlite`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-sqlite``
 
 .. include:: ../../airflow/providers/sqlite/CHANGELOG.rst
diff --git a/docs/apache-airflow-providers-ssh/index.rst b/docs/apache-airflow-providers-ssh/index.rst
index 9a1401e..3d33d43 100644
--- a/docs/apache-airflow-providers-ssh/index.rst
+++ b/docs/apache-airflow-providers-ssh/index.rst
@@ -68,7 +68,7 @@ are in ``airflow.providers.ssh`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-ssh``
 
 PIP requirements
diff --git a/docs/apache-airflow-providers-yandex/index.rst b/docs/apache-airflow-providers-yandex/index.rst
index 7d7e66e..dc63184 100644
--- a/docs/apache-airflow-providers-yandex/index.rst
+++ b/docs/apache-airflow-providers-yandex/index.rst
@@ -70,7 +70,7 @@ are in ``airflow.providers.yandex`` python package.
 Installation
 ------------
 
-You can install this package on top of an existing airflow 2.1+ installation via
+You can install this package on top of an existing Airflow 2.1+ installation via
 ``pip install apache-airflow-providers-yandex``
 
 PIP requirements
diff --git a/docs/apache-airflow/concepts/smart-sensors.rst b/docs/apache-airflow/concepts/smart-sensors.rst
index e654d91..fe84117 100644
--- a/docs/apache-airflow/concepts/smart-sensors.rst
+++ b/docs/apache-airflow/concepts/smart-sensors.rst
@@ -55,7 +55,7 @@ store poke context at sensor_instance table and then exits with a ‘sensing’
 
 When the smart sensor mode is enabled, a special set of builtin smart sensor DAGs
 (named smart_sensor_group_shard_xxx) is created by the system; These DAGs contain ``SmartSensorOperator``
-task and manage the smart sensor jobs for the airflow cluster. The SmartSensorOperator task can fetch
+task and manage the smart sensor jobs for the Airflow cluster. The SmartSensorOperator task can fetch
 hundreds of ‘sensing’ instances from sensor_instance table and poke on behalf of them in batches.
 Users don’t need to change their existing DAGs.
 
@@ -79,7 +79,7 @@ Add the following settings in the ``airflow.cfg``:
 
 *   ``use_smart_sensor``: This config indicates if the smart sensor is enabled.
 *   ``shards``: This config indicates the number of concurrently running smart sensor jobs for
-    the airflow cluster.
+    the Airflow cluster.
 *   ``sensors_enabled``: This config is a list of sensor class names that will use the smart sensor.
     The users use the same class names (e.g. HivePartitionSensor) in their DAGs and they don’t have
     the control to use smart sensors or not, unless they exclude their tasks explicitly.
diff --git a/docs/apache-airflow/deprecated-rest-api-ref.rst b/docs/apache-airflow/deprecated-rest-api-ref.rst
index 877385b..281e68d 100644
--- a/docs/apache-airflow/deprecated-rest-api-ref.rst
+++ b/docs/apache-airflow/deprecated-rest-api-ref.rst
@@ -38,7 +38,7 @@ Endpoints
 .. http:post:: /api/experimental/dags/<DAG_ID>/dag_runs
 
   Creates a dag_run for a given dag id.
-  Note: If execution_date is not specified in the body, airflow by default creates only one DAG per second for a given DAG_ID.
+  Note: If execution_date is not specified in the body, Airflow by default creates only one DAG per second for a given DAG_ID.
   In order to create multiple DagRun within one second, you should set parameter ``"replace_microseconds"`` to ``"false"`` (boolean as string).
 
   The execution_date must be specified with the format ``YYYY-mm-DDTHH:MM:SS.ssssss``.
diff --git a/docs/apache-airflow/executor/kubernetes.rst b/docs/apache-airflow/executor/kubernetes.rst
index ab60791..40db784 100644
--- a/docs/apache-airflow/executor/kubernetes.rst
+++ b/docs/apache-airflow/executor/kubernetes.rst
@@ -93,7 +93,7 @@ With these requirements in mind, here are some examples of basic ``pod_template_
 
 .. note::
 
-    The examples below should work when using default airflow configuration values. However, many custom
+    The examples below should work when using default Airflow configuration values. However, many custom
     configuration values need to be explicitly passed to the pod via this template too. This includes,
     but is not limited to, sql configuration, required Airflow connections, dag folder path and
     logging settings. See :doc:`../configurations-ref` for details.
diff --git a/docs/apache-airflow/extra-packages-ref.rst b/docs/apache-airflow/extra-packages-ref.rst
index 8698cf8..b761e21 100644
--- a/docs/apache-airflow/extra-packages-ref.rst
+++ b/docs/apache-airflow/extra-packages-ref.rst
@@ -46,7 +46,7 @@ python dependencies for the provided package.
 +---------------------+-----------------------------------------------------+----------------------------------------------------------------------------+
 | cgroups             | ``pip install 'apache-airflow[cgroups]'``           | Needed To use CgroupTaskRunner                                             |
 +---------------------+-----------------------------------------------------+----------------------------------------------------------------------------+
-| cncf.kubernetes     | ``pip install 'apache-airflow[cncf.kubernetes]'``   | Kubernetes Executor (also installs the kubernetes provider package)        |
+| cncf.kubernetes     | ``pip install 'apache-airflow[cncf.kubernetes]'``   | Kubernetes Executor (also installs the Kubernetes provider package)        |
 +---------------------+-----------------------------------------------------+----------------------------------------------------------------------------+
 | dask                | ``pip install 'apache-airflow[dask]'``              | DaskExecutor                                                               |
 +---------------------+-----------------------------------------------------+----------------------------------------------------------------------------+
diff --git a/docs/apache-airflow/howto/connection.rst b/docs/apache-airflow/howto/connection.rst
index 99dd065..c77d25d 100644
--- a/docs/apache-airflow/howto/connection.rst
+++ b/docs/apache-airflow/howto/connection.rst
@@ -204,10 +204,10 @@ If storing the environment variable in something like ``~/.bashrc``, add as foll
 
     export AIRFLOW_CONN_MY_PROD_DATABASE='my-conn-type://login:password@host:port/schema?param1=val1&param2=val2'
 
-Using docker .env
+Using Docker .env
 ^^^^^^^^^^^^^^^^^
 
-If using with a docker ``.env`` file, you may need to remove the single quotes.
+If using with a Docker ``.env`` file, you may need to remove the single quotes.
 
 .. code-block::
 
diff --git a/docs/apache-airflow/howto/custom-operator.rst b/docs/apache-airflow/howto/custom-operator.rst
index e4ca522..46431d1 100644
--- a/docs/apache-airflow/howto/custom-operator.rst
+++ b/docs/apache-airflow/howto/custom-operator.rst
@@ -32,7 +32,7 @@ There are two methods that you need to override in a derived class:
   You can specify the ``default_args`` in the dag file. See :ref:`Default args <concepts:default-arguments>` for more details.
 
 * Execute - The code to execute when the runner calls the operator. The method contains the
-  airflow context as a parameter that can be used to read config values.
+  Airflow context as a parameter that can be used to read config values.
 
 .. note::
 
diff --git a/docs/apache-airflow/howto/define_extra_link.rst b/docs/apache-airflow/howto/define_extra_link.rst
index 631bf7e..527312b 100644
--- a/docs/apache-airflow/howto/define_extra_link.rst
+++ b/docs/apache-airflow/howto/define_extra_link.rst
@@ -61,7 +61,7 @@ The following code shows how to add extra links to an operator via Plugins:
 .. note:: Operator Extra Links should be registered via Airflow Plugins or custom Airflow Provider to work.
 
 You can also add a global operator extra link that will be available to
-all the operators through an airflow plugin or through airflow providers. You can learn more about it in the
+all the operators through an Airflow plugin or through Airflow providers. You can learn more about it in the
 :ref:`plugin example <plugin-example>` and in :doc:`apache-airflow-providers:index`.
 
 You can see all the extra links available via community-managed providers in
diff --git a/docs/apache-airflow/installation/index.rst b/docs/apache-airflow/installation/index.rst
index 9c3bebf..bbd3424 100644
--- a/docs/apache-airflow/installation/index.rst
+++ b/docs/apache-airflow/installation/index.rst
@@ -162,7 +162,7 @@ and official constraint files- same that are used for installing Airflow from Py
 
 * Users who are familiar with Containers and Docker stack and understand how to build their own container images.
 * Users who understand how to install providers and dependencies from PyPI with constraints if they want to extend or customize the image.
-* Users who know how to create deployments using Docker by linking together multiple docker containers and maintaining such deployments.
+* Users who know how to create deployments using Docker by linking together multiple Docker containers and maintaining such deployments.
 
 **What are you expected to handle**
 
diff --git a/docs/apache-airflow/installation/installing-from-pypi.rst b/docs/apache-airflow/installation/installing-from-pypi.rst
index 95d0552..6528698 100644
--- a/docs/apache-airflow/installation/installing-from-pypi.rst
+++ b/docs/apache-airflow/installation/installing-from-pypi.rst
@@ -89,8 +89,8 @@ In order to simplify the installation, we have prepared examples of how to upgra
 Installing Airflow with extras and providers
 ============================================
 
-If you need to install extra dependencies of airflow, you can use the script below to make an installation
-a one-liner (the example below installs postgres and google provider, as well as ``async`` extra.
+If you need to install extra dependencies of Airflow, you can use the script below to make an installation
+a one-liner (the example below installs Postgres and Google providers, as well as ``async`` extra).
 
 .. code-block:: bash
     :substitutions:
diff --git a/docs/apache-airflow/logging-monitoring/logging-tasks.rst b/docs/apache-airflow/logging-monitoring/logging-tasks.rst
index 043f8f7..0c1fd79 100644
--- a/docs/apache-airflow/logging-monitoring/logging-tasks.rst
+++ b/docs/apache-airflow/logging-monitoring/logging-tasks.rst
@@ -120,7 +120,7 @@ Some external systems require specific configuration in Airflow for redirection
 Serving logs from workers
 -------------------------
 
-Most task handlers send logs upon completion of a task. In order to view logs in real time, airflow automatically starts an http server to serve the logs in the following cases:
+Most task handlers send logs upon completion of a task. In order to view logs in real time, Airflow automatically starts an http server to serve the logs in the following cases:
 
 - If ``SchedulerExecutor`` or ``LocalExecutor`` is used, then when ``airflow scheduler`` is running.
 - If ``CeleryExecutor`` is used, then when ``airflow worker`` is running.
diff --git a/docs/apache-airflow/plugins.rst b/docs/apache-airflow/plugins.rst
index 3543aef..59fbe9a 100644
--- a/docs/apache-airflow/plugins.rst
+++ b/docs/apache-airflow/plugins.rst
@@ -300,7 +300,7 @@ Plugins as Python packages
 --------------------------
 
 It is possible to load plugins via `setuptools entrypoint <https://packaging.python.org/guides/creating-and-discovering-plugins/#using-package-metadata>`_ mechanism. To do this link
-your plugin using an entrypoint in your package. If the package is installed, airflow
+your plugin using an entrypoint in your package. If the package is installed, Airflow
 will automatically load the registered plugins from the entrypoint list.
 
 .. note::
diff --git a/docs/apache-airflow/production-deployment.rst b/docs/apache-airflow/production-deployment.rst
index 66b7370..c207604 100644
--- a/docs/apache-airflow/production-deployment.rst
+++ b/docs/apache-airflow/production-deployment.rst
@@ -141,7 +141,7 @@ is capable of retrieving the authentication token.
 
 The best practice to implement proper security mechanism in this case is to make sure that worker
 workloads have no access to the Keytab but only have access to the periodically refreshed, temporary
-authentication tokens. This can be achieved in docker environment by running the ``airflow kerberos``
+authentication tokens. This can be achieved in Docker environment by running the ``airflow kerberos``
 command and the worker command in separate containers - where only the ``airflow kerberos`` token has
 access to the Keytab file (preferably configured as secret resource). Those two containers should share
 a volume where the temporary token should be written by the ``airflow kerberos`` and read by the workers.
diff --git a/docs/apache-airflow/start/docker.rst b/docs/apache-airflow/start/docker.rst
index 023f669..377ef99 100644
--- a/docs/apache-airflow/start/docker.rst
+++ b/docs/apache-airflow/start/docker.rst
@@ -324,7 +324,7 @@ runtime user id which is unknown at the time of building the image.
     functionality - only added confusion - so it has been removed.
 
 
-Those additional variables are useful in case you are trying out/testing Airflow installation via docker compose.
+Those additional variables are useful in case you are trying out/testing Airflow installation via Docker Compose.
 They are not intended to be used in production, but they make the environment faster to bootstrap for first time
 users with the most common customizations.
 
diff --git a/docs/apache-airflow/tutorial.rst b/docs/apache-airflow/tutorial.rst
index babb8d6..965a126 100644
--- a/docs/apache-airflow/tutorial.rst
+++ b/docs/apache-airflow/tutorial.rst
@@ -374,11 +374,11 @@ Lets look at another example; we need to get some data from a file which is host
 
 Initial setup
 ''''''''''''''''''''
-We need to have docker and postgres installed.
+We need to have Docker and Postgres installed.
 We will be using this `docker file <https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#docker-compose-yaml>`_
 Follow the instructions properly to set up Airflow.
 
-Create a Employee table in postgres using this:
+Create a Employee table in Postgres using this:
 
 .. code-block:: sql
 
@@ -400,7 +400,7 @@ Create a Employee table in postgres using this:
       "Leave" INTEGER
   );
 
-We also need to add a connection to postgres. Go to the UI and click "Admin" >> "Connections". Specify the following for each field:
+We also need to add a connection to Postgres. Go to the UI and click "Admin" >> "Connections". Specify the following for each field:
 
 - Conn id: LOCAL
 - Conn Type: postgres
diff --git a/docs/docker-stack/build-arg-ref.rst b/docs/docker-stack/build-arg-ref.rst
index f142b37..7671e7b 100644
--- a/docs/docker-stack/build-arg-ref.rst
+++ b/docs/docker-stack/build-arg-ref.rst
@@ -211,7 +211,7 @@ You can see some examples of those in:
 | ``AIRFLOW_CONSTRAINTS_LOCATION``         |                                          | If not empty, it will override the       |
 |                                          |                                          | source of the constraints with the       |
 |                                          |                                          | specified URL or file. Note that the     |
-|                                          |                                          | file has to be in docker context so      |
+|                                          |                                          | file has to be in Docker context so      |
 |                                          |                                          | it's best to place such file in          |
 |                                          |                                          | one of the folders included in           |
 |                                          |                                          | ``.dockerignore`` file.                  |
@@ -246,7 +246,7 @@ When image is build from PIP, by default pre-caching of PIP dependencies is used
 builds during development. When pre-cached PIP dependencies are used and ``setup.py`` or ``setup.cfg`` changes, the
 PIP dependencies are already pre-installed, thus resulting in much faster image rebuild. This is purely an optimization
 of time needed to build the images and should be disabled if you want to install Airflow from
-docker context files.
+Docker context files.
 
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | Build argument                           | Default value                            | Description                              |
diff --git a/docs/docker-stack/build.rst b/docs/docker-stack/build.rst
index a09f879..0d83635 100644
--- a/docs/docker-stack/build.rst
+++ b/docs/docker-stack/build.rst
@@ -81,8 +81,8 @@ In the simplest case building your image consists of those steps:
 
 4) Once you build the image locally you have usually several options to make them available for your deployment:
 
-* For ``docker-compose`` deployment, that's all you need. The image is stored in docker engine cache
-  and docker compose will use it from there.
+* For ``docker-compose`` deployment, that's all you need. The image is stored in Docker engine cache
+  and Docker Compose will use it from there.
 
 * For some - development targeted - Kubernetes deployments you can load the images directly to
   Kubernetes clusters. Clusters such as ``kind`` or ``minikube`` have dedicated ``load`` method to load the
diff --git a/docs/helm-chart/manage-dags-files.rst b/docs/helm-chart/manage-dags-files.rst
index 37ec903..be6030c 100644
--- a/docs/helm-chart/manage-dags-files.rst
+++ b/docs/helm-chart/manage-dags-files.rst
@@ -24,7 +24,7 @@ When you create new or modify existing DAG files, it is necessary to deploy them
 Bake DAGs in Docker image
 -------------------------
 
-The recommended way to update your DAGs with this chart is to build a new docker image with the latest DAG code:
+The recommended way to update your DAGs with this chart is to build a new Docker image with the latest DAG code:
 
 .. code-block:: bash
 
@@ -37,7 +37,7 @@ The recommended way to update your DAGs with this chart is to build a new docker
 
 .. note::
 
-   In airflow images prior to version 2.0.2, there was a bug that required you to use
+   In Airflow images prior to version 2.0.2, there was a bug that required you to use
    a bit longer Dockerfile, to make sure the image remains OpenShift-compatible (i.e dag
    has root group similarly as other files). In 2.0.2 this has been fixed.
 
diff --git a/docs/helm-chart/production-guide.rst b/docs/helm-chart/production-guide.rst
index d2f1435..f9ffb95 100644
--- a/docs/helm-chart/production-guide.rst
+++ b/docs/helm-chart/production-guide.rst
@@ -98,14 +98,14 @@ Now add the secret to your values file:
 
     webserverSecretKey: <secret_key>
 
-Alternatively, create a kubernetes Secret and use ``webserverSecretKeySecretName``:
+Alternatively, create a Kubernetes Secret and use ``webserverSecretKeySecretName``:
 
 .. code-block:: yaml
 
     webserverSecretKeySecretName: my-webserver-secret
     # where the random key is under `webserver-secret-key` in the k8s Secret
 
-Example to create a kubernetes Secret from ``kubectl``:
+Example to create a Kubernetes Secret from ``kubectl``:
 
 .. code-block:: bash
 
diff --git a/docs/helm-chart/setting-resources-for-containers.rst b/docs/helm-chart/setting-resources-for-containers.rst
index ec97985..3322527 100644
--- a/docs/helm-chart/setting-resources-for-containers.rst
+++ b/docs/helm-chart/setting-resources-for-containers.rst
@@ -18,14 +18,14 @@
 Setting resources for containers
 --------------------------------
 
-It is possible to set `resources <https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/>`__ for the Containers managed by the chart. You can define different resources for various airflow k8s Containers. By default the resources are not set.
+It is possible to set `resources <https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/>`__ for the Containers managed by the chart. You can define different resources for various Airflow k8s Containers. By default the resources are not set.
 
 .. note::
     The k8s scheduler can use resources to decide which node to place the Pod on. Since a Pod resource request/limit is the sum of the resource requests/limits for each Container in the Pod, it is advised to specify resources for each Container in the Pod.
 
 Possible Containers where resources can be configured include:
 
-* Main airflow Containers and their sidecars. You can add the resources for these Containers through the following parameters:
+* Main Airflow Containers and their sidecars. You can add the resources for these Containers through the following parameters:
 
    * ``workers.resources``
    * ``workers.logGroomerSidecar.resources``
@@ -37,7 +37,7 @@ Possible Containers where resources can be configured include:
    * ``flower.resources``
    * ``triggerer.resources``
 
-* Containers used for airflow k8s jobs or cron jobs. You can add the resources for these Containers through the following parameters:
+* Containers used for Airflow k8s jobs or cron jobs. You can add the resources for these Containers through the following parameters:
 
    * ``cleanup.resources``
    * ``createUserJob.resources``