You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ka...@apache.org on 2020/11/22 08:31:03 UTC

[airflow] branch master updated: Move providers docs to separate package + Spell-check in a common job with docs-build (#12527)

This is an automated email from the ASF dual-hosted git repository.

kamilbregula pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/master by this push:
     new ef4af21  Move providers docs to separate package + Spell-check in a common job with docs-build (#12527)
ef4af21 is described below

commit ef4af2135171c6e451f1407ea1a280ea875f2175
Author: Kamil BreguĊ‚a <mi...@users.noreply.github.com>
AuthorDate: Sun Nov 22 09:29:51 2020 +0100

    Move providers docs to separate package + Spell-check in a common job with docs-build (#12527)
---
 .github/workflows/ci.yml                           |  23 +-
 CHANGELOG.txt                                      |  22 +-
 airflow/providers/snowflake/provider.yaml          |   2 +-
 airflow/providers/yandex/provider.yaml             |   6 +-
 .../index.rst}                                     |  11 +
 .../operators-and-hooks-ref/apache.rst             |  42 ++++
 .../operators-and-hooks-ref/aws.rst                |  43 ++++
 .../operators-and-hooks-ref/azure.rst              |  42 ++++
 .../operators-and-hooks-ref/google.rst             |  83 +++++++
 .../operators-and-hooks-ref/index.rst              |  26 ++
 .../operators-and-hooks-ref/protocol.rst           |  38 +++
 .../operators-and-hooks-ref/services.rst           |  38 +++
 .../operators-and-hooks-ref/software.rst           |  38 +++
 .../packages-ref.rst}                              |   2 +-
 docs/build_docs.py                                 |  51 +++-
 docs/concepts.rst                                  |   4 +-
 docs/conf.py                                       |  31 ++-
 docs/exts/airflow_intersphinx.py                   |  17 ++
 .../exts/docs_build/dev_index_template.html.jinja2 |   2 +-
 docs/howto/connection/azure.rst                    |   2 +-
 docs/howto/initialize-database.rst                 |   4 +-
 docs/howto/operator/amazon/aws/emr.rst             |   2 +-
 docs/index.rst                                     |   2 -
 docs/installation.rst                              |   8 +-
 docs/logging-monitoring/logging-tasks.rst          |   2 +-
 docs/operators-and-hooks-ref.rst                   | 263 +--------------------
 docs/rest-api-ref.rst                              |   4 +-
 docs/tutorial.rst                                  |   2 +-
 28 files changed, 479 insertions(+), 331 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index e151040..0ac0434 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -300,7 +300,7 @@ jobs:
           VERBOSE: false
 
   docs:
-    timeout-minutes: 30
+    timeout-minutes: 45
     name: "Build docs"
     runs-on: ubuntu-20.04
     needs: [build-info, ci-images]
@@ -311,7 +311,7 @@ jobs:
       - name: "Prepare CI image ${{env.PYTHON_MAJOR_MINOR_VERSION}}:${{ env.GITHUB_REGISTRY_PULL_IMAGE_TAG }}"
         run: ./scripts/ci/images/ci_prepare_ci_image_on_ci.sh
       - name: "Build docs"
-        run: ./scripts/ci/docs/ci_docs.sh --docs-only
+        run: ./scripts/ci/docs/ci_docs.sh
       - name: "Upload documentation"
         uses: actions/upload-artifact@v2
         if: always() && github.event_name == 'pull_request'
@@ -333,22 +333,6 @@ jobs:
           github.event_name == 'push'
         run: aws s3 sync ./files/documentation s3://apache-airflow-docs
 
-  docs-spell-check:
-    timeout-minutes: 30
-    name: "Spell check docs"
-    runs-on: ubuntu-20.04
-    needs: [build-info, ci-images]
-    env:
-      PYTHON_MAJOR_MINOR_VERSION: ${{needs.build-info.outputs.defaultPythonVersion}}
-    if: needs.build-info.outputs.docs-build == 'true'
-    steps:
-      - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
-        uses: actions/checkout@v2
-      - name: "Prepare CI image ${{env.PYTHON_MAJOR_MINOR_VERSION}}:${{ env.GITHUB_REGISTRY_PULL_IMAGE_TAG }}"
-        run: ./scripts/ci/images/ci_prepare_ci_image_on_ci.sh
-      - name: "Spell check docs"
-        run: ./scripts/ci/docs/ci_docs.sh --spellcheck-only
-
   prepare-backport-provider-packages:
     timeout-minutes: 30
     name: "Backport packages"
@@ -839,7 +823,6 @@ jobs:
       - prepare-provider-packages
       - prod-images
       - docs
-      - docs-spell-check
     if: >
       (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/v1-10-test') &&
       github.event_name != 'schedule'
@@ -878,7 +861,6 @@ jobs:
       - prepare-backport-provider-packages
       - ci-images
       - docs
-      - docs-spell-check
     if: >
       (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/v1-10-test' ) &&
       github.event_name != 'schedule'
@@ -979,7 +961,6 @@ jobs:
     runs-on: ubuntu-20.04
     needs:
       - docs
-      - docs-spell-check
       - static-checks
       - static-checks-pylint
       - tests-sqlite
diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index eb63c9f..a4e6137 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -946,7 +946,7 @@ Doc-only changes
 - [AIRFLOW-XXX] Fix typo - AWS DynamoDB Hook (#6319)
 - [AIRFLOW-XXX] Fix Documentation for adding extra Operator Links (#6301)
 - [AIRFLOW-XXX] Add section on task lifecycle & correct casing in docs (#4681)
-- [AIRFLOW-XXX] Make it clear that 1.10.5 wasn't accidentally omitted from UPDATING.md (#6240)
+- [AIRFLOW-XXX] Make it clear that 1.10.5 was not accidentally omitted from UPDATING.md (#6240)
 - [AIRFLOW-XXX] Improve format in code-block directives (#6242)
 - [AIRFLOW-XXX] Format Sendgrid docs (#6245)
 - [AIRFLOW-XXX] Update to new logo (#6066)
@@ -1090,7 +1090,7 @@ New Features
 - [AIRFLOW-4521] Pause dag also pause its subdags (#5283)
 - [AIRFLOW-4738] Enforce exampleinclude for example DAGs (#5375)
 - [AIRFLOW-4326] Airflow AWS SQS Operator (#5110)
-- [AIRFLOW-3729] Support "DownwardAPI" in env variables for KubernetesPodOperator (#4554)
+- [AIRFLOW-3729] Support `DownwardAPI` in env variables for KubernetesPodOperator (#4554)
 - [AIRFLOW-4585] Implement Kubernetes Pod Mutation Hook (#5359)
 - [AIRFLOW-161] New redirect route and extra links (#5059)
 - [AIRFLOW-4420] Backfill respects task_concurrency (#5221)
@@ -1272,7 +1272,7 @@ Bug fixes
 - [AIRFLOW-4455] dag_details broken for subdags in RBAC UI (#5234)
 - [AIRFLOW-2955] Fix kubernetes pod operator to set requests and limits on task pods (#4551)
 - [AIRFLOW-4459] Fix wrong DAG count in /home page when DAG count is zero (#5235)
-- [AIRFLOW-3876] AttributeError: module 'distutils' has no attribute 'util'
+- [AIRFLOW-3876] AttributeError: module `distutils` has no attribute 'util'
 - [AIRFLOW-4146] Fix CgroupTaskRunner errors (#5224)
 - [AIRFLOW-4524] Fix bug with "Ignore \*" toggles in RBAC mode (#5378)
 - [AIRFLOW-4765] Fix DataProcPigOperator execute method (#5426)
@@ -1544,7 +1544,7 @@ Improvement
 Bug fixes
 """""""""
 
-- [AIRFLOW-4248] Fix 'FileExistsError' makedirs race in file_processor_handler (#5047)
+- [AIRFLOW-4248] Fix `FileExistsError` makedirs race in file_processor_handler (#5047)
 - [AIRFLOW-4240] State-changing actions should be POST requests (#5039)
 - [AIRFLOW-4246] Flask-Oauthlib needs downstream dependencies pinning due to breaking changes (#5045)
 - [AIRFLOW-3887] Downgrade dagre-d3 to 0.4.18 (#4713)
@@ -2782,7 +2782,7 @@ Airflow 1.10.0, 2018-08-03
 - [AIRFLOW-1799] Fix logging line which raises errors
 - [AIRFLOW-1102] Upgrade Gunicorn >=19.4.0
 - [AIRFLOW-1756] Fix S3TaskHandler to work with Boto3-based S3Hook
-- [AIRFLOW-1797] S3Hook.load_string didn't work on Python3
+- [AIRFLOW-1797] S3Hook.load_string did not work on Python3
 - [AIRFLOW-646] Add docutils to setup_requires
 - [AIRFLOW-1792] Missing intervals DruidOperator
 - [AIRFLOW-1789][AIRFLOW-1712] Log SSHOperator stderr to log.warning
@@ -2927,7 +2927,7 @@ Airflow 1.9.0, 2018-01-02
 - [AIRFLOW-1799] Fix logging line which raises errors
 - [AIRFLOW-1102] Upgrade Gunicorn >=19.4.0
 - [AIRFLOW-1756] Fix S3TaskHandler to work with Boto3-based S3Hook
-- [AIRFLOW-1797] S3Hook.load_string didn't work on Python3
+- [AIRFLOW-1797] S3Hook.load_string did not work on Python3
 - [AIRFLOW-1792] Missing intervals DruidOperator
 - [AIRFLOW-1789][AIRFLOW-1712] Log SSHOperator stderr to log.warning
 - [AIRFLOW-1669] Fix Docker and pin Moto to 1.1.19
@@ -3262,7 +3262,7 @@ Airflow 1.9.0, 2018-01-02
 - [AIRFLOW-1045] Make log level configurable via airflow.cfg
 - [AIRFLOW-1047] Sanitize strings passed to Markup
 - [AIRFLOW-1040] Fix some small typos in comments and docstrings
-- [AIRFLOW-1017] get_task_instance shouldn't throw exception when no TI
+- [AIRFLOW-1017] get_task_instance should not throw exception when no TI
 - [AIRFLOW-1006] Add config_templates to MANIFEST
 - [AIRFLOW-999] Add support for Redis database
 - [AIRFLOW-1009] Remove SQLOperator from Concepts page
@@ -3307,7 +3307,7 @@ Airflow 1.9.0, 2018-01-02
 - [AIRFLOW-937] Improve performance of task_stats
 - [AIRFLOW-933] use ast.literal_eval rather eval because ast.literal_eval does not execute input.
 - [AIRFLOW-925] Revert airflow.hooks change that cherry-pick picked
-- [AIRFLOW-919] Running tasks with no start date shouldn't break a DAGs UI
+- [AIRFLOW-919] Running tasks with no start date should not break a DAGs UI
 - [AIRFLOW-802][AIRFLOW-1] Add spark-submit operator/hook
 - [AIRFLOW-725] Use keyring to store credentials for JIRA
 - [AIRFLOW-916] Remove deprecated readfp function
@@ -3415,7 +3415,7 @@ Airflow 1.8.1, 2017-05-09
 - [AIRFLOW-1017] get_task_instance should return None instead of throw an exception for non-existent TIs
 - [AIRFLOW-1011] Fix bug in BackfillJob._execute() for SubDAGs
 - [AIRFLOW-1004] `airflow webserver -D` runs in foreground
-- [AIRFLOW-1001] Landing Time shows "unsupported operand type(s) for -: 'datetime.datetime' and 'NoneType'" on example_subdag_operator
+- [AIRFLOW-1001] Landing Time shows "unsupported operand type(s) for -: 'datetime.datetime' and `NoneType`" on example_subdag_operator
 - [AIRFLOW-1000] Rebrand to Apache Airflow instead of Airflow
 - [AIRFLOW-989] Clear Task Regression
 - [AIRFLOW-974] airflow.util.file mkdir has a race condition
@@ -3456,7 +3456,7 @@ Airflow 1.8.0, 2017-03-12
 - [AIRFLOW-937] Improve performance of task_stats
 - [AIRFLOW-933] use ast.literal_eval rather eval because ast.literal_eval does not execute input.
 - [AIRFLOW-925] Revert airflow.hooks change that cherry-pick picked
-- [AIRFLOW-919] Running tasks with no start date shouldn't break a DAGs UI
+- [AIRFLOW-919] Running tasks with no start date should not break a DAGs UI
 - [AIRFLOW-802] Add spark-submit operator/hook
 - [AIRFLOW-897] Prevent dagruns from failing with unfinished tasks
 - [AIRFLOW-861] make pickle_info endpoint be login_required
@@ -3765,7 +3765,7 @@ Airflow 1.7.2
 - [AIRFLOW-262] Simplify commands in MANIFEST.in
 - [AIRFLOW-31] Add zope dependency
 - [AIRFLOW-6] Remove dependency on Highcharts
-- [AIRFLOW-234] make task that aren't `running` self-terminate
+- [AIRFLOW-234] make task that are not `running` self-terminate
 - [AIRFLOW-256] Fix test_scheduler_reschedule heartrate
 - Add Python 3 compatibility fix
 - [AIRFLOW-31] Use standard imports for hooks/operators
diff --git a/airflow/providers/snowflake/provider.yaml b/airflow/providers/snowflake/provider.yaml
index 021a7c5..b306145 100644
--- a/airflow/providers/snowflake/provider.yaml
+++ b/airflow/providers/snowflake/provider.yaml
@@ -29,7 +29,7 @@ integrations:
     external-doc-url: https://snowflake.com/
     how-to-guide:
       - /docs/howto/operator/snowflake.rst
-    tags: [software]
+    tags: [service]
 
 operators:
   - integration-name: Snowflake
diff --git a/airflow/providers/yandex/provider.yaml b/airflow/providers/yandex/provider.yaml
index 884f936..0ee9ba6 100644
--- a/airflow/providers/yandex/provider.yaml
+++ b/airflow/providers/yandex/provider.yaml
@@ -27,13 +27,13 @@ versions:
 integrations:
   - integration-name: Yandex.Cloud
     external-doc-url: https://cloud.yandex.com/
-    tags: [yandex]
+    tags: [service]
 
   - integration-name: Yandex.Cloud Dataproc
     external-doc-url: https://cloud.yandex.com/dataproc
     how-to-guide:
       - /docs/howto/operator/yandexcloud.rst
-    tags: [yandex]
+    tags: [service]
 
 operators:
   - integration-name: Yandex.Cloud Dataproc
@@ -44,6 +44,6 @@ hooks:
   - integration-name: Yandex.Cloud
     python-modules:
       - airflow.providers.yandex.hooks.yandex
-  - integration-name: Yandex.Cloud
+  - integration-name: Yandex.Cloud Dataproc
     python-modules:
       - airflow.providers.yandex.hooks.yandexcloud_dataproc
diff --git a/docs/provider-packages.rst b/docs/apache-airflow-providers/index.rst
similarity index 98%
rename from docs/provider-packages.rst
rename to docs/apache-airflow-providers/index.rst
index 7b405ed..0e431b7 100644
--- a/docs/provider-packages.rst
+++ b/docs/apache-airflow-providers/index.rst
@@ -1,3 +1,4 @@
+
  .. Licensed to the Apache Software Foundation (ASF) under one
     or more contributor license agreements.  See the NOTICE file
     distributed with this work for additional information
@@ -118,3 +119,13 @@ A. It depends on the scope of customization. There is no need to upgrade the pro
    Generally speaking, with Airflow 2 we are following the `Semver <https://semver.org/>`_  approach where
    we will introduce backwards-incompatible changes in Major releases, so all your modifications (as long
    as you have not used internal Airflow classes) should work for All Airflow 2.* versions.
+
+
+Content
+-------
+
+.. toctree::
+    :maxdepth: 1
+
+    Packages <packages-ref>
+    Operators and hooks <operators-and-hooks-ref/index>
diff --git a/docs/apache-airflow-providers/operators-and-hooks-ref/apache.rst b/docs/apache-airflow-providers/operators-and-hooks-ref/apache.rst
new file mode 100644
index 0000000..b53d90a
--- /dev/null
+++ b/docs/apache-airflow-providers/operators-and-hooks-ref/apache.rst
@@ -0,0 +1,42 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+ASF: Apache Software Foundation
+===============================
+
+Airflow supports various software created by `Apache Software Foundation <https://www.apache.org/foundation/>`__.
+
+Software operators and hooks
+----------------------------
+
+These integrations allow you to perform various operations within software developed by Apache Software
+Foundation.
+
+.. operators-hooks-ref::
+   :tags: apache
+   :header-separator: "
+
+
+Transfer operators and hooks
+----------------------------
+
+These integrations allow you to copy data from/to software developed by Apache Software
+Foundation.
+
+.. transfers-ref::
+   :tags: apache
+   :header-separator: "
diff --git a/docs/apache-airflow-providers/operators-and-hooks-ref/aws.rst b/docs/apache-airflow-providers/operators-and-hooks-ref/aws.rst
new file mode 100644
index 0000000..c551473
--- /dev/null
+++ b/docs/apache-airflow-providers/operators-and-hooks-ref/aws.rst
@@ -0,0 +1,43 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+.. _AWS:
+
+AWS: Amazon Web Services
+------------------------
+
+Airflow has support for `Amazon Web Services <https://aws.amazon.com/>`__.
+
+All hooks are based on :mod:`airflow.providers.amazon.aws.hooks.base_aws`.
+
+Service operators and hooks
+'''''''''''''''''''''''''''
+
+These integrations allow you to perform various operations within the Amazon Web Services.
+
+.. operators-hooks-ref::
+   :tags: aws
+   :header-separator: "
+
+Transfer operators and hooks
+''''''''''''''''''''''''''''
+
+These integrations allow you to copy data from/to Amazon Web Services.
+
+.. transfers-ref::
+   :tags: aws
+   :header-separator: "
diff --git a/docs/apache-airflow-providers/operators-and-hooks-ref/azure.rst b/docs/apache-airflow-providers/operators-and-hooks-ref/azure.rst
new file mode 100644
index 0000000..b2eb937
--- /dev/null
+++ b/docs/apache-airflow-providers/operators-and-hooks-ref/azure.rst
@@ -0,0 +1,42 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Azure: Microsoft Azure
+----------------------
+
+Airflow has limited support for `Microsoft Azure <https://azure.microsoft.com/>`__.
+
+Some hooks are based on :mod:`airflow.providers.microsoft.azure.hooks.base_azure`
+which authenticate Azure's Python SDK Clients.
+
+Service operators and hooks
+'''''''''''''''''''''''''''
+
+These integrations allow you to perform various operations within the Microsoft Azure.
+
+.. operators-hooks-ref::
+   :tags: azure
+   :header-separator: "
+
+Transfer operators and hooks
+''''''''''''''''''''''''''''
+
+These integrations allow you to copy data from/to Microsoft Azure.
+
+.. transfers-ref::
+   :tags: azure
+   :header-separator: "
diff --git a/docs/apache-airflow-providers/operators-and-hooks-ref/google.rst b/docs/apache-airflow-providers/operators-and-hooks-ref/google.rst
new file mode 100644
index 0000000..b3435fe
--- /dev/null
+++ b/docs/apache-airflow-providers/operators-and-hooks-ref/google.rst
@@ -0,0 +1,83 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+.. _Google:
+
+Google
+------
+
+Airflow has support for the `Google service <https://developer.google.com/>`__.
+
+All hooks are based on :class:`airflow.providers.google.common.hooks.base_google.GoogleBaseHook`. Some integration
+also use :mod:`airflow.providers.google.common.hooks.discovery_api`.
+
+See the :doc:`Google Cloud connection type <apache-airflow-providers-google:connections/gcp>` documentation to
+configure connections to Google services.
+
+.. _GCP:
+
+Google Cloud
+''''''''''''
+
+Airflow has extensive support for the `Google Cloud <https://cloud.google.com/>`__.
+
+.. note::
+    You can learn how to use Google Cloud integrations by analyzing the
+    `source code of the Google Cloud example DAGs
+    <https://github.com/apache/airflow/tree/master/airflow/providers/google/cloud/example_dags/>`_
+
+
+Service operators and hooks
+"""""""""""""""""""""""""""
+
+These integrations allow you to perform various operations within the Google Cloud.
+
+.. operators-hooks-ref::
+   :tags: gcp
+   :header-separator: !
+
+
+Transfer operators and hooks
+""""""""""""""""""""""""""""
+
+These integrations allow you to copy data from/to Google Cloud.
+
+.. transfers-ref::
+   :tags: gcp
+   :header-separator: !
+
+
+Google Marketing Platform
+'''''''''''''''''''''''''
+
+.. note::
+    You can learn how to use Google Marketing Platform integrations by analyzing the
+    `source code <https://github.com/apache/airflow/tree/master/airflow/providers/google/marketing_platform/example_dags/>`_
+    of the example DAGs.
+
+
+.. operators-hooks-ref::
+   :tags: gmp
+   :header-separator: !
+
+
+Other Google operators and hooks
+''''''''''''''''''''''''''''''''
+
+.. operators-hooks-ref::
+   :tags: google
+   :header-separator: !
diff --git a/docs/apache-airflow-providers/operators-and-hooks-ref/index.rst b/docs/apache-airflow-providers/operators-and-hooks-ref/index.rst
new file mode 100644
index 0000000..2ec3571
--- /dev/null
+++ b/docs/apache-airflow-providers/operators-and-hooks-ref/index.rst
@@ -0,0 +1,26 @@
+
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Operators and Hooks Reference
+=============================
+
+.. toctree::
+    :maxdepth: 2
+    :glob:
+
+    *
diff --git a/docs/apache-airflow-providers/operators-and-hooks-ref/protocol.rst b/docs/apache-airflow-providers/operators-and-hooks-ref/protocol.rst
new file mode 100644
index 0000000..650fdf3
--- /dev/null
+++ b/docs/apache-airflow-providers/operators-and-hooks-ref/protocol.rst
@@ -0,0 +1,38 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Protocol integrations
+---------------------
+
+Protocol operators and hooks
+''''''''''''''''''''''''''''
+
+These integrations allow you to perform various operations within various services using standardized
+communication protocols or interface.
+
+.. operators-hooks-ref::
+   :tags: protocol
+   :header-separator: "
+
+Transfer operators and hooks
+''''''''''''''''''''''''''''
+
+These integrations allow you to copy data.
+
+.. transfers-ref::
+   :tags: protocol
+   :header-separator: "
diff --git a/docs/apache-airflow-providers/operators-and-hooks-ref/services.rst b/docs/apache-airflow-providers/operators-and-hooks-ref/services.rst
new file mode 100644
index 0000000..ac46230
--- /dev/null
+++ b/docs/apache-airflow-providers/operators-and-hooks-ref/services.rst
@@ -0,0 +1,38 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Services
+--------
+
+Service operators and hooks
+'''''''''''''''''''''''''''
+
+These integrations allow you to perform various operations within various services.
+
+.. operators-hooks-ref::
+   :tags: service
+   :header-separator: "
+
+
+Transfer operators and hooks
+''''''''''''''''''''''''''''
+
+These integrations allow you to perform various operations within various services.
+
+.. transfers-ref::
+   :tags: service
+   :header-separator: "
diff --git a/docs/apache-airflow-providers/operators-and-hooks-ref/software.rst b/docs/apache-airflow-providers/operators-and-hooks-ref/software.rst
new file mode 100644
index 0000000..5b7b240
--- /dev/null
+++ b/docs/apache-airflow-providers/operators-and-hooks-ref/software.rst
@@ -0,0 +1,38 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Software integrations
+---------------------
+
+Software operators and hooks
+''''''''''''''''''''''''''''
+
+These integrations allow you to perform various operations using various software.
+
+.. operators-hooks-ref::
+   :tags: software
+   :header-separator: "
+
+
+Transfer operators and hooks
+''''''''''''''''''''''''''''
+
+These integrations allow you to copy data.
+
+.. transfers-ref::
+   :tags: software
+   :header-separator: "
diff --git a/docs/provider-packages-ref.rst b/docs/apache-airflow-providers/packages-ref.rst
similarity index 95%
rename from docs/provider-packages-ref.rst
rename to docs/apache-airflow-providers/packages-ref.rst
index 767fd5a..b818acd 100644
--- a/docs/provider-packages-ref.rst
+++ b/docs/apache-airflow-providers/packages-ref.rst
@@ -18,7 +18,7 @@
 Providers packages reference
 ''''''''''''''''''''''''''''
 
-Here's the list of the :doc:`provider packages <provider-packages>` and what they enable:
+Here's the list of the provider packages and what they enable:
 
 
 .. contents:: :local:
diff --git a/docs/build_docs.py b/docs/build_docs.py
index 63325c7..f04c6fa 100755
--- a/docs/build_docs.py
+++ b/docs/build_docs.py
@@ -73,7 +73,12 @@ class AirflowDocsBuilder:
 
     @property
     def _out_dir(self) -> str:
-        return f"{DOCS_DIR}/_build/docs/{self.package_name}/latest"
+        if self.package_name == 'apache-airflow-providers':
+            # Disable versioning. This documentation does not apply to any issued product and we can update
+            # it as needed, i.e. with each new package of providers.
+            return f"{DOCS_DIR}/_build/docs/{self.package_name}"
+        else:
+            return f"{DOCS_DIR}/_build/docs/{self.package_name}/latest"
 
     @property
     def _src_dir(self) -> str:
@@ -82,7 +87,9 @@ class AirflowDocsBuilder:
         #  to /airflow/ to keep the directory structure more maintainable.
         if self.package_name == 'apache-airflow':
             return DOCS_DIR
-        elif self.package_name.startswith('apache-airflow-providers'):
+        elif self.package_name.startswith('apache-airflow-providers-') or (
+            self.package_name == 'apache-airflow-providers'
+        ):
             return f"{DOCS_DIR}/{self.package_name}"
         else:
             raise Exception(F"Unsupported package: {self.package_name}")
@@ -186,14 +193,14 @@ class AirflowDocsBuilder:
 def get_available_packages():
     """Get list of all available packages to build."""
     provider_package_names = [provider['package-name'] for provider in ALL_PROVIDER_YAMLS]
-    return ["apache-airflow", *provider_package_names]
+    return ["apache-airflow", "apache-airflow-providers", *provider_package_names]
 
 
 def _get_parser():
     available_packages_list = " * " + "\n * ".join(get_available_packages())
     parser = argparse.ArgumentParser(
         description='Builds documentation and runs spell checking',
-        epilog=f"List of supported packages:\n{available_packages_list}" "",
+        epilog=f"List of supported documentation packages:\n{available_packages_list}" "",
     )
     parser.formatter_class = argparse.RawTextHelpFormatter
     parser.add_argument(
@@ -201,8 +208,9 @@ def _get_parser():
     )
     parser.add_argument(
         "--package-filter",
+        action="append",
         help=(
-            "Filter specifying for which packages the documentation is to be built. Wildcaard is supported."
+            "Filter specifying for which packages the documentation is to be built. Wildcard are supported."
         ),
     )
     parser.add_argument('--docs-only', dest='docs_only', action='store_true', help='Only build documentation')
@@ -219,6 +227,7 @@ def build_docs_for_packages(
     all_build_errors: Dict[str, List[DocBuildError]] = defaultdict(list)
     all_spelling_errors: Dict[str, List[SpellingError]] = defaultdict(list)
     for package_name in current_packages:
+        print("#" * 20, package_name, "#" * 20)
         builder = AirflowDocsBuilder(package_name=package_name)
         builder.clean_files()
         if not docs_only:
@@ -281,13 +290,14 @@ def main():
     docs_only = args.docs_only
     spellcheck_only = args.spellcheck_only
     disable_checks = args.disable_checks
-    package_filter = args.package_filter
+    package_filters = args.package_filter
 
-    print("Current package filter: ", package_filter)
+    print("Current package filters: ", package_filters)
     current_packages = (
-        fnmatch.filter(available_packages, package_filter) if package_filter else available_packages
+        [p for p in available_packages if any(fnmatch.fnmatch(p, f) for f in package_filters)]
+        if package_filters
+        else available_packages
     )
-
     print(f"Documentation will be built for {len(current_packages)} package(s): {current_packages}")
 
     all_build_errors: Dict[Optional[str], List[DocBuildError]] = {}
@@ -301,6 +311,29 @@ def main():
         all_build_errors.update(package_build_errors)
     if package_spelling_errors:
         all_spelling_errors.update(package_spelling_errors)
+    to_retry_packages = [
+        package_name
+        for package_name, errors in package_build_errors.items()
+        if any(
+            'failed to reach any of the inventories with the following issues' in e.message for e in errors
+        )
+    ]
+    if to_retry_packages:
+        for package_name in to_retry_packages:
+            if package_name in all_build_errors:
+                del all_build_errors[package_name]
+            if package_name in all_spelling_errors:
+                del all_spelling_errors[package_name]
+
+        package_build_errors, package_spelling_errors = build_docs_for_packages(
+            current_packages=to_retry_packages,
+            docs_only=docs_only,
+            spellcheck_only=spellcheck_only,
+        )
+        if package_build_errors:
+            all_build_errors.update(package_build_errors)
+        if package_spelling_errors:
+            all_spelling_errors.update(package_spelling_errors)
 
     if not disable_checks:
         general_errors = []
diff --git a/docs/concepts.rst b/docs/concepts.rst
index 27e6dea..d6815ba 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -40,7 +40,7 @@ For example, a simple DAG could consist of three tasks: A, B, and C. It could
 say that A has to run successfully before B can run, but C can run anytime. It
 could say that task A times out after 5 minutes, and B can be restarted up to 5
 times in case it fails. It might also say that the workflow will run every night
-at 10pm, but shouldn't start until a certain date.
+at 10pm, but should not start until a certain date.
 
 In this way, a DAG describes *how* you want to carry out your workflow; but
 notice that we haven't said anything about *what* we actually want to do! A, B,
@@ -805,7 +805,7 @@ methods.
 
 It is also possible to override the ``orm_deserialize_value`` method which is used for deserialization when
 recreating ORM XCom object. This happens every time we query the XCom table, for example when we want to populate
-XCom list view in webserver. If your XCom backend performs expensive operations, or has large values that aren't
+XCom list view in webserver. If your XCom backend performs expensive operations, or has large values that are not
 useful to show in such a view, override this method to provide an alternative representation. By default Airflow will
 use ``BaseXCom.orm_deserialize_value`` method which returns the value stored in Airflow database.
 
diff --git a/docs/conf.py b/docs/conf.py
index 8d63875..4e48c46 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -34,7 +34,7 @@
 import glob
 import os
 import sys
-from typing import List
+from typing import List, Optional
 
 import yaml
 
@@ -55,10 +55,11 @@ ROOT_DIR = os.path.abspath(os.path.join(CONF_DIR, os.pardir))
 
 # By default (e.g. on RTD), build docs for `airflow` package
 PACKAGE_NAME = os.environ.get('AIRFLOW_PACKAGE_NAME', 'apache-airflow')
+PACKAGE_DIR: Optional[str]
 if PACKAGE_NAME == 'apache-airflow':
     PACKAGE_DIR = os.path.join(ROOT_DIR, 'airflow')
     PACKAGE_VERSION = airflow.__version__
-else:
+elif PACKAGE_NAME.startswith('apache-airflow-providers-'):
     from provider_yaml_utils import load_package_data  # pylint: disable=no-name-in-module
 
     ALL_PROVIDER_YAMLS = load_package_data()
@@ -72,9 +73,13 @@ else:
         raise Exception(f"Could not find provider.yaml file for package: {PACKAGE_NAME}")
     PACKAGE_DIR = CURRENT_PROVIDER['package-dir']
     PACKAGE_VERSION = 'master'
+else:
+    PACKAGE_DIR = None
+    PACKAGE_VERSION = 'master'
 # Adds to environment variables for easy access from other plugins like airflow_internsphinx.
 os.environ['AIRFLOW_PACKAGE_NAME'] = PACKAGE_NAME
-os.environ['AIRFLOW_PACKAGE_DIR'] = PACKAGE_DIR
+if PACKAGE_DIR:
+    os.environ['AIRFLOW_PACKAGE_DIR'] = PACKAGE_DIR
 os.environ['AIRFLOW_PACKAGE_VERSION'] = PACKAGE_VERSION
 
 
@@ -107,7 +112,6 @@ extensions = [
     'sphinx.ext.viewcode',
     'sphinxarg.ext',
     'sphinx.ext.intersphinx',
-    'autoapi.extension',
     'exampleinclude',
     'docroles',
     'removemarktransform',
@@ -122,15 +126,21 @@ if PACKAGE_NAME == 'apache-airflow':
             'sphinx.ext.graphviz',
             'sphinxcontrib.httpdomain',
             'sphinxcontrib.httpdomain',
-            'providers_packages_ref',
-            'operators_and_hooks_ref',
             # First, generate redoc
             'sphinxcontrib.redoc',
             # Second, update redoc script
             "sphinx_script_update",
         ]
     )
-
+if PACKAGE_NAME == "apache-airflow-providers":
+    extensions.extend(
+        [
+            'operators_and_hooks_ref',
+            'providers_packages_ref',
+        ]
+    )
+else:
+    extensions.append('autoapi.extension')
 # List of patterns, relative to source directory, that match files and
 # directories to ignore when looking for source files.
 exclude_patterns: List[str]
@@ -149,12 +159,15 @@ if PACKAGE_NAME == 'apache-airflow':
         'howto/operator/google/_partials',
         'howto/operator/microsoft/_partials',
         'apache-airflow-providers-*/',
+        'apache-airflow-providers',
         'README.rst',
     ] + glob.glob('apache-airflow-providers-*')
-else:
+elif PACKAGE_NAME.startswith('apache-airflow-providers-'):
     exclude_patterns = [
         '/_partials/',
     ]
+else:
+    exclude_patterns = []
 
 
 def _get_rst_filepath_from_path(filepath: str):
@@ -289,7 +302,7 @@ if airflow_theme_is_available:
 # Jinja context
 if PACKAGE_NAME == 'apache-airflow':
     jinja_contexts = {'config_ctx': {"configs": default_config_yaml()}}
-else:
+elif PACKAGE_NAME.startswith('apache-airflow-providers-'):
 
     def _load_config():
         templates_dir = os.path.join(PACKAGE_DIR, 'config_templates')
diff --git a/docs/exts/airflow_intersphinx.py b/docs/exts/airflow_intersphinx.py
index 0fb2523..b42253d 100644
--- a/docs/exts/airflow_intersphinx.py
+++ b/docs/exts/airflow_intersphinx.py
@@ -91,6 +91,23 @@ def _generate_provider_intersphinx_mapping():
             ),
         )
 
+    if os.environ.get('AIRFLOW_PACKAGE_NAME') != 'apache-airflow-providers':
+        airflow_mapping['apache-airflow-providers'] = (
+            # base URI
+            '/docs/apache-airflow-providers/',
+            # Index locations list
+            # If passed None, this will try to fetch the index from `[base_url]/objects.inv`
+            # If we pass a path containing `://` then we will try to index from the given address.
+            # Otherwise, it will try to read the local file
+            #
+            # In this case, the local index will be read. If unsuccessful, the remote index
+            # will be fetched.
+            (
+                f'{DOCS_DIR}/_build/docs/apache-airflow-providers/objects.inv',
+                f'{S3_DOC_URL}/docs/apache-airflow-providers/objects.inv',
+            ),
+        )
+
     return airflow_mapping
 
 
diff --git a/docs/exts/docs_build/dev_index_template.html.jinja2 b/docs/exts/docs_build/dev_index_template.html.jinja2
index e8c7f0d..9bf04da 100644
--- a/docs/exts/docs_build/dev_index_template.html.jinja2
+++ b/docs/exts/docs_build/dev_index_template.html.jinja2
@@ -38,7 +38,7 @@
   </div>
   <div class="row">
     <div class="col-md-12">
-      <h2>Providers packages</h2>
+      <h2><a href="/docs/apache-airflow-providers/index.html">Providers packages</a></h2>
       <p>
         Providers packages include integrations with third party integrations. They are updated independently of the Apache Airflow core.
       </p>
diff --git a/docs/howto/connection/azure.rst b/docs/howto/connection/azure.rst
index ba107a2..a76c75c 100644
--- a/docs/howto/connection/azure.rst
+++ b/docs/howto/connection/azure.rst
@@ -22,7 +22,7 @@
 Microsoft Azure Connection
 ==========================
 
-The Microsoft Azure connection type enables the :ref:`Azure Integrations <Azure>`.
+The Microsoft Azure connection type enables the Azure Integrations.
 
 Authenticating to Azure
 -----------------------
diff --git a/docs/howto/initialize-database.rst b/docs/howto/initialize-database.rst
index 3197e08..2b8d309 100644
--- a/docs/howto/initialize-database.rst
+++ b/docs/howto/initialize-database.rst
@@ -81,7 +81,7 @@ in the Postgres documentation to learn more.
 Configure Airflow's database connection string
 ----------------------------------------------
 
-Once you've setup your database to host Airflow, you'll need to alter the
+Once you have setup your database to host Airflow, you'll need to alter the
 SqlAlchemy connection string located in ``sql_alchemy_conn`` option in ``[core]`` section in your configuration file
 ``$AIRFLOW_HOME/airflow.cfg``.
 
@@ -90,7 +90,7 @@ You can also define connection URI using ``AIRFLOW__CORE__SQL_ALCHEMY_CONN`` env
 Configure a worker that supports parallelism
 --------------------------------------------
 
-You should then also change the "executor" option in the ``[core]`` option to use "LocalExecutor", an executor that can parallelize task instances locally.
+You should then also change the ``executor`` option in the ``[core]`` option to use ``LocalExecutor``, an executor that can parallelize task instances locally.
 
 Initialize the database
 -----------------------
diff --git a/docs/howto/operator/amazon/aws/emr.rst b/docs/howto/operator/amazon/aws/emr.rst
index a5362d8..87d0836 100644
--- a/docs/howto/operator/amazon/aws/emr.rst
+++ b/docs/howto/operator/amazon/aws/emr.rst
@@ -99,7 +99,7 @@ JobFlow configuration
 """""""""""""""""""""
 
 The configuration is similar to the previous example, except that we set ``'KeepJobFlowAliveWhenNoSteps': True`` because we will terminate the cluster manually.
-Also, we wouldn't specify ``Steps`` in the config when creating the cluster.
+Also, we would not specify ``Steps`` in the config when creating the cluster.
 
 Defining tasks
 """"""""""""""
diff --git a/docs/index.rst b/docs/index.rst
index 46678b3..2b1c4cf 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -78,7 +78,6 @@ Content
     license
     start
     installation
-    provider-packages
     tutorial
     tutorial_taskflow_api
     howto/index
@@ -117,4 +116,3 @@ Content
     Stable REST API <stable-rest-api-ref>
     Configurations <configurations-ref>
     Extra packages <extra-packages-ref>
-    Provider packages <provider-packages-ref>
diff --git a/docs/installation.rst b/docs/installation.rst
index beb606b..725aef1 100644
--- a/docs/installation.rst
+++ b/docs/installation.rst
@@ -78,7 +78,7 @@ the main Airflow installation.
           you have to install provider packages manually. As of Airflow 2.0.0b2 the corresponding
           provider packages are installed together with the extras.
 
-Read more about it in the `Provider Packages <#provider-packages>`_ section.
+Read more about it in the :ref:`Provider Packages <installation:provider_packages>` section.
 
 Requirements
 ''''''''''''
@@ -127,6 +127,8 @@ these extra dependencies.
 
 For the list of the subpackages and what they enable, see: :doc:`extra-packages-ref`.
 
+.. _installation:provider_packages:
+
 Provider packages
 '''''''''''''''''
 
@@ -134,9 +136,9 @@ Unlike Apache Airflow 1.10, the Airflow 2.0 is delivered in multiple, separate,
 The core of Airflow scheduling system is delivered as ``apache-airflow`` package and there are around
 60 providers packages which can be installed separately as so called "Airflow Provider packages".
 The default Airflow installation doesn't have many integrations and you have to install them yourself.
-For more information, see: :doc:`provider-packages`
+For more information, see: :doc:`apache-airflow-providers:index`
 
-For the list of the provider packages and what they enable, see: :doc:`provider-packages-ref`.
+For the list of the provider packages and what they enable, see: :doc:`apache-airflow-providers:packages-ref`.
 
 Initializing Airflow Database
 '''''''''''''''''''''''''''''
diff --git a/docs/logging-monitoring/logging-tasks.rst b/docs/logging-monitoring/logging-tasks.rst
index aaec284..c5457b8 100644
--- a/docs/logging-monitoring/logging-tasks.rst
+++ b/docs/logging-monitoring/logging-tasks.rst
@@ -193,7 +193,7 @@ Follow the steps below to enable Azure Blob Storage logging:
         remote_log_conn_id = <name of the Azure Blob Storage connection>
 
 #. Restart the Airflow webserver and scheduler, and trigger (or wait for) a new task execution.
-#. Verify that logs are showing up for newly executed tasks in the bucket you've defined.
+#. Verify that logs are showing up for newly executed tasks in the bucket you have defined.
 
 .. _write-logs-elasticsearch:
 
diff --git a/docs/operators-and-hooks-ref.rst b/docs/operators-and-hooks-ref.rst
index d735ee7..fc10238 100644
--- a/docs/operators-and-hooks-ref.rst
+++ b/docs/operators-and-hooks-ref.rst
@@ -18,14 +18,9 @@
 Operators and Hooks Reference
 =============================
 
-.. contents:: Content
-  :local:
-  :depth: 1
-
-.. _fundamentals:
-
-Fundamentals
-------------
+Here's the list of the operators and hooks which are available in this release in the ``apache-airflow`` package.
+Airflow has many more integrations available for separate installation as a provider packages. For details see:
+:doc:`apache-airflow-providers:operators-and-hooks-ref/index`.
 
 **Base:**
 
@@ -133,255 +128,3 @@ Fundamentals
 
    * - :mod:`airflow.hooks.filesystem`
      -
-
-
-.. _Apache:
-
-ASF: Apache Software Foundation
--------------------------------
-
-Airflow supports various software created by `Apache Software Foundation <https://www.apache.org/foundation/>`__.
-
-Software operators and hooks
-''''''''''''''''''''''''''''
-These integrations allow you to perform various operations within software developed by Apache Software
-Foundation.
-
-.. operators-hooks-ref::
-   :tags: apache
-   :header-separator: "
-
-
-Transfer operators and hooks
-''''''''''''''''''''''''''''
-
-These integrations allow you to copy data from/to software developed by Apache Software
-Foundation.
-
-.. transfers-ref::
-   :tags: apache
-   :header-separator: "
-
-.. _Azure:
-
-Azure: Microsoft Azure
-----------------------
-
-Airflow has limited support for `Microsoft Azure <https://azure.microsoft.com/>`__.
-
-Some hooks are based on :mod:`airflow.providers.microsoft.azure.hooks.base_azure`
-which authenticate Azure's Python SDK Clients.
-
-Service operators and hooks
-'''''''''''''''''''''''''''
-
-These integrations allow you to perform various operations within the Microsoft Azure.
-
-.. operators-hooks-ref::
-   :tags: azure
-   :header-separator: "
-
-Transfer operators and hooks
-''''''''''''''''''''''''''''
-
-These integrations allow you to copy data from/to Microsoft Azure.
-
-.. transfers-ref::
-   :tags: azure
-   :header-separator: "
-
-
-.. _AWS:
-
-AWS: Amazon Web Services
-------------------------
-
-Airflow has support for `Amazon Web Services <https://aws.amazon.com/>`__.
-
-All hooks are based on :mod:`airflow.providers.amazon.aws.hooks.base_aws`.
-
-Service operators and hooks
-'''''''''''''''''''''''''''
-
-These integrations allow you to perform various operations within the Amazon Web Services.
-
-.. operators-hooks-ref::
-   :tags: aws
-   :header-separator: "
-
-Transfer operators and hooks
-''''''''''''''''''''''''''''
-
-These integrations allow you to copy data from/to Amazon Web Services.
-
-.. transfers-ref::
-   :tags: aws
-   :header-separator: "
-
-.. _Google:
-
-Google
-------
-
-Airflow has support for the `Google service <https://developer.google.com/>`__.
-
-All hooks are based on :class:`airflow.providers.google.common.hooks.base_google.GoogleBaseHook`. Some integration
-also use :mod:`airflow.providers.google.common.hooks.discovery_api`.
-
-See the :doc:`Google Cloud connection type <apache-airflow-providers-google:connections/gcp>` documentation to
-configure connections to Google services.
-
-.. _GCP:
-
-Google Cloud
-''''''''''''
-
-Airflow has extensive support for the `Google Cloud <https://cloud.google.com/>`__.
-
-.. note::
-    You can learn how to use Google Cloud integrations by analyzing the
-    `source code of the Google Cloud example DAGs
-    <https://github.com/apache/airflow/tree/master/airflow/providers/google/cloud/example_dags/>`_
-
-
-Service operators and hooks
-"""""""""""""""""""""""""""
-
-These integrations allow you to perform various operations within the Google Cloud.
-
-.. operators-hooks-ref::
-   :tags: gcp
-   :header-separator: !
-
-
-Transfer operators and hooks
-""""""""""""""""""""""""""""
-
-These integrations allow you to copy data from/to Google Cloud.
-
-.. transfers-ref::
-   :tags: gcp
-   :header-separator: !
-
-
-Google Marketing Platform
-'''''''''''''''''''''''''
-
-.. note::
-    You can learn how to use Google Marketing Platform integrations by analyzing the
-    `source code <https://github.com/apache/airflow/tree/master/airflow/providers/google/marketing_platform/example_dags/>`_
-    of the example DAGs.
-
-
-.. operators-hooks-ref::
-   :tags: gmp
-   :header-separator: !
-
-
-Other Google operators and hooks
-''''''''''''''''''''''''''''''''
-
-.. operators-hooks-ref::
-   :tags: google
-   :header-separator: !
-
-
-.. _yc_service:
-
-Yandex.Cloud
-------------
-
-Airflow has a limited support for the `Yandex.Cloud <https://cloud.yandex.com/>`__.
-
-See the :doc:`Yandex.Cloud connection type <howto/connection/yandexcloud>` documentation to
-configure connections to Yandex.Cloud.
-
-All hooks are based on :class:`airflow.providers.yandex.hooks.yandex.YandexCloudBaseHook`.
-
-.. note::
-    You can learn how to use Yandex.Cloud integrations by analyzing the
-    `example DAG <https://github.com/apache/airflow/blob/master/airflow/providers/yandex/example_dags/example_yandexcloud_dataproc.py>`_
-
-Service operators and hooks
-'''''''''''''''''''''''''''
-
-These integrations allow you to perform various operations within the Yandex.Cloud.
-
-
-.. operators-hooks-ref::
-   :tags: yandex
-   :header-separator: "
-
-.. _service:
-
-Service integrations
---------------------
-
-Service operators and hooks
-'''''''''''''''''''''''''''
-
-These integrations allow you to perform various operations within various services.
-
-.. operators-hooks-ref::
-   :tags: service
-   :header-separator: "
-
-
-Transfer operators and hooks
-''''''''''''''''''''''''''''
-
-These integrations allow you to perform various operations within various services.
-
-.. transfers-ref::
-   :tags: service
-   :header-separator: "
-
-
-.. _software:
-
-Software integrations
----------------------
-
-Software operators and hooks
-''''''''''''''''''''''''''''
-
-These integrations allow you to perform various operations using various software.
-
-.. operators-hooks-ref::
-   :tags: software
-   :header-separator: "
-
-
-Transfer operators and hooks
-''''''''''''''''''''''''''''
-
-These integrations allow you to copy data.
-
-.. transfers-ref::
-   :tags: software
-   :header-separator: "
-
-
-.. _protocol:
-
-Protocol integrations
----------------------
-
-Protocol operators and hooks
-''''''''''''''''''''''''''''
-
-These integrations allow you to perform various operations within various services using standardized
-communication protocols or interface.
-
-.. operators-hooks-ref::
-   :tags: protocol
-   :header-separator: "
-
-Transfer operators and hooks
-''''''''''''''''''''''''''''
-
-These integrations allow you to copy data.
-
-.. transfers-ref::
-   :tags: protocol
-   :header-separator: "
diff --git a/docs/rest-api-ref.rst b/docs/rest-api-ref.rst
index 7696b62..412152e 100644
--- a/docs/rest-api-ref.rst
+++ b/docs/rest-api-ref.rst
@@ -64,7 +64,7 @@ Endpoints
 
 .. http:get:: /api/experimental/dags/<string:dag_id>/dag_runs/<string:execution_date>
 
-  Returns a JSON with a dag_run's public instance variables. The format for the <string:execution_date> is expected to be "YYYY-mm-DDTHH:MM:SS", for example: "2016-11-16T11:34:15".
+  Returns a JSON with a dag_run's public instance variables. The format for the ``<string:execution_date>`` is expected to be ``YYYY-mm-DDTHH:MM:SS``, for example: ``"2016-11-16T11:34:15"``.
 
 
 .. http:get:: /api/experimental/test
@@ -79,7 +79,7 @@ Endpoints
 
 .. http:get:: /api/experimental/dags/<DAG_ID>/dag_runs/<string:execution_date>/tasks/<TASK_ID>
 
-  Returns a JSON with a task instance's public instance variables. The format for the <string:execution_date> is expected to be "YYYY-mm-DDTHH:MM:SS", for example: "2016-11-16T11:34:15".
+  Returns a JSON with a task instance's public instance variables. The format for the ``<string:execution_date>`` is expected to be ``YYYY-mm-DDTHH:MM:SS``, for example: ``"2016-11-16T11:34:15"``.
 
 
 .. http:get:: /api/experimental/dags/<DAG_ID>/paused/<string:paused>
diff --git a/docs/tutorial.rst b/docs/tutorial.rst
index b9b350a..d5e8523 100644
--- a/docs/tutorial.rst
+++ b/docs/tutorial.rst
@@ -352,7 +352,7 @@ which are used to populate the run schedule with task instances from this dag.
 
 What's Next?
 -------------
-That's it, you've written, tested and backfilled your very first Airflow
+That's it, you have written, tested and backfilled your very first Airflow
 pipeline. Merging your code into a code repository that has a master scheduler
 running against it should get it to get triggered and run every day.