You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by po...@apache.org on 2021/09/17 16:11:41 UTC

[airflow] branch v2-1-test updated (614858f -> 7c3ffcc)

This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


    from 614858f  Update Changelog 2.1.4rc2
     new 3b796a1  Improves installing from sources pages for all components (#18251)
     new 7c3ffcc  Refactor installation pages (#18282)

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 dev/README_RELEASE_AIRFLOW.md                      |   9 +-
 dev/README_RELEASE_HELM_CHART.md                   | 602 +++++++++++++++++++++
 dev/README_RELEASE_PROVIDER_PACKAGES.md            |  10 +-
 docs/README.rst                                    |   2 +-
 .../operators/_partials/prerequisite_tasks.rst     |   2 +-
 .../operators/_partials/prerequisite_tasks.rst     |   2 +-
 .../operators/_partials/prerequisite_tasks.rst     |   2 +-
 docs/apache-airflow-providers/index.rst            |   9 +-
 .../installing-from-pypi.rst                       |  45 ++
 .../installing-from-sources.rst                    |   5 +
 docs/apache-airflow/extra-packages-ref.rst         |   2 +-
 docs/apache-airflow/index.rst                      |  11 +-
 docs/apache-airflow/installation.rst               | 336 ------------
 docs/apache-airflow/installation/dependencies.rst  |  92 ++++
 docs/apache-airflow/installation/index.rst         | 317 +++++++++++
 .../installation/installing-from-pypi.rst          | 207 +++++++
 .../{ => installation}/installing-from-sources.rst |   3 +-
 .../installation/prerequisites.rst}                |  32 +-
 .../installation/setting-up-the-database.rst}      |  41 +-
 .../installation/supported-versions.rst            |  67 +++
 docs/apache-airflow/redirects.txt                  |   5 +
 .../index.rst}                                     | 322 ++++++-----
 .../{ => upgrading-from-1-10}/upgrade-check.rst    |   0
 docs/build_docs.py                                 |  55 +-
 docs/conf.py                                       |   3 +-
 docs/docker-stack/build.rst                        |   3 +-
 docs/exts/extra_files_with_substitutions.py        |   8 +-
 docs/helm-chart/index.rst                          |  26 +-
 28 files changed, 1594 insertions(+), 624 deletions(-)
 create mode 100644 dev/README_RELEASE_HELM_CHART.md
 create mode 100644 docs/apache-airflow-providers/installing-from-pypi.rst
 delete mode 100644 docs/apache-airflow/installation.rst
 create mode 100644 docs/apache-airflow/installation/dependencies.rst
 create mode 100644 docs/apache-airflow/installation/index.rst
 create mode 100644 docs/apache-airflow/installation/installing-from-pypi.rst
 rename docs/apache-airflow/{ => installation}/installing-from-sources.rst (98%)
 copy docs/{apache-airflow-providers/core-extensions/auth-backends.rst => apache-airflow/installation/prerequisites.rst} (52%)
 copy docs/{apache-airflow-providers-slack/connections/slack.rst => apache-airflow/installation/setting-up-the-database.rst} (52%)
 create mode 100644 docs/apache-airflow/installation/supported-versions.rst
 rename docs/apache-airflow/{upgrading-to-2.rst => upgrading-from-1-10/index.rst} (85%)
 rename docs/apache-airflow/{ => upgrading-from-1-10}/upgrade-check.rst (100%)

[airflow] 02/02: Refactor installation pages (#18282)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7c3ffccb4709317d292b79b4fcda9064ce4e7885
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Fri Sep 17 17:55:27 2021 +0200

    Refactor installation pages (#18282)
    
    This PR splits ad improves the installation-related documentation
    for Airflow. The "installation" page had become overloaded
    wiht everything-but-the-kitchen-sink and it became rather
    difficult to navigate and link to relevant sections.
    
    Also there was not a single page where one could have an overview
    on different installation methods possible, cases when
    each instalation works best as well as understanding what
    is involved in following each installation method in terms of
    maintenance, and expectations that users should have when it
    comes to what Apache Airflow Community provides.
    
    The PR leaves the installation page as basically a summary of
    all installation methods with all above explained and links
    to detailed pages explaining prerequisites, dependencies,
    database setup and supported versions.
    
    (cherry picked from commit 4308a8c364d410ea8c32d2af7cc8ca3261054696)
---
 dev/README_RELEASE_AIRFLOW.md                      |   9 +-
 dev/README_RELEASE_HELM_CHART.md                   | 602 +++++++++++++++++++++
 dev/README_RELEASE_PROVIDER_PACKAGES.md            |  10 +-
 docs/README.rst                                    |   2 +-
 .../operators/_partials/prerequisite_tasks.rst     |   2 +-
 .../operators/_partials/prerequisite_tasks.rst     |   2 +-
 .../operators/_partials/prerequisite_tasks.rst     |   2 +-
 docs/apache-airflow-providers/index.rst            |   9 +-
 .../installing-from-pypi.rst                       |  45 ++
 .../installing-from-sources.rst                    |   5 +
 docs/apache-airflow/extra-packages-ref.rst         |   2 +-
 docs/apache-airflow/index.rst                      |  11 +-
 docs/apache-airflow/installation.rst               | 336 ------------
 docs/apache-airflow/installation/dependencies.rst  |  92 ++++
 docs/apache-airflow/installation/index.rst         | 317 +++++++++++
 .../installation/installing-from-pypi.rst          | 207 +++++++
 .../{ => installation}/installing-from-sources.rst |   3 +-
 .../installation/prerequisites.rst}                |  26 +-
 .../installation/setting-up-the-database.rst}      |  23 +-
 .../installation/supported-versions.rst            |  67 +++
 docs/apache-airflow/redirects.txt                  |   5 +
 .../index.rst}                                     | 322 ++++++-----
 .../{ => upgrading-from-1-10}/upgrade-check.rst    |   0
 docs/build_docs.py                                 |  55 +-
 docs/conf.py                                       |   3 +-
 docs/docker-stack/build.rst                        |   3 +-
 docs/exts/extra_files_with_substitutions.py        |   8 +-
 docs/helm-chart/index.rst                          |  32 +-
 28 files changed, 1592 insertions(+), 608 deletions(-)

diff --git a/dev/README_RELEASE_AIRFLOW.md b/dev/README_RELEASE_AIRFLOW.md
index 247835a..69d3735 100644
--- a/dev/README_RELEASE_AIRFLOW.md
+++ b/dev/README_RELEASE_AIRFLOW.md
@@ -703,14 +703,9 @@ Dear Airflow community,
 
 I'm happy to announce that Airflow ${VERSION} was just released.
 
-The source release, as well as the binary "sdist" release, are available
-here:
+The released sources and packages can be downloaded via https://airflow.apache.org/installation/installing-from-sources.html
 
-https://dist.apache.org/repos/dist/release/airflow/${VERSION}/
-
-We also made this version available on PyPI for convenience (`pip install apache-airflow`):
-
-https://pypi.python.org/pypi/apache-airflow
+Other installation methods are described in https://airflow.apache.org/installation/
 
 The documentation is available on:
 https://airflow.apache.org/
diff --git a/dev/README_RELEASE_HELM_CHART.md b/dev/README_RELEASE_HELM_CHART.md
new file mode 100644
index 0000000..89f09a5
--- /dev/null
+++ b/dev/README_RELEASE_HELM_CHART.md
@@ -0,0 +1,602 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+<!-- START doctoc generated TOC please keep comment here to allow auto update -->
+<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
+**Table of contents**
+
+- [Prepare the Apache Airflow Helm Chart Release Candidate](#prepare-the-apache-airflow-helm-chart-release-candidate)
+  - [Pre-requisites](#pre-requisites)
+  - [Build RC artifacts](#build-rc-artifacts)
+  - [Prepare Vote email on the Apache Airflow release candidate](#prepare-vote-email-on-the-apache-airflow-release-candidate)
+- [Verify the release candidate by PMCs](#verify-the-release-candidate-by-pmcs)
+  - [SVN check](#svn-check)
+  - [Licence check](#licence-check)
+  - [Signature check](#signature-check)
+  - [SHA512 sum check](#sha512-sum-check)
+- [Verify release candidates by Contributors](#verify-release-candidates-by-contributors)
+- [Publish the final release](#publish-the-final-release)
+  - [Summarize the voting for the release](#summarize-the-voting-for-the-release)
+  - [Publish release to SVN](#publish-release-to-svn)
+  - [Publish documentation](#publish-documentation)
+  - [Update `index.yaml` to Airflow Website](#update-indexyaml-to-airflow-website)
+  - [Notify developers of release](#notify-developers-of-release)
+  - [Update Announcements page](#update-announcements-page)
+  - [Remove old releases](#remove-old-releases)
+
+<!-- END doctoc generated TOC please keep comment here to allow auto update -->
+
+You can find the prerequisites to release Apache Airflow in [README.md](README.md). This document
+details the steps for releasing Helm Chart.
+
+# Prepare the Apache Airflow Helm Chart Release Candidate
+
+## Pre-requisites
+
+- Helm version >= 3.5.4
+
+
+## Build RC artifacts
+
+The Release Candidate artifacts we vote upon should be the exact ones we vote against,
+without any modification than renaming – i.e. the contents of the files must be
+the same between voted release candidate and final release.
+Because of this the version in the built artifacts that will become the
+official Apache releases must not include the rcN suffix.
+
+- Set environment variables
+
+    ```shell
+    # Set Version
+    export VERSION=1.0.1rc1
+    export VERSION_WITHOUT_RC=${VERSION%rc?}
+
+    # Set AIRFLOW_REPO_ROOT to the path of your git repo
+    export AIRFLOW_REPO_ROOT=$(pwd)
+
+    # Example after cloning
+    git clone https://github.com/apache/airflow.git airflow
+    cd airflow
+    export AIRFLOW_REPO_ROOT=$(pwd)
+    ```
+
+- We currently release Helm Chart from `main` branch:
+
+    ```shell
+    git checkout main
+    ```
+
+- Clean the checkout: the sdist step below will
+
+    ```shell
+    git clean -fxd
+    ```
+
+- Update Helm Chart version in `Chart.yaml`, example: `version: 1.0.0` (without
+  the RC tag). If the default version of Airflow is different from `appVersion` change it.
+
+- Commit the version change.
+
+- Tag your release
+
+    ```shell
+    git tag -s helm-chart/${VERSION}
+    ```
+
+- Tarball the repo
+
+    ```shell
+    git archive --format=tar.gz helm-chart/${VERSION} --prefix=airflow-chart-${VERSION_WITHOUT_RC}/ \
+        -o airflow-chart-${VERSION_WITHOUT_RC}-source.tar.gz chart .rat-excludes
+    ```
+
+- Generate chart binary
+
+    NOTE: Make sure your checkout is clean at this stage - any untracked or changed files will otherwise be included
+     in the file produced.
+
+    Replace key email to your email address and path of keyring if it is different.
+    (If you have not generated a key yet, generate it by following instructions on
+    http://www.apache.org/dev/openpgp.html#key-gen-generate-key)
+
+    ```shell
+    helm package chart --dependency-update --sign --key "kaxilnaik@apache.org" \
+        --keyring ~/.gnupg/secring.gpg
+    ```
+
+    Warning: the GnuPG v2 store your secret keyring using a new format kbx on the default
+    location `~/.gnupg/pubring.kbx`. Please use the following command to convert your keyring to the
+    legacy gpg format and run the above command again:
+
+    ```shell
+    gpg --export-secret-keys > ~/.gnupg/secring.gpg
+    ```
+
+    This should also generate Provenance file (Example: `airflow-1.0.0.tgz.prov`) as described in
+    https://helm.sh/docs/topics/provenance/ which can be used to verify integrity of the Helm chart.
+
+    Verify the signed chart:
+
+    ```shell
+    helm verify airflow-${VERSION_WITHOUT_RC}.tgz --keyring ~/.gnupg/secring.gpg
+    ```
+
+    Example Output:
+
+    ```shell
+    $ helm verify airflow-${VERSION_WITHOUT_RC}.tgz --keyring ~/.gnupg/secring.gpg
+    Signed by: Kaxil Naik <ka...@apache.org>
+    Signed by: Kaxil Naik <ka...@gmail.com>
+    Using Key With Fingerprint: CDE15C6E4D3A8EC4ECF4BA4B6674E08AD7DE406F
+    Chart Hash Verified: sha256:6185e54735e136d7d30d329cd16555a3a6c951be876aca8deac2022ab0568e53
+    ```
+
+
+- Generate SHA512/ASC
+
+    ```shell
+    ${AIRFLOW_REPO_ROOT}/dev/sign.sh airflow-chart-${VERSION_WITHOUT_RC}-source.tar.gz
+    ${AIRFLOW_REPO_ROOT}/dev/sign.sh airflow-${VERSION_WITHOUT_RC}.tgz
+    ```
+
+- Move the artifacts to ASF dev dist repo, Generate convenience `index.yaml` & Publish them
+
+  ```shell
+  # First clone the repo
+  svn checkout https://dist.apache.org/repos/dist/dev/airflow airflow-dev
+
+  # Create new folder for the release
+  cd airflow-dev/helm-chart
+  svn mkdir ${VERSION}
+
+  # Move the artifacts to svn folder
+  mv ${AIRFLOW_REPO_ROOT}/airflow-${VERSION_WITHOUT_RC}.tgz* ${VERSION}/
+  mv ${AIRFLOW_REPO_ROOT}/airflow-chart-${VERSION_WITHOUT_RC}-source.tar.gz* ${VERSION}/
+  cd ${VERSION}
+
+  ###### Generate index.yaml file - Start
+  # Download the latest index.yaml on Airflow Website
+  curl https://airflow.apache.org/index.yaml --output index.yaml
+
+  # Replace the URLs from "https://downloads.apache.org" to "https://archive.apache.org"
+  # as the downloads.apache.org only contains latest releases.
+  sed -i 's|https://downloads.apache.org/airflow/helm-chart/|https://archive.apache.org/dist/airflow/helm-chart/|' index.yaml
+
+  # Generate / Merge the new version with existing index.yaml
+  helm repo index --merge ./index.yaml . --url "https://dist.apache.org/repos/dist/dev/airflow/helm-chart/${VERSION}"
+
+  ###### Generate index.yaml file - End
+
+  # Commit the artifacts
+  svn add *
+  svn commit -m "Add artifacts for Helm Chart ${VERSION}"
+  ```
+
+- Remove old Helm Chart versions from the dev repo
+
+  ```shell
+  cd ..
+  export PREVIOUS_VERSION=1.0.0rc1
+  svn rm ${PREVIOUS_VERSION}
+  svn commit -m "Remove old Helm Chart release: ${PREVIOUS_VERSION}"
+  ```
+
+- Push Tag for the release candidate
+
+  ```shell
+  cd ${AIRFLOW_REPO_ROOT}
+  git push origin helm-chart/${VERSION}
+  ```
+
+## Prepare Vote email on the Apache Airflow release candidate
+
+- Send out a vote to the dev@airflow.apache.org mailing list:
+
+Subject:
+
+```
+[VOTE] Release Apache Airflow Helm Chart 1.0.1 based on 1.0.1rc1
+```
+
+Body:
+
+```
+Hello Apache Airflow Community,
+
+This is a call for the vote to release Helm Chart version 1.0.1.
+
+The release candidate is available at:
+https://dist.apache.org/repos/dist/dev/airflow/helm-chart/1.0.1rc1/
+
+airflow-chart-1.0.1-source.tar.gz - is the "main source release" that comes with INSTALL instructions.
+airflow-1.0.1.tgz - is the binary Helm Chart release.
+
+Public keys are available at: https://www.apache.org/dist/airflow/KEYS
+
+For convenience "index.yaml" has been uploaded (though excluded from voting), so you can also run the below commands.
+
+helm repo add apache-airflow-dev https://dist.apache.org/repos/dist/dev/airflow/helm-chart/1.0.1rc1/
+helm repo update
+helm install airflow apache-airflow-dev/airflow
+
+airflow-1.0.1.tgz.prov - is also uploaded for verifying Chart Integrity, though not strictly required for releasing the artifact based on ASF Guidelines.
+
+$ helm verify airflow-1.0.1.tgz --keyring  ~/.gnupg/secring.gpg
+Signed by: Kaxil Naik <ka...@apache.org>
+Signed by: Kaxil Naik <ka...@gmail.com>
+Using Key With Fingerprint: CDE15C6E4D3A8EC4ECF4BA4B6674E08AD7DE406F
+Chart Hash Verified: sha256:6cd3f13fc93d60424a771a1a8a4121c4439f7b6b48fab946436da0ab70d5a507
+
+The vote will be open for at least 72 hours (2021-05-19 01:30 UTC) or until the necessary number of votes are reached.
+
+https://www.timeanddate.com/countdown/to?iso=20210519T0230&p0=136&font=cursive
+
+Please vote accordingly:
+
+[ ] +1 approve
+[ ] +0 no opinion
+[ ] -1 disapprove with the reason
+
+Only votes from PMC members are binding, but members of the community are
+encouraged to test the release and vote with "(non-binding)".
+
+For license checks, the .rat-excludes files is included, so you can run the following to verify licenses (just update $PATH_TO_RAT):
+
+tar -xvf airflow-chart-1.0.1-source.tar.gz
+cd airflow-chart-1.0.1
+java -jar $PATH_TO_RAT/apache-rat-0.13/apache-rat-0.13.jar chart -E .rat-excludes
+
+Please note that the version number excludes the `rcX` string, so it's now
+simply 1.0.1. This will allow us to rename the artifact without modifying
+the artifact checksums when we actually release.
+
+Thanks,
+<your name>
+```
+
+
+# Verify the release candidate by PMCs
+
+The PMCs should verify the releases in order to make sure the release is following the
+[Apache Legal Release Policy](http://www.apache.org/legal/release-policy.html).
+
+At least 3 (+1) votes should be recorded in accordance to
+[Votes on Package Releases](https://www.apache.org/foundation/voting.html#ReleaseVotes)
+
+The legal checks include:
+
+* checking if the packages are present in the right dist folder on svn
+* verifying if all the sources have correct licences
+* verifying if release manager signed the releases with the right key
+* verifying if all the checksums are valid for the release
+
+## SVN check
+
+The files should be present in the sub-folder of
+[Airflow dist](https://dist.apache.org/repos/dist/dev/airflow/)
+
+The following files should be present (7 files):
+
+* `airflow-chart-${VERSION_WITHOUT_RC}-source.tar.gz` + .asc + .sha512
+* `airflow-{VERSION_WITHOUT_RC}.tgz` + .asc + .sha512
+* `airflow-{VERSION_WITHOUT_RC}.tgz.prov`
+
+As a PMC you should be able to clone the SVN repository:
+
+```shell
+svn co https://dist.apache.org/repos/dist/dev/airflow
+```
+
+Or update it if you already checked it out:
+
+```shell
+svn update .
+```
+
+## Licence check
+
+This can be done with the Apache RAT tool.
+
+* Download the latest jar from https://creadur.apache.org/rat/download_rat.cgi (unpack the binary,
+  the jar is inside)
+* Unpack the binary (`-bin.tar.gz`) to a folder
+* Enter the folder and run the check (point to the place where you extracted the .jar)
+
+```shell
+java -jar $PATH_TO_RAT/apache-rat-0.13/apache-rat-0.13.jar chart -E .rat-excludes
+```
+
+where `.rat-excludes` is the file in the root of Chart source code.
+
+## Signature check
+
+Make sure you have the key of person signed imported in your GPG. You can find the valid keys in
+[KEYS](https://dist.apache.org/repos/dist/release/airflow/KEYS).
+
+You can import the whole KEYS file:
+
+```shell
+gpg --import KEYS
+```
+
+You can also import the keys individually from a keyserver. The below one uses Kaxil's key and
+retrieves it from the default GPG keyserver
+[OpenPGP.org](https://keys.openpgp.org):
+
+```shell
+gpg --receive-keys 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+```
+
+You should choose to import the key when asked.
+
+Note that by being default, the OpenPGP server tends to be overloaded often and might respond with
+errors or timeouts. Many of the release managers also uploaded their keys to the
+[GNUPG.net](https://keys.gnupg.net) keyserver, and you can retrieve it from there.
+
+```shell
+gpg --keyserver keys.gnupg.net --receive-keys 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+```
+
+Once you have the keys, the signatures can be verified by running this:
+
+```shell
+for i in *.asc
+do
+   echo "Checking $i"; gpg --verify $i
+done
+```
+
+This should produce results similar to the below. The "Good signature from ..." is indication
+that the signatures are correct. Do not worry about the "not certified with a trusted signature"
+warning. Most of the certificates used by release managers are self signed, that's why you get this
+warning. By importing the server in the previous step and importing it via ID from
+[KEYS](https://dist.apache.org/repos/dist/release/airflow/KEYS) page, you know that
+this is a valid Key already.
+
+```
+Checking airflow-1.0.0.tgz.asc
+gpg: assuming signed data in 'airflow-1.0.0.tgz'
+gpg: Signature made Sun 16 May 01:25:24 2021 BST
+gpg:                using RSA key CDE15C6E4D3A8EC4ECF4BA4B6674E08AD7DE406F
+gpg:                issuer "kaxilnaik@apache.org"
+gpg: Good signature from "Kaxil Naik <ka...@apache.org>" [unknown]
+gpg:                 aka "Kaxil Naik <ka...@gmail.com>" [unknown]
+gpg: WARNING: The key's User ID is not certified with a trusted signature!
+gpg:          There is no indication that the signature belongs to the owner.
+Primary key fingerprint: CDE1 5C6E 4D3A 8EC4 ECF4  BA4B 6674 E08A D7DE 406F
+Checking airflow-chart-1.0.0-source.tar.gz.asc
+gpg: assuming signed data in 'airflow-chart-1.0.0-source.tar.gz'
+gpg: Signature made Sun 16 May 02:24:09 2021 BST
+gpg:                using RSA key CDE15C6E4D3A8EC4ECF4BA4B6674E08AD7DE406F
+gpg:                issuer "kaxilnaik@apache.org"
+gpg: Good signature from "Kaxil Naik <ka...@apache.org>" [unknown]
+gpg:                 aka "Kaxil Naik <ka...@gmail.com>" [unknown]
+gpg: WARNING: The key's User ID is not certified with a trusted signature!
+gpg:          There is no indication that the signature belongs to the owner.
+Primary key fingerprint: CDE1 5C6E 4D3A 8EC4 ECF4  BA4B 6674 E08A D7DE 406F
+```
+
+## SHA512 sum check
+
+Run this:
+
+```shell
+for i in *.sha512
+do
+    echo "Checking $i"; shasum -a 512 `basename $i .sha512 ` | diff - $i
+done
+```
+
+You should get output similar to:
+
+```
+Checking airflow-1.0.0.tgz.sha512
+Checking airflow-chart-1.0.0-source.tar.gz.sha512
+```
+
+# Verify release candidates by Contributors
+
+Contributors can run below commands to test the Helm Chart
+
+```shell
+helm repo add apache-airflow-dev https://dist.apache.org/repos/dist/dev/airflow/helm-chart/1.0.1rc1/
+helm repo update
+helm install airflow apache-airflow-dev/airflow
+```
+
+You can then perform any other verifications to check that it works as you expected by
+upgrading the Chart or installing by overriding default of `values.yaml`.
+
+# Publish the final release
+
+## Summarize the voting for the release
+
+Once the vote has been passed, you will need to send a result vote to dev@airflow.apache.org:
+
+Subject:
+
+```
+[RESULT][VOTE] Release Apache Airflow Helm Chart 1.0.1 based on 1.0.1rc1
+```
+
+Message:
+
+```
+Hello all,
+
+The vote to release Apache Airflow Helm Chart version 1.0.1 based on 1.0.1rc1 is now closed.
+
+The vote PASSED with 4 binding "+1", 4 non-binding "+1" and 0 "-1" votes:
+
+"+1" Binding votes:
+
+  - Kaxil Naik
+  - Jarek Potiuk
+  - Ash Berlin-Taylor
+  - Xiaodong Deng
+
+"+1" Non-Binding votes:
+
+  - Jed Cunningham
+  - Ephraim Anierobi
+  - Dennis Akpenyi
+  - Ian Stanton
+
+Vote thread:
+https://lists.apache.org/thread.html/r865f041e491a2a7a52e17784abf0d0f2e35c3bac5ae8a05927285558%40%3Cdev.airflow.apache.org%3E
+
+I'll continue with the release process and the release announcement will follow shortly.
+
+Thanks,
+<your name>
+```
+
+## Publish release to SVN
+
+You need to migrate the RC artifacts that passed to this repository:
+https://dist.apache.org/repos/dist/release/airflow/helm-chart/
+(The migration should include renaming the files so that they no longer have the RC number in their filenames.)
+
+The best way of doing this is to svn cp between the two repos (this avoids having to upload
+the binaries again, and gives a clearer history in the svn commit logs):
+
+```shell
+# First clone the repo
+export RC=1.0.1rc1
+export VERSION=${RC/rc?/}
+svn checkout https://dist.apache.org/repos/dist/release/airflow airflow-release
+
+# Create new folder for the release
+cd airflow-release/helm-chart
+svn mkdir ${VERSION}
+cd ${VERSION}
+
+# Move the artifacts to svn folder & commit (don't copy or copy & remove - index.yaml)
+for f in ../../../airflow-dev/helm-chart/$RC/*; do svn cp $f ${$(basename $f)/}; done
+svn rm index.yaml
+svn commit -m "Release Airflow Helm Chart Check ${VERSION} from ${RC}"
+
+```
+
+Verify that the packages appear in [Airflow Helm Chart](https://dist.apache.org/repos/dist/release/airflow/helm-chart/).
+
+## Publish documentation
+
+In our cases, documentation for the released versions is published in a separate repository -
+[`apache/airflow-site`](https://github.com/apache/airflow-site), but the documentation source code and
+build tools are available in the `apache/airflow` repository, so you have to coordinate
+between the two repositories to be able to build the documentation.
+
+- First, copy the airflow-site repository and set the environment variable ``AIRFLOW_SITE_DIRECTORY``.
+
+    ```shell
+    git clone https://github.com/apache/airflow-site.git airflow-site
+    cd airflow-site
+    export AIRFLOW_SITE_DIRECTORY="$(pwd)"
+    ```
+
+- Then you can go to the directory and build the necessary documentation packages
+
+    ```shell
+    cd "${AIRFLOW_REPO_ROOT}"
+    ./breeze build-docs -- --package-filter helm-chart --for-production
+    ```
+
+- We upload `index.yaml` to the Airflow website to allow: `helm repo add https://airflow.apache.org`.
+
+- Now you can preview the documentation.
+
+    ```shell
+    ./docs/start_doc_server.sh
+    ```
+
+- Copy the documentation to the ``airflow-site`` repository.
+
+    ```shell
+    ./docs/publish_docs.py --package-filter apache-airflow --package-filter docker-stack
+    cd "${AIRFLOW_SITE_DIRECTORY}"
+    git commit -m "Add documentation for Apache Airflow Helm Chart ${VERSION}"
+    git push
+    ```
+
+## Update `index.yaml` to Airflow Website
+
+We upload `index.yaml` to the Airflow website to allow: `helm repo add https://airflow.apache.org`.
+
+  ```shell
+  cd "${AIRFLOW_SITE_DIRECTORY}"
+  curl https://dist.apache.org/repos/dist/dev/airflow/helm-chart/${RC}/index.yaml -o index.yaml
+  https://dist.apache.org/repos/dist/dev/airflow/helm-chart/${VERSION}
+  sed -i "s|https://dist.apache.org/repos/dist/dev/airflow/helm-chart/$RC|https://downloads.apache.org/airflow/helm-chart/$VERSION|" index.yaml
+
+  git commit -m "Add documentation for Apache Airflow Helm Chart ${VERSION}"
+  git push
+  ```
+
+## Notify developers of release
+
+- Notify users@airflow.apache.org (cc'ing dev@airflow.apache.org and announce@apache.org) that
+the artifacts have been published:
+
+Subject:
+
+```shell
+cat <<EOF
+[ANNOUNCE] Apache Airflow Helm Chart version ${VERSION} Released
+EOF
+```
+
+Body:
+
+```shell
+cat <<EOF
+Dear Airflow community,
+
+I am pleased to announce that we have released Apache Airflow Helm chart $VERSION 🎉 🎊
+
+The source release, as well as the "binary" Helm Chart release, are available:
+
+📦   Official Sources: https://airflow.apache.org/helm-chart/installing-helm-chart-from-sources.html
+📦   ArtifactHub: https://artifacthub.io/packages/helm/apache-airflow/airflow
+📚   Docs: https://airflow.apache.org/docs/helm-chart/$VERSION/
+
+Thanks to all the contributors who made this possible.
+
+Cheers,
+<your name>
+EOF
+```
+
+## Update Announcements page
+
+Update "Announcements" page at the [Official Airflow website](https://airflow.apache.org/announcements/)
+
+## Remove old releases
+
+We should keep the old version a little longer than a day or at least until the updated
+``index.yaml`` is published. This is to avoid errors for users who haven't run ``helm repo update``.
+
+It is probably ok if we leave last 2 versions on release svn repo too.
+
+```shell
+# http://www.apache.org/legal/release-policy.html#when-to-archive
+cd airflow-release/helm-chart
+export PREVIOUS_VERSION=1.0.0
+svn rm ${PREVIOUS_VERSION}
+svn commit -m "Remove old Helm Chart release: ${PREVIOUS_VERSION}"
+```
diff --git a/dev/README_RELEASE_PROVIDER_PACKAGES.md b/dev/README_RELEASE_PROVIDER_PACKAGES.md
index dcb729a..c9eef59 100644
--- a/dev/README_RELEASE_PROVIDER_PACKAGES.md
+++ b/dev/README_RELEASE_PROVIDER_PACKAGES.md
@@ -805,15 +805,11 @@ I'm happy to announce that new versions of Airflow Providers packages were just
 
 The source release, as well as the binary releases, are available here:
 
-https://dist.apache.org/repos/dist/release/airflow/providers/
+https://airflow.apache.org/docs/apache-airflow-providers/installing-from-sources
 
-We also made those versions available on PyPi for convenience ('pip install apache-airflow-providers-*'):
+You can install the providers via PyPI  https://airflow.apache.org/apache-airflow-providers/installing-from-pypi
 
-https://pypi.org/search/?q=apache-airflow-providers
-
-The documentation is available at https://airflow.apache.org/docs/ and linked from the PyPI packages:
-
-<PASTE TWINE UPLOAD LINKS HERE. SORT THEM BEFORE!>
+The documentation is available at https://airflow.apache.org/docs/ and linked from the PyPI packages.
 
 Cheers,
 <your name>
diff --git a/docs/README.rst b/docs/README.rst
index abb9a1c..949935f 100644
--- a/docs/README.rst
+++ b/docs/README.rst
@@ -114,7 +114,7 @@ Role ``:class:`` works well with references between packages. If you want to use
 
 .. code-block:: rst
 
-    :doc:`apache-airflow:installation`
+    :doc:`apache-airflow:installation/index`
     :ref:`apache-airflow-providers-google:write-logs-stackdriver`
 
 If you still feel confused then you can view more possible roles for our documentation:
diff --git a/docs/apache-airflow-providers-amazon/operators/_partials/prerequisite_tasks.rst b/docs/apache-airflow-providers-amazon/operators/_partials/prerequisite_tasks.rst
index ba99fd8..39ed4d9 100644
--- a/docs/apache-airflow-providers-amazon/operators/_partials/prerequisite_tasks.rst
+++ b/docs/apache-airflow-providers-amazon/operators/_partials/prerequisite_tasks.rst
@@ -26,7 +26,7 @@ To use these operators, you must do a few things:
 
       pip install 'apache-airflow[amazon]'
 
-    Detailed information is available :doc:`apache-airflow:installation`
+    Detailed information is available :doc:`apache-airflow:installation/index`
 
   * :doc:`Setup Connection </connections/aws>`.
 
diff --git a/docs/apache-airflow-providers-google/operators/_partials/prerequisite_tasks.rst b/docs/apache-airflow-providers-google/operators/_partials/prerequisite_tasks.rst
index a509ecd..c36e386 100644
--- a/docs/apache-airflow-providers-google/operators/_partials/prerequisite_tasks.rst
+++ b/docs/apache-airflow-providers-google/operators/_partials/prerequisite_tasks.rst
@@ -28,6 +28,6 @@ To use these operators, you must do a few things:
 
       pip install 'apache-airflow[google]'
 
-    Detailed information is available for :doc:`Installation <apache-airflow:installation>`.
+    Detailed information is available for :doc:`Installation <apache-airflow:installation/index>`.
 
   * :doc:`Setup a Google Cloud Connection </connections/gcp>`.
diff --git a/docs/apache-airflow-providers-microsoft-azure/operators/_partials/prerequisite_tasks.rst b/docs/apache-airflow-providers-microsoft-azure/operators/_partials/prerequisite_tasks.rst
index 02cb178..16be143 100644
--- a/docs/apache-airflow-providers-microsoft-azure/operators/_partials/prerequisite_tasks.rst
+++ b/docs/apache-airflow-providers-microsoft-azure/operators/_partials/prerequisite_tasks.rst
@@ -26,7 +26,7 @@ To use these operators, you must do a few things:
 
       pip install 'apache-airflow[azure]'
 
-    Detailed information is available :doc:`apache-airflow:installation`
+    Detailed information is available :doc:`apache-airflow:installation/index`
 
   * :doc:`Setup Connection </connections/azure>`.
 
diff --git a/docs/apache-airflow-providers/index.rst b/docs/apache-airflow-providers/index.rst
index 0b0b527..b13a7b7 100644
--- a/docs/apache-airflow-providers/index.rst
+++ b/docs/apache-airflow-providers/index.rst
@@ -143,7 +143,7 @@ package.
 
 Each community provider has corresponding extra which can be used when installing airflow to install the
 provider together with ``Apache Airflow`` - for example you can install airflow with those extras:
-``apache-airflow[google,amazon]`` (with correct constraints -see :doc:`apache-airflow:installation`) and you
+``apache-airflow[google,amazon]`` (with correct constraints -see :doc:`apache-airflow:installation/index`) and you
 will install the appropriate versions of the ``apache-airflow-providers-amazon`` and
 ``apache-airflow-providers-google`` packages together with ``Apache Airflow``.
 
@@ -417,10 +417,5 @@ specific Airflow versions.
     Operators and hooks <operators-and-hooks-ref/index>
     Core Extensions <core-extensions/index>
     Update community providers <howto/create-update-providers>
-
-.. toctree::
-    :maxdepth: 2
-    :hidden:
-    :caption: Resources
-
     Installing from sources <installing-from-sources>
+    Installing from PyPI <installing-from-pypi>
diff --git a/docs/apache-airflow-providers/installing-from-pypi.rst b/docs/apache-airflow-providers/installing-from-pypi.rst
new file mode 100644
index 0000000..c94b0ab
--- /dev/null
+++ b/docs/apache-airflow-providers/installing-from-pypi.rst
@@ -0,0 +1,45 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Installation from PyPI
+----------------------
+
+.. contents:: :local:
+
+
+This page describes installations using the ``apache-airflow-providers`` package `published in
+PyPI <https://pypi.org/search/?q=apache-airflow-providers>`__.
+
+Installation tools
+''''''''''''''''''
+
+Only ``pip`` installation is currently officially supported.
+
+While there are some successes with using other tools like `poetry <https://python-poetry.org/>`_ or
+`pip-tools <https://pypi.org/project/pip-tools/>`_, they do not share the same workflow as
+``pip`` - especially when it comes to constraint vs. requirements management.
+Installing via ``Poetry`` or ``pip-tools`` is not currently supported. If you wish to install airflow
+using those tools you should use the constraints and convert them to appropriate
+format and workflow that your tool requires.
+
+Typical command to install airflow from PyPI looks like below (you need to use the right Airflow version and Python version):
+
+.. code-block::
+
+    pip install "apache-airflow-providers-celery" --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.1.4/constraints-3.6.txt"
+
+This is an example, see :doc:`apache-airflow:installation/installing-from-pypi` for more examples, including how to upgrade the providers.
diff --git a/docs/apache-airflow-providers/installing-from-sources.rst b/docs/apache-airflow-providers/installing-from-sources.rst
index 00e2a2d..4b4990e 100644
--- a/docs/apache-airflow-providers/installing-from-sources.rst
+++ b/docs/apache-airflow-providers/installing-from-sources.rst
@@ -36,6 +36,11 @@ Released packages
          </ul>
 
 
+    You can also install ``Apache Airflow Providers`` - as most Python packages - via :doc:`PyPI <installing-from-pypi>`.
+    You can choose different version of Airflow by selecting different version from the drop-down at
+    the top-left of the page.
+
+
 Release integrity
 '''''''''''''''''
 
diff --git a/docs/apache-airflow/extra-packages-ref.rst b/docs/apache-airflow/extra-packages-ref.rst
index 2d4769e..3da2af8 100644
--- a/docs/apache-airflow/extra-packages-ref.rst
+++ b/docs/apache-airflow/extra-packages-ref.rst
@@ -18,7 +18,7 @@
 Reference for package extras
 ''''''''''''''''''''''''''''
 
-Here's the list of all the :ref:`extra dependencies <installation:airflow_extra_dependencies>`.
+Here's the list of all the extra dependencies of Apache Airflow.
 
 The entries with ``*`` in the ``Preinstalled`` column indicate that those extras (providers) are always
 pre-installed when Airflow is installed.
diff --git a/docs/apache-airflow/index.rst b/docs/apache-airflow/index.rst
index 62787ad..3d9ea36 100644
--- a/docs/apache-airflow/index.rst
+++ b/docs/apache-airflow/index.rst
@@ -77,9 +77,8 @@ unit of work and continuity.
     project
     license
     start/index
-    installation
-    upgrading-to-2
-    upgrade-check
+    installation/index
+    upgrading-from-1-10/index
     tutorial
     tutorial_taskflow_api
     howto/index
@@ -117,9 +116,3 @@ unit of work and continuity.
     Configurations <configurations-ref>
     Extra packages <extra-packages-ref>
     Database Migrations <migrations-ref>
-
-.. toctree::
-    :hidden:
-    :caption: Resources
-
-    Installing from sources <installing-from-sources>
diff --git a/docs/apache-airflow/installation.rst b/docs/apache-airflow/installation.rst
deleted file mode 100644
index 00e0baf..0000000
--- a/docs/apache-airflow/installation.rst
+++ /dev/null
@@ -1,336 +0,0 @@
- .. Licensed to the Apache Software Foundation (ASF) under one
-    or more contributor license agreements.  See the NOTICE file
-    distributed with this work for additional information
-    regarding copyright ownership.  The ASF licenses this file
-    to you under the Apache License, Version 2.0 (the
-    "License"); you may not use this file except in compliance
-    with the License.  You may obtain a copy of the License at
-
- ..   http://www.apache.org/licenses/LICENSE-2.0
-
- .. Unless required by applicable law or agreed to in writing,
-    software distributed under the License is distributed on an
-    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-    KIND, either express or implied.  See the License for the
-    specific language governing permissions and limitations
-    under the License.
-
-
-Installation
-------------
-
-.. contents:: :local:
-
-This page describes installations using the ``apache-airflow`` package `published in
-PyPI <https://pypi.org/project/apache-airflow/>`__, but some information may be useful during
-installation with other tools as well.
-
-.. note::
-
-    Airflow is also distributed as a Docker image (OCI Image). Consider using it to guarantee that software will always run the same no matter where it is deployed. For more information, see: :doc:`docker-stack:index`.
-
-Prerequisites
-'''''''''''''
-
-Airflow is tested with:
-
-* Python: 3.6, 3.7, 3.8, 3.9
-
-* Databases:
-
-  * PostgreSQL:  9.6, 10, 11, 12, 13
-  * MySQL: 5.7, 8
-  * SQLite: 3.15.0+
-
-* Kubernetes: 1.18.15 1.19.7 1.20.2
-
-**Note:** MySQL 5.x versions are unable to or have limitations with
-running multiple schedulers -- please see: :doc:`/concepts/scheduler`. MariaDB is not tested/recommended.
-
-**Note:** SQLite is used in Airflow tests. Do not use it in production. We recommend
-using the latest stable version of SQLite for local development.
-
-Starting with Airflow 2.1.2, Airflow is tested with Python 3.6, 3.7, 3.8, and 3.9.
-
-Installation tools
-''''''''''''''''''
-
-Only ``pip`` installation is currently officially supported.
-
-While there are some successes with using other tools like `poetry <https://python-poetry.org/>`_ or
-`pip-tools <https://pypi.org/project/pip-tools/>`_, they do not share the same workflow as
-``pip`` - especially when it comes to constraint vs. requirements management.
-Installing via ``Poetry`` or ``pip-tools`` is not currently supported. If you wish to install airflow
-using those tools you should use the :ref:`constraint files <installation:constraints>`  and convert them to appropriate
-format and workflow that your tool requires.
-
-.. _installation:airflow_extra_dependencies:
-
-Airflow extra dependencies
-''''''''''''''''''''''''''
-
-The ``apache-airflow`` PyPI basic package only installs what's needed to get started.
-Additional packages can be installed depending on what will be useful in your
-environment. For instance, if you don't need connectivity with Postgres,
-you won't have to go through the trouble of installing the ``postgres-devel``
-yum package, or whatever equivalent applies on the distribution you are using.
-
-Most of the extra dependencies are linked to a corresponding provider package. For example "amazon" extra
-has a corresponding ``apache-airflow-providers-amazon`` provider package to be installed. When you install
-Airflow with such extras, the necessary provider packages are installed automatically (latest versions from
-PyPI for those packages). However you can freely upgrade and install provider packages independently from
-the main Airflow installation.
-
-For the list of the extras and what they enable, see: :doc:`extra-packages-ref`.
-
-.. _installation:provider_packages:
-
-Provider packages
-'''''''''''''''''
-
-Unlike Apache Airflow 1.10, the Airflow 2.0 is delivered in multiple, separate, but connected packages.
-The core of Airflow scheduling system is delivered as ``apache-airflow`` package and there are around
-60 provider packages which can be installed separately as so called ``Airflow Provider packages``.
-The default Airflow installation doesn't have many integrations and you have to install them yourself.
-
-You can even develop and install your own providers for Airflow. For more information,
-see: :doc:`apache-airflow-providers:index`
-
-For the list of the provider packages and what they enable, see: :doc:`apache-airflow-providers:packages-ref`.
-
-Differences between extras and providers
-''''''''''''''''''''''''''''''''''''''''
-
-Just to prevent confusion of extras versus provider packages: Extras and providers are different things,
-though many extras are leading to installing providers.
-
-Extras are standard Python setuptools feature that allows to add additional set of dependencies as
-optional features to "core" Apache Airflow. One of the type of such optional features are providers
-packages, but not all optional features of Apache Airflow have corresponding providers.
-
-We are using the ``extras`` setuptools features to also install provider packages.
-Most of the extras are also linked (same name) with provider packages - for example adding ``[google]``
-extra also adds ``apache-airflow-providers-google`` as dependency. However there are some extras that do
-not install providers (examples ``github_enterprise``, ``kerberos``, ``async`` - they add some extra
-dependencies which are needed for those ``extra`` features of Airflow mentioned. The three examples
-above add respectively github enterprise oauth authentication, kerberos integration or
-asynchronous workers for gunicorn. None of those have providers, they are just extending Apache Airflow
-"core" package with new functionalities.
-
-System dependencies
-'''''''''''''''''''
-
-You need certain system level requirements in order to install Airflow. Those are requirements that are known
-to be needed for Linux system (Tested on Ubuntu Buster LTS) :
-
-.. code-block:: bash
-   :substitutions:
-
-   sudo apt-get install -y --no-install-recommends \
-           freetds-bin \
-           krb5-user \
-           ldap-utils \
-           libffi6 \
-           libsasl2-2 \
-           libsasl2-modules \
-           libssl1.1 \
-           locales  \
-           lsb-release \
-           sasl2-bin \
-           sqlite3 \
-           unixodbc
-
-You also need database client packages (Postgres or MySQL) if you want to use those databases.
-
-.. _installation:constraints:
-
-Constraints files
-'''''''''''''''''
-
-Airflow installation might be sometimes tricky because Airflow is a bit of both a library and application.
-Libraries usually keep their dependencies open and applications usually pin them, but we should do neither
-and both at the same time. We decided to keep our dependencies as open as possible
-(in ``setup.cfg`` and ``setup.py``) so users can install different
-version of libraries if needed. This means that from time to time plain ``pip install apache-airflow`` will
-not work or will produce unusable Airflow installation.
-
-In order to have repeatable installation, starting from **Airflow 1.10.10** and updated in
-**Airflow 1.10.13** we also keep a set of "known-to-be-working" constraint files in the
-``constraints-main``, ``constraints-2-0`` orphan branches and then we create tag
-for each released version e.g. :subst-code:`constraints-|version|`. This way, when we keep a tested and working set of dependencies.
-
-Those "known-to-be-working" constraints are per major/minor Python version. You can use them as constraint
-files when installing Airflow from PyPI. Note that you have to specify correct Airflow version
-and Python versions in the URL.
-
-You can create the URL to the file substituting the variables in the template below.
-
-.. code-block::
-
-  https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt
-
-where:
-
-- ``AIRFLOW_VERSION`` - Airflow version (e.g. :subst-code:`|version|`) or ``main``, ``2-0``, for latest development version
-- ``PYTHON_VERSION`` Python version e.g. ``3.8``, ``3.7``
-
-There is also a no-providers constraint file, which contains just constraints required to install Airflow core. This allows
-to install and upgrade airflow separately and independently from providers.
-
-You can create the URL to the file substituting the variables in the template below.
-
-.. code-block::
-
-  https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-no-providers-${PYTHON_VERSION}.txt
-
-Installation script
-'''''''''''''''''''
-
-In order to simplify the installation, we have prepared examples that will select
-`the constraints file <installation:constraints>`__ compatible with your Python version.
-
-**Installing Airflow with extras and providers**
-
-If you need to install :ref:`extra dependencies of airflow <installation:airflow_extra_dependencies>`,
-you can use the script below to make an installation a one-liner (the example below installs
-postgres and google provider, as well as ``async`` extra.
-
-.. code-block:: bash
-    :substitutions:
-
-    AIRFLOW_VERSION=|version|
-    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
-    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
-    pip install "apache-airflow[async,postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
-
-Note, that it will install the versions of providers that were available at the moment this version of Airflow
-has been prepared. You need to follow next steps if you want to upgrade provider packages in case they were
-released afterwards.
-
-
-**Upgrading Airflow with providers**
-
-You can also upgrade airflow together with extras (providers available at the time of the release of Airflow
-being installed.
-
-
-.. code-block:: bash
-    :substitutions:
-
-    AIRFLOW_VERSION=|version|
-    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
-    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
-    pip install --upgrade "apache-airflow[postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
-
-**Installation and upgrading of Airflow providers separately**
-
-You can manually install all the providers you need. You can continue using the "providers" constraint files
-but the 'versioned' airflow constraints installs only the versions of providers that were available in PyPI at
-the time of preparing of the airflow version. However, usually you can use "main" version of the providers
-to install latest version of providers. Usually the providers work with most versions of Airflow, if there
-will be any incompatibilities, it will be captured as package dependencies.
-
-.. code-block:: bash
-
-    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
-    # For example: 3.6
-    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-${PYTHON_VERSION}.txt"
-    pip install "apache-airflow-providers-google" --constraint "${CONSTRAINT_URL}"
-
-You can also upgrade the providers to latest versions (you need to use main version of constraints for that):
-
-.. code-block:: bash
-
-    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
-    # For example: 3.6
-    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-${PYTHON_VERSION}.txt"
-    pip install "apache-airflow-providers-google" --upgrade --constraint "${CONSTRAINT_URL}"
-
-
-**Installation and upgrade of Airflow core:**
-
-If you don't want to install any extra providers, initially you can use the command set below.
-
-.. code-block:: bash
-    :substitutions:
-
-    AIRFLOW_VERSION=|version|
-    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
-    # For example: 3.6
-    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-no-providers-${PYTHON_VERSION}.txt"
-    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-|version|/constraints-no-providers-3.6.txt
-    pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
-
-
-Support for Python and Kubernetes versions
-''''''''''''''''''''''''''''''''''''''''''
-
-As of Airflow 2.0 we agreed to certain rules we follow for Python and Kubernetes support.
-They are based on the official release schedule of Python and Kubernetes, nicely summarized in the
-`Python Developer's Guide <https://devguide.python.org/#status-of-python-branches>`_ and
-`Kubernetes version skew policy <https://kubernetes.io/docs/setup/release/version-skew-policy>`_.
-
-1. We drop support for Python and Kubernetes versions when they reach EOL. We drop support for those
-   EOL versions in main right after EOL date, and it is effectively removed when we release the
-   first new MINOR (Or MAJOR if there is no new MINOR version) of Airflow
-   For example for Python 3.6 it means that we drop support in main right after 23.12.2021, and the first
-   MAJOR or MINOR version of Airflow released after will not have it.
-
-2. The "oldest" supported version of Python/Kubernetes is the default one. "Default" is only meaningful
-   in terms of "smoke tests" in CI PRs which are run using this default version and default reference
-   image available in DockerHub. Currently ``apache/airflow:latest`` and ``apache/airflow:2.0.2`` images
-   are both Python 3.6 images, however the first MINOR/MAJOR release of Airflow release after 23.12.2021 will
-   become Python 3.7 images.
-
-3. We support a new version of Python/Kubernetes in main after they are officially released, as soon as we
-   make them work in our CI pipeline (which might not be immediate due to dependencies catching up with
-   new versions of Python mostly) we release a new images/support in Airflow based on the working CI setup.
-
-Set up a database
-'''''''''''''''''
-
-Airflow requires a database. If you're just experimenting and learning Airflow, you can stick with the
-default SQLite option. If you don't want to use SQLite, then take a look at
-:doc:`howto/set-up-database` to setup a different database.
-
-
-Troubleshooting
-'''''''''''''''
-
-This section describes how to troubleshoot installation issues.
-
-Airflow command is not recognized
-"""""""""""""""""""""""""""""""""
-
-If the ``airflow`` command is not getting recognized (can happen on Windows when using WSL), then
-ensure that ``~/.local/bin`` is in your ``PATH`` environment variable, and add it in if necessary:
-
-.. code-block:: bash
-
-    PATH=$PATH:~/.local/bin
-
-You can also start airflow with ``python -m airflow``
-
-``Symbol not found: _Py_GetArgcArgv``
-"""""""""""""""""""""""""""""""""""""
-
-If you see ``Symbol not found: _Py_GetArgcArgv`` while starting or importing Airflow, this may mean that you are using an incompatible version of Python.
-For a homebrew installed version of Python, this is generally caused by using Python in ``/usr/local/opt/bin`` rather than the Frameworks installation (e.g. for ``python 3.7``: ``/usr/local/opt/python@3.7/Frameworks/Python.framework/Versions/3.7``).
-
-The crux of the issue is that a library Airflow depends on, ``setproctitle``, uses a non-public Python API
-which is not available from the standard installation ``/usr/local/opt/`` (which symlinks to a path under ``/usr/local/Cellar``).
-
-An easy fix is just to ensure you use a version of Python that has a dylib of the Python library available. For example:
-
-.. code-block:: bash
-
-  # Note: these instructions are for python3.7 but can be loosely modified for other versions
-  brew install python@3.7
-  virtualenv -p /usr/local/opt/python@3.7/Frameworks/Python.framework/Versions/3.7/bin/python3 .toy-venv
-  source .toy-venv/bin/activate
-  pip install apache-airflow
-  python
-  >>> import setproctitle
-  # Success!
-
-Alternatively, you can download and install Python directly from the `Python website <https://www.python.org/>`__.
diff --git a/docs/apache-airflow/installation/dependencies.rst b/docs/apache-airflow/installation/dependencies.rst
new file mode 100644
index 0000000..f545d63
--- /dev/null
+++ b/docs/apache-airflow/installation/dependencies.rst
@@ -0,0 +1,92 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Dependencies
+------------
+
+Airflow extra dependencies
+''''''''''''''''''''''''''
+
+The ``apache-airflow`` PyPI basic package only installs what's needed to get started.
+Additional packages can be installed depending on what will be useful in your
+environment. For instance, if you don't need connectivity with Postgres,
+you won't have to go through the trouble of installing the ``postgres-devel``
+yum package, or whatever equivalent applies on the distribution you are using.
+
+Most of the extra dependencies are linked to a corresponding provider package. For example "amazon" extra
+has a corresponding ``apache-airflow-providers-amazon`` provider package to be installed. When you install
+Airflow with such extras, the necessary provider packages are installed automatically (latest versions from
+PyPI for those packages). However you can freely upgrade and install provider packages independently from
+the main Airflow installation.
+
+For the list of the extras and what they enable, see: :doc:`/extra-packages-ref`.
+
+Provider packages
+'''''''''''''''''
+
+Unlike Apache Airflow 1.10, the Airflow 2.0 is delivered in multiple, separate, but connected packages.
+The core of Airflow scheduling system is delivered as ``apache-airflow`` package and there are around
+60 provider packages which can be installed separately as so called ``Airflow Provider packages``.
+The default Airflow installation doesn't have many integrations and you have to install them yourself.
+
+You can even develop and install your own providers for Airflow. For more information,
+see: :doc:`apache-airflow-providers:index`
+
+For the list of the provider packages and what they enable, see: :doc:`apache-airflow-providers:packages-ref`.
+
+Differences between extras and providers
+''''''''''''''''''''''''''''''''''''''''
+
+Just to prevent confusion of extras versus provider packages: Extras and providers are different things,
+though many extras are leading to installing providers.
+
+Extras are standard Python setuptools feature that allows to add additional set of dependencies as
+optional features to "core" Apache Airflow. One of the type of such optional features are providers
+packages, but not all optional features of Apache Airflow have corresponding providers.
+
+We are using the ``extras`` setuptools features to also install provider packages.
+Most of the extras are also linked (same name) with provider packages - for example adding ``[google]``
+extra also adds ``apache-airflow-providers-google`` as dependency. However there are some extras that do
+not install providers (examples ``github_enterprise``, ``kerberos``, ``async`` - they add some extra
+dependencies which are needed for those ``extra`` features of Airflow mentioned. The three examples
+above add respectively github enterprise oauth authentication, kerberos integration or
+asynchronous workers for gunicorn. None of those have providers, they are just extending Apache Airflow
+"core" package with new functionalities.
+
+System dependencies
+'''''''''''''''''''
+
+You need certain system level requirements in order to install Airflow. Those are requirements that are known
+to be needed for Linux system (Tested on Ubuntu Buster LTS) :
+
+.. code-block:: bash
+
+   sudo apt-get install -y --no-install-recommends \
+           freetds-bin \
+           krb5-user \
+           ldap-utils \
+           libffi6 \
+           libsasl2-2 \
+           libsasl2-modules \
+           libssl1.1 \
+           locales  \
+           lsb-release \
+           sasl2-bin \
+           sqlite3 \
+           unixodbc
+
+You also need database client packages (Postgres or MySQL) if you want to use those databases.
diff --git a/docs/apache-airflow/installation/index.rst b/docs/apache-airflow/installation/index.rst
new file mode 100644
index 0000000..d5ffff4
--- /dev/null
+++ b/docs/apache-airflow/installation/index.rst
@@ -0,0 +1,317 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Installation
+------------
+
+.. contents:: :local:
+
+.. toctree::
+    :maxdepth: 1
+    :caption: Installation
+    :hidden:
+
+    Prerequisites <prerequisites>
+    Dependencies <dependencies>
+    Supported versions <supported-versions>
+    Installing from sources <installing-from-sources>
+    Installing from PyPI <installing-from-pypi>
+    Setting up the database <setting-up-the-database>
+
+This page describes installations options that you might use when considering how to install Airflow.
+Airflow consists of many components, often distributed among many physical or virtual machines, therefore
+installation of Airflow might be quite complex, depending on the options you choose.
+
+You should also check-out the :doc:`Prerequisites <prerequisites>` that must be fulfilled when installing Airflow
+as well as  :doc:`Supported versions <supported-versions>` to know what are the policies for supporting
+Airflow, Python and Kubernetes.
+
+Airflow requires additional :doc:`Dependencies <dependencies>` to be installed - which can be done
+via extras and providers.
+
+When you install Airflow, you need to :doc:`setup the database <setting-up-the-database>` which must
+also be kept updated when Airflow is upgraded.
+
+.. warning::
+
+  As of June 2021 Airflow 1.10 is end-of-life and is not going to receive any fixes even critical
+  security fixes. Follow the :doc:`/upgrading-from-1-10/index` to learn
+  how to upgrade the end-of-life 1.10 to Airflow 2.
+
+Using released sources
+''''''''''''''''''''''
+
+More details: :doc:`installing-from-sources`
+
+**When this option works best**
+
+* This option is best if you expect to build all your software from sources.
+* Apache Airflow is one of the projects that belong to the `Apache Software Foundation <https://www.apache.org/>`__ .
+  It is a requirement for all ASF projects that they can be installed using official sources released via `Official Apache Mirrors <http://ws.apache.org/mirrors.cgi/>`__ .
+* This is the best choice if you have a strong need to `verify the integrity and provenance of the software <https://www.apache.org/dyn/closer.cgi#verify>`__
+
+**Intended users**
+
+* Users who are familiar with installing and building software from sources and are conscious about integrity and provenance
+  of the software they use down to the lowest level possible.
+
+**What are you expected to handle**
+
+* You are expected to build and install airflow and its components on your own.
+* You should develop and handle the deployment for all components of Airflow.
+* You are responsible for setting up database, creating and managing database schema with ``airflow db`` commands,
+  automated startup and recovery, maintenance, cleanup and upgrades of Airflow and the Airflow Providers.
+
+**What Apache Airflow Community provides for that method**
+
+* You have `instructions <https://github.com/apache/airflow/blob/main/INSTALL>`__ on how to build the software but due to various environments
+  and tools you might want to use, you might expect that there will be problems which are specific to your deployment and environment
+  you will have to diagnose and solve.
+
+**Where to ask for help**
+
+* The ``#development`` slack channel for building the software.
+
+* The ``#troubleshooting`` slack is a channel for quick general troubleshooting questions. The
+  `GitHub discussions <https://github.com/apache/airflow/discussions>`__ if you look for longer discussion and have more information to share.
+
+* If you can provide description of a reproducible problem with Airflow software, you can open issue at `GitHub issues <https://github.com/apache/airflow/issues>`_
+
+Using PyPI
+'''''''''''
+
+More details:  :doc:`/installation/installing-from-pypi`
+
+**When this option works best**
+
+* This installation method is useful when you are not familiar with Containers and Docker and want to install
+  Apache Airflow on physical or virtual machines and you are used to installing and running software using custom
+  deployment mechanism.
+
+* The only officially supported mechanism of installation is via ``pip`` using constraint mechanisms. The constraint
+  files are managed by Apache Airflow release managers to make sure that you can repeatably install Airflow from PyPI with all Providers and
+  required dependencies.
+
+* In case of PyPI installation you could also verify integrity and provenance of the packages of the packages
+  downloaded from PyPI as described at the installation page, but software you download from PyPI is pre-built
+  for you so that you can install it without building, and you do not build the software from sources.
+
+**Intended users**
+
+* Users who are familiar with installing and configuring Python applications, managing Python environments,
+  dependencies and running software with their custom deployment mechanisms.
+
+**What are you expected to handle**
+
+* You are expected to install Airflow - all components of it - on your own.
+* You should develop and handle the deployment for all components of Airflow.
+* You are responsible for setting up database, creating and managing database schema with ``airflow db`` commands,
+  automated startup and recovery, maintenance, cleanup and upgrades of Airflow and Airflow Providers.
+
+**What Apache Airflow Community provides for that method**
+
+* You have :doc:`/installation/installing-from-pypi`
+  on how to install the software but due to various environments and tools you might want to use, you might
+  expect that there will be problems which are specific to your deployment and environment you will have to
+  diagnose and solve.
+* You have :doc:`/start/local` where you can see an example of Quick Start with running Airflow
+  locally which you can use to start Airflow quickly for local testing and development.
+  However this is just an inspiration. Do not expect this docker-compose is ready for production installation,
+  you need to build your own production-ready deployment in this approach.
+
+**Where to ask for help**
+
+* The ``#troubleshooting`` channel on Airflow Slack for quick general
+  troubleshooting questions. The `GitHub discussions <https://github.com/apache/airflow/discussions>`__
+  if you look for longer discussion and have more information to share.
+* If you can provide description of a reproducible problem with Airflow software, you can open
+  issue at `GitHub issues <https://github.com/apache/airflow/issues>`__
+
+
+Using Production Docker Images
+''''''''''''''''''''''''''''''
+
+More details: :doc:`docker-stack:index`
+
+**When this option works best**
+
+This installation method is useful when you are familiar with Container/Docker stack. It provides a capability of
+running Airflow components in isolation from other software running on the same physical or virtual machines with easy
+maintenance of dependencies.
+
+The images are build by Apache Airflow release managers and they use officially released packages from PyPI
+and official constraint files- same that are used for installing Airflow from PyPI.
+
+**Intended users**
+
+* Users who are familiar with Containers and Docker stack and understand how to build their own container images.
+* Users who understand how to install providers and dependencies from PyPI with constraints if they want to extend or customize the image.
+* Users who now how to create deployments using Docker by linking together multiple docker containers and maintaining such deployments.
+
+**What are you expected to handle**
+
+* You are expected to be able to customize or extend Container/Docker images if you want to
+  add extra dependencies. You are expected to put together a deployment built of several containers
+  (for example using docker-compose) and to make sure that they are linked together.
+* You are responsible for setting up database, creating and managing database schema with ``airflow db`` commands,
+  automated startup and recovery, maintenance, cleanup and upgrades of Airflow and the Airflow Providers.
+* You are responsible to manage your own customizations and extensions for your custom dependencies.
+  With the Official Airflow Docker Images, upgrades of Airflow and Airflow Providers which
+  are part of the reference image are handled by the community - you need to make sure to pick up
+  those changes when released by upgrading the base image. However you are responsible in creating a
+  pipeline of building your own custom images with your own added dependencies and Providers and need to
+  repeat the customization step and building your own image when new version of Airflow image is released.
+* You should choose the right deployment mechanism. There a number of available options of
+  deployments of containers. You can use your own custom mechanism, custom Kubernetes deployments,
+  custom Docker Compose, custom Helm charts etc., and you should choose it based on your experience
+  and expectations.
+
+**What Apache Airflow Community provides for that method**
+
+* You have instructions: :doc:`docker-stack:build` on how to build and customize your image.
+* You have :doc:`/start/docker` where you can see an example of Quick Start which
+  you can use to start Airflow quickly for local testing and development. However this is just an inspiration.
+  Do not expect to use this ``docker-compose.yml`` file for production installation, you need to get familiar
+  with Docker Compose and its capabilities and build your own production-ready deployment with it if
+  you choose Docker Compose for your deployment.
+* The Docker Image is managed by the same people who build Airflow, and they are committed to keep
+  it updated whenever new features and capabilities of Airflow are released.
+
+**Where to ask for help**
+
+* For quick questions with the Official Docker Image there is the ``#production-docker-image`` channel in Airflow Slack.
+* The ``#troubleshooting`` channel on Airflow Slack for quick general
+  troubleshooting questions. The `GitHub discussions <https://github.com/apache/airflow/discussions>`__
+  if you look for longer discussion and have more information to share.
+* If you can provide description of a reproducible problem with Airflow software, you can open
+  issue at `GitHub issues <https://github.com/apache/airflow/issues>`__
+
+Using Official Airflow Helm Chart
+'''''''''''''''''''''''''''''''''
+
+More details: :doc:`helm-chart:index`
+
+**When this option works best**
+
+* This installation method is useful when you are not only familiar with Container/Docker stack but also when you
+  use Kubernetes and want to install and maintain Airflow using the community-managed Kubernetes installation
+  mechanism via Helm chart.
+* It provides not only a capability of running Airflow components in isolation from other software
+  running on the same physical or virtual machines and managing dependencies, but also it provides capabilities of
+  easier maintaining, configuring and upgrading Airflow in the way that is standardized and will be maintained
+  by the community.
+* The Chart uses the Official Airflow Production Docker Images to run Airflow.
+
+**Intended users**
+
+* Users who are familiar with Containers and Docker stack and understand how to build their own container images.
+* Users who understand how to install providers and dependencies from PyPI with constraints if they want to extend or customize the image.
+* Users who manage their infrastructure using Kubernetes and manage their applications on Kubernetes using Helm Charts.
+
+**What are you expected to handle**
+
+* You are expected to be able to customize or extend Container/Docker images if you want to
+  add extra dependencies. You are expected to put together a deployment built of several containers
+  (for example using Docker Compose) and to make sure that they are linked together.
+* You are responsible for setting up database.
+* The Helm Chart manages your database schema, automates startup, recovery and restarts of the
+  components of the application and linking them together, so you do not have to worry about that.
+* You are responsible to manage your own customizations and extensions for your custom dependencies.
+  With the Official Airflow Docker Images, upgrades of Airflow and Airflow Providers which
+  are part of the reference image are handled by the community - you need to make sure to pick up
+  those changes when released by upgrading the base image. However you are responsible in creating a
+  pipeline of building your own custom images with your own added dependencies and Providers and need to
+  repeat the customization step and building your own image when new version of Airflow image is released.
+
+**What Apache Airflow Community provides for that method**
+
+* You have instructions: :doc:`docker-stack:build` on how to build and customize your image.
+* You have :doc:`helm-chart:index` - full documentation on how to configure and install the Helm Chart.
+* The Helm Chart is managed by the same people who build Airflow, and they are committed to keep
+  it updated whenever new features and capabilities of Airflow are released.
+
+**Where to ask for help**
+
+* For quick questions with the Official Docker Image there is the ``#production-docker-image`` channel in Airflow Slack.
+* For quick questions with the official Helm Chart there is the ``#helm-chart-official`` channel in Slack.
+* The ``#troubleshooting`` channel on Airflow Slack for quick general
+  troubleshooting questions. The `GitHub discussions <https://github.com/apache/airflow/discussions>`__
+  if you look for longer discussion and have more information to share.
+* If you can provide description of a reproducible problem with Airflow software, you can open
+  issue at `GitHub issues <https://github.com/apache/airflow/issues>`__
+
+
+Using Managed Airflow Services
+''''''''''''''''''''''''''''''
+
+Follow the  `Ecosystem <https://airflow.apache.org/ecosystem/>`__ page to find all Managed Services for Airflow.
+
+**When this option works best**
+
+* When you prefer to have someone else manage Airflow installation for you, there are Managed Airflow Services
+  that you can use.
+
+**Intended users**
+
+* Users who prefer to get Airflow managed for them and want to pay for it.
+
+**What are you expected to handle**
+
+* The Managed Services usually provide everything you need to run Airflow. Please refer to documentation of
+  the Managed Services for details.
+
+**What Apache Airflow Community provides for that method**
+
+* Airflow Community does not provide any specific documentation for managed services.
+  Please refer to the documentation of the Managed Services for details.
+
+**Where to ask for help**
+
+* Your first choice should be support that is provided by the Managed services. There are a few
+  channels in the Apache Airflow Slack that are dedicated to different groups of users and if you have
+  come to conclusion the question is more related to Airflow than the managed service,
+  you can use those channels.
+
+Using 3rd-party images, charts, deployments
+'''''''''''''''''''''''''''''''''''''''''''
+
+Follow the  `Ecosystem <https://airflow.apache.org/ecosystem/>`__ page to find all 3rd-party deployment options.
+
+**When this option works best**
+
+* Those installation methods are useful in case none of the official methods mentioned before work for you,
+  or you have historically used those. It is recommended though that whenever you consider any change,
+  you should consider switching to one of the methods that are officially supported by the Apache Airflow
+  Community or Managed Services.
+
+**Intended users**
+
+* Users who historically used other installation methods or find the official methods not sufficient for other reasons.
+
+**What are you expected to handle**
+
+* Depends on what the 3rd-party provides. Look at the documentation of the 3rd-party.
+
+**What Apache Airflow Community provides for that method**
+
+* Airflow Community does not provide any specific documentation for 3rd-party methods.
+  Please refer to the documentation of the Managed Services for details.
+
+**Where to ask for help**
+
+* Depends on what the 3rd-party provides. Look at the documentation of the 3rd-party deployment you use.
diff --git a/docs/apache-airflow/installation/installing-from-pypi.rst b/docs/apache-airflow/installation/installing-from-pypi.rst
new file mode 100644
index 0000000..b6c1eab
--- /dev/null
+++ b/docs/apache-airflow/installation/installing-from-pypi.rst
@@ -0,0 +1,207 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Installation from PyPI
+----------------------
+
+.. contents:: :local:
+
+
+This page describes installations using the ``apache-airflow`` package `published in
+PyPI <https://pypi.org/project/apache-airflow/>`__.
+
+Installation tools
+''''''''''''''''''
+
+Only ``pip`` installation is currently officially supported.
+
+While there are some successes with using other tools like `poetry <https://python-poetry.org/>`_ or
+`pip-tools <https://pypi.org/project/pip-tools/>`_, they do not share the same workflow as
+``pip`` - especially when it comes to constraint vs. requirements management.
+Installing via ``Poetry`` or ``pip-tools`` is not currently supported. If you wish to install airflow
+using those tools you should use the constraints and convert them to appropriate
+format and workflow that your tool requires.
+
+Typical command to install airflow from PyPI looks like below:
+
+.. code-block::
+
+    pip install "apache-airflow[celery]==|version|" --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-|version|/constraints-3.6.txt"
+
+This is an example, see further for more explanation.
+
+.. _installation:constraints:
+
+Constraints files
+'''''''''''''''''
+
+Airflow installation might be sometimes tricky because Airflow is a bit of both a library and application.
+Libraries usually keep their dependencies open and applications usually pin them, but we should do neither
+and both at the same time. We decided to keep our dependencies as open as possible
+(in ``setup.cfg`` and ``setup.py``) so users can install different
+version of libraries if needed. This means that from time to time plain ``pip install apache-airflow`` will
+not work or will produce unusable Airflow installation.
+
+In order to have repeatable installation, we also keep a set of "known-to-be-working" constraint files in the
+``constraints-main``, ``constraints-2-0``, ``constraints-2-1`` etc. orphan branches and then we create tag
+for each released version e.g. :subst-code:`constraints-|version|`. This way, when we keep a tested and working set of dependencies.
+
+Those "known-to-be-working" constraints are per major/minor Python version. You can use them as constraint
+files when installing Airflow from PyPI. Note that you have to specify correct Airflow version
+and Python versions in the URL.
+
+You can create the URL to the file substituting the variables in the template below.
+
+.. code-block::
+
+  https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt
+
+where:
+
+- ``AIRFLOW_VERSION`` - Airflow version (e.g. :subst-code:`|version|`) or ``main``, ``2-0``, for latest development version
+- ``PYTHON_VERSION`` Python version e.g. ``3.8``, ``3.7``
+
+There is also a no-providers constraint file, which contains just constraints required to install Airflow core. This allows
+to install and upgrade airflow separately and independently from providers.
+
+You can create the URL to the file substituting the variables in the template below.
+
+.. code-block::
+
+  https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-no-providers-${PYTHON_VERSION}.txt
+
+Installation and upgrade scenarios
+''''''''''''''''''''''''''''''''''
+
+In order to simplify the installation, we have prepared examples of how to upgrade Airflow and providers.
+
+Installing Airflow with extras and providers
+============================================
+
+If you need to install extra dependencies of airflow, you can use the script below to make an installation
+a one-liner (the example below installs postgres and google provider, as well as ``async`` extra.
+
+.. code-block:: bash
+    :substitutions:
+
+    AIRFLOW_VERSION=|version|
+    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
+    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
+    pip install "apache-airflow[async,postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
+
+Note, that it will install the versions of providers that were available at the moment this version of Airflow
+has been prepared. You need to follow next steps if you want to upgrade provider packages in case they were
+released afterwards.
+
+
+Upgrading Airflow with providers
+================================
+
+You can upgrade airflow together with extras (providers available at the time of the release of Airflow
+being installed.
+
+
+.. code-block:: bash
+    :substitutions:
+
+    AIRFLOW_VERSION=|version|
+    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
+    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
+    pip install --upgrade "apache-airflow[postgres,google]==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
+
+Installation and upgrading of Airflow providers separately
+==========================================================
+
+You can manually install all the providers you need. You can continue using the "providers" constraint files
+but the 'versioned' airflow constraints installs only the versions of providers that were available in PyPI at
+the time of preparing of the airflow version. However, usually you can use "main" version of the providers
+to install latest version of providers. Usually the providers work with most versions of Airflow, if there
+will be any incompatibilities, it will be captured as package dependencies.
+
+.. code-block:: bash
+
+    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
+    # For example: 3.6
+    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-${PYTHON_VERSION}.txt"
+    pip install "apache-airflow-providers-google" --constraint "${CONSTRAINT_URL}"
+
+You can also upgrade the providers to latest versions (you need to use main version of constraints for that):
+
+.. code-block:: bash
+
+    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
+    # For example: 3.6
+    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-${PYTHON_VERSION}.txt"
+    pip install "apache-airflow-providers-google" --upgrade --constraint "${CONSTRAINT_URL}"
+
+
+Installation and upgrade of Airflow core
+========================================
+
+If you don't want to install any extra providers, initially you can use the command set below.
+
+.. code-block:: bash
+    :substitutions:
+
+    AIRFLOW_VERSION=|version|
+    PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
+    # For example: 3.6
+    CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-no-providers-${PYTHON_VERSION}.txt"
+    # For example: https://raw.githubusercontent.com/apache/airflow/constraints-|version|/constraints-no-providers-3.6.txt
+    pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
+
+
+Troubleshooting
+'''''''''''''''
+
+This section describes how to troubleshoot installation issues with PyPI installation.
+
+Airflow command is not recognized
+=================================
+
+If the ``airflow`` command is not getting recognized (can happen on Windows when using WSL), then
+ensure that ``~/.local/bin`` is in your ``PATH`` environment variable, and add it in if necessary:
+
+.. code-block:: bash
+
+    PATH=$PATH:~/.local/bin
+
+You can also start airflow with ``python -m airflow``
+
+Symbol not found: ``_Py_GetArgcArgv``
+=====================================
+
+If you see ``Symbol not found: _Py_GetArgcArgv`` while starting or importing Airflow, this may mean that you are using an incompatible version of Python.
+For a homebrew installed version of Python, this is generally caused by using Python in ``/usr/local/opt/bin`` rather than the Frameworks installation (e.g. for ``python 3.7``: ``/usr/local/opt/python@3.7/Frameworks/Python.framework/Versions/3.7``).
+
+The crux of the issue is that a library Airflow depends on, ``setproctitle``, uses a non-public Python API
+which is not available from the standard installation ``/usr/local/opt/`` (which symlinks to a path under ``/usr/local/Cellar``).
+
+An easy fix is just to ensure you use a version of Python that has a dylib of the Python library available. For example:
+
+.. code-block:: bash
+
+  # Note: these instructions are for python3.7 but can be loosely modified for other versions
+  brew install python@3.7
+  virtualenv -p /usr/local/opt/python@3.7/Frameworks/Python.framework/Versions/3.7/bin/python3 .toy-venv
+  source .toy-venv/bin/activate
+  pip install apache-airflow
+  python
+  >>> import setproctitle
+  # Success!
+
+Alternatively, you can download and install Python directly from the `Python website <https://www.python.org/>`__.
diff --git a/docs/apache-airflow/installing-from-sources.rst b/docs/apache-airflow/installation/installing-from-sources.rst
similarity index 98%
rename from docs/apache-airflow/installing-from-sources.rst
rename to docs/apache-airflow/installation/installing-from-sources.rst
index a730078..fad080c 100644
--- a/docs/apache-airflow/installing-from-sources.rst
+++ b/docs/apache-airflow/installation/installing-from-sources.rst
@@ -28,8 +28,7 @@ Released packages
 
     This page describes downloading and verifying ``Apache Airflow`` version
     ``{{ airflow_version }}`` using officially released packages.
-    You can also install ``Apache Airflow`` - as most Python packages - via
-    `PyPI <https://pypi.org/project/apache-airflow/{{ airflow_version }}>`__ .
+    You can also install ``Apache Airflow`` - as most Python packages - via :doc:`PyPI <installing-from-pypi>`.
     You can choose different version of Airflow by selecting different version from the drop-down at
     the top-left of the page.
 
diff --git a/docs/apache-airflow-providers-microsoft-azure/operators/_partials/prerequisite_tasks.rst b/docs/apache-airflow/installation/prerequisites.rst
similarity index 51%
copy from docs/apache-airflow-providers-microsoft-azure/operators/_partials/prerequisite_tasks.rst
copy to docs/apache-airflow/installation/prerequisites.rst
index 02cb178..4f55d9e 100644
--- a/docs/apache-airflow-providers-microsoft-azure/operators/_partials/prerequisite_tasks.rst
+++ b/docs/apache-airflow/installation/prerequisites.rst
@@ -15,20 +15,28 @@
     specific language governing permissions and limitations
     under the License.
 
+Prerequisites
+-------------
 
+Airflow is tested with:
 
-To use these operators, you must do a few things:
+* Python: 3.6, 3.7, 3.8, 3.9
 
-  * Create necessary resources using `AZURE PORTAL`_ or `AZURE CLI`_.
-  * Install API libraries via **pip**.
+* Databases:
 
-    .. code-block:: bash
+  * PostgreSQL:  9.6, 10, 11, 12, 13
+  * MySQL: 5.7, 8
+  * SQLite: 3.15.0+
 
-      pip install 'apache-airflow[azure]'
+* Kubernetes: 1.18.15 1.19.7 1.20.2
 
-    Detailed information is available :doc:`apache-airflow:installation`
+**Note:** MySQL 5.x versions are unable to or have limitations with
+running multiple schedulers -- please see: :doc:`/concepts/scheduler`. MariaDB is not tested/recommended.
 
-  * :doc:`Setup Connection </connections/azure>`.
+**Note:** SQLite is used in Airflow tests. Do not use it in production. We recommend
+using the latest stable version of SQLite for local development.
 
-.. _AZURE PORTAL: https://portal.azure.com
-.. _AZURE CLI: https://docs.microsoft.com/en-us/cli/azure/
+Starting with Airflow 2.1.2, Airflow is tested with Python 3.6, 3.7, 3.8, and 3.9.
+
+The minimum memory required we recommend Airflow to run with is 4GB, but the actual requirements depends
+wildly on the deployment options you have
diff --git a/docs/apache-airflow-providers-google/operators/_partials/prerequisite_tasks.rst b/docs/apache-airflow/installation/setting-up-the-database.rst
similarity index 50%
copy from docs/apache-airflow-providers-google/operators/_partials/prerequisite_tasks.rst
copy to docs/apache-airflow/installation/setting-up-the-database.rst
index a509ecd..9ac64ea 100644
--- a/docs/apache-airflow-providers-google/operators/_partials/prerequisite_tasks.rst
+++ b/docs/apache-airflow/installation/setting-up-the-database.rst
@@ -15,19 +15,18 @@
     specific language governing permissions and limitations
     under the License.
 
+Setting up the database
+-----------------------
 
+Airflow requires a database. If you're just experimenting and learning Airflow, you can stick with the
+default SQLite option. If you don't want to use SQLite, then take a look at
+:doc:`/howto/set-up-database` to setup a different database.
 
-To use these operators, you must do a few things:
+Usually, you need to run ``airflow db upgrade`` in order to create the database schema that Airflow can use.
 
-  * Select or create a Cloud Platform project using the `Cloud Console <https://console.cloud.google.com/project>`__.
-  * Enable billing for your project, as described in the `Google Cloud documentation <https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project>`__.
-  * Enable the API, as described in the `Cloud Console documentation <https://cloud.google.com/apis/docs/enable-disable-apis>`__.
-  * Install API libraries via **pip**.
+Similarly, upgrading Airflow usually requires an extra step of upgrading the database. This is done
+with ``airflow db upgrade`` CLI command. You should make sure that Airflow components are
+not running while the upgrade is being executed.
 
-    .. code-block:: bash
-
-      pip install 'apache-airflow[google]'
-
-    Detailed information is available for :doc:`Installation <apache-airflow:installation>`.
-
-  * :doc:`Setup a Google Cloud Connection </connections/gcp>`.
+In some deployments, such as :doc:`helm-chart:index`, both initializing and running the database migration
+is executed automatically when Airflow is upgraded.
diff --git a/docs/apache-airflow/installation/supported-versions.rst b/docs/apache-airflow/installation/supported-versions.rst
new file mode 100644
index 0000000..2e1ee97
--- /dev/null
+++ b/docs/apache-airflow/installation/supported-versions.rst
@@ -0,0 +1,67 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Supported versions
+------------------
+
+Version Life Cycle
+``````````````````
+
+Apache Airflow version life cycle:
+
++---------+-----------------+---------------+-----------------+----------------+
+| Version | State           | First Release | Limited Support | EOL/Terminated |
++---------+-----------------+---------------+-----------------+----------------+
+| 2       | Supported       | Dec 17, 2020  | Dec 2021        | TBD            |
++---------+-----------------+---------------+-----------------+----------------+
+| 1.10    | Limited Support | Aug 27, 2018  | Dec 17, 2020    | June 2021      |
++---------+-----------------+---------------+-----------------+----------------+
+| 1.9     | EOL             | Jan 03, 2018  | Aug 27, 2018    | Aug 2018       |
++---------+-----------------+---------------+-----------------+----------------+
+| 1.8     | EOL             | Mar 19, 2017  | Jan 03, 2018    | Jan 2018       |
++---------+-----------------+---------------+-----------------+----------------+
+| 1.7     | EOL             | Mar 28, 2016  | Mar 19, 2017    | Mar 2017       |
++---------+-----------------+---------------+-----------------+----------------+
+
+Limited support versions will be supported with security and critical bug fix only.
+EOL versions will not get any fixes nor support.
+We **highly** recommend installing the latest Airflow release which has richer features.
+
+
+Support for Python and Kubernetes versions
+``````````````````````````````````````````
+
+As of Airflow 2.0 we agreed to certain rules we follow for Python and Kubernetes support.
+They are based on the official release schedule of Python and Kubernetes, nicely summarized in the
+`Python Developer's Guide <https://devguide.python.org/#status-of-python-branches>`_ and
+`Kubernetes version skew policy <https://kubernetes.io/docs/setup/release/version-skew-policy>`_.
+
+1. We drop support for Python and Kubernetes versions when they reach EOL. We drop support for those
+   EOL versions in main right after EOL date, and it is effectively removed when we release the
+   first new MINOR (Or MAJOR if there is no new MINOR version) of Airflow
+   For example for Python 3.6 it means that we drop support in main right after 23.12.2021, and the first
+   MAJOR or MINOR version of Airflow released after will not have it.
+
+2. The "oldest" supported version of Python/Kubernetes is the default one. "Default" is only meaningful
+   in terms of "smoke tests" in CI PRs which are run using this default version and default reference
+   image available in DockerHub. Currently ``apache/airflow:latest`` and ``apache/airflow:2.0.2`` images
+   are both Python 3.6 images, however the first MINOR/MAJOR release of Airflow release after 23.12.2021 will
+   become Python 3.7 images.
+
+3. We support a new version of Python/Kubernetes in main after they are officially released, as soon as we
+   make them work in our CI pipeline (which might not be immediate due to dependencies catching up with
+   new versions of Python mostly) we release a new images/support in Airflow based on the working CI setup.
diff --git a/docs/apache-airflow/redirects.txt b/docs/apache-airflow/redirects.txt
index acfd694..5e641fc 100644
--- a/docs/apache-airflow/redirects.txt
+++ b/docs/apache-airflow/redirects.txt
@@ -49,3 +49,8 @@ rest-api-ref.rst deprecated-rest-api-ref.rst
 concepts.rst concepts/index.rst
 smart-sensor.rst concepts/smart-sensors.rst
 scheduler.rst concepts/scheduler.rst
+
+# Installation
+installation.rst installation/index.rst
+upgrade-check.rst installation/upgrade-check.rst
+upgrading-to-2.rst upgrading-from-1-10/index.rst
diff --git a/docs/apache-airflow/upgrading-to-2.rst b/docs/apache-airflow/upgrading-from-1-10/index.rst
similarity index 85%
rename from docs/apache-airflow/upgrading-to-2.rst
rename to docs/apache-airflow/upgrading-from-1-10/index.rst
index 96293ae..7f4e3d9 100644
--- a/docs/apache-airflow/upgrading-to-2.rst
+++ b/docs/apache-airflow/upgrading-from-1-10/index.rst
@@ -16,19 +16,25 @@
    under the License.
 
 
-Upgrading to Airflow 2.0+
--------------------------
+Upgrading from 1.10 to 2
+------------------------
+
+.. toctree::
+    :hidden:
+
+    upgrade-check
 
 .. contents:: :local:
 
-Apache Airflow 2.0 is a major release and the purpose of this document is to assist
-users to migrate from Airflow 1.10.x to Airflow 2.0.
+Apache Airflow 2 is a major release and the purpose of this document is to assist
+users to migrate from Airflow 1.10.x to Airflow 2
 
-Step 1: Upgrade to Python 3
-'''''''''''''''''''''''''''
+Step 1: Switch to Python 3
+''''''''''''''''''''''''''
 
-Airflow 1.10 will be the last release series to support Python 2. Airflow 2.0.0
-requires Python 3.6+ and has been tested with Python versions 3.6, 3.7 and 3.8, but does not yet support Python 3.9.
+Airflow 1.10 was the last release series to support Python 2. Airflow 2.0.0
+requires Python 3.6+ and has been tested with Python versions 3.6, 3.7 and 3.8.
+Python 3.9 support was added from Airflow 2.1.2.
 
 If you have a specific task that still requires Python 2 then you can use the :class:`~airflow.operators.python.PythonVirtualenvOperator` or the ``KubernetesPodOperator`` for this.
 
@@ -37,17 +43,17 @@ For a list of breaking changes between Python 2 and Python 3, please refer to th
 from the CouchBaseDB team.
 
 
-Step 2: Upgrade to Airflow 1.10.15 (a.k.a our "bridge" release)
-'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
+Step 2: Upgrade to 1.10.15
+''''''''''''''''''''''''''
 
-To minimize friction for users upgrading from Airflow 1.10 to Airflow 2.0 and beyond, Airflow 1.10.15 "a bridge release" has
-been created. This is intended to be the final 1.10 feature release. Airflow 1.10.15 includes support for various features
+To minimize friction for users upgrading from Airflow 1.10 to Airflow 2.0 and beyond, Airflow 1.10.15 a.k.a "bridge release" has
+been created. This is the final 1.10 feature release. Airflow 1.10.15 includes support for various features
 that have been backported from Airflow 2.0 to make it easy for users to test their Airflow
 environment before upgrading to Airflow 2.0.
 
 We strongly recommend that all users upgrading to Airflow 2.0, first
 upgrade to Airflow 1.10.15 and test their Airflow deployment and only then upgrade to Airflow 2.0.
-The Airflow 1.10.x release tree will be supported for six months from Airflow 2.0 release date.
+Airflow 1.10.x reached end of life on 17 June 2021. No new Airflow 1.x versions will be released.
 
 Features in 1.10.15 include:
 
@@ -70,8 +76,8 @@ simply run the following command:
 Once you have performed this step, simply write out the file path to this file in the ``pod_template_file`` config of the ``kubernetes``
 section of your ``airflow.cfg``
 
-Step 3: Install and run the Upgrade check scripts
-'''''''''''''''''''''''''''''''''''''''''''''''''
+Step 3: Run the Upgrade check scripts
+'''''''''''''''''''''''''''''''''''''
 
 After upgrading to Airflow 1.10.15, we recommend that you install the "upgrade check" scripts. These scripts will read through your ``airflow.cfg`` and all of your DAGs and will give a detailed report of all changes required before upgrading. We are testing this script diligently, and our goal is that any Airflow setup that can pass these tests will be able to upgrade to 2.0 without any issues.
 
@@ -88,8 +94,8 @@ Once this is installed, please run the upgrade check script.
 More details about this process are here :ref:`Upgrade Check Scripts<upgrade-check>`.
 
 
-Step 4: Import Operators from Backport Providers
-''''''''''''''''''''''''''''''''''''''''''''''''
+Step 4: Switch to Backport Providers
+''''''''''''''''''''''''''''''''''''
 
 Now that you are set up in Airflow 1.10.15 with Python a 3.6+ environment, you are ready to start porting your DAGs to Airflow 2.0 compliance!
 
@@ -146,14 +152,14 @@ The behavior can be reverted when instantiating a DAG.
 
     import jinja2
 
-    dag = DAG('simple_dag', template_undefined=jinja2.Undefined)
+    dag = DAG("simple_dag", template_undefined=jinja2.Undefined)
 
 Alternatively, it is also possible to override each Jinja Template variable on an individual basis
 by using the ``| default`` Jinja filter as shown below.
 
 .. code-block:: python
 
-    {{ a | default(1) }}
+    {{a | default(1)}}
 
 
 
@@ -174,23 +180,18 @@ Whereas previously a user would import each individual class to build the pod as
     from airflow.kubernetes.volume_mount import VolumeMount
 
 
-    volume_config = {
-        'persistentVolumeClaim': {
-            'claimName': 'test-volume'
-        }
-    }
-    volume = Volume(name='test-volume', configs=volume_config)
-    volume_mount = VolumeMount('test-volume',
-                               mount_path='/root/mount_file',
-                               sub_path=None,
-                               read_only=True)
+    volume_config = {"persistentVolumeClaim": {"claimName": "test-volume"}}
+    volume = Volume(name="test-volume", configs=volume_config)
+    volume_mount = VolumeMount(
+        "test-volume", mount_path="/root/mount_file", sub_path=None, read_only=True
+    )
 
-    port = Port('http', 80)
-    secret_file = Secret('volume', '/etc/sql_conn', 'airflow-secrets', 'sql_alchemy_conn')
-    secret_env = Secret('env', 'SQL_CONN', 'airflow-secrets', 'sql_alchemy_conn')
+    port = Port("http", 80)
+    secret_file = Secret("volume", "/etc/sql_conn", "airflow-secrets", "sql_alchemy_conn")
+    secret_env = Secret("env", "SQL_CONN", "airflow-secrets", "sql_alchemy_conn")
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo", "10"],
@@ -219,23 +220,25 @@ Now the user can use the ``kubernetes.client.models`` class as a single point of
     from airflow.kubernetes.secret import Secret
 
 
-    configmaps = ['test-configmap-1', 'test-configmap-2']
+    configmaps = ["test-configmap-1", "test-configmap-2"]
 
     volume = k8s.V1Volume(
-        name='test-volume',
-        persistent_volume_claim=k8s.V1PersistentVolumeClaimVolumeSource(claim_name='test-volume'),
+        name="test-volume",
+        persistent_volume_claim=k8s.V1PersistentVolumeClaimVolumeSource(
+            claim_name="test-volume"
+        ),
     )
 
-    port = k8s.V1ContainerPort(name='http', container_port=80)
-    secret_file = Secret('volume', '/etc/sql_conn', 'airflow-secrets', 'sql_alchemy_conn')
-    secret_env = Secret('env', 'SQL_CONN', 'airflow-secrets', 'sql_alchemy_conn')
-    secret_all_keys = Secret('env', None, 'airflow-secrets-2')
+    port = k8s.V1ContainerPort(name="http", container_port=80)
+    secret_file = Secret("volume", "/etc/sql_conn", "airflow-secrets", "sql_alchemy_conn")
+    secret_env = Secret("env", "SQL_CONN", "airflow-secrets", "sql_alchemy_conn")
+    secret_all_keys = Secret("env", None, "airflow-secrets-2")
     volume_mount = k8s.V1VolumeMount(
-        name='test-volume', mount_path='/root/mount_file', sub_path=None, read_only=True
+        name="test-volume", mount_path="/root/mount_file", sub_path=None, read_only=True
     )
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo", "10"],
@@ -247,7 +250,8 @@ Now the user can use the ``kubernetes.client.models`` class as a single point of
         name="airflow-test-pod",
         task_id="task",
         is_delete_operator_pod=True,
-        hostnetwork=False)
+        hostnetwork=False,
+    )
 
 
 We decided to keep the Secret class as users seem to really like that simplifies the complexity of mounting
@@ -297,7 +301,7 @@ When DAGs are initialized with the ``access_control`` variable set, any usage of
     as it is the only default UI.
 
     If you previously used non-RBAC UI, you have to switch to the new RBAC-UI and create users to be able
-    to access Airflow's webserver. For more details on CLI to create users see :doc:`cli-and-env-variables-ref`
+    to access Airflow's webserver. For more details on CLI to create users see :doc:`/cli-and-env-variables-ref`
 
 Please note that custom auth backends will need re-writing to target new FAB based UI.
 
@@ -338,7 +342,7 @@ the only supported UI.
     to use randomly generated secret keys instead of an insecure default and may break existing
     deployments that rely on the default.
 
-The ``flask-ouathlib`` has been replaced with ``authlib`` because ``flask-outhlib`` has
+The ``flask-oauthlib`` has been replaced with ``authlib`` because ``flask-oauthlib`` has
 been deprecated in favor of ``authlib``.
 The Old and New provider configuration keys that have changed are as follows
 
@@ -445,9 +449,9 @@ While in the deprecated version a user would mount a volume using the following
                         "mountPath": "/foo/",
                         "name": "example-kubernetes-test-volume",
                     },
-                ]
+                ],
             }
-        }
+        },
     )
 
 In the new model a user can accomplish the same thing using the following code under the ``pod_override`` key:
@@ -459,38 +463,37 @@ In the new model a user can accomplish the same thing using the following code u
     second_task = PythonOperator(
         task_id="four_task",
         python_callable=test_volume_mount,
-        executor_config={"pod_override": k8s.V1Pod(
-            spec=k8s.V1PodSpec(
-                containers=[
-                    k8s.V1Container(
-                        name="base",
-                        volume_mounts=[
-                            k8s.V1VolumeMount(
-                                mount_path="/foo/",
-                                name="example-kubernetes-test-volume"
-                            )
-                        ]
-                    )
-                ],
-                volumes=[
-                    k8s.V1Volume(
-                        name="example-kubernetes-test-volume",
-                        host_path=k8s.V1HostPathVolumeSource(
-                            path="/tmp/"
+        executor_config={
+            "pod_override": k8s.V1Pod(
+                spec=k8s.V1PodSpec(
+                    containers=[
+                        k8s.V1Container(
+                            name="base",
+                            volume_mounts=[
+                                k8s.V1VolumeMount(
+                                    mount_path="/foo/",
+                                    name="example-kubernetes-test-volume",
+                                )
+                            ],
                         )
-                    )
-                ]
+                    ],
+                    volumes=[
+                        k8s.V1Volume(
+                            name="example-kubernetes-test-volume",
+                            host_path=k8s.V1HostPathVolumeSource(path="/tmp/"),
+                        )
+                    ],
+                )
             )
-        )
-        }
+        },
     )
 
 For Airflow 2.0, the traditional ``executor_config`` will continue operation with a deprecation warning,
 but will be removed in a future version.
 
 
-Step 7: Upgrade to Airflow 2.0
-'''''''''''''''''''''''''''''''
+Step 7: Upgrade to Airflow 2
+''''''''''''''''''''''''''''
 
 After running the upgrade checks as described above, installing the backported providers, modifying
 the DAGs to be compatible, and updating the configuration settings, you should be ready to upgrade to Airflow 2.0.
@@ -531,13 +534,6 @@ At this point, just follow the standard Airflow version upgrade process:
 
 * Restart Airflow Scheduler, Webserver, and Workers
 
-
-Frequently Asked Questions on Upgrade
-'''''''''''''''''''''''''''''''''''''
-* Q. Why doesn't the list of connection types show up in the Airflow UI after I upgrade to 2.0?
-  * A. It is because Airflow 2.0 does not ship with the provider packages. The connection type list in the Airflow UI is based on the providers you have installed with Airflow 2.0. Please note that these will only show up once you install the provider and restart Airflow. You can read more about providers at :doc:`apache-airflow-providers:index`.
-
-
 Appendix
 ''''''''
 
@@ -551,9 +547,10 @@ Before:
 .. code-block:: python
 
     from airflow.kubernetes.pod import Port
-    port = Port('http', 80)
+
+    port = Port("http", 80)
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -566,9 +563,10 @@ After:
 .. code-block:: python
 
     from kubernetes.client import models as k8s
-    port = k8s.V1ContainerPort(name='http', container_port=80)
+
+    port = k8s.V1ContainerPort(name="http", container_port=80)
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -583,12 +581,12 @@ Before:
 .. code-block:: python
 
     from airflow.kubernetes.volume_mount import VolumeMount
-    volume_mount = VolumeMount('test-volume',
-                               mount_path='/root/mount_file',
-                               sub_path=None,
-                               read_only=True)
+
+    volume_mount = VolumeMount(
+        "test-volume", mount_path="/root/mount_file", sub_path=None, read_only=True
+    )
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -601,11 +599,12 @@ After:
 .. code-block:: python
 
     from kubernetes.client import models as k8s
+
     volume_mount = k8s.V1VolumeMount(
-        name='test-volume', mount_path='/root/mount_file', sub_path=None, read_only=True
+        name="test-volume", mount_path="/root/mount_file", sub_path=None, read_only=True
     )
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -622,14 +621,10 @@ Before:
 
     from airflow.kubernetes.volume import Volume
 
-    volume_config = {
-        'persistentVolumeClaim': {
-            'claimName': 'test-volume'
-    }
-    }
-    volume = Volume(name='test-volume', configs=volume_config)
+    volume_config = {"persistentVolumeClaim": {"claimName": "test-volume"}}
+    volume = Volume(name="test-volume", configs=volume_config)
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -642,12 +637,15 @@ After:
 .. code-block:: python
 
     from kubernetes.client import models as k8s
+
     volume = k8s.V1Volume(
-        name='test-volume',
-        persistent_volume_claim=k8s.V1PersistentVolumeClaimVolumeSource(claim_name='test-volume'),
+        name="test-volume",
+        persistent_volume_claim=k8s.V1PersistentVolumeClaimVolumeSource(
+            claim_name="test-volume"
+        ),
     )
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -662,7 +660,7 @@ Before:
 .. code-block:: python
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -677,17 +675,12 @@ After:
     from kubernetes.client import models as k8s
 
     env_vars = [
-        k8s.V1EnvVar(
-            name="ENV1",
-            value="val1"
-        ),
-        k8s.V1EnvVar(
-            name="ENV2",
-            value="val2"
-        )]
+        k8s.V1EnvVar(name="ENV1", value="val1"),
+        k8s.V1EnvVar(name="ENV2", value="val2"),
+    ]
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -707,7 +700,7 @@ Before:
     from airflow.kubernetes.pod_runtime_info_env import PodRuntimeInfoEnv
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -725,15 +718,13 @@ After:
         k8s.V1EnvVar(
             name="ENV3",
             value_from=k8s.V1EnvVarSource(
-                field_ref=k8s.V1ObjectFieldSelector(
-                    field_path="status.podIP"
-                )
-            )
+                field_ref=k8s.V1ObjectFieldSelector(field_path="status.podIP")
+            ),
         )
     ]
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -750,12 +741,12 @@ Before:
 .. code-block:: python
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
-        configmaps=['test-configmap'],
-        task_id="task"
+        configmaps=["test-configmap"],
+        task_id="task",
     )
 
 
@@ -765,20 +756,18 @@ After:
 
     from kubernetes.client import models as k8s
 
-    configmap ="test-configmap"
-    env_from = [k8s.V1EnvFromSource(
-                    config_map_ref=k8s.V1ConfigMapEnvSource(
-                        name=configmap
-                    )
-                )]
+    configmap = "test-configmap"
+    env_from = [
+        k8s.V1EnvFromSource(config_map_ref=k8s.V1ConfigMapEnvSource(name=configmap))
+    ]
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
         env_from=env_from,
-        task_id="task"
+        task_id="task",
     )
 
 
@@ -789,15 +778,15 @@ Before:
 .. code-block:: python
 
     resources = {
-        'limit_cpu': 0.25,
-        'limit_memory': '64Mi',
-        'limit_ephemeral_storage': '2Gi',
-        'request_cpu': '250m',
-        'request_memory': '64Mi',
-        'request_ephemeral_storage': '1Gi',
+        "limit_cpu": 0.25,
+        "limit_memory": "64Mi",
+        "limit_ephemeral_storage": "2Gi",
+        "request_cpu": "250m",
+        "request_memory": "64Mi",
+        "request_ephemeral_storage": "1Gi",
     }
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -815,21 +804,17 @@ After:
 
     from kubernetes.client import models as k8s
 
-    resources=k8s.V1ResourceRequirements(
-        requests={
-            'memory': '64Mi',
-            'cpu': '250m',
-            'ephemeral-storage': '1Gi'
-        },
+    resources = k8s.V1ResourceRequirements(
+        requests={"memory": "64Mi", "cpu": "250m", "ephemeral-storage": "1Gi"},
         limits={
-            'memory': '64Mi',
-            'cpu': 0.25,
-            'nvidia.com/gpu': None,
-            'ephemeral-storage': '2Gi'
-        }
+            "memory": "64Mi",
+            "cpu": 0.25,
+            "nvidia.com/gpu": None,
+            "ephemeral-storage": "2Gi",
+        },
     )
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
@@ -849,14 +834,14 @@ Before:
 .. code-block:: python
 
     k = KubernetesPodOperator(
-        namespace='default',
+        namespace="default",
         image="ubuntu:16.04",
         cmds=["bash", "-cx"],
         arguments=["echo 10"],
         name="test",
         task_id="task",
         image_pull_secrets="fake-secret",
-        cluster_context='default'
+        cluster_context="default",
     )
 
 After:
@@ -864,9 +849,9 @@ After:
 .. code-block:: python
 
     quay_k8s = KubernetesPodOperator(
-        namespace='default',
-        image='quay.io/apache/bash',
-        image_pull_secrets=[k8s.V1LocalObjectReference('testquay')],
+        namespace="default",
+        image="quay.io/apache/bash",
+        image_pull_secrets=[k8s.V1LocalObjectReference("testquay")],
         cmds=["bash", "-cx"],
         name="airflow-private-image-pod",
         task_id="task-two",
@@ -1136,19 +1121,19 @@ non-RBAC UI (``flask-admin`` based UI), update it to use ``flask_appbuilder_view
 
 
     class TestView(BaseView):
-        @expose('/')
+        @expose("/")
         def test(self):
             # in this example, put your test_plugin/test.html template at airflow/plugins/templates/test_plugin/test.html
             return self.render("test_plugin/test.html", content="Hello galaxy!")
 
+
     v = TestView(category="Test Plugin", name="Test View")
 
     ml = MenuLink(
-        category='Test Plugin',
-        name='Test Menu Link',
-        url='https://airflow.apache.org/'
+        category="Test Plugin", name="Test Menu Link", url="https://airflow.apache.org/"
     )
 
+
     class AirflowTestPlugin(AirflowPlugin):
         admin_views = [v]
         menu_links = [ml]
@@ -1161,6 +1146,7 @@ non-RBAC UI (``flask-admin`` based UI), update it to use ``flask_appbuilder_view
     from airflow.plugins_manager import AirflowPlugin
     from flask_appbuilder import expose, BaseView as AppBuilderBaseView
 
+
     class TestAppBuilderBaseView(AppBuilderBaseView):
         default_view = "test"
 
@@ -1168,16 +1154,21 @@ non-RBAC UI (``flask-admin`` based UI), update it to use ``flask_appbuilder_view
         def test(self):
             return self.render_template("test_plugin/test.html", content="Hello galaxy!")
 
+
     v_appbuilder_view = TestAppBuilderBaseView()
-    v_appbuilder_package = {"name": "Test View",
-                            "category": "Test Plugin",
-                            "view": v_appbuilder_view}
+    v_appbuilder_package = {
+        "name": "Test View",
+        "category": "Test Plugin",
+        "view": v_appbuilder_view,
+    }
 
     # Creating a flask appbuilder Menu Item
-    appbuilder_mitem = {"name": "Google",
-                        "category": "Search",
-                        "category_icon": "fa-th",
-                        "href": "https://www.google.com"}
+    appbuilder_mitem = {
+        "name": "Google",
+        "category": "Search",
+        "category_icon": "fa-th",
+        "href": "https://www.google.com",
+    }
 
 
     # Defining the plugin class
@@ -1199,14 +1190,9 @@ depending on the development packages then you should use ``devel_all``.
 Support for Airflow 1.10.x releases
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
-The Airflow 1.10.x release tree will be supported for **six months** from Airflow 2.0 release date.
-Specifically, only "critical fixes" defined as fixes
-to bugs that take down Production systems, will be backported to 1.10.x core for
-**six months** after Airflow 2.0.0 is released.
+Airflow 1.10.x reached end of life on 17 June 2021. No new Airflow 1.x versions will be released.
 
-In addition, Backport providers within
-1.10.x, will be supported for critical fixes for **three months** from Airflow 2.0.0
-release date.
+Support of Backport providers ended on 17 March 2021. No new versions of backport providers will be released.
 
 We plan to take a strict Semantic Versioning approach to our versioning and release process. This
 means that we do not plan to make any backwards-incompatible changes in the 2.* releases. Any
diff --git a/docs/apache-airflow/upgrade-check.rst b/docs/apache-airflow/upgrading-from-1-10/upgrade-check.rst
similarity index 100%
rename from docs/apache-airflow/upgrade-check.rst
rename to docs/apache-airflow/upgrading-from-1-10/upgrade-check.rst
diff --git a/docs/build_docs.py b/docs/build_docs.py
index 0f52211..6edd713 100755
--- a/docs/build_docs.py
+++ b/docs/build_docs.py
@@ -96,6 +96,12 @@ def _get_parser():
         '--disable-checks', dest='disable_checks', action='store_true', help='Disables extra checks'
     )
     parser.add_argument(
+        '--one-pass-only',
+        dest='one_pass_only',
+        action='store_true',
+        help='Do not attempt multiple builds on error',
+    )
+    parser.add_argument(
         "--package-filter",
         action="append",
         help=(
@@ -485,31 +491,32 @@ def main():
     if package_spelling_errors:
         all_spelling_errors.update(package_spelling_errors)
 
-    # Build documentation for some packages again if it can help them.
-    package_build_errors, package_spelling_errors = retry_building_docs_if_needed(
-        all_build_errors,
-        all_spelling_errors,
-        args,
-        docs_only,
-        for_production,
-        jobs,
-        package_build_errors,
-        package_spelling_errors,
-        spellcheck_only,
-    )
+    if not args.one_pass_only:
+        # Build documentation for some packages again if it can help them.
+        package_build_errors, package_spelling_errors = retry_building_docs_if_needed(
+            all_build_errors,
+            all_spelling_errors,
+            args,
+            docs_only,
+            for_production,
+            jobs,
+            package_build_errors,
+            package_spelling_errors,
+            spellcheck_only,
+        )
 
-    # And try again in case one change spans across three-level dependencies
-    retry_building_docs_if_needed(
-        all_build_errors,
-        all_spelling_errors,
-        args,
-        docs_only,
-        for_production,
-        jobs,
-        package_build_errors,
-        package_spelling_errors,
-        spellcheck_only,
-    )
+        # And try again in case one change spans across three-level dependencies
+        retry_building_docs_if_needed(
+            all_build_errors,
+            all_spelling_errors,
+            args,
+            docs_only,
+            for_production,
+            jobs,
+            package_build_errors,
+            package_spelling_errors,
+            spellcheck_only,
+        )
 
     if not disable_checks:
         general_errors = lint_checks.run_all_check()
diff --git a/docs/conf.py b/docs/conf.py
index c3521ae..128714e 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -264,7 +264,8 @@ if PACKAGE_NAME == 'apache-airflow':
     ]
     # Replace "|version|" in links
     manual_substitutions_in_generated_html = [
-        "installation.html",
+        "installation/installing-from-pypi.html",
+        "installation/installing-from-sources.html",
     ]
 
 if PACKAGE_NAME == 'docker-stack':
diff --git a/docs/docker-stack/build.rst b/docs/docker-stack/build.rst
index f508b72..d8dbb36 100644
--- a/docs/docker-stack/build.rst
+++ b/docs/docker-stack/build.rst
@@ -391,8 +391,7 @@ You can also download any version of Airflow constraints and adapt it with your
 constraints and manually set your own versions of dependencies in your own constraints and use the version
 of constraints that you manually prepared.
 
-You can read more about constraints in the documentation of the
-`Installation <http://airflow.apache.org/docs/apache-airflow/stable/installation.html#constraints-files>`_
+You can read more about constraints in :doc:`apache-airflow:installation/installing-from-pypi`
 
 Note that if you place ``requirements.txt`` in the ``docker-context-files`` folder, it will be
 used to install all requirements declared there. It is recommended that the file
diff --git a/docs/exts/extra_files_with_substitutions.py b/docs/exts/extra_files_with_substitutions.py
index 6c2fa08..bc08e64 100644
--- a/docs/exts/extra_files_with_substitutions.py
+++ b/docs/exts/extra_files_with_substitutions.py
@@ -32,11 +32,13 @@ def copy_docker_compose(app, exception):
                 for line in file:
                     output_file.write(line.replace('|version|', app.config.version))
 
-    # Replace `|version|` in the installation.html that requires manual substitutions (in links)
+    # Replace `|version|` in the installation files that requires manual substitutions (in links)
     for path in app.config.manual_substitutions_in_generated_html:
-        with open(os.path.join(app.outdir, os.path.basename(path))) as input_file:
+        with open(os.path.join(app.outdir, os.path.dirname(path), os.path.basename(path))) as input_file:
             content = input_file.readlines()
-        with open(os.path.join(app.outdir, os.path.basename(path)), "wt") as output_file:
+        with open(
+            os.path.join(app.outdir, os.path.dirname(path), os.path.basename(path)), "wt"
+        ) as output_file:
             for line in content:
                 output_file.write(line.replace('|version|', app.config.version))
 
diff --git a/docs/helm-chart/index.rst b/docs/helm-chart/index.rst
index 11a3edb..6910a73 100644
--- a/docs/helm-chart/index.rst
+++ b/docs/helm-chart/index.rst
@@ -27,29 +27,26 @@ Helm Chart for Apache Airflow
     Home <self>
     quick-start
     airflow-configuration
+    adding-connections-and-variables
     manage-dags-files
     manage-logs
     keda
     using-additional-containers
-    production-guide
-
-.. toctree::
-    :hidden:
-    :caption: References
-
-    Parameters <parameters-ref>
+    Installing from sources<installing-helm-chart-from-sources>
 
 .. toctree::
     :hidden:
-    :caption: Resources
+    :caption: Guides
 
-    Installing from sources<installing-helm-chart-from-sources>
+    production-guide
 
 .. toctree::
     :hidden:
-    :caption: Resources
+    :caption: References
 
-    Installing from sources<installing-helm-chart-from-sources>
+    Parameters <parameters-ref>
+    changelog
+    Updating <updating>
 
 
 This chart will bootstrap an `Airflow <https://airflow.apache.org>`__
@@ -60,7 +57,7 @@ Requirements
 ------------
 
 -  Kubernetes 1.14+ cluster
--  Helm 2.11+ or Helm 3.0+
+-  Helm 3.0+
 -  PV provisioner support in the underlying infrastructure (optionally)
 
 Features
@@ -84,13 +81,13 @@ Features
 Installing the Chart
 --------------------
 
-To install this repository from source (using helm 3)
+To install this chart using helm 3, run the following commands:
 
 .. code-block:: bash
 
     kubectl create namespace airflow
-    helm dep update
-    helm install airflow . --namespace airflow
+    helm repo add apache-airflow https://airflow.apache.org
+    helm install airflow apache-airflow/airflow --namespace airflow
 
 The command deploys Airflow on the Kubernetes cluster in the default configuration. The :doc:`parameters-ref`
 section lists the parameters that can be configured during installation.
@@ -105,7 +102,10 @@ To upgrade the chart with the release name ``airflow``:
 
 .. code-block:: bash
 
-    helm upgrade airflow . --namespace airflow
+    helm upgrade airflow apache-airflow/airflow --namespace airflow
+
+.. note::
+  To upgrade to a new version of the chart, run ``helm repo update`` first.
 
 Uninstalling the Chart
 ----------------------

[airflow] 01/02: Improves installing from sources pages for all components (#18251)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3b796a11e53438e2e45e795718265e5f1b13c060
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Wed Sep 15 08:32:06 2021 +0200

    Improves installing from sources pages for all components (#18251)
    
    * Shorter menu sections for installation page
    * Added "installing from sources" for Helm Chart
    * Added Providers summary page for all provider packages
    * Added scripts to verify PyPI packages with gpg/sha
    
    (cherry picked from commit 67fddbf644eb6b11810addaf17d1be29c1bc39f6)
---
 docs/helm-chart/index.rst | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/docs/helm-chart/index.rst b/docs/helm-chart/index.rst
index d43a70d..11a3edb 100644
--- a/docs/helm-chart/index.rst
+++ b/docs/helm-chart/index.rst
@@ -45,6 +45,12 @@ Helm Chart for Apache Airflow
 
     Installing from sources<installing-helm-chart-from-sources>
 
+.. toctree::
+    :hidden:
+    :caption: Resources
+
+    Installing from sources<installing-helm-chart-from-sources>
+
 
 This chart will bootstrap an `Airflow <https://airflow.apache.org>`__
 deployment on a `Kubernetes <http://kubernetes.io>`__ cluster using the