You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by po...@apache.org on 2020/08/02 07:35:59 UTC
[airflow] branch master updated: Combine entries in UPDATING.md
file (#10102)
This is an automated email from the ASF dual-hosted git repository.
potiuk pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git
The following commit(s) were added to refs/heads/master by this push:
new 5fe7da9 Combine entries in UPDATING.md file (#10102)
5fe7da9 is described below
commit 5fe7da928ab40193ed27bd0cfaea2ef4534cb7c2
Author: Kamil BreguĊa <mi...@users.noreply.github.com>
AuthorDate: Sun Aug 2 09:35:19 2020 +0200
Combine entries in UPDATING.md file (#10102)
---
UPDATING.md | 72 +++++++++++++++++++++++++++++--------------------------------
1 file changed, 34 insertions(+), 38 deletions(-)
diff --git a/UPDATING.md b/UPDATING.md
index eb18d99..823b2d7 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -1545,25 +1545,40 @@ We standardised the Extras names and synchronized providers package names with t
We deprecated a number of extras in 2.0.
-| Deprecated extras | New extras |
-|-------------------|------------------|
-| atlas | apache.atlas |
-| aws | amazon |
-| azure | microsoft.azure |
-| cassandra | apache.cassandra |
-| druid | apache.druid |
-| gcp | google |
-| gcp_api | google |
-| hdfs | apache.hdfs |
-| hive | apache.hive |
-| kubernetes | cncf.kubernetes |
-| mssql | microsoft.mssql |
-| pinot | apache.pinot |
-| webhdfs | apache.webhdfs |
-| winrm | apache.winrm |
-
-For example instead of `pip install apache-airflow[atlas]` you should use
-`pip install apache-airflow[apache.atlas]` .
+| Deprecated extras | New extras |
+|---------------------------|------------------|
+| atlas | apache.atlas |
+| aws | amazon |
+| azure | microsoft.azure |
+| azure_blob_storage | microsoft.azure |
+| azure_data_lake | microsoft.azure |
+| azure_cosmos | microsoft.azure |
+| azure_container_instances | microsoft.azure |
+| cassandra | apache.cassandra |
+| druid | apache.druid |
+| gcp | google |
+| gcp_api | google |
+| hdfs | apache.hdfs |
+| hive | apache.hive |
+| kubernetes | cncf.kubernetes |
+| mssql | microsoft.mssql |
+| pinot | apache.pinot |
+| webhdfs | apache.webhdfs |
+| winrm | apache.winrm |
+
+For example:
+
+If you want to install integration for Microsoft Azure, then instead of `pip install apache-airflow[atlas]`
+you should use `pip install apache-airflow[apache.atlas]`.
+
+If you want to install integration for Microsoft Azure, then instead of
+```
+pip install 'apache-airflow[azure_blob_storage,azure_data_lake,azure_cosmos,azure_container_instances]'
+```
+you should execute `pip install 'apache-airflow[azure]'`
+
+If you want to install integration for Amazon Web Services, then instead of
+`pip install 'apache-airflow[s3,emr]'`, you should execute `pip install 'apache-airflow[aws]'`
The deprecated extras will be removed in 2.1:
@@ -1581,25 +1596,6 @@ plugins =
airflow.mypy.plugin.decorators
```
-#### Renamed "extra" requirements for cloud providers
-
-Subpackages for specific services have been combined into one variant for
-each cloud provider. The name of the subpackage for the Google Cloud Platform
-has changed to follow style.
-
-If you want to install integration for Microsoft Azure, then instead of
-```
-pip install 'apache-airflow[azure_blob_storage,azure_data_lake,azure_cosmos,azure_container_instances]'
-```
-you should execute `pip install 'apache-airflow[azure]'`
-
-If you want to install integration for Amazon Web Services, then instead of
-`pip install 'apache-airflow[s3,emr]'`, you should execute `pip install 'apache-airflow[aws]'`
-
-If you want to install integration for Google Cloud Platform, then instead of
-`pip install 'apache-airflow[gcp_api]'`, you should execute `pip install 'apache-airflow[gcp]'`.
-The old way will work until the release of Airflow 2.1.
-
#### Simplify the response payload of endpoints /dag_stats and /task_stats
The response of endpoints `/dag_stats` and `/task_stats` help UI fetch brief statistics about DAGs and Tasks. The format was like