You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/12/07 00:44:44 UTC

[GitHub] [airflow] vikramkoka opened a new pull request #12872: Added documentation for Aiflow Upgrade Check

vikramkoka opened a new pull request #12872:
URL: https://github.com/apache/airflow/pull/12872


   Added a documentation file for the Airflow Upgrade Check scripts. This is still
   fairly early, but wanted a starting point to detail how to get the python
   package, run the script, understand and apply the results.
   
   <!--
   Thank you for contributing! Please make sure that your code changes
   are covered with tests. And in case of new features or big changes
   remember to adjust the documentation.
   
   Feel free to ping committers for the review!
   
   In case of existing issue, reference it using one of the following:
   
   closes: #ISSUE
   related: #ISSUE
   
   How to write a good git commit message:
   http://chris.beams.io/posts/git-commit/
   -->
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information.
   In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #12872: Added documentation for Aiflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r537325604



##########
File path: docs/apache-airflow/upgrade-check.rst
##########
@@ -0,0 +1,145 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one

Review comment:
       I think this should be moved to a separate documentation package so that this documentation can be updated independently of Airflow. Simply put, you need to create a new directory in / docs / and then update the /docs/conf.py and /docs/build_docs.py files so that the package can be built.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] vikramkoka commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
vikramkoka commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r538000496



##########
File path: docs/upgprade-check/upgrade-check.rst
##########
@@ -0,0 +1,161 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Upgrade Check Script
+--------------------
+
+.. contents:: :local:
+
+Getting the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+Apache Airflow is published as ``apache-airflow`` package in PyPI. The Upgrade Check Script is part of a
+separate Python Package, since it is separate from the core Apache Airflow package and is only needed for
+a period of time and specifically only for upgrading from Airflow 1.10 releases to Airflow 2.0.
+
+While there has been a lot of work put into making this upgrade as easy as possible, there are some
+changes which are compatible between Airflow 1.10 and Airflow 2.0. In order to make this as simple to
+navigate, we recommend that people first upgrade to the latest release in the 1.10 series (at the
+time of writing: 1.10.14) and then to download this package and run the script as detailed below.
+
+
+.. note::
+
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+.. code-block:: bash
+
+    pip install apache-airflow-upgrade-check
+
+This will install the latest version of the Airflow Upgrade check package.
+
+
+Running the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+.. code-block:: bash
+
+    airflow upgrade_check
+
+This will print out to the screen a number of actions that should be taken before upgrading the Airflow
+release to 2.0.0 or higher.
+
+The exit code of the command will be 0 (success) if no problems are found in running the command, or 1 if
+problems were encountered in running the check.
+
+A sample output as a result of a successful run of the upgrade check is shown below.
+
+.. code-block:: bash
+
+    ============================================== STATUS ============================================
+
+    Check for latest versions of apache-airflow and checker..................................SUCCESS
+    Remove airflow.AirflowMacroPlugin class..................................................SUCCESS
+    Chain between DAG and operator not allowed...............................................SUCCESS
+    Connection.conn_id is not unique.........................................................SUCCESS
+    Connection.conn_type is not nullable.....................................................SUCCESS
+    Fernet is enabled by default.............................................................FAIL
+    GCP service account key deprecation......................................................SUCCESS
+    Changes in import paths of hooks, operators, sensors and others..........................FAIL
+    Users must delete deprecated configs for KubernetesExecutor..............................FAIL
+    Legacy UI is deprecated by default.......................................................SUCCESS
+    Logging configuration has been moved to new section......................................FAIL
+    Removal of Mesos Executor................................................................SUCCESS
+    Users must set a kubernetes.pod_template_file value......................................FAIL
+    SendGrid email uses old airflow.contrib module...........................................SUCCESS
+    Changes in import path of remote task handlers...........................................SUCCESS
+    Jinja Template Variables cannot be undefined.............................................FAIL
+    Found 7 problems.
+
+    ========================================== RECOMMENDATIONS ========================================
+
+    Fernet is enabled by default
+    ----------------------------
+    The fernet mechanism is enabled by default to increase the security of the default installation.
+
+    Problems:
+
+    1.  fernet_key in airflow.cfg must be explicitly set empty as fernet mechanism is enabledby default. This means that the apache-airflow[crypto] extra-packages are always installed.However, this requires that your operating system has libffi-dev installed.
+
+    Changes in import paths of hooks, operators, sensors and others
+    ---------------------------------------------------------------
+    Many hooks, operators and other classes has been renamed and moved. Those changes were part of unifying names and imports paths as described in AIP-21.
+    The contrib folder has been replaced by providers directory and packages:
+    https://github.com/apache/airflow#backport-packages
+
+    Problems:
+
+    1.  Using airflow.operators.python_operator.PythonOperator will be replaced by airflow.operators.python.PythonOperator. Affected file:
+
+
+The following sections describe what is being done and how to apply the recommendations shown above.
+Please note that the above results shown are only a partial set, where only the first
+two of the seven problems identified are shown in the section above. In reality,
+all the problems are shown on the screen.
+
+
+Understanding what is being checked
+'''''''''''''''''''''''''''''''''''
+
+The Upgrade Check checks the configuration data from airflow.cfg, the meta data from the Airflow
+database, as well as the DAGs which have been set up in the current Airflow environment.
+
+Using the above results as an example, there are two specific problems which have
+been identified.
+
+The first problem is identified in the configuration file airflow.cfg where the current configuration
+option for the fernet_key is no longer acceptable and needs to be changed. This is because as of
+Airflow 2.0, the fernet_key cannot be left empty, but needs to have a defined value. Examining the
+problematic airflow.cfg and searching for the fernet_key entries would show the following:
+
+.. code-block:: bash
+    fernet_key =
+
+The second problem was identified in one of the DAGs. In this case, this import
+statement for the PythonOperator needs to be changed, since the location is different
+in Airflow 2.0. Examining the DAG file would probably show the following:
+
+.. code-block:: bash
+    from airflow.operators.python_operator import PythonOperator
+
+We will discuss how to fix these and make them compatible with Airflow 2.0 in the next
+section.
+
+
+Applying the Recommendations
+''''''''''''''''''''''''''''
+
+In most cases, the Recommendations result section of the Upgrade check contains
+enough information to make the change.
+
+For the first problem identified above with respect to the fernet_key, the solution is
+to enter a valid value in the Airflow Configuration file airflow.cfg for the fernet_key.
+
+For the second problem, looking at the source of the DAG file and changing the import
+statement for the Python Operator to be as follows will make this DAG work in Airflow 1.10.14
+as well as make it compatible for Airflow 2.0.
+
+.. code-block:: python
+    from airflow.operators import PythonOperator

Review comment:
       Actually, this does seem to work in 1.10.14rc1




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
kaxil commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r537945229



##########
File path: docs/apache-airflow/upgrade-check.rst
##########
@@ -0,0 +1,145 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Upgrade Check Script
+--------------------
+
+.. contents:: :local:
+
+Getting the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+Apache Airflow is published as ``apache-airflow`` package in PyPI. The Upgrade Check Script is part of a
+separate Python Package, since it is separate from the core Apache Airflow package and is only needed for
+a period of time and specifically only for upgrading from Airflow 1.10 releases to Airflow 2.0.
+
+While there has been a lot of work put into making this upgrade as easy as possible, there are some
+changes which are compatible between Airflow 1.10 and Airflow 2.0. In order to make this as simple to
+navigate, we recommend that people first upgrade to the latest release in the 1.10 series (at the
+time of writing: 1.10.14) and then to download this package and run the script as detailed below.
+
+
+.. note::
+
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+.. code-block:: bash
+
+    pip install apache-airflow-upgrade-check
+
+This will install the latest version of the Airflow Upgrade check package.
+
+
+Running the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+.. code-block:: bash
+
+    airflow upgrade_check
+
+This will print out to the screen a number of actions that should be taken before upgrading the Airflow
+release to 2.0.0 or higher.
+
+The exit code of the command will be 0 (success) if no problems are found in running the command, or 1 if
+problems were encountered in running the check.
+
+A sample output as a result of a successful run of the upgrade check is shown below.
+
+.. code-block:: bash
+
+    ============================================== STATUS ============================================
+
+    Check for latest versions of apache-airflow and checker..................................SUCCESS
+    Remove airflow.AirflowMacroPlugin class..................................................SUCCESS
+    Chain between DAG and operator not allowed...............................................SUCCESS
+    Connection.conn_id is not unique.........................................................SUCCESS
+    Connection.conn_type is not nullable.....................................................SUCCESS
+    Fernet is enabled by default.............................................................FAIL
+    GCP service account key deprecation......................................................SUCCESS
+    Changes in import paths of hooks, operators, sensors and others..........................FAIL
+    Users must delete deprecated configs for KubernetesExecutor..............................FAIL
+    Legacy UI is deprecated by default.......................................................SUCCESS
+    Logging configuration has been moved to new section......................................FAIL
+    Removal of Mesos Executor................................................................SUCCESS
+    Users must set a kubernetes.pod_template_file value......................................FAIL
+    SendGrid email uses old airflow.contrib module...........................................SUCCESS
+    Changes in import path of remote task handlers...........................................SUCCESS
+    Jinja Template Variables cannot be undefined.............................................FAIL
+    Found 7 problems.
+
+    ========================================== RECOMMENDATIONS ========================================
+
+    Fernet is enabled by default
+    ----------------------------
+    The fernet mechanism is enabled by default to increase the security of the default installation.
+
+    Problems:
+
+    1.  fernet_key in airflow.cfg must be explicitly set empty as fernet mechanism is enabledby default. This means that the apache-airflow[crypto] extra-packages are always installed.However, this requires that your operating system has libffi-dev installed.
+
+    Changes in import paths of hooks, operators, sensors and others
+    ---------------------------------------------------------------
+    Many hooks, operators and other classes has been renamed and moved. Those changes were part of unifying names and imports paths as described in AIP-21.
+    The contrib folder has been replaced by providers directory and packages:
+    https://github.com/apache/airflow#backport-packages
+
+    Problems:
+
+    1.  Using airflow.operators.python_operator.PythonOperator will be replaced by airflow.operators.python.PythonOperator. Affected file:
+
+
+The following sections describe what is being done and how to apply the recommendations shown above.
+Please note that the above results shown are only a partial set, where only the first
+two of the seven problems identified are shown in the section above. In reality,
+all the problems are shown on the screen.
+
+
+Understanding what is being checked
+'''''''''''''''''''''''''''''''''''
+
+The Upgrade Check checks the configuration data from airflow.cfg, the meta data from the Airflow
+database, as well as the DAGs which have been set up in the current Airflow environment.
+
+Using the above results as an example, there are two specific problems which have
+been identified.
+
+The first problem is identified in the configuration file airflow.cfg where the current configuration
+option for the fernet_key is no longer acceptable and needs to be changed.
+
+The second problem was identified in one of the DAGs. In this case, this import
+statement for the PythonOperator needs to be changed, since the location is different
+in Airflow 2.0. We will discuss how in the next section.

Review comment:
       ```suggestion
   Understanding what is being checked
   '''''''''''''''''''''''''''''''''''
   
   The Upgrade Check checks the configuration data from airflow.cfg, the metadata from the Airflow
   database, as well as the DAGs which have been set up in the current Airflow environment.
   
   Using the above results as an example, there are two specific problems which have
   been identified.
   
   The first problem is identified in the configuration file airflow.cfg where the current configuration
   option for the fernet_key is no longer acceptable and needs to be changed.
   
   The second problem was identified in one of the DAGs. In this case, this import
   statement for the PythonOperator needs to be changed, since the location is different
   in Airflow 2.0. We will discuss how in the next section.
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r538215678



##########
File path: docs/upgprade-check/upgrade-check.rst
##########
@@ -0,0 +1,163 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Upgrade Check Script
+--------------------
+
+.. contents:: :local:
+
+Getting the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+Apache Airflow is published as ``apache-airflow`` package in PyPI. The Upgrade Check Script is part of a
+separate Python Package, since it is separate from the core Apache Airflow package and is only needed for
+a period of time and specifically only for upgrading from Airflow 1.10 releases to Airflow 2.0.
+
+While there has been a lot of work put into making this upgrade as easy as possible, there are some
+changes which are compatible between Airflow 1.10 and Airflow 2.0. In order to make this as simple to
+navigate, we recommend that people first upgrade to the latest release in the 1.10 series (at the
+time of writing: 1.10.14) and then to download this package and run the script as detailed below.
+
+
+.. note::
+
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+.. code-block:: bash
+
+    pip install apache-airflow-upgrade-check
+
+This will install the latest version of the Airflow Upgrade check package.
+
+
+Running the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+.. code-block:: bash
+
+    airflow upgrade_check
+
+This will print out to the screen a number of actions that should be taken before upgrading the Airflow
+release to 2.0.0 or higher.
+
+The exit code of the command will be 0 (success) if no problems are found in running the command, or 1 if
+problems were encountered in running the check.
+
+A sample output as a result of a successful run of the upgrade check is shown below.
+
+.. code-block:: bash
+
+    ============================================== STATUS ============================================
+
+    Check for latest versions of apache-airflow and checker..................................SUCCESS
+    Remove airflow.AirflowMacroPlugin class..................................................SUCCESS
+    Chain between DAG and operator not allowed...............................................SUCCESS
+    Connection.conn_id is not unique.........................................................SUCCESS
+    Connection.conn_type is not nullable.....................................................SUCCESS
+    Fernet is enabled by default.............................................................FAIL
+    GCP service account key deprecation......................................................SUCCESS
+    Changes in import paths of hooks, operators, sensors and others..........................FAIL
+    Users must delete deprecated configs for KubernetesExecutor..............................FAIL
+    Legacy UI is deprecated by default.......................................................SUCCESS
+    Logging configuration has been moved to new section......................................FAIL
+    Removal of Mesos Executor................................................................SUCCESS
+    Users must set a kubernetes.pod_template_file value......................................FAIL
+    SendGrid email uses old airflow.contrib module...........................................SUCCESS
+    Changes in import path of remote task handlers...........................................SUCCESS
+    Jinja Template Variables cannot be undefined.............................................FAIL
+    Found 7 problems.
+
+    ========================================== RECOMMENDATIONS ========================================
+
+    Fernet is enabled by default
+    ----------------------------
+    The fernet mechanism is enabled by default to increase the security of the default installation.
+
+    Problems:
+
+    1.  fernet_key in airflow.cfg must be explicitly set empty as fernet mechanism is enabledby default. This means that the apache-airflow[crypto] extra-packages are always installed.However, this requires that your operating system has libffi-dev installed.
+
+    Changes in import paths of hooks, operators, sensors and others
+    ---------------------------------------------------------------
+    Many hooks, operators and other classes has been renamed and moved. Those changes were part of unifying names and imports paths as described in AIP-21.
+    The contrib folder has been replaced by providers directory and packages:
+    https://github.com/apache/airflow#backport-packages
+
+    Problems:
+
+    1.  Using airflow.operators.python_operator.PythonOperator will be replaced by airflow.operators.python.PythonOperator. Affected file:

Review comment:
       ```suggestion
       1.  Using ``airflow.operators.python_operator.PythonOperator`` will be replaced by ``airflow.operators.python.PythonOperator``. Affected file:
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #12872: Added documentation for Aiflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r537644446



##########
File path: docs/apache-airflow/upgrade-check.rst
##########
@@ -0,0 +1,145 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one

Review comment:
       Documentation for airflow upgrade-check, we can publish from the master page, if we do not generate any pages automatically.
   
   Docs about cross-reference syntax: https://github.com/apache/airflow/tree/master/docs#cross-referencing-syntax
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] vikramkoka commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
vikramkoka commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r538002937



##########
File path: docs/upgprade-check/upgrade-check.rst
##########
@@ -0,0 +1,161 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Upgrade Check Script
+--------------------
+
+.. contents:: :local:
+
+Getting the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+Apache Airflow is published as ``apache-airflow`` package in PyPI. The Upgrade Check Script is part of a
+separate Python Package, since it is separate from the core Apache Airflow package and is only needed for
+a period of time and specifically only for upgrading from Airflow 1.10 releases to Airflow 2.0.
+
+While there has been a lot of work put into making this upgrade as easy as possible, there are some
+changes which are compatible between Airflow 1.10 and Airflow 2.0. In order to make this as simple to
+navigate, we recommend that people first upgrade to the latest release in the 1.10 series (at the
+time of writing: 1.10.14) and then to download this package and run the script as detailed below.
+
+
+.. note::
+
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+.. code-block:: bash
+
+    pip install apache-airflow-upgrade-check
+
+This will install the latest version of the Airflow Upgrade check package.
+
+
+Running the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+.. code-block:: bash
+
+    airflow upgrade_check
+
+This will print out to the screen a number of actions that should be taken before upgrading the Airflow
+release to 2.0.0 or higher.
+
+The exit code of the command will be 0 (success) if no problems are found in running the command, or 1 if
+problems were encountered in running the check.
+
+A sample output as a result of a successful run of the upgrade check is shown below.
+
+.. code-block:: bash
+
+    ============================================== STATUS ============================================
+
+    Check for latest versions of apache-airflow and checker..................................SUCCESS
+    Remove airflow.AirflowMacroPlugin class..................................................SUCCESS
+    Chain between DAG and operator not allowed...............................................SUCCESS
+    Connection.conn_id is not unique.........................................................SUCCESS
+    Connection.conn_type is not nullable.....................................................SUCCESS
+    Fernet is enabled by default.............................................................FAIL
+    GCP service account key deprecation......................................................SUCCESS
+    Changes in import paths of hooks, operators, sensors and others..........................FAIL
+    Users must delete deprecated configs for KubernetesExecutor..............................FAIL
+    Legacy UI is deprecated by default.......................................................SUCCESS
+    Logging configuration has been moved to new section......................................FAIL
+    Removal of Mesos Executor................................................................SUCCESS
+    Users must set a kubernetes.pod_template_file value......................................FAIL
+    SendGrid email uses old airflow.contrib module...........................................SUCCESS
+    Changes in import path of remote task handlers...........................................SUCCESS
+    Jinja Template Variables cannot be undefined.............................................FAIL
+    Found 7 problems.
+
+    ========================================== RECOMMENDATIONS ========================================
+
+    Fernet is enabled by default
+    ----------------------------
+    The fernet mechanism is enabled by default to increase the security of the default installation.
+
+    Problems:
+
+    1.  fernet_key in airflow.cfg must be explicitly set empty as fernet mechanism is enabledby default. This means that the apache-airflow[crypto] extra-packages are always installed.However, this requires that your operating system has libffi-dev installed.
+
+    Changes in import paths of hooks, operators, sensors and others
+    ---------------------------------------------------------------
+    Many hooks, operators and other classes has been renamed and moved. Those changes were part of unifying names and imports paths as described in AIP-21.
+    The contrib folder has been replaced by providers directory and packages:
+    https://github.com/apache/airflow#backport-packages
+
+    Problems:
+
+    1.  Using airflow.operators.python_operator.PythonOperator will be replaced by airflow.operators.python.PythonOperator. Affected file:
+
+
+The following sections describe what is being done and how to apply the recommendations shown above.
+Please note that the above results shown are only a partial set, where only the first
+two of the seven problems identified are shown in the section above. In reality,
+all the problems are shown on the screen.
+
+
+Understanding what is being checked
+'''''''''''''''''''''''''''''''''''
+
+The Upgrade Check checks the configuration data from airflow.cfg, the meta data from the Airflow
+database, as well as the DAGs which have been set up in the current Airflow environment.
+
+Using the above results as an example, there are two specific problems which have
+been identified.
+
+The first problem is identified in the configuration file airflow.cfg where the current configuration
+option for the fernet_key is no longer acceptable and needs to be changed. This is because as of
+Airflow 2.0, the fernet_key cannot be left empty, but needs to have a defined value. Examining the
+problematic airflow.cfg and searching for the fernet_key entries would show the following:
+
+.. code-block:: bash
+    fernet_key =
+
+The second problem was identified in one of the DAGs. In this case, this import
+statement for the PythonOperator needs to be changed, since the location is different
+in Airflow 2.0. Examining the DAG file would probably show the following:
+
+.. code-block:: bash
+    from airflow.operators.python_operator import PythonOperator
+
+We will discuss how to fix these and make them compatible with Airflow 2.0 in the next
+section.
+
+
+Applying the Recommendations
+''''''''''''''''''''''''''''
+
+In most cases, the Recommendations result section of the Upgrade check contains
+enough information to make the change.
+
+For the first problem identified above with respect to the fernet_key, the solution is
+to enter a valid value in the Airflow Configuration file airflow.cfg for the fernet_key.
+
+For the second problem, looking at the source of the DAG file and changing the import
+statement for the Python Operator to be as follows will make this DAG work in Airflow 1.10.14
+as well as make it compatible for Airflow 2.0.
+
+.. code-block:: python
+    from airflow.operators import PythonOperator

Review comment:
       I will retest with 1.10.14rc4 and 2.0beta3 in just a bit to make sure. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#issuecomment-740376126


   The PR is likely ready to be merged. No tests are needed as no important environment files, nor python files were modified by it. However, committers might decide that full test matrix is needed and add the 'full tests needed' label. Then you should rebase it to the latest master or amend the last commit of the PR, and push it with --force-with-lease.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ashb commented on a change in pull request #12872: Added documentation for Aiflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
ashb commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r537641780



##########
File path: docs/apache-airflow/upgrade-check.rst
##########
@@ -0,0 +1,145 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one

Review comment:
       If we do this how can we link to it this from a (yet-to-be-created) `updating-to-2-0.rst` in `docs/apache-airflow/`?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
kaxil commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r537993892



##########
File path: docs/upgprade-check/upgrade-check.rst
##########
@@ -0,0 +1,161 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Upgrade Check Script
+--------------------
+
+.. contents:: :local:
+
+Getting the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+Apache Airflow is published as ``apache-airflow`` package in PyPI. The Upgrade Check Script is part of a
+separate Python Package, since it is separate from the core Apache Airflow package and is only needed for
+a period of time and specifically only for upgrading from Airflow 1.10 releases to Airflow 2.0.
+
+While there has been a lot of work put into making this upgrade as easy as possible, there are some
+changes which are compatible between Airflow 1.10 and Airflow 2.0. In order to make this as simple to
+navigate, we recommend that people first upgrade to the latest release in the 1.10 series (at the
+time of writing: 1.10.14) and then to download this package and run the script as detailed below.
+
+
+.. note::
+
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+.. code-block:: bash
+
+    pip install apache-airflow-upgrade-check
+
+This will install the latest version of the Airflow Upgrade check package.
+
+
+Running the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+.. code-block:: bash
+
+    airflow upgrade_check
+
+This will print out to the screen a number of actions that should be taken before upgrading the Airflow
+release to 2.0.0 or higher.
+
+The exit code of the command will be 0 (success) if no problems are found in running the command, or 1 if
+problems were encountered in running the check.
+
+A sample output as a result of a successful run of the upgrade check is shown below.
+
+.. code-block:: bash
+
+    ============================================== STATUS ============================================
+
+    Check for latest versions of apache-airflow and checker..................................SUCCESS
+    Remove airflow.AirflowMacroPlugin class..................................................SUCCESS
+    Chain between DAG and operator not allowed...............................................SUCCESS
+    Connection.conn_id is not unique.........................................................SUCCESS
+    Connection.conn_type is not nullable.....................................................SUCCESS
+    Fernet is enabled by default.............................................................FAIL
+    GCP service account key deprecation......................................................SUCCESS
+    Changes in import paths of hooks, operators, sensors and others..........................FAIL
+    Users must delete deprecated configs for KubernetesExecutor..............................FAIL
+    Legacy UI is deprecated by default.......................................................SUCCESS
+    Logging configuration has been moved to new section......................................FAIL
+    Removal of Mesos Executor................................................................SUCCESS
+    Users must set a kubernetes.pod_template_file value......................................FAIL
+    SendGrid email uses old airflow.contrib module...........................................SUCCESS
+    Changes in import path of remote task handlers...........................................SUCCESS
+    Jinja Template Variables cannot be undefined.............................................FAIL
+    Found 7 problems.
+
+    ========================================== RECOMMENDATIONS ========================================
+
+    Fernet is enabled by default
+    ----------------------------
+    The fernet mechanism is enabled by default to increase the security of the default installation.
+
+    Problems:
+
+    1.  fernet_key in airflow.cfg must be explicitly set empty as fernet mechanism is enabledby default. This means that the apache-airflow[crypto] extra-packages are always installed.However, this requires that your operating system has libffi-dev installed.
+
+    Changes in import paths of hooks, operators, sensors and others
+    ---------------------------------------------------------------
+    Many hooks, operators and other classes has been renamed and moved. Those changes were part of unifying names and imports paths as described in AIP-21.
+    The contrib folder has been replaced by providers directory and packages:
+    https://github.com/apache/airflow#backport-packages
+
+    Problems:
+
+    1.  Using airflow.operators.python_operator.PythonOperator will be replaced by airflow.operators.python.PythonOperator. Affected file:
+
+
+The following sections describe what is being done and how to apply the recommendations shown above.
+Please note that the above results shown are only a partial set, where only the first
+two of the seven problems identified are shown in the section above. In reality,
+all the problems are shown on the screen.
+
+
+Understanding what is being checked
+'''''''''''''''''''''''''''''''''''
+
+The Upgrade Check checks the configuration data from airflow.cfg, the meta data from the Airflow
+database, as well as the DAGs which have been set up in the current Airflow environment.
+
+Using the above results as an example, there are two specific problems which have
+been identified.
+
+The first problem is identified in the configuration file airflow.cfg where the current configuration
+option for the fernet_key is no longer acceptable and needs to be changed. This is because as of
+Airflow 2.0, the fernet_key cannot be left empty, but needs to have a defined value. Examining the
+problematic airflow.cfg and searching for the fernet_key entries would show the following:
+
+.. code-block:: bash
+    fernet_key =
+
+The second problem was identified in one of the DAGs. In this case, this import
+statement for the PythonOperator needs to be changed, since the location is different
+in Airflow 2.0. Examining the DAG file would probably show the following:
+
+.. code-block:: bash
+    from airflow.operators.python_operator import PythonOperator
+
+We will discuss how to fix these and make them compatible with Airflow 2.0 in the next
+section.
+
+
+Applying the Recommendations
+''''''''''''''''''''''''''''
+
+In most cases, the Recommendations result section of the Upgrade check contains
+enough information to make the change.
+
+For the first problem identified above with respect to the fernet_key, the solution is
+to enter a valid value in the Airflow Configuration file airflow.cfg for the fernet_key.
+
+For the second problem, looking at the source of the DAG file and changing the import
+statement for the Python Operator to be as follows will make this DAG work in Airflow 1.10.14
+as well as make it compatible for Airflow 2.0.
+
+.. code-block:: python
+    from airflow.operators import PythonOperator

Review comment:
       This does not work with 1.10.14 does it @vikramkoka ?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] vikramkoka commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
vikramkoka commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r538045722



##########
File path: docs/upgprade-check/upgrade-check.rst
##########
@@ -0,0 +1,161 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Upgrade Check Script
+--------------------
+
+.. contents:: :local:
+
+Getting the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+Apache Airflow is published as ``apache-airflow`` package in PyPI. The Upgrade Check Script is part of a
+separate Python Package, since it is separate from the core Apache Airflow package and is only needed for
+a period of time and specifically only for upgrading from Airflow 1.10 releases to Airflow 2.0.
+
+While there has been a lot of work put into making this upgrade as easy as possible, there are some
+changes which are compatible between Airflow 1.10 and Airflow 2.0. In order to make this as simple to
+navigate, we recommend that people first upgrade to the latest release in the 1.10 series (at the
+time of writing: 1.10.14) and then to download this package and run the script as detailed below.
+
+
+.. note::
+
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+.. code-block:: bash
+
+    pip install apache-airflow-upgrade-check
+
+This will install the latest version of the Airflow Upgrade check package.
+
+
+Running the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+.. code-block:: bash
+
+    airflow upgrade_check
+
+This will print out to the screen a number of actions that should be taken before upgrading the Airflow
+release to 2.0.0 or higher.
+
+The exit code of the command will be 0 (success) if no problems are found in running the command, or 1 if
+problems were encountered in running the check.
+
+A sample output as a result of a successful run of the upgrade check is shown below.
+
+.. code-block:: bash
+
+    ============================================== STATUS ============================================
+
+    Check for latest versions of apache-airflow and checker..................................SUCCESS
+    Remove airflow.AirflowMacroPlugin class..................................................SUCCESS
+    Chain between DAG and operator not allowed...............................................SUCCESS
+    Connection.conn_id is not unique.........................................................SUCCESS
+    Connection.conn_type is not nullable.....................................................SUCCESS
+    Fernet is enabled by default.............................................................FAIL
+    GCP service account key deprecation......................................................SUCCESS
+    Changes in import paths of hooks, operators, sensors and others..........................FAIL
+    Users must delete deprecated configs for KubernetesExecutor..............................FAIL
+    Legacy UI is deprecated by default.......................................................SUCCESS
+    Logging configuration has been moved to new section......................................FAIL
+    Removal of Mesos Executor................................................................SUCCESS
+    Users must set a kubernetes.pod_template_file value......................................FAIL
+    SendGrid email uses old airflow.contrib module...........................................SUCCESS
+    Changes in import path of remote task handlers...........................................SUCCESS
+    Jinja Template Variables cannot be undefined.............................................FAIL
+    Found 7 problems.
+
+    ========================================== RECOMMENDATIONS ========================================
+
+    Fernet is enabled by default
+    ----------------------------
+    The fernet mechanism is enabled by default to increase the security of the default installation.
+
+    Problems:
+
+    1.  fernet_key in airflow.cfg must be explicitly set empty as fernet mechanism is enabledby default. This means that the apache-airflow[crypto] extra-packages are always installed.However, this requires that your operating system has libffi-dev installed.
+
+    Changes in import paths of hooks, operators, sensors and others
+    ---------------------------------------------------------------
+    Many hooks, operators and other classes has been renamed and moved. Those changes were part of unifying names and imports paths as described in AIP-21.
+    The contrib folder has been replaced by providers directory and packages:
+    https://github.com/apache/airflow#backport-packages
+
+    Problems:
+
+    1.  Using airflow.operators.python_operator.PythonOperator will be replaced by airflow.operators.python.PythonOperator. Affected file:
+
+
+The following sections describe what is being done and how to apply the recommendations shown above.
+Please note that the above results shown are only a partial set, where only the first
+two of the seven problems identified are shown in the section above. In reality,
+all the problems are shown on the screen.
+
+
+Understanding what is being checked
+'''''''''''''''''''''''''''''''''''
+
+The Upgrade Check checks the configuration data from airflow.cfg, the meta data from the Airflow
+database, as well as the DAGs which have been set up in the current Airflow environment.
+
+Using the above results as an example, there are two specific problems which have
+been identified.
+
+The first problem is identified in the configuration file airflow.cfg where the current configuration
+option for the fernet_key is no longer acceptable and needs to be changed. This is because as of
+Airflow 2.0, the fernet_key cannot be left empty, but needs to have a defined value. Examining the
+problematic airflow.cfg and searching for the fernet_key entries would show the following:
+
+.. code-block:: bash
+    fernet_key =
+
+The second problem was identified in one of the DAGs. In this case, this import
+statement for the PythonOperator needs to be changed, since the location is different
+in Airflow 2.0. Examining the DAG file would probably show the following:
+
+.. code-block:: bash
+    from airflow.operators.python_operator import PythonOperator
+
+We will discuss how to fix these and make them compatible with Airflow 2.0 in the next
+section.
+
+
+Applying the Recommendations
+''''''''''''''''''''''''''''
+
+In most cases, the Recommendations result section of the Upgrade check contains
+enough information to make the change.
+
+For the first problem identified above with respect to the fernet_key, the solution is
+to enter a valid value in the Airflow Configuration file airflow.cfg for the fernet_key.
+
+For the second problem, looking at the source of the DAG file and changing the import
+statement for the Python Operator to be as follows will make this DAG work in Airflow 1.10.14
+as well as make it compatible for Airflow 2.0.
+
+.. code-block:: python
+    from airflow.operators import PythonOperator

Review comment:
       Drats! This does work in 1.10.14rc4, but it does NOT work in 2.0.0b3. 
   I will fix it in the doc, but not right now, I don't know of a way where it works in both. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil merged pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
kaxil merged pull request #12872:
URL: https://github.com/apache/airflow/pull/12872


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ashb commented on a change in pull request #12872: Added documentation for Aiflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
ashb commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r537642468



##########
File path: docs/apache-airflow/upgrade-check.rst
##########
@@ -0,0 +1,145 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one

Review comment:
       Oh -- but also, maybe slight complication: the upgrade/check code doesn't exist on master, just on v1-10-* branches.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r538215887



##########
File path: docs/upgprade-check/upgrade-check.rst
##########
@@ -0,0 +1,163 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Upgrade Check Script
+--------------------
+
+.. contents:: :local:
+
+Getting the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+Apache Airflow is published as ``apache-airflow`` package in PyPI. The Upgrade Check Script is part of a
+separate Python Package, since it is separate from the core Apache Airflow package and is only needed for
+a period of time and specifically only for upgrading from Airflow 1.10 releases to Airflow 2.0.
+
+While there has been a lot of work put into making this upgrade as easy as possible, there are some
+changes which are compatible between Airflow 1.10 and Airflow 2.0. In order to make this as simple to
+navigate, we recommend that people first upgrade to the latest release in the 1.10 series (at the
+time of writing: 1.10.14) and then to download this package and run the script as detailed below.
+
+
+.. note::
+
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+.. code-block:: bash
+
+    pip install apache-airflow-upgrade-check
+
+This will install the latest version of the Airflow Upgrade check package.
+
+
+Running the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+.. code-block:: bash
+
+    airflow upgrade_check
+
+This will print out to the screen a number of actions that should be taken before upgrading the Airflow
+release to 2.0.0 or higher.
+
+The exit code of the command will be 0 (success) if no problems are found in running the command, or 1 if
+problems were encountered in running the check.
+
+A sample output as a result of a successful run of the upgrade check is shown below.
+
+.. code-block:: bash
+
+    ============================================== STATUS ============================================
+
+    Check for latest versions of apache-airflow and checker..................................SUCCESS
+    Remove airflow.AirflowMacroPlugin class..................................................SUCCESS
+    Chain between DAG and operator not allowed...............................................SUCCESS
+    Connection.conn_id is not unique.........................................................SUCCESS
+    Connection.conn_type is not nullable.....................................................SUCCESS
+    Fernet is enabled by default.............................................................FAIL
+    GCP service account key deprecation......................................................SUCCESS
+    Changes in import paths of hooks, operators, sensors and others..........................FAIL
+    Users must delete deprecated configs for KubernetesExecutor..............................FAIL
+    Legacy UI is deprecated by default.......................................................SUCCESS
+    Logging configuration has been moved to new section......................................FAIL
+    Removal of Mesos Executor................................................................SUCCESS
+    Users must set a kubernetes.pod_template_file value......................................FAIL
+    SendGrid email uses old airflow.contrib module...........................................SUCCESS
+    Changes in import path of remote task handlers...........................................SUCCESS
+    Jinja Template Variables cannot be undefined.............................................FAIL
+    Found 7 problems.
+
+    ========================================== RECOMMENDATIONS ========================================
+
+    Fernet is enabled by default
+    ----------------------------
+    The fernet mechanism is enabled by default to increase the security of the default installation.
+
+    Problems:
+
+    1.  fernet_key in airflow.cfg must be explicitly set empty as fernet mechanism is enabledby default. This means that the apache-airflow[crypto] extra-packages are always installed.However, this requires that your operating system has libffi-dev installed.
+
+    Changes in import paths of hooks, operators, sensors and others
+    ---------------------------------------------------------------
+    Many hooks, operators and other classes has been renamed and moved. Those changes were part of unifying names and imports paths as described in AIP-21.
+    The contrib folder has been replaced by providers directory and packages:
+    https://github.com/apache/airflow#backport-packages
+
+    Problems:
+
+    1.  Using airflow.operators.python_operator.PythonOperator will be replaced by airflow.operators.python.PythonOperator. Affected file:
+
+
+The following sections describe what is being done and how to apply the recommendations shown above.
+Please note that the above results shown are only a partial set, where only the first
+two of the seven problems identified are shown in the section above. In reality,
+all the problems are shown on the screen.
+
+
+Understanding what is being checked
+'''''''''''''''''''''''''''''''''''
+
+The Upgrade Check checks the configuration data from airflow.cfg, the meta data from the Airflow
+database, as well as the DAGs which have been set up in the current Airflow environment.
+
+Using the above results as an example, there are two specific problems which have
+been identified.
+
+The first problem is identified in the configuration file airflow.cfg where the current configuration
+option for the fernet_key is no longer acceptable and needs to be changed. This is because as of
+Airflow 2.0, the fernet_key cannot be left empty, but needs to have a defined value. Examining the
+problematic airflow.cfg and searching for the fernet_key entries would show the following:
+
+.. code-block:: bash
+
+    fernet_key =
+
+The second problem was identified in one of the DAGs. In this case, this import
+statement for the PythonOperator needs to be changed, since the location is different

Review comment:
       ```suggestion
   statement for the ``PythonOperator`` needs to be changed, since the location is different
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r538159816



##########
File path: docs/upgprade-check/upgrade-check.rst
##########
@@ -0,0 +1,161 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+Upgrade Check Script
+--------------------
+
+.. contents:: :local:
+
+Getting the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+Apache Airflow is published as ``apache-airflow`` package in PyPI. The Upgrade Check Script is part of a
+separate Python Package, since it is separate from the core Apache Airflow package and is only needed for
+a period of time and specifically only for upgrading from Airflow 1.10 releases to Airflow 2.0.
+
+While there has been a lot of work put into making this upgrade as easy as possible, there are some
+changes which are compatible between Airflow 1.10 and Airflow 2.0. In order to make this as simple to
+navigate, we recommend that people first upgrade to the latest release in the 1.10 series (at the
+time of writing: 1.10.14) and then to download this package and run the script as detailed below.
+
+
+.. note::
+
+   On November 2020, new version of PIP (20.3) has been released with a new, 2020 resolver. This resolver
+   does not yet work with Apache Airflow and might leads to errors in installation - depends on your choice
+   of extras. In order to install Airflow you need to either downgrade pip to version 20.2.4
+   ``pip upgrade --pip==20.2.4`` or, in case you use Pip 20.3, you need to add option
+   ``--use-deprecated legacy-resolver`` to your pip install command.
+
+.. code-block:: bash
+
+    pip install apache-airflow-upgrade-check
+
+This will install the latest version of the Airflow Upgrade check package.
+
+
+Running the Airflow Upgrade Check Package
+'''''''''''''''''''''''''''''''''''''''''
+
+.. code-block:: bash
+
+    airflow upgrade_check
+
+This will print out to the screen a number of actions that should be taken before upgrading the Airflow
+release to 2.0.0 or higher.
+
+The exit code of the command will be 0 (success) if no problems are found in running the command, or 1 if
+problems were encountered in running the check.
+
+A sample output as a result of a successful run of the upgrade check is shown below.
+
+.. code-block:: bash
+
+    ============================================== STATUS ============================================
+
+    Check for latest versions of apache-airflow and checker..................................SUCCESS
+    Remove airflow.AirflowMacroPlugin class..................................................SUCCESS
+    Chain between DAG and operator not allowed...............................................SUCCESS
+    Connection.conn_id is not unique.........................................................SUCCESS
+    Connection.conn_type is not nullable.....................................................SUCCESS
+    Fernet is enabled by default.............................................................FAIL
+    GCP service account key deprecation......................................................SUCCESS
+    Changes in import paths of hooks, operators, sensors and others..........................FAIL
+    Users must delete deprecated configs for KubernetesExecutor..............................FAIL
+    Legacy UI is deprecated by default.......................................................SUCCESS
+    Logging configuration has been moved to new section......................................FAIL
+    Removal of Mesos Executor................................................................SUCCESS
+    Users must set a kubernetes.pod_template_file value......................................FAIL
+    SendGrid email uses old airflow.contrib module...........................................SUCCESS
+    Changes in import path of remote task handlers...........................................SUCCESS
+    Jinja Template Variables cannot be undefined.............................................FAIL
+    Found 7 problems.
+
+    ========================================== RECOMMENDATIONS ========================================
+
+    Fernet is enabled by default
+    ----------------------------
+    The fernet mechanism is enabled by default to increase the security of the default installation.
+
+    Problems:
+
+    1.  fernet_key in airflow.cfg must be explicitly set empty as fernet mechanism is enabledby default. This means that the apache-airflow[crypto] extra-packages are always installed.However, this requires that your operating system has libffi-dev installed.
+
+    Changes in import paths of hooks, operators, sensors and others
+    ---------------------------------------------------------------
+    Many hooks, operators and other classes has been renamed and moved. Those changes were part of unifying names and imports paths as described in AIP-21.
+    The contrib folder has been replaced by providers directory and packages:
+    https://github.com/apache/airflow#backport-packages
+
+    Problems:
+
+    1.  Using airflow.operators.python_operator.PythonOperator will be replaced by airflow.operators.python.PythonOperator. Affected file:
+
+
+The following sections describe what is being done and how to apply the recommendations shown above.
+Please note that the above results shown are only a partial set, where only the first
+two of the seven problems identified are shown in the section above. In reality,
+all the problems are shown on the screen.
+
+
+Understanding what is being checked
+'''''''''''''''''''''''''''''''''''
+
+The Upgrade Check checks the configuration data from airflow.cfg, the meta data from the Airflow
+database, as well as the DAGs which have been set up in the current Airflow environment.
+
+Using the above results as an example, there are two specific problems which have
+been identified.
+
+The first problem is identified in the configuration file airflow.cfg where the current configuration
+option for the fernet_key is no longer acceptable and needs to be changed. This is because as of
+Airflow 2.0, the fernet_key cannot be left empty, but needs to have a defined value. Examining the
+problematic airflow.cfg and searching for the fernet_key entries would show the following:
+
+.. code-block:: bash
+    fernet_key =
+
+The second problem was identified in one of the DAGs. In this case, this import
+statement for the PythonOperator needs to be changed, since the location is different
+in Airflow 2.0. Examining the DAG file would probably show the following:
+
+.. code-block:: bash
+    from airflow.operators.python_operator import PythonOperator
+
+We will discuss how to fix these and make them compatible with Airflow 2.0 in the next
+section.
+
+
+Applying the Recommendations
+''''''''''''''''''''''''''''
+
+In most cases, the Recommendations result section of the Upgrade check contains
+enough information to make the change.
+
+For the first problem identified above with respect to the fernet_key, the solution is
+to enter a valid value in the Airflow Configuration file airflow.cfg for the fernet_key.
+
+For the second problem, looking at the source of the DAG file and changing the import
+statement for the Python Operator to be as follows will make this DAG work in Airflow 1.10.14
+as well as make it compatible for Airflow 2.0.
+
+.. code-block:: python
+    from airflow.operators import PythonOperator

Review comment:
       See: https://github.com/apache/airflow/commit/0e5eee83b14b2a57345370b14e91404d518f0bf4#diff-4169e001fd63ad284011b12a578f74ae09ebeda1c5318256c0d7a97633c738c8




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] mik-laj commented on a change in pull request #12872: Added documentation for Airflow Upgrade Check

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #12872:
URL: https://github.com/apache/airflow/pull/12872#discussion_r538216112



##########
File path: docs/spelling_wordlist.txt
##########
@@ -1094,6 +1094,8 @@ pypa
 pypi
 pytest
 pythonpath
+python_operator
+PythonOperator

Review comment:
       ```suggestion
   ```
   See: https://github.com/apache/airflow/pull/12872/files#r538215678




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org