You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ka...@apache.org on 2021/02/08 20:15:51 UTC

[airflow] 01/01: Fix broken ``airflow upgrade_check`` command

This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch fix-broken-version-check
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 5e35707de3fdd2d55797852a56767e36a1de9ed3
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Mon Feb 8 20:08:39 2021 +0000

    Fix broken ``airflow upgrade_check`` command
    
    https://github.com/apache/airflow/pull/13392 broke the ability
    to run ``airflow upgrade_check`` command.
    
    When ``airflow upgrade_check`` is run, the airflow cli runs
    `airflow.upgrade.checker.register_arguments` function.
    
    This function registers all arguments and sets the default function to run.
    
    Since we don't want to handle the logic in ``airflow``, we have a hack
    to only run anything via that ``run`` command. So the logic about
    checking for args should be in it.
    
    https://github.com/apache/airflow/pull/13392 removed ``subparser.set_defaults(func=run)``.
    
    Due to which I get the following:
    
    **Before**
    
    ❯ airflow upgrade_check --list
    
    Please install apache-airflow-upgrade-check distribution from PyPI to perform upgrade checks
    
    Because default is not set in `checker.py`, it uses one set in
    https://github.com/apache/airflow/blob/v1-10-stable/airflow/bin/cli.py#L4248
    
    and hence just prints `Please install apache-airflow-upgrade-check distribution from PyPI to perform upgrade checks`.
    
    **After the fix in this PR**:
    
    ❯ airflow upgrade_check --list
    
    Upgrade Checks:
    - VersionCheckRule: Check for latest versions of apache-airflow and checker
    - AirflowMacroPluginRemovedRule: Remove airflow.AirflowMacroPlugin class
    - BaseOperatorMetaclassRule: Ensure users are not using custom metaclasses in custom operators
    - ChainBetweenDAGAndOperatorNotAllowedRule: Chain between DAG and operator not allowed.
    - ConnTypeIsNotNullableRule: Connection.conn_type is not nullable
    - CustomExecutorsRequireFullPathRule: Custom Executors now require full path
    - DatabaseVersionCheckRule: Check versions of PostgreSQL, MySQL, and SQLite to ease upgrade to Airflow 2.0
    - DbApiRule: Hooks that run DB functions must inherit from DBApiHook
    - FernetEnabledRule: Fernet is enabled by default
    - GCPServiceAccountKeyRule: GCP service account key deprecation
    - HostnameCallable: Unify hostname_callable option in core section
    - ImportChangesRule: Changes in import paths of hooks, operators, sensors and others
    - LegacyUIDeprecated: Legacy UI is deprecated by default
    - LoggingConfigurationRule: Logging configuration has been moved to new section
    - MesosExecutorRemovedRule: Removal of Mesos Executor
    - NoAdditionalArgsInOperatorsRule: No additional argument allowed in BaseOperator.
    - PodTemplateFileRule: Users must set a kubernetes.pod_template_file value
    - SendGridEmailerMovedRule: SendGrid email uses old airflow.contrib module
    - SparkJDBCOperatorConnIdRule: Check Spark JDBC Operator default connection name
    - TaskHandlersMovedRule: Changes in import path of remote task handlers
    - UniqueConnIdRule: Connection.conn_id is not unique
---
 airflow/upgrade/checker.py | 5 +++++
 1 file changed, 5 insertions(+)

diff --git a/airflow/upgrade/checker.py b/airflow/upgrade/checker.py
index 974417f..889b0f4 100644
--- a/airflow/upgrade/checker.py
+++ b/airflow/upgrade/checker.py
@@ -69,12 +69,17 @@ def register_arguments(subparser):
         help="List the upgrade checks and their class names",
         action="store_true",
     )
+    subparser.set_defaults(func=run)
 
 
 def run(args):
     from airflow.upgrade.formatters import ConsoleFormatter, JSONFormatter
     from airflow.upgrade.config import UpgradeConfig
 
+    if args.list:
+        list_checks()
+        return
+
     if args.save:
         filename = args.save
         if not filename.lower().endswith(".json"):