You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Rohit S S (Jira)" <ji...@apache.org> on 2020/03/06 18:40:00 UTC

[jira] [Comment Edited] (AIRFLOW-6960) Airflow Celery worker : command returned non-zero exit status 2

    [ https://issues.apache.org/jira/browse/AIRFLOW-6960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17053690#comment-17053690 ] 

Rohit S S edited comment on AIRFLOW-6960 at 3/6/20, 6:39 PM:
-------------------------------------------------------------

* Request you to give elaborate steps to recreate the bug.
 * It doesn't look like a bug because if you use ./breeze everything looks fine.
 * Detailed steps to recreate this bug would definitely help me solve the issue.
 * Also what do you mean by running the worker separately (Please Be lil more specific)


was (Author: randr97):
* Request you to give elaborate steps to recreate the bug.
 * It doesn't look like a bug because if you use ./breeze everything looks fine.
 * Detailed steps to recreate this bug would definitely help me solve the issue.

> Airflow Celery worker : command returned non-zero exit status 2
> ---------------------------------------------------------------
>
>                 Key: AIRFLOW-6960
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-6960
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: celery
>    Affects Versions: 2.0.0, 1.10.9
>            Reporter: Uragalage Thilanka Mahesh Perera
>            Assignee: Rohit S S
>            Priority: Blocker
>
> I am getting below error and trying to fix it for hours and did get any luck. Below logs are from airflow celery worker.
> {code:java}
>   airflow command error: argument subcommand: invalid choice: 'tasks' (choose from 'backfill', 'list_dag_runs', 'list_tasks', 'clear', 'pause', 'unpause', 'trigger_dag', 'delete_dag', 'show_dag', 'pool', 'variables', 'kerberos', 'render', 'run', 'initdb', 'list_dags', 'dag_state', 'task_failed_deps', 'task_state', 'serve_logs', 'test', 'webserver', 'resetdb', 'upgradedb', 'checkdb', 'shell', 'scheduler', 'worker', 'flower', 'version', 'connections', 'create_user', 'delete_user', 'list_users', 'sync_perm', 'next_execution', 'rotate_fernet_key'), see help above. usage: airflow [-h] {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key} ... positional arguments: {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key} sub-command help backfill Run subsections of a DAG for a specified date range. If reset_dag_run option is used, backfill will first prompt users whether airflow should clear all the previous dag_run and task_instances within the backfill date range. If rerun_failed_tasks is used, backfill will auto re-run the previous failed task instances within the backfill date range. list_dag_runs List dag runs given a DAG id. If state option is given, it will onlysearch for all the dagruns with the given state. If no_backfill option is given, it will filter outall backfill dagruns for given dag id. list_tasks List the tasks within a DAG clear Clear a set of task instance, as if they never ran pause Pause a DAG unpause Resume a paused DAG trigger_dag Trigger a DAG run delete_dag Delete all DB records related to the specified DAG show_dag Displays DAG's tasks with their dependencies pool CRUD operations on pools variables CRUD operations on variables kerberos Start a kerberos ticket renewer render Render a task instance's template(s) run Run a single task instance initdb Initialize the metadata database list_dags List all the DAGs dag_state Get the status of a dag run task_failed_deps Returns the unmet dependencies for a task instance from the perspective of the scheduler. In other words, why a task instance doesn't get scheduled and then queued by the scheduler, and then run by an executor). task_state Get the status of a task instance serve_logs Serve logs generate by worker test Test a task instance. This will run a task without checking for dependencies or recording its state in the database. webserver Start a Airflow webserver instance resetdb Burn down and rebuild the metadata database upgradedb Upgrade the metadata database to latest version checkdb Check if the database can be reached. shell Runs a shell to access the database scheduler Start a scheduler instance worker Start a Celery worker node flower Start a Celery Flower version Show the version connections List/Add/Delete connections create_user Create an account for the Web UI (FAB-based) delete_user Delete an account for the Web UI list_users List accounts for the Web UI sync_perm Update permissions for existing roles and DAGs. next_execution Get the next execution datetime of a DAG. rotate_fernet_key Rotate all encrypted connection credentials and variables; see https://airflow.readthedocs.io/en/stable/howto/secure- connections.html#rotating-encryption-keys. optional arguments: -h, --help show this help message and exit airflow command error: argument subcommand: invalid choice: 'tasks' (choose from 'backfill', 'list_dag_runs', 'list_tasks', 'clear', 'pause', 'unpause', 'trigger_dag', 'delete_dag', 'show_dag', 'pool', 'variables', 'kerberos', 'render', 'run', 'initdb', 'list_dags', 'dag_state', 'task_failed_deps', 'task_state', 'serve_logs', 'test', 'webserver', 'resetdb', 'upgradedb', 'checkdb', 'shell', 'scheduler', 'worker', 'flower', 'version', 'connections', 'create_user', 'delete_user', 'list_users', 'sync_perm', 'next_execution', 'rotate_fernet_key'), see help above. [2020-03-01 00:11:41,941: ERROR/ForkPoolWorker-8] execute_command encountered a CalledProcessError Traceback (most recent call last): File "/opt/rh/rh-python36/root/usr/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 69, in execute_command close_fds=True, env=env) File "/opt/rh/rh-python36/root/usr/lib64/python3.6/subprocess.py", line 311, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['airflow', 'tasks', 'run', 'airflow_worker_check_pipeline', 'dev_couchbase_backup', '2020-02-29T14:47:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/airflow/dags/project1/airflow_worker_check_pipeline.py']' returned non-zero exit status 2. [2020-03-01 00:11:41,941: ERROR/ForkPoolWorker-8] None [2020-03-01 00:11:41,996: ERROR/ForkPoolWorker-8] Task airflow.executors.celery_executor.execute_command[0e0c3d02-bdb3-4d16-a863-cbb3bb7a7137] raised unexpected: AirflowException('Celery command failed',) Traceback (most recent call last): File "/opt/rh/rh-python36/root/usr/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 69, in execute_command close_fds=True, env=env) File "/opt/rh/rh-python36/root/usr/lib64/python3.6/subprocess.py", line 311, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['airflow', 'tasks', 'run', 'airflow_worker_check_pipeline', 'dev_couchbase_backup', '2020-02-29T14:47:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/airflow/dags/project1/airflow_worker_check_pipeline.py']' returned non-zero exit status 2. During handling of the above exception, another exception occurred:
> {code}
> *_Airflow Scheduler & Mater versions : v2.0.0.dev0 docker platform (Image -->apache/airflow master-ci)_* 
> *_Airflow Worker Versions : v1.10.9 (manual install/non docker platform)_*
> I suspect that the could be due to version mismatch and I tried to update the airflow worker version, but unfortunately I could not find that version
> {code:java}
> ERROR: Could not find a version that satisfies the requirement apache-airflow[celery]=={v2.0.0} (from versions: **1.10.9-bin, 1.8.1, 1.8.2rc1, 1.8.2, 1.9.0, 1.10.0, 1.10.1b1, 1.10.1rc2, 1.10.1, 1.10.2b2, 1.10.2rc1, 1.10.2rc2, 1.10.2rc3, 1.10.2, 1.10.3b1, 1.10.3b2, 1.10.3rc1, 1.10.3rc2, 1.10.3, 1.10.4b2, 1.10.4rc1, 1.10.4rc2, 1.10.4rc3, 1.10.4rc4, 1.10.4rc5, 1.10.4, 1.10.5rc1, 1.10.5, 1.10.6rc1, 1.10.6rc2, 1.10.6, 1.10.7rc1, 1.10.7rc2, 1.10.7rc3, 1.10.7, 1.10.8rc1, 1.10.8, 1.10.9rc1, 1.10.9**) ERROR: No matching distribution found for apache-airflow[celery]=={v2.0.0}
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)