You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/06/04 04:53:18 UTC

[GitHub] [airflow] timgriffiths commented on issue #13542: Task stuck in "scheduled" or "queued" state, pool has all slots queued, nothing is executing

timgriffiths commented on issue #13542:
URL: https://github.com/apache/airflow/issues/13542#issuecomment-854356862


   Environment:
   
   Airflow 2.1.0
   Docker : apache/airflow:2.1.0-python3.8
   Executor : Celery
   
   So I have also hit this today, the way i was able to replicate this 
   
   -> have a py file that generates move than 1 dag
   -> then have a user break 1 of the dag from this resultant chain
   -> all the other still valid dags will be triggered in the scheduler but fail in the Worker but it looks like the exception is not passed back to the Scheduler so it permanently stays in a "queued state"
   
   `[2021-06-04 01:11:22,927: INFO/ForkPoolWorker-40] Executing command in Celery: ['airflow', 'tasks', 'run', 'myworkingjob', 'taskname-abc', '2021-06-04T01:11:21.775097+00:00', '--local', '--pool', 'pool', '--subdir', '/opt/airflow/dags/mm-dynamic-dags.py']
   [2021-06-04 01:11:22,940: INFO/ForkPoolWorker-34] Filling up the DagBag from /opt/airflow/dags/mm-dynamic-dags.py
   [2021-06-04 01:11:27,993: ERROR/ForkPoolWorker-39] Failed to import: /opt/airflow/dags/mm-dynamic-dags.py
   Traceback (most recent call last):
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dagbag.py", line 317, in _load_modules_from_file
       loader.exec_module(new_module)
     File "<frozen importlib._bootstrap_external>", line 848, in exec_module
     File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
     File "/opt/airflow/dags/mm-dynamic-dags.py", line 187, in <module>
       create_dag("{}/{}".format(global_dags_base_url, dag_file))
     File "/opt/airflow/dags/mm-dynamic-dags.py", line 123, in create_dag
       dag = DAG(
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dag.py", line 277, in __init__
       validate_key(dag_id)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/helpers.py", line 44, in validate_key
       raise AirflowException(
   airflow.exceptions.AirflowException: The key (test-test+abc) has to be made of alphanumeric characters, dashes, dots and underscores exclusively
   [2021-06-04 01:11:27,996: ERROR/ForkPoolWorker-39] Failed to execute task dag_id could not be found: myworkingjob. Either the dag did not exist or it failed to parse..
   Traceback (most recent call last):
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/celery_executor.py", line 116, in _execute_in_fork
       args.func(args)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
       return func(*args, **kwargs)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 91, in wrapper
       return f(*args, **kwargs)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 219, in task_run
       dag = get_dag(args.subdir, args.dag_id)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 191, in get_dag
       raise AirflowException(
   airflow.exceptions.AirflowException: dag_id could not be found: myworkingjob. Either the dag did not exist or it failed to parse.
   [2021-06-04 01:11:28,017: ERROR/ForkPoolWorker-39] Task airflow.executors.celery_executor.execute_command[66453177-4042-4c6a-882b-bd0fcdbda0d8] raised unexpected: AirflowException('Celery command failed on host: airflow-mm-worker-5464db666b-f8mlm')
   Traceback (most recent call last):
     File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/trace.py", line 412, in trace_task
       R = retval = fun(*args, **kwargs)
     File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/trace.py", line 704, in __protected_call__
       return self.run(*args, **kwargs)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/celery_executor.py", line 87, in execute_command
       _execute_in_fork(command_to_exec)
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/executors/celery_executor.py", line 98, in _execute_in_fork
       raise AirflowException('Celery command failed on host: ' + get_hostname())
   airflow.exceptions.AirflowException: Celery command failed on host: airflow-mm-worker-5464db666b-f8mlm
   [2021-06-04 01:11:28,091: ERROR/ForkPoolWorker-37] Failed to import: /opt/airflow/dags/mm-dynamic-dags.py`
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org