You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/09/01 09:45:56 UTC

[GitHub] [airflow] david30907d commented on issue #15978: DAG getting stuck in "running" state indefinitely

david30907d commented on issue #15978:
URL: https://github.com/apache/airflow/issues/15978#issuecomment-909184633


   > Just updated to 2.1.2, still facing the issue. My dags are getting stuck into running state. When I checked scheduler logs I have this kind of line : `Not executing <TaskInstance: xxxxxx 2021-08-09 14:30:00+00:00 [scheduled]> since the number of tasks running or queued from DAG xxxxxx is >= to the DAG's task concurrency limit of 16`
   > Seems that celery worker doesn't pick up any task as the log file is not updated when it is happening
   
   @hafid-d I bumped into the same issue few days ago.
   
   It works well for now after replacing `PythonOperator` with`SubDagOperator`. Are you using `SubDagOperator` as well?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org