You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Andrew Loughran (JIRA)" <ji...@apache.org> on 2017/10/20 17:18:00 UTC

[jira] [Created] (AIRFLOW-1742) Cannot work out why queued tasks aren't being processed.

Andrew Loughran created AIRFLOW-1742:
----------------------------------------

             Summary: Cannot work out why queued tasks aren't being processed.
                 Key: AIRFLOW-1742
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1742
             Project: Apache Airflow
          Issue Type: Wish
            Reporter: Andrew Loughran


I have run both LocalExecutor and CeleryWorker with a remote postgresDB.

On the initial run, the DAG completed successfully, but subsequent runs the execution of the tasks halts.  The UI reports that the tasks are in the queue, but the executor fails to pick them up to process them.  This has resulted in stopped DAGS, and ultimately the only way forward was an `airflow resetdb` to get a fresh state, restart all the airflow services (scheduler, worker, webserver & flower), and a flush of the redis queue.

I'd like to help troubleshoot what could be causing this issue, but probably need assistance with how to setup the logs, and replicate the problem with the debugging turned on fully.  Once that's done, I'm hoping it will be a more obvious fix.

Opening this ticket as I will try and start the work by myself and post the logs as attachments, but will require support from other developers as and when I've got the logs here.

Thanks,

Andy



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)