You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Nadeem Ahmed Nazeer (JIRA)" <ji...@apache.org> on 2016/08/04 19:19:20 UTC

[jira] [Updated] (AIRFLOW-396) DAG status still running when all its tasks are complete

     [ https://issues.apache.org/jira/browse/AIRFLOW-396?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nadeem Ahmed Nazeer updated AIRFLOW-396:
----------------------------------------
    Attachment: DagRuns.png

> DAG status still running when all its tasks are complete
> --------------------------------------------------------
>
>                 Key: AIRFLOW-396
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-396
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DagRun, scheduler
>    Affects Versions: Airflow 1.7.0
>            Reporter: Nadeem Ahmed Nazeer
>            Assignee: Siddharth Anand
>            Priority: Minor
>         Attachments: DagRuns.png
>
>
> Hello,
> I am facing a situation with Airflow where it doesn't flag the DAG's as success even though all of the tasks in that DAG are complete.
> I have a BranchPythonOperator which forks into running all downstream tasks or just a single task (dummy operator as an endpoint) depending if files exists to be processed or not for that cycle.
> I see that in the DAG's that go to the dummy operator, the status of the DAG always shows running where its complete. I can't get to figure out what is stopping the scheduler from marking this DAG success. Since it is in running state, every time the scheduler checks the status of this DAG which is unnecessary.
> Please advise.
> Thanks,
> Nadeem



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)