You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by Nadeem Ahmed Nazeer <na...@neon-lab.com> on 2016/08/01 21:29:07 UTC

DAG status still running when all its tasks are complete

Hello,

I am facing a situation with Airflow where it doesn't flag the DAG's as
success even though all of the tasks in that DAG are complete.

I have a BranchPythonOperator which forks into running all downstream tasks
or just a single task (dummy operator as an endpoint) depending if files
exists to be processed or not for that cycle.

I see that in the DAG's that go to the dummy operator, the status of the
DAG always shows running where its complete. I can't get to figure out what
is stopping the scheduler from marking this DAG success. Since it is in
running state, every time the scheduler checks the status of this DAG which
is unnecessary.

Please advise.

Thanks,
Nadeem

Re: DAG status still running when all its tasks are complete

Posted by Nadeem Ahmed Nazeer <na...@neon-lab.com>.
Hi Siddharth,

AIRFLOW-396 <https://issues.apache.org/jira/browse/AIRFLOW-396> has been
assigned to you with requested information. Thanks for your help.

Please revert if any further information is required.

Thanks,
Nadeem

On Tue, Aug 2, 2016 at 10:27 PM, siddharth anand <sa...@apache.org> wrote:

> Hi Nadeem,
> Can you open a JIRA, attach a DAG which I can run to reproduce your issue,
> and assign the JIRA to me?
> -s
>
> On Tue, Aug 2, 2016 at 8:40 PM, Nadeem Ahmed Nazeer <na...@neon-lab.com>
> wrote:
>
> > Could someone please shed some light on this DAG status?
> >
> > My airflow version is 1.7.0. This is the only version that works for me
> > when it comes to scheduler. Any version above this, the scheduler gets
> > stuck without a trace and wouldn't schedule anything.
> >
> > Thanks,
> > Nadeem
> >
> > On Mon, Aug 1, 2016 at 2:29 PM, Nadeem Ahmed Nazeer <nazeer@neon-lab.com
> >
> > wrote:
> >
> > > Hello,
> > >
> > > I am facing a situation with Airflow where it doesn't flag the DAG's as
> > > success even though all of the tasks in that DAG are complete.
> > >
> > > I have a BranchPythonOperator which forks into running all downstream
> > > tasks or just a single task (dummy operator as an endpoint) depending
> if
> > > files exists to be processed or not for that cycle.
> > >
> > > I see that in the DAG's that go to the dummy operator, the status of
> the
> > > DAG always shows running where its complete. I can't get to figure out
> > what
> > > is stopping the scheduler from marking this DAG success. Since it is in
> > > running state, every time the scheduler checks the status of this DAG
> > which
> > > is unnecessary.
> > >
> > > Please advise.
> > >
> > > Thanks,
> > > Nadeem
> > >
> > >
> >
>

Re: DAG status still running when all its tasks are complete

Posted by siddharth anand <sa...@apache.org>.
Hi Nadeem,
Can you open a JIRA, attach a DAG which I can run to reproduce your issue,
and assign the JIRA to me?
-s

On Tue, Aug 2, 2016 at 8:40 PM, Nadeem Ahmed Nazeer <na...@neon-lab.com>
wrote:

> Could someone please shed some light on this DAG status?
>
> My airflow version is 1.7.0. This is the only version that works for me
> when it comes to scheduler. Any version above this, the scheduler gets
> stuck without a trace and wouldn't schedule anything.
>
> Thanks,
> Nadeem
>
> On Mon, Aug 1, 2016 at 2:29 PM, Nadeem Ahmed Nazeer <na...@neon-lab.com>
> wrote:
>
> > Hello,
> >
> > I am facing a situation with Airflow where it doesn't flag the DAG's as
> > success even though all of the tasks in that DAG are complete.
> >
> > I have a BranchPythonOperator which forks into running all downstream
> > tasks or just a single task (dummy operator as an endpoint) depending if
> > files exists to be processed or not for that cycle.
> >
> > I see that in the DAG's that go to the dummy operator, the status of the
> > DAG always shows running where its complete. I can't get to figure out
> what
> > is stopping the scheduler from marking this DAG success. Since it is in
> > running state, every time the scheduler checks the status of this DAG
> which
> > is unnecessary.
> >
> > Please advise.
> >
> > Thanks,
> > Nadeem
> >
> >
>

Re: DAG status still running when all its tasks are complete

Posted by Nadeem Ahmed Nazeer <na...@neon-lab.com>.
Could someone please shed some light on this DAG status?

My airflow version is 1.7.0. This is the only version that works for me
when it comes to scheduler. Any version above this, the scheduler gets
stuck without a trace and wouldn't schedule anything.

Thanks,
Nadeem

On Mon, Aug 1, 2016 at 2:29 PM, Nadeem Ahmed Nazeer <na...@neon-lab.com>
wrote:

> Hello,
>
> I am facing a situation with Airflow where it doesn't flag the DAG's as
> success even though all of the tasks in that DAG are complete.
>
> I have a BranchPythonOperator which forks into running all downstream
> tasks or just a single task (dummy operator as an endpoint) depending if
> files exists to be processed or not for that cycle.
>
> I see that in the DAG's that go to the dummy operator, the status of the
> DAG always shows running where its complete. I can't get to figure out what
> is stopping the scheduler from marking this DAG success. Since it is in
> running state, every time the scheduler checks the status of this DAG which
> is unnecessary.
>
> Please advise.
>
> Thanks,
> Nadeem
>
>