You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by ra...@gmail.com, ra...@gmail.com on 2018/05/21 12:13:18 UTC

Dags getting failed after 24 hours

Hi All,
We have a long running DAG which is expected to take around 48 hours. But we are observing that its get killed by Airflow scheduler after ~24 hrs. We are not setting any Dag/task execution timeout explicitly.
Is there any default timeout value that get used. We are using LocalExecutor mode.
We checked in the Airflow code but execution timeout values seem to be set to 'None'

Thanks,
Raman Gupta

Re: Dags getting failed after 24 hours

Posted by Maxime Beauchemin <ma...@gmail.com>.
Even though it's possible to set and `execution_timeout` on any task and/or
a dagrun_timeout on DAG runs, by default it's all set to None (unless
you're somehow setting the DAG's default parameters in some other ways).

Maybe your have some OS-level policies on long-running processes in your
environment? Anything in the logs? SIGKILL?

Max

On Mon, May 21, 2018 at 5:13 AM ramandumcs@gmail.com <ra...@gmail.com>
wrote:

> Hi All,
> We have a long running DAG which is expected to take around 48 hours. But
> we are observing that its get killed by Airflow scheduler after ~24 hrs. We
> are not setting any Dag/task execution timeout explicitly.
> Is there any default timeout value that get used. We are using
> LocalExecutor mode.
> We checked in the Airflow code but execution timeout values seem to be set
> to 'None'
>
> Thanks,
> Raman Gupta
>