You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Siddharth Anand (JIRA)" <ji...@apache.org> on 2017/12/06 00:48:32 UTC
[jira] [Assigned] (AIRFLOW-1886) Failed jobs are not being counted
towards max_active_runs_per_dag
[ https://issues.apache.org/jira/browse/AIRFLOW-1886?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Siddharth Anand reassigned AIRFLOW-1886:
----------------------------------------
Assignee: Oleg Yamin
> Failed jobs are not being counted towards max_active_runs_per_dag
> -----------------------------------------------------------------
>
> Key: AIRFLOW-1886
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1886
> Project: Apache Airflow
> Issue Type: Bug
> Components: DagRun
> Affects Versions: 1.8.1
> Reporter: Oleg Yamin
> Assignee: Oleg Yamin
>
> Currently, I have setup max_active_runs_per_dag = 2 in airflow.cfg but when a DAG aborts, it will keep submitting next DAG in the queue not counting the current incomplete DAG that is already in the queue. I am using 1.8.1 but i see that the jobs.py in latest version is still not addressing this issue.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)