You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Ivan Vergiliev (JIRA)" <ji...@apache.org> on 2017/09/05 13:09:00 UTC

[jira] [Commented] (AIRFLOW-1515) Airflow 1.8.1 tasks not being marked as upstream_failed when one of the parents fails

    [ https://issues.apache.org/jira/browse/AIRFLOW-1515?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16153612#comment-16153612 ] 

Ivan Vergiliev commented on AIRFLOW-1515:
-----------------------------------------

I'm seeing this as well on 1.8.0. As far is I can tell, this is related to the `flag_upstream_failed` property of `DepContext` and the logic around it in `airflow/ti_deps/deps/trigger_rule_dep.py:_evaluate_trigger_rule`. If I'm reading this correctly, when `flag_upstream_failed` is set to False, task state is not properly set to `UPSTREAM_FAILED` and stay in a "something is blocking this task from being scheduled" state.

> Airflow 1.8.1 tasks not being marked as upstream_failed when one of the parents fails
> -------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-1515
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1515
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: core, DAG, DagRun
>    Affects Versions: 1.8.1
>            Reporter: Jose Sanchez
>         Attachments: airflow_bug.png, rofl_test.py
>
>
> the trigger rule "all_done" is not working when its parents are marked as State.None instead of State.Upstream_Failed. I am submitting a very small dag as example and a picture of the run, where the last Task should have been executed before marking the dag as failed.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)