You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Nikolay Petrachkov (JIRA)" <ji...@apache.org> on 2017/09/22 07:36:00 UTC
[jira] [Updated] (AIRFLOW-1630) Tasks do not end up in state
UPSTREAM_FAILED consistently
[ https://issues.apache.org/jira/browse/AIRFLOW-1630?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Nikolay Petrachkov updated AIRFLOW-1630:
----------------------------------------
Environment: Ubuntu 16.04, PostgreSQL 9.6.5, Python 2.7.12 (was: Ubuntu 16.04)
> Tasks do not end up in state UPSTREAM_FAILED consistently
> ---------------------------------------------------------
>
> Key: AIRFLOW-1630
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1630
> Project: Apache Airflow
> Issue Type: Bug
> Affects Versions: 1.8.2
> Environment: Ubuntu 16.04, PostgreSQL 9.6.5, Python 2.7.12
> Reporter: Nikolay Petrachkov
> Attachments: Screen Shot 2017-09-21 at 17.36.37.png
>
>
> Given a simple DAG with 2 tasks: BashOperator and DummyOperator
> BashOperator has a command "exit 1"
> BashOperator is upstream for DummyOperator.
> When we run this DAG we expect BashOperator to fail and DummyOperator to be in state UPSTREAM_FAILED.
> Actual result: BashOperator is in state FAILED and DummyOperator is in state None.
> Code:
> {code:python}
> from airflow import DAG
> from datetime import datetime
> from airflow.operators.dummy_operator import DummyOperator
> from airflow.operators.bash_operator import BashOperator
> default_args = {
> 'owner': 'airflow',
> 'start_date': datetime(2017, 9, 20),
> 'retries': 0
> }
> dag = DAG(
> 'delivery-failed',
> default_args=default_args,
> schedule_interval=None
> )
> failed_bash = "exit 1"
> bash_task = BashOperator(
> task_id='bash-task',
> bash_command=failed_bash,
> dag=dag
> )
> end_task = DummyOperator(
> task_id='end',
> dag=dag
> )
> bash_task >> end_task
> {code}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)