You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by Sachin <pa...@gmail.com> on 2017/07/12 05:02:04 UTC

dag_run marked as failed even when there are tasks with status UP_FOR_RETRY

Hi,

I am using airflow 1.7.1.3.

I have a dag with the Bash operator which has command: 'dat' which will
fail.

when I trigger the dag using "airflow trigger_dag testretry1" it executes
and try running the task and fails with error

ERROR - Bash command failed


I expect it to attempt 2nd time after as retries and retries_delay are
defined. after 1st attempt the dag run is marked as failed and it never
runs 2nd time. The task instance status is up_for_retry at this point.

As soon as I change the dag_run to "running" (UI has a option to change the
status of dag_run), the task runs 2nd time and it fails (as expected).

Expecting this:
As I have retries=1 and retries_delay=2 mins, I expect the task instance to
run and fail 1st time (task instance status=up_for_retry, dag_run
status=running) and again run after 2 minutes and fail (task instance
status=failed, dag_run status=failed)

This is what happening:
The task instance to run and fail 1st time (task instance
status=up_for_retry, dag_run status=failed) and again task instance does
not run 2nd time.


Here is my DAG defination:

from airflow.models import DAGfrom airflow.operators import BashOperator
from datetime import datetime, timedelta
default_args = {
  'owner': 'Sachin Parmar',
  'depends_on_past': False,
  'start_date': datetime.now(),
  'email': ['abc@xyz.com'],
  'email_on_failure': False,
  'email_on_retry': False,
  'retries': 1,
  'retry_delay': timedelta(minutes=2),}
dag = DAG('testretry1', default_args=default_args, schedule_interval=None)
task1 = BashOperator(
  task_id='print_date',
  bash_command='dat',
  dag=dag)


is anyting wrong with the dag or my understanding? please correct me.

Thanks,
Sachin Parmar