You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/09/16 08:03:15 UTC

[GitHub] [airflow] kosteev commented on issue #18229: When I set wait_for_downstream=True, the task will not wait, and a failure will be reported

kosteev commented on issue #18229:
URL: https://github.com/apache/airflow/issues/18229#issuecomment-920679992


   I can reproduce it in Airflow 2.1.2.
   
   Code of DAG:
   ```
   default_args = {'wait_for_downstream': True, 'provide_context': False}
   command = 'sleep 1'
   
   with DAG(
       'test_downstream',
       start_date=datetime.datetime.now() - datetime.timedelta(hours=1),
       schedule_interval='*/1 * * * *',
       default_args=default_args,
       max_active_runs=100, dagrun_timeout=datetime.timedelta(minutes=60),
       catchup=True) as dag:
   
     start_task = BashOperator(
         task_id='start_task', bash_command=command, retries=3,
         trigger_rule='all_success', dag=dag)
   
     end_task = BashOperator(
         task_id='end_task', bash_command=command, retries=3,
         trigger_rule='all_success', dag=dag)
   
     for target_date in range(0, 2):
       bash_cmd_j000 = BashOperator(
           task_id='bash_cmd_j000_' + str(target_date),
           bash_command=command, retries=3,
           trigger_rule='all_success', dag=dag)
       bash_cmd_j001 = BashOperator(
           task_id='bash_cmd_j001_' + str(target_date),
           bash_command=command, retries=3,
           trigger_rule='all_success', dag=dag)
       bash_cmd_j002 = BashOperator(
           task_id='bash_cmd_j002_' + str(target_date),
           bash_command=command, retries=3,
           trigger_rule='all_success', dag=dag)
       bash_cmd_j003 = BashOperator(
           task_id='bash_cmd_j003_' + str(target_date),
           bash_command=command, retries=3,
           trigger_rule='all_success', dag=dag)
   
       start_task >> bash_cmd_j000 >> bash_cmd_j001 >> bash_cmd_j002 >> bash_cmd_j003 >> end_task
   ```
   
   After some time some tasks are scheduled and marked as failed (and up_for_retry) with log message:
   ```
   [2021-09-16 07:56:46,290] {taskinstance.py:887} INFO - Dependencies not met for <TaskInstance: test_downstream.bash_cmd_j001_0 2021-09-16T06:58:00+00:00 [queued]>, dependency 'Previous Dagrun State' FAILED: The tasks downstream of the previous task instance <TaskInstance: test_downstream.bash_cmd_j001_0 2021-09-16 06:57:00+00:00 [success]> haven't completed (and wait_for_downstream is True).
   [2021-09-16 07:56:46,560] {local_task_job.py:96} INFO - Task is not able to be run
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org