You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by "Ryabchuk, Pavlo" <ex...@here.com> on 2016/05/26 13:23:45 UTC

Correct way to stop execution of a triggered DAG(DagRun)

Hi,

I am wondering what is the most correct way to stop execution of triggered DAG(DAGRun) with all subDAGS and tasks using Celery Executor? It's not that important to mark specifically all the tasks, just to make sure that Celery/Rabbitmq queue will be empty for the next DagRun.
My concrete case is that if one of subset of tasks fails I need to stop the execution off all DAG (DagRun). Workers are running on EC2 instances, which means i will shut down the instances (with Airflow CeleryExecutor workers) as well.
So far I plan to use on_failure_callback to stop instances and do this "cleanup".

Also about trigger_rule  functionality, what about extending it to support multiple rules? (ex. all_success|one_failed})

Best,
Paul