You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by Chengzhi Zhao <w....@gmail.com> on 2018/03/11 00:51:56 UTC

Question on running dag with new dag definition

Hey, there,

I have a question on Airflow dag deployment. We had an issue that when a
dag is currently running, if a new dag python file been deployed (meaning
CICD process replaced the original file), the downstream tasks will be
changed based on new files.

I went through airflow jobs and dag_processing code and it looks like
airflow is aggressively updating the dag definition so the downstream will
be changed in my case, please advise if my understanding is correct.

https://github.com/apache/incubator-airflow/blob/master/
airflow/jobs.py#L1617
https://github.com/apache/incubator-airflow/blob/master/
airflow/utils/dag_processing.py#L535

If so, is there any way/option in airflow that I can skip the dag file
processing while a dag is running? I would love to learn how other people
usually deploy their dag and if anyone face this issue before.

Thanks,
Chengzhi