You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/09/02 14:12:04 UTC

[GitHub] [airflow] potiuk commented on issue #17988: DAGBAG_IMPORT_TIMEOUT does not work

potiuk commented on issue #17988:
URL: https://github.com/apache/airflow/issues/17988#issuecomment-911729268


   Did you restart your scheduler  after the change? and did you change it as "global" variable or "worker" one? 
   
   I believe you should repeat your getfloat() from `scheduler` not from "worker" - and it is very likely your scheduler still uses the old value.
   
   However I have side comment, but very important. You will have a LOT of problems if your DAGs are importing that long.
   
   Import of > 30 seconds is probably an indication of bigger problem you have actually. It probably means that you do not follow best practices for Top-Level code: https://airflow.apache.org/docs/apache-airflow/stable/best-practices.html#top-level-python-code  . I think you should rather focus on making sure that you do not do any heavy-lifting during importing your DAGs and move all the heaving processing to either before dag is scheduled (generating some meta-data) or to inside the "execute" methods in your operators or JINJA templating)
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org