You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/07/13 23:18:41 UTC

[GitHub] [airflow] pmlewis commented on issue #14896: Error: 'daemonic processes are not allowed to have children', after upgradeing to airflow:2.0.1

pmlewis commented on issue #14896:
URL: https://github.com/apache/airflow/issues/14896#issuecomment-879467171


   I had a similar issue using CeleryExecutor and multiprocessing. This may be a side effect of Airflow 2 forking by default to execute tasks versus creating a new Python process, which seems to be one thing that changed from Airflow 1 in CeleryExecutor and LocalExecutor. There's a config value that can control that behavior. I got around the issue by adding
   
   execute_tasks_new_python_interpreter = True
   
   to my airflow.cfg under [core], and without using PYTHONOPTIMIZE. It looks like CeleryExecutor and LocalExecutor will try to use this value to determine to fork or to subprocess, so maybe that will work for LocalExecutor too.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org