You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/08/11 14:05:40 UTC

[GitHub] [airflow] JonnyWaffles commented on issue #14896: Error: 'daemonic processes are not allowed to have children', after upgradeing to airflow:2.0.1

JonnyWaffles commented on issue #14896:
URL: https://github.com/apache/airflow/issues/14896#issuecomment-1212036202

   Hi team, apologies for adding to a closed thread. I encountered the same problem when using a `CeleryExecutor` when a third party package (Okera in my use case) attempts multiprocessing
   
   >   File "/app/.local/lib/python3.9/site-packages/okera/concurrency.py", line 16, in __init__
   >     self.manager = multiprocessing.Manager()
   ...
   
   >   File "/usr/lib/python3.9/multiprocessing/process.py", line 118, in start
   >     assert not _current_process._config.get('daemon'), \
   > AssertionError: daemonic processes are not allowed to have children
   
   I am using Airflow 2.3.3 and Celery 5.2.7 running on the default `airflow celery worker` entrypoint. Where in the stack is the daemonic subprocess being created? When I comb through the code I see by default the workers execute tasks in a fork. Are the forked celery workers daemons by default? Is there a way I can resolve the issue without monkey patching a third party library's use of multi-processing per @potiuk 's suggestion above? I just want to better understand the problem before mucking around. Thanks for any insight you can provide!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org