You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/08/30 15:59:57 UTC

[GitHub] [airflow] abhaysjuneja commented on issue #14896: Error: 'daemonic processes are not allowed to have children', after upgradeing to airflow:2.0.1

abhaysjuneja commented on issue #14896:
URL: https://github.com/apache/airflow/issues/14896#issuecomment-908463824


   > @pmlewis Because `execute_tasks_new_python_interpreter = True` will create Python interpreter for each task so it will have performance implications -- few seconds / ms atleast.
   
   I tried this with LocalExecutor, but I'm still get this warning `UserWarning: Multiprocessing-backed parallel loops cannot be nested, setting n_jobs=1`. With Joblib, it's basically setting n_jobs=1 and serializing the process, rendering it useless. I'll try this with CeleryExecutor next.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org