You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/04/08 20:27:18 UTC

[GitHub] [airflow] potiuk commented on issue #15286: use PythonVirtualenvOperator with a prebuilt env

potiuk commented on issue #15286:
URL: https://github.com/apache/airflow/issues/15286#issuecomment-816158124


   Just one comment - this fine, if you can make sure all your - distributed - venvs are present on all the workers (which might be tricky if you want to update those) - and you have to somehow link the "task" definition (expecting certain venv with certain requirement versions) with the "deployment" (i.e. worker definition). Any kind of "upgrade" to such an env might be tricky. The "local" installation pattern had the advantage, that you always got the requirements in the version you described in the task definition (via requirements specification).  
   
   I think a better solution would be to use caching mechanism to the task and modify the PythonVirtualenv to use it. However this might be tricky to get right when you have multiple tasks of the same type running in the same runner in Celery deployment. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org