You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2023/01/03 14:45:08 UTC

[GitHub] [airflow] mharrisb1 commented on issue #24730: Google CloudRun job operator

mharrisb1 commented on issue #24730:
URL: https://github.com/apache/airflow/issues/24730#issuecomment-1369849278

   @VinceLegendre all great thoughts.
   
   I think most of my plugin is obsolete once someone can get https://github.com/googleapis/python-run to build correctly with the rest of the Google Cloud providers code https://pypi.org/project/apache-airflow-providers-google/. The only issue is resolving protobuf versions between the 2 (see https://github.com/googleapis/python-run/issues/70). The Google team will not solve this on their side so someone will need to solve it in the Google providers code.
   
   The official python-run library is definitely preferred over my custom client. Then yes, taking the same approach as other Google Cloud providers would be the goal. And exactly as you pointed out: extend `GoogleBaseHook` for auth, etc.
   
   Once the protobuf issue is resolved then it should be an easy path to just implement operators for all CRUD and execution options. I think sensors, custom links, etc. are nice to have but could potentially be introduced in subsequent versions if someone doesn't want to implement it all at once. I would though consider all the CRUD operators as part of the completion requirements since that allows full control over the resource lifecycle.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org