You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/05/11 10:43:58 UTC

[GitHub] [airflow] dinigo commented on issue #8804: Add ability for all operators to interact with storages of AWS/GCP/AZURE

dinigo commented on issue #8804:
URL: https://github.com/apache/airflow/issues/8804#issuecomment-626625086


   You are suggesting to implement kind of what we have with [GenericTransfer](https://github.com/apache/airflow/blob/master/airflow/operators/generic_transfer.py). I've thought of this too. We would need to have a common API for getting and sending files. Preferably a `flile-like-object` so transfers are done with a stream and they don't take storage.
   
   I would rather have an `XtoYOperator` where I can configure the Hooks for each, source and destination, such as:
   ```
   gcs_to_s3 = GenericFileTransfer(
     source_hook=GCSHook('gcs-conn-id'),
     dest_hook=S3Hook('s3-conn-id'),
     source_file='gs://my-gcs-bucket/my-file.csv',
     dest_file='my-s3-bucket/my-file.csv'
   )


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org