You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/08/06 09:05:10 UTC

[GitHub] [airflow] mik-laj commented on pull request #14415: SnowflakeToS3Operator

mik-laj commented on pull request #14415:
URL: https://github.com/apache/airflow/pull/14415#issuecomment-894118493


   > And I'd say it's not really transfer. It's just executing sql. It's purely an interaction with snowflake, and snowflake is handling the "transferring" internally. E.g. we're not pulling the rows out through connector into airflow and writing them out and uploading to s3 -- that would be more of a transfer.
   
   I agree. This is an operator for a specific operation in Snowflake, not a typical transfer operator.  For this to be an operator transfer, all credentials should be managed by Airflow to ensure a unified experience. In this case, AWS credentials are managed by Snowflaake. In Google providers, we had a similar case. To move data from GCS bucket to GCS  bucket, you can https://github.com/apache/airflow/blob/1bd3a5c68c88cf3840073d6276460a108f864187/airflow/providers/google/cloud/transfers/gcs_to_gcs.py#L29 operator or use Google service -- https://github.com/apache/airflow/blob/1bd3a5c68c88cf3840073d6276460a108f864187/airflow/providers/google/cloud/operators/cloud_storage_transfer_service.py#L166


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org