You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2023/01/05 18:03:00 UTC

[GitHub] [airflow] Taragolis commented on issue #28743: s3_hook.copy_object() doesn't allow copying files > 5Gb

Taragolis commented on issue #28743:
URL: https://github.com/apache/airflow/issues/28743#issuecomment-1372552399

   @set92 some info in addition to what @eladkal wrote
   
   * `copy_object` method of S3 Client it is wrapper for AWS API and limited by API (same info as mentioned in boto3: https://docs.aws.amazon.com/AmazonS3/latest/API/API_CopyObject.html)
   * `copy` method of S3 Client it is one of helpers which use [s3transfer](https://github.com/boto/s3transfer) capabilities and work by different way and this is not replacement of `copy_object`
   
   Base on your exception you use PythonOperator (task flow decorator) so you could use all capabilities of AWS Hooks, include (but not limited) create boto3 S3 client by use Airflow Connection as resault you could use all client methods described in boto3 documentation
   
   ```python
   
   @task
   def awesome_s3_copy(**kwargs):
       hook = S3Hook(aws_conn_id="awesome-conn-id", region_name="us-east-1", ...)
       s3_client = hook.conn
       s3_client.copy(...)
   ```  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org