You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2023/01/05 13:21:23 UTC

[GitHub] [airflow] eladkal commented on issue #28743: s3_hook.copy_object() doesn't allow copying files > 5Gb

eladkal commented on issue #28743:
URL: https://github.com/apache/airflow/issues/28743#issuecomment-1372209403

   You linked to `copy` rather than to `copy_object` 
   boto explain the limitation when you check the right function :)
   https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.copy_object
   
   ```
   Note
   
   You can store individual objects of up to 5 TB in Amazon S3.
   You create a copy of your object up to 5 GB in size in a single atomic action using this API.
   However, to copy an object greater than 5 GB, you must use the multipart upload Upload Part - Copy (UploadPartCopy) API.
   For more information, see [Copy Object Using the REST Multipart Upload API](https://docs.aws.amazon.com/AmazonS3/latest/dev/CopyingObjctsUsingRESTMPUapi.html).
   ```
   
   
   So this is not a bug.. this is simply the functionality of boto3.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org