You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@thrift.apache.org by Fabio Ronca <hi...@gmail.com> on 2023/05/08 14:53:29 UTC

Transfer large data

Hi all

I have the large size of request data to be submitted to remote server API. (something like machine learning's feature data).
Will I use thrift or gRPC (protobuf behind it) to implement it for better performance/security etc?

Best regards

Re: Transfer large data

Posted by Christian <ch...@gmail.com>.
Thrift and grpc excel for building APIs where requests and responses parse
as protobuf (/thrift) structured objects that fit in memory.

For transferring large blobs you shall rather use http, ftp, torrent, etc...

On Tue, May 9, 2023, 7:43 PM Fabio Ronca <hi...@gmail.com> wrote:

> Hi all
>
> I have the large size of request data to be submitted to remote server
> API. (something like machine learning's feature data).
> Will I use thrift or gRPC (protobuf behind it) to implement it for better
> performance/security etc?
>
> Best regards