You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Wouter Bosland <wb...@gmail.com> on 2014/05/13 10:40:35 UTC

100 MB messages

Hello everyone,

Can Kafka be used for binary large objects of 100 MB ?

Or should I use a different solution to store these files like MongoDB and
maybe send the location of these files in MongoDB over Kafka?



Thanks is advance,

Wouter

Re: 100 MB messages

Posted by Jay Kreps <ja...@gmail.com>.
It can, but it will not perform very well. Kafka fully instantiates
messages in memory (as a byte[] basically) so if you send a 100MB message
the server will do a 100MB allocation to hold that data prior to writing to
disk.

I think MongoDB does have blob support so passing a pointer via Kafka as
you describe may be a better solution.

-Jay


On Tue, May 13, 2014 at 1:40 AM, Wouter Bosland <wb...@gmail.com>wrote:

> Hello everyone,
>
> Can Kafka be used for binary large objects of 100 MB ?
>
> Or should I use a different solution to store these files like MongoDB and
> maybe send the location of these files in MongoDB over Kafka?
>
>
>
> Thanks is advance,
>
> Wouter
>

Re: 100 MB messages

Posted by Joe Stein <jo...@stealth.ly>.
Or HDFS and use Kafka for the event of file, yup. Processing on the files can be done without the mapreduce overhead in Hadoop now using Apache Tez (or something that use Tez like Pig).  


/*******************************************
 Joe Stein
 Founder, Principal Consultant
 Big Data Open Source Security LLC
 http://www.stealth.ly
 Twitter: @allthingshadoop
********************************************/


> On May 13, 2014, at 4:40 AM, Wouter Bosland <wb...@gmail.com> wrote:
> 
> Hello everyone,
> 
> Can Kafka be used for binary large objects of 100 MB ?
> 
> Or should I use a different solution to store these files like MongoDB and
> maybe send the location of these files in MongoDB over Kafka?
> 
> 
> 
> Thanks is advance,
> 
> Wouter