You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Johan Lundahl <jo...@gmail.com> on 2014/03/13 19:03:02 UTC

Broker plugins

Hi,

I have a use case for which it would be useful with pluggable processing
functions in the broker.

We have some data containing sensitive information which is legally ok to
transmit over the internal network to the Kafka brokers and keep in
volatile memory but not to flush to disk unconcealed/unencrypted. The
application server resources are too scarce and critical to handle this
processing so we must do it elsewhere.

To cope with this, I'm looking for a way to plug a "concealer" somewhere
near KafkaApis.handleProducerRequest before anything has been flushed to
disk but I imagine that other people might come up with ideas where
plugging in custom functions would be interesting as well. My case might be
relatively specific but has the general idea of user plugins in different
areas of the broker ever been discussed?

Re: Broker plugins

Posted by Johan Lundahl <jo...@gmail.com>.
Thanks for the response,

indeed the encryption/concealment should ideally be done on the producer
side but it's just not feasible in some of our applications so a middle
layer would be needed. So far our thoughts have been around using Flume
interceptors but that means introducing another "moving piece"...


On Thu, Mar 13, 2014 at 9:34 PM, Benjamin Black <b...@b3k.us> wrote:

> Or introduce an app layer between the producers and kafka that does the
> processing without changes/load to the producers.
>
>
> On Thu, Mar 13, 2014 at 1:18 PM, Neha Narkhede <neha.narkhede@gmail.com
> >wrote:
>
> > In general, the preference has been to avoid having user code run on the
> > brokers since that just opens a can of worms where the broker logic get's
> > complicated trying to deal with errors that the user code can throw. The
> > suggestion is to push any user specific processing to the client side. In
> > this case, you can imagine a producer that encrypts sensitive data before
> > sending it to a topic on the broker.
> >
> > Thanks,
> > Neha
> >
> >
> > On Thu, Mar 13, 2014 at 11:03 AM, Johan Lundahl <johan.lundahl@gmail.com
> > >wrote:
> >
> > > Hi,
> > >
> > > I have a use case for which it would be useful with pluggable
> processing
> > > functions in the broker.
> > >
> > > We have some data containing sensitive information which is legally ok
> to
> > > transmit over the internal network to the Kafka brokers and keep in
> > > volatile memory but not to flush to disk unconcealed/unencrypted. The
> > > application server resources are too scarce and critical to handle this
> > > processing so we must do it elsewhere.
> > >
> > > To cope with this, I'm looking for a way to plug a "concealer"
> somewhere
> > > near KafkaApis.handleProducerRequest before anything has been flushed
> to
> > > disk but I imagine that other people might come up with ideas where
> > > plugging in custom functions would be interesting as well. My case
> might
> > be
> > > relatively specific but has the general idea of user plugins in
> different
> > > areas of the broker ever been discussed?
> > >
> >
>

Re: Broker plugins

Posted by Johan Lundahl <jo...@gmail.com>.
Thanks for the response,

indeed the encryption/concealment should ideally be done on the producer
side but it's just not feasible in some of our applications so a middle
layer would be needed. So far our thoughts have been around using Flume
interceptors but that means introducing another "moving piece"...


On Thu, Mar 13, 2014 at 9:34 PM, Benjamin Black <b...@b3k.us> wrote:

> Or introduce an app layer between the producers and kafka that does the
> processing without changes/load to the producers.
>
>
> On Thu, Mar 13, 2014 at 1:18 PM, Neha Narkhede <neha.narkhede@gmail.com
> >wrote:
>
> > In general, the preference has been to avoid having user code run on the
> > brokers since that just opens a can of worms where the broker logic get's
> > complicated trying to deal with errors that the user code can throw. The
> > suggestion is to push any user specific processing to the client side. In
> > this case, you can imagine a producer that encrypts sensitive data before
> > sending it to a topic on the broker.
> >
> > Thanks,
> > Neha
> >
> >
> > On Thu, Mar 13, 2014 at 11:03 AM, Johan Lundahl <johan.lundahl@gmail.com
> > >wrote:
> >
> > > Hi,
> > >
> > > I have a use case for which it would be useful with pluggable
> processing
> > > functions in the broker.
> > >
> > > We have some data containing sensitive information which is legally ok
> to
> > > transmit over the internal network to the Kafka brokers and keep in
> > > volatile memory but not to flush to disk unconcealed/unencrypted. The
> > > application server resources are too scarce and critical to handle this
> > > processing so we must do it elsewhere.
> > >
> > > To cope with this, I'm looking for a way to plug a "concealer"
> somewhere
> > > near KafkaApis.handleProducerRequest before anything has been flushed
> to
> > > disk but I imagine that other people might come up with ideas where
> > > plugging in custom functions would be interesting as well. My case
> might
> > be
> > > relatively specific but has the general idea of user plugins in
> different
> > > areas of the broker ever been discussed?
> > >
> >
>

Re: Broker plugins

Posted by Benjamin Black <b...@b3k.us>.
Or introduce an app layer between the producers and kafka that does the
processing without changes/load to the producers.


On Thu, Mar 13, 2014 at 1:18 PM, Neha Narkhede <ne...@gmail.com>wrote:

> In general, the preference has been to avoid having user code run on the
> brokers since that just opens a can of worms where the broker logic get's
> complicated trying to deal with errors that the user code can throw. The
> suggestion is to push any user specific processing to the client side. In
> this case, you can imagine a producer that encrypts sensitive data before
> sending it to a topic on the broker.
>
> Thanks,
> Neha
>
>
> On Thu, Mar 13, 2014 at 11:03 AM, Johan Lundahl <johan.lundahl@gmail.com
> >wrote:
>
> > Hi,
> >
> > I have a use case for which it would be useful with pluggable processing
> > functions in the broker.
> >
> > We have some data containing sensitive information which is legally ok to
> > transmit over the internal network to the Kafka brokers and keep in
> > volatile memory but not to flush to disk unconcealed/unencrypted. The
> > application server resources are too scarce and critical to handle this
> > processing so we must do it elsewhere.
> >
> > To cope with this, I'm looking for a way to plug a "concealer" somewhere
> > near KafkaApis.handleProducerRequest before anything has been flushed to
> > disk but I imagine that other people might come up with ideas where
> > plugging in custom functions would be interesting as well. My case might
> be
> > relatively specific but has the general idea of user plugins in different
> > areas of the broker ever been discussed?
> >
>

Re: Broker plugins

Posted by Neha Narkhede <ne...@gmail.com>.
In general, the preference has been to avoid having user code run on the
brokers since that just opens a can of worms where the broker logic get's
complicated trying to deal with errors that the user code can throw. The
suggestion is to push any user specific processing to the client side. In
this case, you can imagine a producer that encrypts sensitive data before
sending it to a topic on the broker.

Thanks,
Neha


On Thu, Mar 13, 2014 at 11:03 AM, Johan Lundahl <jo...@gmail.com>wrote:

> Hi,
>
> I have a use case for which it would be useful with pluggable processing
> functions in the broker.
>
> We have some data containing sensitive information which is legally ok to
> transmit over the internal network to the Kafka brokers and keep in
> volatile memory but not to flush to disk unconcealed/unencrypted. The
> application server resources are too scarce and critical to handle this
> processing so we must do it elsewhere.
>
> To cope with this, I'm looking for a way to plug a "concealer" somewhere
> near KafkaApis.handleProducerRequest before anything has been flushed to
> disk but I imagine that other people might come up with ideas where
> plugging in custom functions would be interesting as well. My case might be
> relatively specific but has the general idea of user plugins in different
> areas of the broker ever been discussed?
>

Re: Broker plugins

Posted by Neha Narkhede <ne...@gmail.com>.
In general, the preference has been to avoid having user code run on the
brokers since that just opens a can of worms where the broker logic get's
complicated trying to deal with errors that the user code can throw. The
suggestion is to push any user specific processing to the client side. In
this case, you can imagine a producer that encrypts sensitive data before
sending it to a topic on the broker.

Thanks,
Neha


On Thu, Mar 13, 2014 at 11:03 AM, Johan Lundahl <jo...@gmail.com>wrote:

> Hi,
>
> I have a use case for which it would be useful with pluggable processing
> functions in the broker.
>
> We have some data containing sensitive information which is legally ok to
> transmit over the internal network to the Kafka brokers and keep in
> volatile memory but not to flush to disk unconcealed/unencrypted. The
> application server resources are too scarce and critical to handle this
> processing so we must do it elsewhere.
>
> To cope with this, I'm looking for a way to plug a "concealer" somewhere
> near KafkaApis.handleProducerRequest before anything has been flushed to
> disk but I imagine that other people might come up with ideas where
> plugging in custom functions would be interesting as well. My case might be
> relatively specific but has the general idea of user plugins in different
> areas of the broker ever been discussed?
>