You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by abdelali elmantagui <ab...@gmail.com> on 2022/06/13 15:50:41 UTC

Kafka consumer filtering

Hi All,

I started a couple of weeks ago learning Kafka, and my goal is to optimize an existing architecture that uses Kafka in its components.
The problem is that there many microservices that produce messages/events to the the kafka topic and in the other hand theres other microservices that consumes these messages/events and each microservice have to consume all the messages and then filter which message are interested in and that create a problem of huge memory usage because of the huge anount of objects created in the memory after deserilaization of these messages.

Am asking for any concept or solution that can help in this situation.

Kind Regards,
Abdelali

+--------------------+
+-----------------+ | | +------------------+
| microservices |---------->| Kafka topic | --------> | microservices |
+-----------------+ | | +------------------+
+--------------------+


Re: Kafka consumer filtering

Posted by Jamie <ja...@aol.co.uk.INVALID>.
Hi abdelali,
If you can’t get your producers to send the different types of events to different topics (or you don’t want to) you could use Kafka streams to filter the data in the topic to new topics that are subsets of the data. 

I have also seen apache spark used to do similar.
Thanks,
Jamie 


Sent from the all-new AOL app for iOS


On Monday, June 13, 2022, 4:53 pm, abdelali elmantagui <ab...@gmail.com> wrote:

Hi All,

I started a couple of weeks ago learning Kafka, and my goal is to optimize an existing architecture that uses Kafka in its components.
The problem is that there many microservices that produce messages/events to the the kafka topic and in the other hand theres other microservices that consumes these messages/events and each microservice have to consume all the messages and then filter which message are interested in and that create a problem of huge memory usage because of the huge anount of objects created in the memory after deserilaization of these messages.

Am asking for any concept or solution that can help in this situation.

Kind Regards,
Abdelali

+--------------------+
+-----------------+ | | +------------------+
| microservices |---------->| Kafka topic | --------> | microservices |
+-----------------+ | | +------------------+
+--------------------+





Re: Kafka consumer filtering

Posted by Anders Engström <ep...@gmail.com>.
Hi -

depending on the rules for how to filter/drop incoming messages (and
depending on the mechanics of the library you use to consume the messages),
it might be possible to filter out messages based on message headers,
maybe? That way you would not need to deserialize the message key/value
before deciding if the message should be dropped or not.

/Anders

On Mon, Jun 13, 2022 at 5:53 PM abdelali elmantagui <
abdelalielmantagui@gmail.com> wrote:

> Hi All,
>
> I started a couple of weeks ago learning Kafka, and my goal is to optimize
> an existing architecture that uses Kafka in its components.
> The problem is that there many microservices that produce messages/events
> to the the kafka topic and in the other hand theres other microservices
> that consumes these messages/events and each microservice have to consume
> all the messages and then filter which message are interested in and that
> create a problem of huge memory usage because of the huge anount of objects
> created in the memory after deserilaization of these messages.
>
> Am asking for any concept or solution that can help in this situation.
>
> Kind Regards,
> Abdelali
>
> +--------------------+
> +-----------------+ | | +------------------+
> | microservices |---------->| Kafka topic | --------> | microservices |
> +-----------------+ | | +------------------+
> +--------------------+
>
>