You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by "M. Manna" <ma...@gmail.com> on 2017/06/25 18:23:42 UTC

[Implementation-suggestion] for Stream/Connect/Plain ProducerConsumer API

Hello,

This question is more of a request for suggestion, since I am already using
Plain API (Producer Consumer) and trying to explore either Stream/Connect
API to solve my problem.

I need to perform adhoc read from a different server and this is not
event-driven. For example:

1) User Logs in

2) User wants to check the order history.

3) Orders for last 3 months are in the main database (NoSQL or SQL)

4) Orders older than 3 months are stored in a different database (NoSQL or
SQL).

5) When user clicks on "Older than 3 months" link - this is where I am
trying to solve my problem.

Without using Connect/Stream I am thinking about doing a Producer-Consumer
wrapper call with some reasonable poll period. In summary:

1) I Publish request message to server where the data is

2) I create a consumer group and subscribe to the same topic and po;; for
the data (with reasonable period).

3) I process based on data received/no data.

listen to  will keep polling for a certain ms of time and after that I will
do something based on any data/no data.

I understand that Kafka Streams is more about “Aggregation” of various data
and Kafka Connect is more about Data Sourcing and sinking. But for my
particular problem can I use Stream/Connect API, because I am not sure if
this can help me?

Also, I acknowledge that for Kafka Connect API requires me to provide
database addresses (if reading from/writing to SQL/NoSQL db). I was hoping
not to do this, but I understand that good tools have good requirements,
and for good reasons :)

Kindest Regards,