You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by BYEONG-GI KIM <bg...@bluedigm.com> on 2017/03/09 08:59:58 UTC

Kafka Streams question

Hello.

I'm a new who started learning the one of the new Kafka functionality, aka
Kafka Stream.

As far as I know, the simplest usage of the Kafka Stream is to do something
like parsing, which forward incoming data from a topic to another topic,
with a few changing.

So... Here is what I'd want to do:

1. Produce a simple message, like 1, 2, 3, 4, 5, ... from a producer
2. Let Kafka Stream application consume the message and change the message
like [1], [2], [3], ...
3. Consume the changed message at a consumer

I've read the documentation,
https://kafka.apache.org/0102/javadoc/index.html?org/apache/kafka/connect,
but it's unclear for me how to implement it.

Especially, I could not understand the the
line, builder.stream("my-input-topic").mapValues(value -&gt;
value.length().toString()).to("my-output-topic"). Could someone explain it
and how to implement what I've mentioned?

Thanks in advance.

Best regards

KIM

Re: Kafka Streams question

Posted by BYEONG-GI KIM <bg...@bluedigm.com>.
Thank you very much for the reply.

I'll try to implement it.

Best regards

KIM

2017-03-14 17:07 GMT+09:00 Michael Noll <mi...@confluent.io>:

> Yes, of course.  You can also re-use any existing JSON and/or YAML library
> for helping you with that.
>
> Also, in general, an application that uses the Kafka Streams API/library is
> a normal, standard Java application -- you can of course also use any other
> Java/Scala/... library for the application's processing needs.
>
> -Michael
>
>
>
> On Tue, Mar 14, 2017 at 9:00 AM, BYEONG-GI KIM <bg...@bluedigm.com> wrote:
>
> > Dear Michael Noll,
> >
> > I have a question; Is it possible converting JSON format to YAML format
> via
> > using Kafka Streams?
> >
> > Best Regards
> >
> > KIM
> >
> > 2017-03-10 11:36 GMT+09:00 BYEONG-GI KIM <bg...@bluedigm.com>:
> >
> > > Thank you very much for the information!
> > >
> > >
> > > 2017-03-09 19:40 GMT+09:00 Michael Noll <mi...@confluent.io>:
> > >
> > >> There's actually a demo application that demonstrates the simplest use
> > >> case
> > >> for Kafka's Streams API:  to read data from an input topic and then
> > write
> > >> that data as-is to an output topic.
> > >>
> > >> https://github.com/confluentinc/examples/blob/3.2.x/kafka-
> > >> streams/src/test/java/io/confluent/examples/streams/Pas
> > >> sThroughIntegrationTest.java
> > >>
> > >> The code above is for Confluent 3.2 and Apache Kafka 0.10.2.
> > >>
> > >> The demo shows how to (1) write a message from a producer to the input
> > >> topic, (2) use a Kafka Streams app to process that data and write the
> > >> results back to Kafka, and (3) validating the results with a consumer
> > that
> > >> reads from the output topic.
> > >>
> > >> The GitHub project above includes many more such examples, see
> > >> https://github.com/confluentinc/examples/tree/3.2.x/kafka-streams.
> > >> Again,
> > >> this is for Confluent 3.2 and Kafka 0.10.2.  There is a version
> > >> compatibility matrix that explains which branches you need to use for
> > >> older
> > >> versions of Confluent/Kafka as well as for the very latest development
> > >> version (aka Kafka's trunk):
> > >> https://github.com/confluentinc/examples/tree/3.2.x/kafka-
> > >> streams#version-compatibility
> > >>
> > >> Hope this helps!
> > >> Michael
> > >>
> > >>
> > >>
> > >>
> > >> On Thu, Mar 9, 2017 at 9:59 AM, BYEONG-GI KIM <bg...@bluedigm.com>
> > wrote:
> > >>
> > >> > Hello.
> > >> >
> > >> > I'm a new who started learning the one of the new Kafka
> functionality,
> > >> aka
> > >> > Kafka Stream.
> > >> >
> > >> > As far as I know, the simplest usage of the Kafka Stream is to do
> > >> something
> > >> > like parsing, which forward incoming data from a topic to another
> > topic,
> > >> > with a few changing.
> > >> >
> > >> > So... Here is what I'd want to do:
> > >> >
> > >> > 1. Produce a simple message, like 1, 2, 3, 4, 5, ... from a producer
> > >> > 2. Let Kafka Stream application consume the message and change the
> > >> message
> > >> > like [1], [2], [3], ...
> > >> > 3. Consume the changed message at a consumer
> > >> >
> > >> > I've read the documentation,
> > >> > https://kafka.apache.org/0102/javadoc/index.html?org/apache/
> > >> kafka/connect,
> > >> > but it's unclear for me how to implement it.
> > >> >
> > >> > Especially, I could not understand the the
> > >> > line, builder.stream("my-input-topic").mapValues(value -&gt;
> > >> > value.length().toString()).to("my-output-topic"). Could someone
> > >> explain it
> > >> > and how to implement what I've mentioned?
> > >> >
> > >> > Thanks in advance.
> > >> >
> > >> > Best regards
> > >> >
> > >> > KIM
> > >> >
> > >>
> > >
> > >
> >
>
>
>
> --
> *Michael G. Noll*
> Product Manager | Confluent
> +1 650 453 5860 | @miguno <https://twitter.com/miguno>
> Follow us: Twitter <https://twitter.com/ConfluentInc> | Blog
> <http://www.confluent.io/blog>
>

Re: Kafka Streams question

Posted by Michael Noll <mi...@confluent.io>.
Yes, of course.  You can also re-use any existing JSON and/or YAML library
for helping you with that.

Also, in general, an application that uses the Kafka Streams API/library is
a normal, standard Java application -- you can of course also use any other
Java/Scala/... library for the application's processing needs.

-Michael



On Tue, Mar 14, 2017 at 9:00 AM, BYEONG-GI KIM <bg...@bluedigm.com> wrote:

> Dear Michael Noll,
>
> I have a question; Is it possible converting JSON format to YAML format via
> using Kafka Streams?
>
> Best Regards
>
> KIM
>
> 2017-03-10 11:36 GMT+09:00 BYEONG-GI KIM <bg...@bluedigm.com>:
>
> > Thank you very much for the information!
> >
> >
> > 2017-03-09 19:40 GMT+09:00 Michael Noll <mi...@confluent.io>:
> >
> >> There's actually a demo application that demonstrates the simplest use
> >> case
> >> for Kafka's Streams API:  to read data from an input topic and then
> write
> >> that data as-is to an output topic.
> >>
> >> https://github.com/confluentinc/examples/blob/3.2.x/kafka-
> >> streams/src/test/java/io/confluent/examples/streams/Pas
> >> sThroughIntegrationTest.java
> >>
> >> The code above is for Confluent 3.2 and Apache Kafka 0.10.2.
> >>
> >> The demo shows how to (1) write a message from a producer to the input
> >> topic, (2) use a Kafka Streams app to process that data and write the
> >> results back to Kafka, and (3) validating the results with a consumer
> that
> >> reads from the output topic.
> >>
> >> The GitHub project above includes many more such examples, see
> >> https://github.com/confluentinc/examples/tree/3.2.x/kafka-streams.
> >> Again,
> >> this is for Confluent 3.2 and Kafka 0.10.2.  There is a version
> >> compatibility matrix that explains which branches you need to use for
> >> older
> >> versions of Confluent/Kafka as well as for the very latest development
> >> version (aka Kafka's trunk):
> >> https://github.com/confluentinc/examples/tree/3.2.x/kafka-
> >> streams#version-compatibility
> >>
> >> Hope this helps!
> >> Michael
> >>
> >>
> >>
> >>
> >> On Thu, Mar 9, 2017 at 9:59 AM, BYEONG-GI KIM <bg...@bluedigm.com>
> wrote:
> >>
> >> > Hello.
> >> >
> >> > I'm a new who started learning the one of the new Kafka functionality,
> >> aka
> >> > Kafka Stream.
> >> >
> >> > As far as I know, the simplest usage of the Kafka Stream is to do
> >> something
> >> > like parsing, which forward incoming data from a topic to another
> topic,
> >> > with a few changing.
> >> >
> >> > So... Here is what I'd want to do:
> >> >
> >> > 1. Produce a simple message, like 1, 2, 3, 4, 5, ... from a producer
> >> > 2. Let Kafka Stream application consume the message and change the
> >> message
> >> > like [1], [2], [3], ...
> >> > 3. Consume the changed message at a consumer
> >> >
> >> > I've read the documentation,
> >> > https://kafka.apache.org/0102/javadoc/index.html?org/apache/
> >> kafka/connect,
> >> > but it's unclear for me how to implement it.
> >> >
> >> > Especially, I could not understand the the
> >> > line, builder.stream("my-input-topic").mapValues(value -&gt;
> >> > value.length().toString()).to("my-output-topic"). Could someone
> >> explain it
> >> > and how to implement what I've mentioned?
> >> >
> >> > Thanks in advance.
> >> >
> >> > Best regards
> >> >
> >> > KIM
> >> >
> >>
> >
> >
>



-- 
*Michael G. Noll*
Product Manager | Confluent
+1 650 453 5860 | @miguno <https://twitter.com/miguno>
Follow us: Twitter <https://twitter.com/ConfluentInc> | Blog
<http://www.confluent.io/blog>

Re: Kafka Streams question

Posted by BYEONG-GI KIM <bg...@bluedigm.com>.
Dear Michael Noll,

I have a question; Is it possible converting JSON format to YAML format via
using Kafka Streams?

Best Regards

KIM

2017-03-10 11:36 GMT+09:00 BYEONG-GI KIM <bg...@bluedigm.com>:

> Thank you very much for the information!
>
>
> 2017-03-09 19:40 GMT+09:00 Michael Noll <mi...@confluent.io>:
>
>> There's actually a demo application that demonstrates the simplest use
>> case
>> for Kafka's Streams API:  to read data from an input topic and then write
>> that data as-is to an output topic.
>>
>> https://github.com/confluentinc/examples/blob/3.2.x/kafka-
>> streams/src/test/java/io/confluent/examples/streams/Pas
>> sThroughIntegrationTest.java
>>
>> The code above is for Confluent 3.2 and Apache Kafka 0.10.2.
>>
>> The demo shows how to (1) write a message from a producer to the input
>> topic, (2) use a Kafka Streams app to process that data and write the
>> results back to Kafka, and (3) validating the results with a consumer that
>> reads from the output topic.
>>
>> The GitHub project above includes many more such examples, see
>> https://github.com/confluentinc/examples/tree/3.2.x/kafka-streams.
>> Again,
>> this is for Confluent 3.2 and Kafka 0.10.2.  There is a version
>> compatibility matrix that explains which branches you need to use for
>> older
>> versions of Confluent/Kafka as well as for the very latest development
>> version (aka Kafka's trunk):
>> https://github.com/confluentinc/examples/tree/3.2.x/kafka-
>> streams#version-compatibility
>>
>> Hope this helps!
>> Michael
>>
>>
>>
>>
>> On Thu, Mar 9, 2017 at 9:59 AM, BYEONG-GI KIM <bg...@bluedigm.com> wrote:
>>
>> > Hello.
>> >
>> > I'm a new who started learning the one of the new Kafka functionality,
>> aka
>> > Kafka Stream.
>> >
>> > As far as I know, the simplest usage of the Kafka Stream is to do
>> something
>> > like parsing, which forward incoming data from a topic to another topic,
>> > with a few changing.
>> >
>> > So... Here is what I'd want to do:
>> >
>> > 1. Produce a simple message, like 1, 2, 3, 4, 5, ... from a producer
>> > 2. Let Kafka Stream application consume the message and change the
>> message
>> > like [1], [2], [3], ...
>> > 3. Consume the changed message at a consumer
>> >
>> > I've read the documentation,
>> > https://kafka.apache.org/0102/javadoc/index.html?org/apache/
>> kafka/connect,
>> > but it's unclear for me how to implement it.
>> >
>> > Especially, I could not understand the the
>> > line, builder.stream("my-input-topic").mapValues(value -&gt;
>> > value.length().toString()).to("my-output-topic"). Could someone
>> explain it
>> > and how to implement what I've mentioned?
>> >
>> > Thanks in advance.
>> >
>> > Best regards
>> >
>> > KIM
>> >
>>
>
>

Re: Kafka Streams question

Posted by BYEONG-GI KIM <bg...@bluedigm.com>.
Thank you very much for the information!


2017-03-09 19:40 GMT+09:00 Michael Noll <mi...@confluent.io>:

> There's actually a demo application that demonstrates the simplest use case
> for Kafka's Streams API:  to read data from an input topic and then write
> that data as-is to an output topic.
>
> https://github.com/confluentinc/examples/blob/3.
> 2.x/kafka-streams/src/test/java/io/confluent/examples/streams/
> PassThroughIntegrationTest.java
>
> The code above is for Confluent 3.2 and Apache Kafka 0.10.2.
>
> The demo shows how to (1) write a message from a producer to the input
> topic, (2) use a Kafka Streams app to process that data and write the
> results back to Kafka, and (3) validating the results with a consumer that
> reads from the output topic.
>
> The GitHub project above includes many more such examples, see
> https://github.com/confluentinc/examples/tree/3.2.x/kafka-streams.  Again,
> this is for Confluent 3.2 and Kafka 0.10.2.  There is a version
> compatibility matrix that explains which branches you need to use for older
> versions of Confluent/Kafka as well as for the very latest development
> version (aka Kafka's trunk):
> https://github.com/confluentinc/examples/tree/3.2.x/kafka-streams#version-
> compatibility
>
> Hope this helps!
> Michael
>
>
>
>
> On Thu, Mar 9, 2017 at 9:59 AM, BYEONG-GI KIM <bg...@bluedigm.com> wrote:
>
> > Hello.
> >
> > I'm a new who started learning the one of the new Kafka functionality,
> aka
> > Kafka Stream.
> >
> > As far as I know, the simplest usage of the Kafka Stream is to do
> something
> > like parsing, which forward incoming data from a topic to another topic,
> > with a few changing.
> >
> > So... Here is what I'd want to do:
> >
> > 1. Produce a simple message, like 1, 2, 3, 4, 5, ... from a producer
> > 2. Let Kafka Stream application consume the message and change the
> message
> > like [1], [2], [3], ...
> > 3. Consume the changed message at a consumer
> >
> > I've read the documentation,
> > https://kafka.apache.org/0102/javadoc/index.html?org/apache/
> kafka/connect,
> > but it's unclear for me how to implement it.
> >
> > Especially, I could not understand the the
> > line, builder.stream("my-input-topic").mapValues(value -&gt;
> > value.length().toString()).to("my-output-topic"). Could someone explain
> it
> > and how to implement what I've mentioned?
> >
> > Thanks in advance.
> >
> > Best regards
> >
> > KIM
> >
>

Re: Kafka Streams question

Posted by Michael Noll <mi...@confluent.io>.
There's actually a demo application that demonstrates the simplest use case
for Kafka's Streams API:  to read data from an input topic and then write
that data as-is to an output topic.

https://github.com/confluentinc/examples/blob/3.2.x/kafka-streams/src/test/java/io/confluent/examples/streams/PassThroughIntegrationTest.java

The code above is for Confluent 3.2 and Apache Kafka 0.10.2.

The demo shows how to (1) write a message from a producer to the input
topic, (2) use a Kafka Streams app to process that data and write the
results back to Kafka, and (3) validating the results with a consumer that
reads from the output topic.

The GitHub project above includes many more such examples, see
https://github.com/confluentinc/examples/tree/3.2.x/kafka-streams.  Again,
this is for Confluent 3.2 and Kafka 0.10.2.  There is a version
compatibility matrix that explains which branches you need to use for older
versions of Confluent/Kafka as well as for the very latest development
version (aka Kafka's trunk):
https://github.com/confluentinc/examples/tree/3.2.x/kafka-streams#version-compatibility

Hope this helps!
Michael




On Thu, Mar 9, 2017 at 9:59 AM, BYEONG-GI KIM <bg...@bluedigm.com> wrote:

> Hello.
>
> I'm a new who started learning the one of the new Kafka functionality, aka
> Kafka Stream.
>
> As far as I know, the simplest usage of the Kafka Stream is to do something
> like parsing, which forward incoming data from a topic to another topic,
> with a few changing.
>
> So... Here is what I'd want to do:
>
> 1. Produce a simple message, like 1, 2, 3, 4, 5, ... from a producer
> 2. Let Kafka Stream application consume the message and change the message
> like [1], [2], [3], ...
> 3. Consume the changed message at a consumer
>
> I've read the documentation,
> https://kafka.apache.org/0102/javadoc/index.html?org/apache/kafka/connect,
> but it's unclear for me how to implement it.
>
> Especially, I could not understand the the
> line, builder.stream("my-input-topic").mapValues(value -&gt;
> value.length().toString()).to("my-output-topic"). Could someone explain it
> and how to implement what I've mentioned?
>
> Thanks in advance.
>
> Best regards
>
> KIM
>