You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Sa Li <sa...@gmail.com> on 2014/12/04 21:58:13 UTC

kafka consumer to write into DB

Hello, all

I never developed a kafka consumer, I want to be able to make an advanced
kafka consumer in java to consume the data and continuously write the data
into postgresql DB. I am thinking to create a map in memory and getting a
predefined number of messages in memory then write into DB in batch, is
there a API or sample code to allow me to do this?


thanks


-- 

Alec Li

Re: kafka consumer to write into DB

Posted by Krishna Raj <re...@gmail.com>.
Hi Sa,

I created bulk consumer which consumes, processes and post to ElasticSearch.

There are config for the size of message consumption. And you can modify
the code about what you want to do about the consumed message.

https://github.com/reachkrishnaraj/kafka-elasticsearch-standalone-consumer

Thanks,
Kr



On Fri, Dec 5, 2014 at 5:07 PM, Neha Narkhede <ne...@confluent.io> wrote:

> Not that I know of.
>
> On Fri, Dec 5, 2014 at 9:44 AM, Sa Li <sa...@gmail.com> wrote:
>
> > Thanks, Neha, is there a java version batch consumer?
> >
> > thanks
> >
> >
> >
> > On Fri, Dec 5, 2014 at 9:41 AM, Scott Clasen <sc...@heroku.com> wrote:
> >
> > > if you are using scala/akka this will handle the batching and acks for
> > you.
> > >
> > > https://github.com/sclasen/akka-kafka#akkabatchconsumer
> > >
> > > On Fri, Dec 5, 2014 at 9:21 AM, Sa Li <sa...@gmail.com> wrote:
> > >
> > > > Thank you very much for the reply, Neha, I have a question about
> > > consumer,
> > > > I consume the data from kafka and write into DB, of course I have to
> > > create
> > > > a hash map in memory, load data into memory and bulk copy to DB
> instead
> > > of
> > > > insert into DB line by line. Does it mean I need to ack each message
> > > while
> > > > load to memory?
> > > >
> > > > thanks
> > > >
> > > >
> > > >
> > > > On Thu, Dec 4, 2014 at 1:21 PM, Neha Narkhede <ne...@confluent.io>
> > wrote:
> > > >
> > > > > This is specific for pentaho but may be useful -
> > > > > https://github.com/RuckusWirelessIL/pentaho-kafka-consumer
> > > > >
> > > > > On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <sa...@gmail.com> wrote:
> > > > >
> > > > > > Hello, all
> > > > > >
> > > > > > I never developed a kafka consumer, I want to be able to make an
> > > > advanced
> > > > > > kafka consumer in java to consume the data and continuously write
> > the
> > > > > data
> > > > > > into postgresql DB. I am thinking to create a map in memory and
> > > > getting a
> > > > > > predefined number of messages in memory then write into DB in
> > batch,
> > > is
> > > > > > there a API or sample code to allow me to do this?
> > > > > >
> > > > > >
> > > > > > thanks
> > > > > >
> > > > > >
> > > > > > --
> > > > > >
> > > > > > Alec Li
> > > > > >
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > Thanks,
> > > > > Neha
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > >
> > > > Alec Li
> > > >
> > >
> >
> >
> >
> > --
> >
> > Alec Li
> >
>
>
>
> --
> Thanks,
> Neha
>

Re: kafka consumer to write into DB

Posted by Neha Narkhede <ne...@confluent.io>.
Not that I know of.

On Fri, Dec 5, 2014 at 9:44 AM, Sa Li <sa...@gmail.com> wrote:

> Thanks, Neha, is there a java version batch consumer?
>
> thanks
>
>
>
> On Fri, Dec 5, 2014 at 9:41 AM, Scott Clasen <sc...@heroku.com> wrote:
>
> > if you are using scala/akka this will handle the batching and acks for
> you.
> >
> > https://github.com/sclasen/akka-kafka#akkabatchconsumer
> >
> > On Fri, Dec 5, 2014 at 9:21 AM, Sa Li <sa...@gmail.com> wrote:
> >
> > > Thank you very much for the reply, Neha, I have a question about
> > consumer,
> > > I consume the data from kafka and write into DB, of course I have to
> > create
> > > a hash map in memory, load data into memory and bulk copy to DB instead
> > of
> > > insert into DB line by line. Does it mean I need to ack each message
> > while
> > > load to memory?
> > >
> > > thanks
> > >
> > >
> > >
> > > On Thu, Dec 4, 2014 at 1:21 PM, Neha Narkhede <ne...@confluent.io>
> wrote:
> > >
> > > > This is specific for pentaho but may be useful -
> > > > https://github.com/RuckusWirelessIL/pentaho-kafka-consumer
> > > >
> > > > On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <sa...@gmail.com> wrote:
> > > >
> > > > > Hello, all
> > > > >
> > > > > I never developed a kafka consumer, I want to be able to make an
> > > advanced
> > > > > kafka consumer in java to consume the data and continuously write
> the
> > > > data
> > > > > into postgresql DB. I am thinking to create a map in memory and
> > > getting a
> > > > > predefined number of messages in memory then write into DB in
> batch,
> > is
> > > > > there a API or sample code to allow me to do this?
> > > > >
> > > > >
> > > > > thanks
> > > > >
> > > > >
> > > > > --
> > > > >
> > > > > Alec Li
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Thanks,
> > > > Neha
> > > >
> > >
> > >
> > >
> > > --
> > >
> > > Alec Li
> > >
> >
>
>
>
> --
>
> Alec Li
>



-- 
Thanks,
Neha

Re: kafka consumer to write into DB

Posted by Sa Li <sa...@gmail.com>.
Thanks, Neha, is there a java version batch consumer?

thanks



On Fri, Dec 5, 2014 at 9:41 AM, Scott Clasen <sc...@heroku.com> wrote:

> if you are using scala/akka this will handle the batching and acks for you.
>
> https://github.com/sclasen/akka-kafka#akkabatchconsumer
>
> On Fri, Dec 5, 2014 at 9:21 AM, Sa Li <sa...@gmail.com> wrote:
>
> > Thank you very much for the reply, Neha, I have a question about
> consumer,
> > I consume the data from kafka and write into DB, of course I have to
> create
> > a hash map in memory, load data into memory and bulk copy to DB instead
> of
> > insert into DB line by line. Does it mean I need to ack each message
> while
> > load to memory?
> >
> > thanks
> >
> >
> >
> > On Thu, Dec 4, 2014 at 1:21 PM, Neha Narkhede <ne...@confluent.io> wrote:
> >
> > > This is specific for pentaho but may be useful -
> > > https://github.com/RuckusWirelessIL/pentaho-kafka-consumer
> > >
> > > On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <sa...@gmail.com> wrote:
> > >
> > > > Hello, all
> > > >
> > > > I never developed a kafka consumer, I want to be able to make an
> > advanced
> > > > kafka consumer in java to consume the data and continuously write the
> > > data
> > > > into postgresql DB. I am thinking to create a map in memory and
> > getting a
> > > > predefined number of messages in memory then write into DB in batch,
> is
> > > > there a API or sample code to allow me to do this?
> > > >
> > > >
> > > > thanks
> > > >
> > > >
> > > > --
> > > >
> > > > Alec Li
> > > >
> > >
> > >
> > >
> > > --
> > > Thanks,
> > > Neha
> > >
> >
> >
> >
> > --
> >
> > Alec Li
> >
>



-- 

Alec Li

Re: kafka consumer to write into DB

Posted by Scott Clasen <sc...@heroku.com>.
if you are using scala/akka this will handle the batching and acks for you.

https://github.com/sclasen/akka-kafka#akkabatchconsumer

On Fri, Dec 5, 2014 at 9:21 AM, Sa Li <sa...@gmail.com> wrote:

> Thank you very much for the reply, Neha, I have a question about consumer,
> I consume the data from kafka and write into DB, of course I have to create
> a hash map in memory, load data into memory and bulk copy to DB instead of
> insert into DB line by line. Does it mean I need to ack each message while
> load to memory?
>
> thanks
>
>
>
> On Thu, Dec 4, 2014 at 1:21 PM, Neha Narkhede <ne...@confluent.io> wrote:
>
> > This is specific for pentaho but may be useful -
> > https://github.com/RuckusWirelessIL/pentaho-kafka-consumer
> >
> > On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <sa...@gmail.com> wrote:
> >
> > > Hello, all
> > >
> > > I never developed a kafka consumer, I want to be able to make an
> advanced
> > > kafka consumer in java to consume the data and continuously write the
> > data
> > > into postgresql DB. I am thinking to create a map in memory and
> getting a
> > > predefined number of messages in memory then write into DB in batch, is
> > > there a API or sample code to allow me to do this?
> > >
> > >
> > > thanks
> > >
> > >
> > > --
> > >
> > > Alec Li
> > >
> >
> >
> >
> > --
> > Thanks,
> > Neha
> >
>
>
>
> --
>
> Alec Li
>

Re: kafka consumer to write into DB

Posted by Sa Li <sa...@gmail.com>.
Thank you very much for the reply, Neha, I have a question about consumer,
I consume the data from kafka and write into DB, of course I have to create
a hash map in memory, load data into memory and bulk copy to DB instead of
insert into DB line by line. Does it mean I need to ack each message while
load to memory?

thanks



On Thu, Dec 4, 2014 at 1:21 PM, Neha Narkhede <ne...@confluent.io> wrote:

> This is specific for pentaho but may be useful -
> https://github.com/RuckusWirelessIL/pentaho-kafka-consumer
>
> On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <sa...@gmail.com> wrote:
>
> > Hello, all
> >
> > I never developed a kafka consumer, I want to be able to make an advanced
> > kafka consumer in java to consume the data and continuously write the
> data
> > into postgresql DB. I am thinking to create a map in memory and getting a
> > predefined number of messages in memory then write into DB in batch, is
> > there a API or sample code to allow me to do this?
> >
> >
> > thanks
> >
> >
> > --
> >
> > Alec Li
> >
>
>
>
> --
> Thanks,
> Neha
>



-- 

Alec Li

Re: kafka consumer to write into DB

Posted by Neha Narkhede <ne...@confluent.io>.
This is specific for pentaho but may be useful -
https://github.com/RuckusWirelessIL/pentaho-kafka-consumer

On Thu, Dec 4, 2014 at 12:58 PM, Sa Li <sa...@gmail.com> wrote:

> Hello, all
>
> I never developed a kafka consumer, I want to be able to make an advanced
> kafka consumer in java to consume the data and continuously write the data
> into postgresql DB. I am thinking to create a map in memory and getting a
> predefined number of messages in memory then write into DB in batch, is
> there a API or sample code to allow me to do this?
>
>
> thanks
>
>
> --
>
> Alec Li
>



-- 
Thanks,
Neha