You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by vishnu murali <vi...@gmail.com> on 2020/05/07 05:47:36 UTC

JDBC Sink Connector

Hey Guys,

i am working on JDBC Sink Conneector to take data from kafka topic to mysql.

i am having 2 questions.

i am using normal Apache Kafka 2.5 not a confluent version.

1)For inserting data every time we need to add the schema data also with
every data,How can i overcome this situation?i want to give only the data.

2)In certain time i need to update the existing record without adding as a
new record.How can i achieve this?

Re: JDBC Sink Connector

Posted by Robin Moffatt <ro...@confluent.io>.
Schema Registry and its serde libraries are part of Confluent Platform,
licensed under Confluent Community Licence (
https://www.confluent.io/confluent-community-license-faq/)




-- 

Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff


On Fri, 8 May 2020 at 13:39, vishnu murali <vi...@gmail.com>
wrote:

> Thank you so much Robin
>
> It helped me a lot to define sink connector with upsert mode and it is very
> helpful.
>
> For that schema related question i am not getting proper understanding.
>
> Because i am using Normal Apache kafka,i don't know whether those schema
> registry ,kql,avro serializers are present or not in Apache kafka (2.5)
>
> I Suppose these Schema Registry and ksql services  are coming in the
> confluent version of Kafka.
>
> On Thu, May 7, 2020 at 1:47 PM Robin Moffatt <ro...@confluent.io> wrote:
>
> > If you don't want to send the schema each time then serialise your data
> > using Avro (or Protobuf), and then the schema is held in the Schema
> > Registry. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=981s
> >
> > If you want to update a record insert of insert, you can use the upsert
> > mode. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=627s
> >
> >
> > --
> >
> > Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff
> >
> >
> > On Thu, 7 May 2020 at 06:48, vishnu murali <vi...@gmail.com>
> > wrote:
> >
> > > Hey Guys,
> > >
> > > i am working on JDBC Sink Conneector to take data from kafka topic to
> > > mysql.
> > >
> > > i am having 2 questions.
> > >
> > > i am using normal Apache Kafka 2.5 not a confluent version.
> > >
> > > 1)For inserting data every time we need to add the schema data also
> with
> > > every data,How can i overcome this situation?i want to give only the
> > data.
> > >
> > > 2)In certain time i need to update the existing record without adding
> as
> > a
> > > new record.How can i achieve this?
> > >
> >
>

Re: JDBC Sink Connector

Posted by vishnu murali <vi...@gmail.com>.
Thank you so much Robin

It helped me a lot to define sink connector with upsert mode and it is very
helpful.

For that schema related question i am not getting proper understanding.

Because i am using Normal Apache kafka,i don't know whether those schema
registry ,kql,avro serializers are present or not in Apache kafka (2.5)

I Suppose these Schema Registry and ksql services  are coming in the
confluent version of Kafka.

On Thu, May 7, 2020 at 1:47 PM Robin Moffatt <ro...@confluent.io> wrote:

> If you don't want to send the schema each time then serialise your data
> using Avro (or Protobuf), and then the schema is held in the Schema
> Registry. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=981s
>
> If you want to update a record insert of insert, you can use the upsert
> mode. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=627s
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff
>
>
> On Thu, 7 May 2020 at 06:48, vishnu murali <vi...@gmail.com>
> wrote:
>
> > Hey Guys,
> >
> > i am working on JDBC Sink Conneector to take data from kafka topic to
> > mysql.
> >
> > i am having 2 questions.
> >
> > i am using normal Apache Kafka 2.5 not a confluent version.
> >
> > 1)For inserting data every time we need to add the schema data also with
> > every data,How can i overcome this situation?i want to give only the
> data.
> >
> > 2)In certain time i need to update the existing record without adding as
> a
> > new record.How can i achieve this?
> >
>

Re: JDBC Sink Connector

Posted by Robin Moffatt <ro...@confluent.io>.
If you don't want to send the schema each time then serialise your data
using Avro (or Protobuf), and then the schema is held in the Schema
Registry. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=981s

If you want to update a record insert of insert, you can use the upsert
mode. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=627s


-- 

Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff


On Thu, 7 May 2020 at 06:48, vishnu murali <vi...@gmail.com>
wrote:

> Hey Guys,
>
> i am working on JDBC Sink Conneector to take data from kafka topic to
> mysql.
>
> i am having 2 questions.
>
> i am using normal Apache Kafka 2.5 not a confluent version.
>
> 1)For inserting data every time we need to add the schema data also with
> every data,How can i overcome this situation?i want to give only the data.
>
> 2)In certain time i need to update the existing record without adding as a
> new record.How can i achieve this?
>

Re: JDBC Sink Connector

Posted by Liam Clarke-Hutchinson <li...@adscale.co.nz>.
Hi Vishnu,

I wrote an implementation of org.apache.kafka.connect.storage.Converter,
included it in the KC worker classpath (then set it with the property
value.converter) to provide the schema that the JDBC sink needs.

That approach may work for 1).

For 2) KC can use upsert if your DB supports it, based on the PK you
configure. But I've found in the past that it's not possible to reference
values already in the DB, so if key X had count = 5 in the DB already, and
the JDBC sink had a record with key X and count = 10, then it'll overwrite
instead of accumulating, so after the update count in the DB will be 10,
not 15.


Kind regards,

Liam Clarke-Hutchinson
On Thu, 7 May 2020, 5:48 pm vishnu murali, <vi...@gmail.com>
wrote:

> Hey Guys,
>
> i am working on JDBC Sink Conneector to take data from kafka topic to
> mysql.
>
> i am having 2 questions.
>
> i am using normal Apache Kafka 2.5 not a confluent version.
>
> 1)For inserting data every time we need to add the schema data also with
> every data,How can i overcome this situation?i want to give only the data.
>
> 2)In certain time i need to update the existing record without adding as a
> new record.How can i achieve this?
>