You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by vishnu murali <vi...@gmail.com> on 2020/05/08 13:59:24 UTC

JDBC SINK SCHEMA

Hey Guys,

I am *using Apache **2.5 *not confluent.

i am trying to send data from topic to database using jdbc sink connector.

we need to send that data with the appropriate schema also.

i am *not using confluent version* of kafka.

so can anyone explain how can i do this ?

Re: JDBC SINK SCHEMA

Posted by Robin Moffatt <ro...@confluent.io>.
Schema Registry is available as part of Confluent Platform download (
https://www.confluent.io/download/), and install per
https://docs.confluent.io/current/schema-registry/installation/index.html
The difference is that you just run the Schema Registry part of the stack,
and leave the other components as is. When you configure Schema Registry
you point it at your existing Apache Kafka cluster (
https://docs.confluent.io/current/schema-registry/installation/config.html#schemaregistry-config
)


-- 

Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff


On Mon, 11 May 2020 at 10:02, vishnu murali <vi...@gmail.com>
wrote:

> Hi Robin
>
> Is it possible to integrate Apache Kafka with that confluent schema
> registry like u said ??
>
> I don't know how to do,can u able to give any reference?
>
> On Mon, May 11, 2020, 14:09 Robin Moffatt <ro...@confluent.io> wrote:
>
> > You can use Apache Kafka as you are currently using, and just deploy
> Schema
> > Registry alongside it.
> >
> >
> > --
> >
> > Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff
> >
> >
> > On Sat, 9 May 2020 at 02:16, Chris Toomey <ct...@gmail.com> wrote:
> >
> > > You have to either 1) use one of the Confluent serializers
> > > <
> > >
> >
> https://docs.confluent.io/current/schema-registry/serdes-develop/index.html#
> > > >
> > > when you publish to the topic, so that the schema (or reference to it)
> is
> > > included, or 2) write and use a custom converter
> > > <
> > >
> >
> https://kafka.apache.org/25/javadoc/org/apache/kafka/connect/storage/Converter.html
> > > >
> > > that knows about the data schema and can take the kafka record value
> and
> > > convert it into a kafka connect record (by implementing the
> toConnectData
> > > converter method), which is what the sink connectors are driven from.
> > >
> > > See https://docs.confluent.io/current/connect/concepts.html#converters
> > >
> > > Chris
> > >
> > >
> > >
> > > On Fri, May 8, 2020 at 6:59 AM vishnu murali <
> vishnumurali9762@gmail.com
> > >
> > > wrote:
> > >
> > > > Hey Guys,
> > > >
> > > > I am *using Apache **2.5 *not confluent.
> > > >
> > > > i am trying to send data from topic to database using jdbc sink
> > > connector.
> > > >
> > > > we need to send that data with the appropriate schema also.
> > > >
> > > > i am *not using confluent version* of kafka.
> > > >
> > > > so can anyone explain how can i do this ?
> > > >
> > >
> >
>

Re: JDBC SINK SCHEMA

Posted by vishnu murali <vi...@gmail.com>.
Hi Robin

Is it possible to integrate Apache Kafka with that confluent schema
registry like u said ??

I don't know how to do,can u able to give any reference?

On Mon, May 11, 2020, 14:09 Robin Moffatt <ro...@confluent.io> wrote:

> You can use Apache Kafka as you are currently using, and just deploy Schema
> Registry alongside it.
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff
>
>
> On Sat, 9 May 2020 at 02:16, Chris Toomey <ct...@gmail.com> wrote:
>
> > You have to either 1) use one of the Confluent serializers
> > <
> >
> https://docs.confluent.io/current/schema-registry/serdes-develop/index.html#
> > >
> > when you publish to the topic, so that the schema (or reference to it) is
> > included, or 2) write and use a custom converter
> > <
> >
> https://kafka.apache.org/25/javadoc/org/apache/kafka/connect/storage/Converter.html
> > >
> > that knows about the data schema and can take the kafka record value and
> > convert it into a kafka connect record (by implementing the toConnectData
> > converter method), which is what the sink connectors are driven from.
> >
> > See https://docs.confluent.io/current/connect/concepts.html#converters
> >
> > Chris
> >
> >
> >
> > On Fri, May 8, 2020 at 6:59 AM vishnu murali <vishnumurali9762@gmail.com
> >
> > wrote:
> >
> > > Hey Guys,
> > >
> > > I am *using Apache **2.5 *not confluent.
> > >
> > > i am trying to send data from topic to database using jdbc sink
> > connector.
> > >
> > > we need to send that data with the appropriate schema also.
> > >
> > > i am *not using confluent version* of kafka.
> > >
> > > so can anyone explain how can i do this ?
> > >
> >
>

Re: JDBC SINK SCHEMA

Posted by Robin Moffatt <ro...@confluent.io>.
You can use Apache Kafka as you are currently using, and just deploy Schema
Registry alongside it.


-- 

Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff


On Sat, 9 May 2020 at 02:16, Chris Toomey <ct...@gmail.com> wrote:

> You have to either 1) use one of the Confluent serializers
> <
> https://docs.confluent.io/current/schema-registry/serdes-develop/index.html#
> >
> when you publish to the topic, so that the schema (or reference to it) is
> included, or 2) write and use a custom converter
> <
> https://kafka.apache.org/25/javadoc/org/apache/kafka/connect/storage/Converter.html
> >
> that knows about the data schema and can take the kafka record value and
> convert it into a kafka connect record (by implementing the toConnectData
> converter method), which is what the sink connectors are driven from.
>
> See https://docs.confluent.io/current/connect/concepts.html#converters
>
> Chris
>
>
>
> On Fri, May 8, 2020 at 6:59 AM vishnu murali <vi...@gmail.com>
> wrote:
>
> > Hey Guys,
> >
> > I am *using Apache **2.5 *not confluent.
> >
> > i am trying to send data from topic to database using jdbc sink
> connector.
> >
> > we need to send that data with the appropriate schema also.
> >
> > i am *not using confluent version* of kafka.
> >
> > so can anyone explain how can i do this ?
> >
>

Re: JDBC SINK SCHEMA

Posted by Chris Toomey <ct...@gmail.com>.
You have to either 1) use one of the Confluent serializers
<https://docs.confluent.io/current/schema-registry/serdes-develop/index.html#>
when you publish to the topic, so that the schema (or reference to it) is
included, or 2) write and use a custom converter
<https://kafka.apache.org/25/javadoc/org/apache/kafka/connect/storage/Converter.html>
that knows about the data schema and can take the kafka record value and
convert it into a kafka connect record (by implementing the toConnectData
converter method), which is what the sink connectors are driven from.

See https://docs.confluent.io/current/connect/concepts.html#converters

Chris



On Fri, May 8, 2020 at 6:59 AM vishnu murali <vi...@gmail.com>
wrote:

> Hey Guys,
>
> I am *using Apache **2.5 *not confluent.
>
> i am trying to send data from topic to database using jdbc sink connector.
>
> we need to send that data with the appropriate schema also.
>
> i am *not using confluent version* of kafka.
>
> so can anyone explain how can i do this ?
>