You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by vishnu murali <vi...@gmail.com> on 2020/07/02 12:54:05 UTC

Problem in reading From JDBC SOURCE

Hi Guys,

I am having some problem while reading from MySQL using JDBC source and
received like below
Anyone know what is the reason and how to solve this ?

"a": "Aote",

  "b": "AmrU",

  "c": "AceM",

  "d": "Aote",


Instead of

"a": 0.002,

  "b": 0.465,

  "c": 0.545,

  "d": 0.100


It's my configuration


{

    "name": "sample",

    "config": {

        "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",

        "connection.url": "jdbc:mysql://localhost:3306/sample",

        "connection.user": "xxxx",

        "connection.password": "xxx",

        "topic.prefix": "dample-",

        "poll.interval.ms": 3600000,

        "table.whitelist": "sample",

        "schemas.enable": "false",

        "mode": "bulk",

        "value.converter.schemas.enable": "false",

        "numeric.mapping": "best_fit",

        "value.converter": "org.apache.kafka.connect.json.JsonConverter",

        "transforms": "createKey,extractInt",

        "transforms.createKey.type":
"org.apache.kafka.connect.transforms.ValueToKey",

        "transforms.createKey.fields": "ID",

        "transforms.extractInt.type":
"org.apache.kafka.connect.transforms.ExtractField$Key",

        "transforms.extractInt.field": "ID"

    }

}

Re: Problem in reading From JDBC SOURCE

Posted by vishnu murali <vi...@gmail.com>.
Hi


Below is the script I used to create table in Mysql


CREATE TABLE `sample` (

  `id` varchar(45) NOT NULL,

  `a` decimal(10,3) DEFAULT NULL,

  `b` decimal(10,3) DEFAULT NULL,

  `c` decimal(10,3) DEFAULT NULL,

  `d` decimal(10,3) DEFAULT NULL,

  PRIMARY KEY (`id`)

) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;



 Table data

Id- 1

a- 0.002

b- 2.250

c- 0.789

d- 0.558

On Thu, Jul 2, 2020, 19:50 Ricardo Ferreira <ri...@riferrei.com> wrote:

> Vishnu,
>
> I think is hard to troubleshoot things without the proper context. In
> your case, could you please share an example of the rows contained in
> the table `sample`? As well as its DDL?
>
> -- Ricardo
>
> On 7/2/20 9:29 AM, vishnu murali wrote:
> > I go through that documentation
> >
> > Where it described like DECIMAL is not supported in MySQL  like this .
> >
> > And also no example for MySQL so is there any other sample with MySQL
> >
> >
> >
> > On Thu, Jul 2, 2020, 18:49 Robin Moffatt <ro...@confluent.io> wrote:
> >
> >> Check out this article where it covers decimal handling:
> >>
> >>
> https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics
> >>
> >>
> >> --
> >>
> >> Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff
> >>
> >>
> >> On Thu, 2 Jul 2020 at 13:54, vishnu murali <vi...@gmail.com>
> >> wrote:
> >>
> >>> Hi Guys,
> >>>
> >>> I am having some problem while reading from MySQL using JDBC source and
> >>> received like below
> >>> Anyone know what is the reason and how to solve this ?
> >>>
> >>> "a": "Aote",
> >>>
> >>>    "b": "AmrU",
> >>>
> >>>    "c": "AceM",
> >>>
> >>>    "d": "Aote",
> >>>
> >>>
> >>> Instead of
> >>>
> >>> "a": 0.002,
> >>>
> >>>    "b": 0.465,
> >>>
> >>>    "c": 0.545,
> >>>
> >>>    "d": 0.100
> >>>
> >>>
> >>> It's my configuration
> >>>
> >>>
> >>> {
> >>>
> >>>      "name": "sample",
> >>>
> >>>      "config": {
> >>>
> >>>          "connector.class":
> >> "io.confluent.connect.jdbc.JdbcSourceConnector",
> >>>          "connection.url": "jdbc:mysql://localhost:3306/sample",
> >>>
> >>>          "connection.user": "xxxx",
> >>>
> >>>          "connection.password": "xxx",
> >>>
> >>>          "topic.prefix": "dample-",
> >>>
> >>>          "poll.interval.ms": 3600000,
> >>>
> >>>          "table.whitelist": "sample",
> >>>
> >>>          "schemas.enable": "false",
> >>>
> >>>          "mode": "bulk",
> >>>
> >>>          "value.converter.schemas.enable": "false",
> >>>
> >>>          "numeric.mapping": "best_fit",
> >>>
> >>>          "value.converter":
> "org.apache.kafka.connect.json.JsonConverter",
> >>>
> >>>          "transforms": "createKey,extractInt",
> >>>
> >>>          "transforms.createKey.type":
> >>> "org.apache.kafka.connect.transforms.ValueToKey",
> >>>
> >>>          "transforms.createKey.fields": "ID",
> >>>
> >>>          "transforms.extractInt.type":
> >>> "org.apache.kafka.connect.transforms.ExtractField$Key",
> >>>
> >>>          "transforms.extractInt.field": "ID"
> >>>
> >>>      }
> >>>
> >>> }
> >>>
>

Re: Problem in reading From JDBC SOURCE

Posted by Ricardo Ferreira <ri...@riferrei.com>.
Vishnu,

I think is hard to troubleshoot things without the proper context. In 
your case, could you please share an example of the rows contained in 
the table `sample`? As well as its DDL?

-- Ricardo

On 7/2/20 9:29 AM, vishnu murali wrote:
> I go through that documentation
>
> Where it described like DECIMAL is not supported in MySQL  like this .
>
> And also no example for MySQL so is there any other sample with MySQL
>
>
>
> On Thu, Jul 2, 2020, 18:49 Robin Moffatt <ro...@confluent.io> wrote:
>
>> Check out this article where it covers decimal handling:
>>
>> https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics
>>
>>
>> --
>>
>> Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff
>>
>>
>> On Thu, 2 Jul 2020 at 13:54, vishnu murali <vi...@gmail.com>
>> wrote:
>>
>>> Hi Guys,
>>>
>>> I am having some problem while reading from MySQL using JDBC source and
>>> received like below
>>> Anyone know what is the reason and how to solve this ?
>>>
>>> "a": "Aote",
>>>
>>>    "b": "AmrU",
>>>
>>>    "c": "AceM",
>>>
>>>    "d": "Aote",
>>>
>>>
>>> Instead of
>>>
>>> "a": 0.002,
>>>
>>>    "b": 0.465,
>>>
>>>    "c": 0.545,
>>>
>>>    "d": 0.100
>>>
>>>
>>> It's my configuration
>>>
>>>
>>> {
>>>
>>>      "name": "sample",
>>>
>>>      "config": {
>>>
>>>          "connector.class":
>> "io.confluent.connect.jdbc.JdbcSourceConnector",
>>>          "connection.url": "jdbc:mysql://localhost:3306/sample",
>>>
>>>          "connection.user": "xxxx",
>>>
>>>          "connection.password": "xxx",
>>>
>>>          "topic.prefix": "dample-",
>>>
>>>          "poll.interval.ms": 3600000,
>>>
>>>          "table.whitelist": "sample",
>>>
>>>          "schemas.enable": "false",
>>>
>>>          "mode": "bulk",
>>>
>>>          "value.converter.schemas.enable": "false",
>>>
>>>          "numeric.mapping": "best_fit",
>>>
>>>          "value.converter": "org.apache.kafka.connect.json.JsonConverter",
>>>
>>>          "transforms": "createKey,extractInt",
>>>
>>>          "transforms.createKey.type":
>>> "org.apache.kafka.connect.transforms.ValueToKey",
>>>
>>>          "transforms.createKey.fields": "ID",
>>>
>>>          "transforms.extractInt.type":
>>> "org.apache.kafka.connect.transforms.ExtractField$Key",
>>>
>>>          "transforms.extractInt.field": "ID"
>>>
>>>      }
>>>
>>> }
>>>

Re: Problem in reading From JDBC SOURCE

Posted by vishnu murali <vi...@gmail.com>.
I go through that documentation

Where it described like DECIMAL is not supported in MySQL  like this .

And also no example for MySQL so is there any other sample with MySQL



On Thu, Jul 2, 2020, 18:49 Robin Moffatt <ro...@confluent.io> wrote:

> Check out this article where it covers decimal handling:
>
> https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff
>
>
> On Thu, 2 Jul 2020 at 13:54, vishnu murali <vi...@gmail.com>
> wrote:
>
> > Hi Guys,
> >
> > I am having some problem while reading from MySQL using JDBC source and
> > received like below
> > Anyone know what is the reason and how to solve this ?
> >
> > "a": "Aote",
> >
> >   "b": "AmrU",
> >
> >   "c": "AceM",
> >
> >   "d": "Aote",
> >
> >
> > Instead of
> >
> > "a": 0.002,
> >
> >   "b": 0.465,
> >
> >   "c": 0.545,
> >
> >   "d": 0.100
> >
> >
> > It's my configuration
> >
> >
> > {
> >
> >     "name": "sample",
> >
> >     "config": {
> >
> >         "connector.class":
> "io.confluent.connect.jdbc.JdbcSourceConnector",
> >
> >         "connection.url": "jdbc:mysql://localhost:3306/sample",
> >
> >         "connection.user": "xxxx",
> >
> >         "connection.password": "xxx",
> >
> >         "topic.prefix": "dample-",
> >
> >         "poll.interval.ms": 3600000,
> >
> >         "table.whitelist": "sample",
> >
> >         "schemas.enable": "false",
> >
> >         "mode": "bulk",
> >
> >         "value.converter.schemas.enable": "false",
> >
> >         "numeric.mapping": "best_fit",
> >
> >         "value.converter": "org.apache.kafka.connect.json.JsonConverter",
> >
> >         "transforms": "createKey,extractInt",
> >
> >         "transforms.createKey.type":
> > "org.apache.kafka.connect.transforms.ValueToKey",
> >
> >         "transforms.createKey.fields": "ID",
> >
> >         "transforms.extractInt.type":
> > "org.apache.kafka.connect.transforms.ExtractField$Key",
> >
> >         "transforms.extractInt.field": "ID"
> >
> >     }
> >
> > }
> >
>

Re: Problem in reading From JDBC SOURCE

Posted by Robin Moffatt <ro...@confluent.io>.
Check out this article where it covers decimal handling:
https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/#bytes-decimals-numerics


-- 

Robin Moffatt | Senior Developer Advocate | robin@confluent.io | @rmoff


On Thu, 2 Jul 2020 at 13:54, vishnu murali <vi...@gmail.com>
wrote:

> Hi Guys,
>
> I am having some problem while reading from MySQL using JDBC source and
> received like below
> Anyone know what is the reason and how to solve this ?
>
> "a": "Aote",
>
>   "b": "AmrU",
>
>   "c": "AceM",
>
>   "d": "Aote",
>
>
> Instead of
>
> "a": 0.002,
>
>   "b": 0.465,
>
>   "c": 0.545,
>
>   "d": 0.100
>
>
> It's my configuration
>
>
> {
>
>     "name": "sample",
>
>     "config": {
>
>         "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
>
>         "connection.url": "jdbc:mysql://localhost:3306/sample",
>
>         "connection.user": "xxxx",
>
>         "connection.password": "xxx",
>
>         "topic.prefix": "dample-",
>
>         "poll.interval.ms": 3600000,
>
>         "table.whitelist": "sample",
>
>         "schemas.enable": "false",
>
>         "mode": "bulk",
>
>         "value.converter.schemas.enable": "false",
>
>         "numeric.mapping": "best_fit",
>
>         "value.converter": "org.apache.kafka.connect.json.JsonConverter",
>
>         "transforms": "createKey,extractInt",
>
>         "transforms.createKey.type":
> "org.apache.kafka.connect.transforms.ValueToKey",
>
>         "transforms.createKey.fields": "ID",
>
>         "transforms.extractInt.type":
> "org.apache.kafka.connect.transforms.ExtractField$Key",
>
>         "transforms.extractInt.field": "ID"
>
>     }
>
> }
>