You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@kafka.apache.org by "Eric C Abis (JIRA)" <ji...@apache.org> on 2018/11/29 17:58:00 UTC

[jira] [Updated] (KAFKA-7688) Allow byte array class for Decimal Logical Types to fix Debezium Issues

     [ https://issues.apache.org/jira/browse/KAFKA-7688?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Eric C Abis updated KAFKA-7688:
-------------------------------
    Description: 
Decimal Logical Type fields are failing with Kafka Connect sink tasks and showing this error:
{code:java}
Invalid Java object for schema type BYTES: class [B for field: "null"{code}
There is an issue tracker for the problem here in the Confluent Schema Registry tracker (it's all related):  [https://github.com/confluentinc/schema-registry/issues/833]

I've created a fix for this issue and tested and verified it in our CF4 cluster here at Shutterstock.

Ultimately the issue boils down to the fact that in Avro, Decimal Logical types store values as a Base64 encoded Byte Arrays for the default values, and BigInteger Byte Arrays for the record values.   I'd like to submit a PR that changes the SCHEMA_TYPE_CLASSES hash map in org.apache.kafka.connect.data.ConnectSchema to allow Byte Arrays for Decimal fields. 

Separately I have a similar change in{color:#333333} [io.confluent.connect.avro.AvroData|https://github.com/TheGreatAbyss/schema-registry/pull/1/files#diff-ac149179f9760319ccc772695cb21364] that I will submit a PR for as well.{color}

I reached out [to users@kafka.apache.org|mailto:to%C2%A0users@kafka.apache.org] to ask for GitHub permissions but if there is somewhere else I need to reach out to please let me know.

My GitHub user is TheGreatAbyss

Thank You!

Eric

  was:
Decimal Logical Type fields are failing with Kafka Connect sink tasks and showing this error:
{code:java}
Invalid Java object for schema type BYTES: class [B for field: "null"{code}
There is an issue tracker for the problem here in the Confluent Schema Registry tracker (it's all related):  [https://github.com/confluentinc/schema-registry/issues/833]

I've created a fix for this issue and tested and verified it in our CF4 cluster here at Shutterstock.

Ultimately the issue boils down to the fact that in Avro, Decimal Logical types store values as a Base64 encoded Byte Arrays for the default values, and BigInteger Byte Arrays for the record values.   I'd like to submit a PR that changes the SCHEMA_TYPE_CLASSES hash map in org.apache.kafka.connect.data.ConnectSchema to allow Byte Arrays for Decimal fields. 

Separately I have a similar change in{color:#333333} [io.confluent.connect.avro.AvroData|https://github.com/TheGreatAbyss/schema-registry/pull/1/files#diff-ac149179f9760319ccc772695cb21364] that I will submit a PR for as well.{color}

I reached out [to users@kafka.apache.org|mailto:to%C2%A0users@kafka.apache.org] to ask for GitHub permissions but if there is somewhere else I need to reach out to please let me know.


Thank You!

Eric


> Allow byte array class for Decimal Logical Types to fix Debezium Issues
> -----------------------------------------------------------------------
>
>                 Key: KAFKA-7688
>                 URL: https://issues.apache.org/jira/browse/KAFKA-7688
>             Project: Kafka
>          Issue Type: Bug
>          Components: KafkaConnect
>    Affects Versions: 1.1.1
>            Reporter: Eric C Abis
>            Priority: Blocker
>             Fix For: 1.1.1
>
>
> Decimal Logical Type fields are failing with Kafka Connect sink tasks and showing this error:
> {code:java}
> Invalid Java object for schema type BYTES: class [B for field: "null"{code}
> There is an issue tracker for the problem here in the Confluent Schema Registry tracker (it's all related):  [https://github.com/confluentinc/schema-registry/issues/833]
> I've created a fix for this issue and tested and verified it in our CF4 cluster here at Shutterstock.
> Ultimately the issue boils down to the fact that in Avro, Decimal Logical types store values as a Base64 encoded Byte Arrays for the default values, and BigInteger Byte Arrays for the record values.   I'd like to submit a PR that changes the SCHEMA_TYPE_CLASSES hash map in org.apache.kafka.connect.data.ConnectSchema to allow Byte Arrays for Decimal fields. 
> Separately I have a similar change in{color:#333333} [io.confluent.connect.avro.AvroData|https://github.com/TheGreatAbyss/schema-registry/pull/1/files#diff-ac149179f9760319ccc772695cb21364] that I will submit a PR for as well.{color}
> I reached out [to users@kafka.apache.org|mailto:to%C2%A0users@kafka.apache.org] to ask for GitHub permissions but if there is somewhere else I need to reach out to please let me know.
> My GitHub user is TheGreatAbyss
> Thank You!
> Eric



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)