You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@avro.apache.org by Cary Young <ca...@gmail.com> on 2020/01/09 00:10:59 UTC
How best to handle JSON decoding for nullable fields
Hi,
Avro 1.9.1, using Java 8
I happened across the previous conversation ("More idiomatic JSON
encoding for unions") today as I was searching for a solution to
exactly that issue. In my case: 'org.apache.avro.AvroTypeException:
Expected start-union. Got VALUE_NUMBER_INT')
I'm trying to read JSON off of a Kafka topic and write it as Avro.
I have an Avro schema definition that matches the structure of my data
(generated based on my application's Java model objects), however my
data (serialized Java model objects) is not Avro-structured json.
So far this is fine, except in the case of nullable fields that happen
to be non-null in practice, due to the behavior of the Json encoding
scheme employed by Avro in the case of union types.
I am only using union types to handle nullability; all of my unions
are of null and the actual type of the field.
Is there a suggestion for the best way to proceed? Is there an easy
way to convert my data to avro-compatible json, given that I have a
schema?
Or would it be better to investigate using a modified Decoder, such as
the one Zoltan Farkas has worked on in this fork:
https://github.com/zolyfarkas/avro ?
Any pointers on this would be super helpful.
Thank you!
-Cary