You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@avro.apache.org by "donald cestnik (JIRA)" <ji...@apache.org> on 2019/02/28 20:13:00 UTC
[jira] [Commented] (AVRO-2137) avro JsonDecoding additional field
in array type
[ https://issues.apache.org/jira/browse/AVRO-2137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16780874#comment-16780874 ]
donald cestnik commented on AVRO-2137:
--------------------------------------
I have also encountered this issue as well and is a show stopper for us using avro until resolved.
> avro JsonDecoding additional field in array type
> ------------------------------------------------
>
> Key: AVRO-2137
> URL: https://issues.apache.org/jira/browse/AVRO-2137
> Project: Apache Avro
> Issue Type: Bug
> Components: java
> Affects Versions: 1.8.1
> Reporter: Arun sethia
> Priority: Major
>
> I have following avro schema:
> {code:json}
> {
> "type": "record",
> "name": "test",
> "namespace": "test.name",
> "fields": [
> {
> "name": "items",
> "type": {
> "type": "array",
> "items": {
> "type": "record",
> "name": "items",
> "fields": [
> {
> "name": "name",
> "type": "string"
> },
> {
> "name": "state",
> "type": "string"
> }
> ]
> }
> }
> },
> {
> "name": "firstname",
> "type": "string"
> }
> ]
> }
> {code}
> when I am using Json decoder and avro encoder to encode Json data (scala code):
> {code:scala}
> val writer = new GenericDatumWriter[GenericRecord](schema)
> val reader = new GenericDatumReader[GenericRecord](schema)
> val baos = new ByteArrayOutputStream
> val decoder: JsonDecoder = DecoderFactory.get.jsonDecoder(schema, json)
> val encoder = EncoderFactory.get.binaryEncoder(baos, null)
> val datum = reader.read(null, decoder) writer.write(datum, encoder)
> encoder.flush()
> val avroByteArray = baos.toByteArray
> {code}
> *scenario1:* when I am passing following json to encode it works fine:
> {code:json}
> {
> "items": [
> {
> "name": "dallas",
> "state": "TX"
> }
> ],
> "firstname": "arun"
> }
> {code}
> *scenario2:* when I am passing additional attribute in json at root level (lastname) it is able to encode and works fine:
> {code:json}
> {
> "items": [
> {
> "name": "dallas",
> "state": "TX"
> }
> ],
> "firstname": "fname",
> "lastname": "lname"
> }
> {code}
> *scenario3*: when I am add additional attribute in array record (country) it is throwing following exception:
> {code:scala}
> Expected record-end. Got FIELD_NAME org.apache.avro.AvroTypeException: Expected record-end. Got FIELD_NAME at org.apache.avro.io.JsonDecoder.error(JsonDecoder.java:698) { "items": [
> { "name": "dallas", "state": "TX", "country":"USA" }
> ], "firstname":"fname", "lastname":"lname" }
> {code}
> In case of if we have any additional element in array type, it should work in same way as normal record; it should just discard them and decode the Json data.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)