You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Bart Vercammen <ba...@cloutrix.com> on 2018/10/15 13:38:06 UTC

[Kafka Consumer] deserializer (documentation) mismatch?

Hi,

I found a mismatch between the documentation in
the org.apache.kafka.common.serialization.Deserializer and the
implementation in KafkaConsumer.

Deserializer documentation sais: *"serialized bytes; may be null;
implementations are recommended to handle null by returning a value or null
rather than throwing an exception*"
but in the KafkaConsumer, 'null' is never passed to the deserializer.

From 'parseRecord' in 'org.apache.kafka.clients.consumer.internals.Fetcher'
:
{{
            K key = keyBytes == null ? null :
this.keyDeserializer.deserialize(partition.topic(), headers, keyByteArray);
            ByteBuffer valueBytes = record.value();
            byte[] valueByteArray = valueBytes == null ? null :
Utils.toArray(valueBytes);
            V value = valueBytes == null ? null :
this.valueDeserializer.deserialize(partition.topic(), headers,
valueByteArray);
}}

I stumbled upon this discrepancy while trying to pass a valid object from
the deserializer to the application when a 'delete' was received on a
log-compacted topic.
So basically the question I have here is the following: is the
documentation in the Deserializer wrong, or is the implementation in the
Fetcher wrong?
To me it seems more plausible to have 'null' being processed by the
deserializer, as to the Fetcher shortcutting on 'null' values ...

Any thoughts?
Greets,
Bart