You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/06/04 22:43:19 UTC

[GitHub] [beam] damccorm opened a new issue, #21332: Coder information lost in Kafka Read

damccorm opened a new issue, #21332:
URL: https://github.com/apache/beam/issues/21332

   When upgrading from 2.29.0 to 2.36.0 our Kafka Read transform broke.
   While debugging, I saw that information of the key and value coders is lost after [this statement in KafkaIO](https://github.com/apache/beam/blob/v2.36.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaIO.java#L1459):
   ```
   
   return output.apply(readTransform).setCoder(KafkaRecordCoder.of(keyCoder, valueCoder));
   
   ```
   
   The issue seems to be that `output.apply` already checks for the presence of key and value coders and ends up trying to infer them with the help of [LocalDeserializerProvider](https://github.com/apache/beam/blob/v2.36.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/LocalDeserializerProvider.java), so the process never arrives at `{}setCoder{`}.
   In our case this inference fails since we implement the `Deserializer` interface in a super class of the instance passed as the deserializer (I assume it would also fail afterwards since we don't register our coders with the registry).
   
   This was not yet broken on 2.29.0, so all versions after that could be affected.
   
   Imported from Jira [BEAM-13924](https://issues.apache.org/jira/browse/BEAM-13924). Original Jira may contain additional context.
   Reported by: jgrabber.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org