You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2019/01/20 19:44:42 UTC

[GitHub] alexeyt820 commented on a change in pull request #6615: [FLINK-8354] [flink-connectors] Add ability to access and provider Kafka headers

alexeyt820 commented on a change in pull request #6615: [FLINK-8354] [flink-connectors] Add ability to access and provider Kafka headers
URL: https://github.com/apache/flink/pull/6615#discussion_r249292744
 
 

 ##########
 File path: flink-connectors/flink-connector-kafka-base/src/main/java/org/apache/flink/streaming/util/serialization/KeyedDeserializationSchema.java
 ##########
 @@ -32,6 +34,49 @@
  */
 @PublicEvolving
 public interface KeyedDeserializationSchema<T> extends Serializable, ResultTypeQueryable<T> {
+	/**
+	 * Kafka record to be deserialized.
+	 * Record consists of key,value pair, topic name, partition offset, headers and a timestamp (if available)
+	 */
+	interface Record {
 
 Review comment:
   I already answered before: we had to keep KeyedDeserializationSchema.Record because KeyedDeserializationSchema is used by legacy connectors, e.g. Kafka 0.8 doesn't have _ConsumerRecord_ at all
   Regarding timestamp - yes, we will need to expand interface, and I planned address it in separate PR on top of this one.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services