You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2019/01/20 05:59:32 UTC

[GitHub] stevenzwu commented on a change in pull request #6615: [FLINK-8354] [flink-connectors] Add ability to access and provider Kafka headers

stevenzwu commented on a change in pull request #6615: [FLINK-8354] [flink-connectors] Add ability to access and provider Kafka headers
URL: https://github.com/apache/flink/pull/6615#discussion_r249265860
 
 

 ##########
 File path: flink-connectors/flink-connector-kafka-base/src/main/java/org/apache/flink/streaming/util/serialization/KeyedDeserializationSchema.java
 ##########
 @@ -32,6 +34,49 @@
  */
 @PublicEvolving
 public interface KeyedDeserializationSchema<T> extends Serializable, ResultTypeQueryable<T> {
+	/**
+	 * Kafka record to be deserialized.
+	 * Record consists of key,value pair, topic name, partition offset, headers and a timestamp (if available)
+	 */
+	interface Record {
 
 Review comment:
   I am wondering if we should just use ConsumerRecord from kafka directly. this `KeyedDeserializationSchema` is for kafka consumer only. there isn't much benefit of defining another Flink interface.
   
   just to give a counter point. I am also interested in the `timestamp` fields. now we need to expand this interface.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services