You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Allen Wang (JIRA)" <ji...@apache.org> on 2019/01/14 21:31:00 UTC

[jira] [Created] (FLINK-11319) Allow usage of custom implementation of Kafka Producer and Consumer in source and sink

Allen Wang created FLINK-11319:
----------------------------------

             Summary: Allow usage of custom implementation of Kafka Producer and Consumer in source and sink
                 Key: FLINK-11319
                 URL: https://issues.apache.org/jira/browse/FLINK-11319
             Project: Flink
          Issue Type: Improvement
          Components: Kafka Connector
            Reporter: Allen Wang


We use our own implementation of Kafka producer and consumer in our cloud environment for better integration with our infrastructure. The {{Consumer}} and {{Producer}} interfaces are properly implemented, but the implementation does not extend {{KafkaConsumer}} or {{KafkaProducer}}. Instead, it wraps and decorates the instance of the default Kafka implementation.

I propose the following changes to make it easy to hook up our own implementation with Flink. 
 * Refer to {{Consumer}} and {{Producer}} interface, not {{KafkaConsumer}} or {{KafkaProducer}} in {{FlinkKafkaInternalProducer}} and {{KafkaConsumerThread}}
 * Add {{ConsumerBuilder}} and {{ProducerBuilder}} interface with the following definition
{code:java}
// ProducerBuilder
Producer<K, V> build(Properties properties)

// ConsumerBuilder
Consumer<byte[], byte[]> build(Properties properties){code}

 * Add new constructors in {{FlinkKafkaProducer}} and {{FlinkKafkaConsumer}} to pass in the {{ProducerBuilder}} and {{ConsumerBuilder}}. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)