You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by tzulitai <gi...@git.apache.org> on 2016/07/14 06:34:24 UTC

[GitHub] flink issue #2231: [FLINK-4035] Bump Kafka producer in Kafka sink to Kafka 0...

Github user tzulitai commented on the issue:

    https://github.com/apache/flink/pull/2231
  
    Hi @radekg , thank you for opening a PR for this!
    From a first look it seems that there isn't much changes to the code of `flink-connector-kafka-0.9` and this PR. Also, from the original discussion / comments in the JIRA, the Kafka API doesn't seem to have changed between 0.9 and 0.10, so it might be possible to let the Kafka 0.9 connector use the 0.10 client by putting the Kafka 0.10 dependency into the user pom.
    
    May I ask whether you have tried this approach out already? Also,
    > At The Weather Company we bumped into a problem while trying to use Flink with Kafka 0.10.x.
    What was the problem? If you can describe, it'll be helpful for deciding how we can proceed with this :) There's another contributor who was trying this out, I'll also try to ask for his feedback on this in the JIRA.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---