You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Phillip Mann <pm...@trulia.com> on 2017/03/21 20:57:44 UTC
Are there Connector artifacts in Confluent or any other Maven
repository?
I am trying to migrate from StreamX (https://github.com/qubole/streamx) to use the official Confluent S3 connector (https://github.com/confluentinc/kafka-connect-storage-cloud). Part of my implementation of Kafka Connect requires a custom partitioner. This partitioner originally extended the Partitioner defined here (https://github.com/confluentinc/kafka-connect-hdfs/blob/master/src/main/java/io/confluent/connect/hdfs/partitioner/Partitioner.java). This was possible because I would build StreamX and add it to my companie’s artifact repository. However, before I fork a bunch of different Confluent projects and then add them to my companies repository, I would like to know if it would be possible to import different Confluent projects such as HDFS connector and S3 connector through Maven so that I can use code from these projects. If this doesn’t exist, why doesn’t Confluent add these artifacts to the Confluent repository? Thanks for your help!
Phillip
Re: Are there Connector artifacts in Confluent or any other Maven repository?
Posted by Ewen Cheslack-Postava <ew...@confluent.io>.
Yes, these get published to Confluent's maven repository. Follow the
instructions here
http://docs.confluent.io/current/installation.html#installation-maven for
adding the Confluent maven repository to your project and then add a
dependency for the connector to your project (e.g. for that partitioner you
need io.confluent.kafka-connect-hdfs). Be sure to add it as a provided
dependency so you don't actually get an extra copy of the connector and its
dependencies.
-Ewen
On Tue, Mar 21, 2017 at 1:57 PM, Phillip Mann <pm...@trulia.com> wrote:
> I am trying to migrate from StreamX (https://github.com/qubole/streamx)
> to use the official Confluent S3 connector (https://github.com/
> confluentinc/kafka-connect-storage-cloud). Part of my implementation of
> Kafka Connect requires a custom partitioner. This partitioner originally
> extended the Partitioner defined here (https://github.com/
> confluentinc/kafka-connect-hdfs/blob/master/src/main/
> java/io/confluent/connect/hdfs/partitioner/Partitioner.java). This was
> possible because I would build StreamX and add it to my companie’s artifact
> repository. However, before I fork a bunch of different Confluent projects
> and then add them to my companies repository, I would like to know if it
> would be possible to import different Confluent projects such as HDFS
> connector and S3 connector through Maven so that I can use code from these
> projects. If this doesn’t exist, why doesn’t Confluent add these artifacts
> to the Confluent repository? Thanks for your help!
>
> Phillip
>