You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Pulkit Manchanda <pu...@gmail.com> on 2019/03/15 14:52:34 UTC
Kafka - Connect for logs processing
Hi All,
I am building a data pipeline to send logs from one data source to the
other node.
I am using Kafka Connect standalone for this integration.
Everything works fine but the problem is on Day1 the log file is renamed as
log_Day0 and a new log file log_Day1 is created.
And my Kafka Connect don't process the new log file.
Looking for a solution. Any help is appreciated.
Thanks
Pulkit
Re: Kafka - Connect for logs processing
Posted by Pulkit Manchanda <pu...@gmail.com>.
Hi Hans,
Thanks for quick response.
I am gonna look into it.
Thanks
Pulkit
On Fri, Mar 15, 2019 at 11:39 AM Hans Jespersen <ha...@confluent.io> wrote:
> Take a look at kafka-connect-spooldir and see if it meets your needs.
>
> https://www.confluent.io/connector/kafka-connect-spooldir/
>
> This connector can monitor a directory and pick up any new files that are
> created. Great for picking up batch files, parsing them, and publishing
> each line as if it were published in realtime.
>
> -hans
>
> > On Mar 15, 2019, at 7:52 AM, Pulkit Manchanda <pu...@gmail.com>
> wrote:
> >
> > Hi All,
> >
> > I am building a data pipeline to send logs from one data source to the
> > other node.
> > I am using Kafka Connect standalone for this integration.
> > Everything works fine but the problem is on Day1 the log file is renamed
> as
> > log_Day0 and a new log file log_Day1 is created.
> > And my Kafka Connect don't process the new log file.
> > Looking for a solution. Any help is appreciated.
> >
> > Thanks
> > Pulkit
>
Re: Kafka - Connect for logs processing
Posted by Hans Jespersen <ha...@confluent.io>.
Take a look at kafka-connect-spooldir and see if it meets your needs.
https://www.confluent.io/connector/kafka-connect-spooldir/
This connector can monitor a directory and pick up any new files that are created. Great for picking up batch files, parsing them, and publishing each line as if it were published in realtime.
-hans
> On Mar 15, 2019, at 7:52 AM, Pulkit Manchanda <pu...@gmail.com> wrote:
>
> Hi All,
>
> I am building a data pipeline to send logs from one data source to the
> other node.
> I am using Kafka Connect standalone for this integration.
> Everything works fine but the problem is on Day1 the log file is renamed as
> log_Day0 and a new log file log_Day1 is created.
> And my Kafka Connect don't process the new log file.
> Looking for a solution. Any help is appreciated.
>
> Thanks
> Pulkit