You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Yogesh BG <yo...@gmail.com> on 2016/04/20 06:11:24 UTC

Kafka support needed

Hi



I have a one scenario ass below, I want to know whether its supported
currently. If not is there any work around by using existing kafka features.



I have kafka producer, currently he doesn’t have connection to the broker.
I want to send the messages to kafka broker when the connection is
available.

Meanwhile I should be able to delete the messages from producer buffer
after some size / some days of interval the connection is not available.

-- 
Yogesh..BG
Senior Software engineer
Sling Media Pvt. Ltd.
PSS Plaza, #6,
Wind Tunnel Road.
Murghesh Palya,
Banglore - 560 017
Contact no: 7760922118

Re: Kafka support needed

Posted by Yogesh BG <yo...@gmail.com>.
Exception in thread "main" java.util.concurrent.ExecutionException:
org.apache.kafka.common.errors.TimeoutException: Batch Expired
at
org.apache.kafka.clients.producer.internals.FutureRecordMetadata.valueOrError(FutureRecordMetadata.java:56)
at
org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:43)

what coulld be the cause

Properties properties = new Properties();
properties.put("bootstrap.servers", IPPORT);
properties.put("acks", "all");
properties.put("key.serializer",
"org.apache.kafka.common.serialization.StringSerializer");
properties.put("value.serializer",
"org.apache.kafka.common.serialization.StringSerializer");

KafkaProducer<String, String> producer = new KafkaProducer(properties);
List<PartitionInfo> partitionsFor = producer.partitionsFor("elnetty-items");
while (true) {
Future<RecordMetadata> send = producer.send(new ProducerRecord<String,
String>("elnetty-items", "event1", "value1"));
RecordMetadata recordMetadata = send.get();
}

On Wed, Apr 20, 2016 at 10:48 AM, Joe Stein <jo...@elodina.net> wrote:

> If you use Go you can use https://github.com/sclasen/event-shuttle which
> is
> nice choice in some cases because footprint, it uses boltdb which is like
> leveldb which is like embedded k/v ok
>
> NiFi is cool too https://nifi.apache.org/
>
> So is bruce https://github.com/ifwe/bruce
>
> those are more out of the box and can work with but reliable because people
> use them live
>
> On Wed, Apr 20, 2016 at 1:05 AM, Sunny Shah <su...@gmail.com> wrote:
>
> > Hi Yogesh,
> >
> > You can even use sqllite/leveldb to buffer the data on client.
> >
> > Thanks,
> > Sunny
> > On Apr 20, 2016 10:31 AM, "Yogesh BG" <yo...@gmail.com> wrote:
> >
> > > Thank You for the reply.
> > >
> > > I am running producer in very resource constraint device(IOT hub). I
> > doubt
> > > whether i can accommodate local broker.
> > >
> > >
> > > On Wed, Apr 20, 2016 at 10:07 AM, Sunny Shah <su...@gmail.com>
> > wrote:
> > >
> > > > Hi Yogesh,
> > > >
> > > > No, Kafka does not provide this functionality out of the box, Though
> > you
> > > > can easily engineer it by having a localhost Kafka setup.
> > > >
> > > >    1. Always write data to the localhost Kafka.
> > > >    2. When broker connection is available then read data from
> localhost
> > > >    Kafka and send it to remote Kafka broker.
> > > >
> > > > If you don't want to engineer this sytem then you can use Apache
> NiFi,
> > It
> > > > is meant for reliable edge node data ingestion.
> > > >
> > > > Thanks,
> > > >  Sunny
> > > >
> > > > On Wed, Apr 20, 2016 at 9:41 AM, Yogesh BG <yo...@gmail.com>
> > wrote:
> > > >
> > > > > Hi
> > > > >
> > > > >
> > > > >
> > > > > I have a one scenario ass below, I want to know whether its
> supported
> > > > > currently. If not is there any work around by using existing kafka
> > > > > features.
> > > > >
> > > > >
> > > > >
> > > > > I have kafka producer, currently he doesn’t have connection to the
> > > > broker.
> > > > > I want to send the messages to kafka broker when the connection is
> > > > > available.
> > > > >
> > > > > Meanwhile I should be able to delete the messages from producer
> > buffer
> > > > > after some size / some days of interval the connection is not
> > > available.
> > > > >
> > > > > --
> > > > > Yogesh..BG
> > > > > Senior Software engineer
> > > > > Sling Media Pvt. Ltd.
> > > > > PSS Plaza, #6,
> > > > > Wind Tunnel Road.
> > > > > Murghesh Palya,
> > > > > Banglore - 560 017
> > > > > Contact no: 7760922118
> > > > >
> > > >
> > >
> > >
> > >
> > > --
> > > Yogesh..BG
> > > Senior Software engineer
> > > Sling Media Pvt. Ltd.
> > > PSS Plaza, #6,
> > > Wind Tunnel Road.
> > > Murghesh Palya,
> > > Banglore - 560 017
> > > Contact no: 7760922118
> > >
> >
>



-- 
Yogesh..BG
Senior Software engineer
Sling Media Pvt. Ltd.
PSS Plaza, #6,
Wind Tunnel Road.
Murghesh Palya,
Banglore - 560 017
Contact no: 7760922118

Re: Kafka support needed

Posted by Joe Stein <jo...@elodina.net>.
If you use Go you can use https://github.com/sclasen/event-shuttle which is
nice choice in some cases because footprint, it uses boltdb which is like
leveldb which is like embedded k/v ok

NiFi is cool too https://nifi.apache.org/

So is bruce https://github.com/ifwe/bruce

those are more out of the box and can work with but reliable because people
use them live

On Wed, Apr 20, 2016 at 1:05 AM, Sunny Shah <su...@gmail.com> wrote:

> Hi Yogesh,
>
> You can even use sqllite/leveldb to buffer the data on client.
>
> Thanks,
> Sunny
> On Apr 20, 2016 10:31 AM, "Yogesh BG" <yo...@gmail.com> wrote:
>
> > Thank You for the reply.
> >
> > I am running producer in very resource constraint device(IOT hub). I
> doubt
> > whether i can accommodate local broker.
> >
> >
> > On Wed, Apr 20, 2016 at 10:07 AM, Sunny Shah <su...@gmail.com>
> wrote:
> >
> > > Hi Yogesh,
> > >
> > > No, Kafka does not provide this functionality out of the box, Though
> you
> > > can easily engineer it by having a localhost Kafka setup.
> > >
> > >    1. Always write data to the localhost Kafka.
> > >    2. When broker connection is available then read data from localhost
> > >    Kafka and send it to remote Kafka broker.
> > >
> > > If you don't want to engineer this sytem then you can use Apache NiFi,
> It
> > > is meant for reliable edge node data ingestion.
> > >
> > > Thanks,
> > >  Sunny
> > >
> > > On Wed, Apr 20, 2016 at 9:41 AM, Yogesh BG <yo...@gmail.com>
> wrote:
> > >
> > > > Hi
> > > >
> > > >
> > > >
> > > > I have a one scenario ass below, I want to know whether its supported
> > > > currently. If not is there any work around by using existing kafka
> > > > features.
> > > >
> > > >
> > > >
> > > > I have kafka producer, currently he doesn’t have connection to the
> > > broker.
> > > > I want to send the messages to kafka broker when the connection is
> > > > available.
> > > >
> > > > Meanwhile I should be able to delete the messages from producer
> buffer
> > > > after some size / some days of interval the connection is not
> > available.
> > > >
> > > > --
> > > > Yogesh..BG
> > > > Senior Software engineer
> > > > Sling Media Pvt. Ltd.
> > > > PSS Plaza, #6,
> > > > Wind Tunnel Road.
> > > > Murghesh Palya,
> > > > Banglore - 560 017
> > > > Contact no: 7760922118
> > > >
> > >
> >
> >
> >
> > --
> > Yogesh..BG
> > Senior Software engineer
> > Sling Media Pvt. Ltd.
> > PSS Plaza, #6,
> > Wind Tunnel Road.
> > Murghesh Palya,
> > Banglore - 560 017
> > Contact no: 7760922118
> >
>

Re: Kafka support needed

Posted by Sunny Shah <su...@gmail.com>.
Hi Yogesh,

You can even use sqllite/leveldb to buffer the data on client.

Thanks,
Sunny
On Apr 20, 2016 10:31 AM, "Yogesh BG" <yo...@gmail.com> wrote:

> Thank You for the reply.
>
> I am running producer in very resource constraint device(IOT hub). I doubt
> whether i can accommodate local broker.
>
>
> On Wed, Apr 20, 2016 at 10:07 AM, Sunny Shah <su...@gmail.com> wrote:
>
> > Hi Yogesh,
> >
> > No, Kafka does not provide this functionality out of the box, Though you
> > can easily engineer it by having a localhost Kafka setup.
> >
> >    1. Always write data to the localhost Kafka.
> >    2. When broker connection is available then read data from localhost
> >    Kafka and send it to remote Kafka broker.
> >
> > If you don't want to engineer this sytem then you can use Apache NiFi, It
> > is meant for reliable edge node data ingestion.
> >
> > Thanks,
> >  Sunny
> >
> > On Wed, Apr 20, 2016 at 9:41 AM, Yogesh BG <yo...@gmail.com> wrote:
> >
> > > Hi
> > >
> > >
> > >
> > > I have a one scenario ass below, I want to know whether its supported
> > > currently. If not is there any work around by using existing kafka
> > > features.
> > >
> > >
> > >
> > > I have kafka producer, currently he doesn’t have connection to the
> > broker.
> > > I want to send the messages to kafka broker when the connection is
> > > available.
> > >
> > > Meanwhile I should be able to delete the messages from producer buffer
> > > after some size / some days of interval the connection is not
> available.
> > >
> > > --
> > > Yogesh..BG
> > > Senior Software engineer
> > > Sling Media Pvt. Ltd.
> > > PSS Plaza, #6,
> > > Wind Tunnel Road.
> > > Murghesh Palya,
> > > Banglore - 560 017
> > > Contact no: 7760922118
> > >
> >
>
>
>
> --
> Yogesh..BG
> Senior Software engineer
> Sling Media Pvt. Ltd.
> PSS Plaza, #6,
> Wind Tunnel Road.
> Murghesh Palya,
> Banglore - 560 017
> Contact no: 7760922118
>

Re: Kafka support needed

Posted by Yogesh BG <yo...@gmail.com>.
Thank You for the reply.

I am running producer in very resource constraint device(IOT hub). I doubt
whether i can accommodate local broker.


On Wed, Apr 20, 2016 at 10:07 AM, Sunny Shah <su...@gmail.com> wrote:

> Hi Yogesh,
>
> No, Kafka does not provide this functionality out of the box, Though you
> can easily engineer it by having a localhost Kafka setup.
>
>    1. Always write data to the localhost Kafka.
>    2. When broker connection is available then read data from localhost
>    Kafka and send it to remote Kafka broker.
>
> If you don't want to engineer this sytem then you can use Apache NiFi, It
> is meant for reliable edge node data ingestion.
>
> Thanks,
>  Sunny
>
> On Wed, Apr 20, 2016 at 9:41 AM, Yogesh BG <yo...@gmail.com> wrote:
>
> > Hi
> >
> >
> >
> > I have a one scenario ass below, I want to know whether its supported
> > currently. If not is there any work around by using existing kafka
> > features.
> >
> >
> >
> > I have kafka producer, currently he doesn’t have connection to the
> broker.
> > I want to send the messages to kafka broker when the connection is
> > available.
> >
> > Meanwhile I should be able to delete the messages from producer buffer
> > after some size / some days of interval the connection is not available.
> >
> > --
> > Yogesh..BG
> > Senior Software engineer
> > Sling Media Pvt. Ltd.
> > PSS Plaza, #6,
> > Wind Tunnel Road.
> > Murghesh Palya,
> > Banglore - 560 017
> > Contact no: 7760922118
> >
>



-- 
Yogesh..BG
Senior Software engineer
Sling Media Pvt. Ltd.
PSS Plaza, #6,
Wind Tunnel Road.
Murghesh Palya,
Banglore - 560 017
Contact no: 7760922118

Re: Kafka support needed

Posted by Sunny Shah <su...@gmail.com>.
Hi Yogesh,

No, Kafka does not provide this functionality out of the box, Though you
can easily engineer it by having a localhost Kafka setup.

   1. Always write data to the localhost Kafka.
   2. When broker connection is available then read data from localhost
   Kafka and send it to remote Kafka broker.

If you don't want to engineer this sytem then you can use Apache NiFi, It
is meant for reliable edge node data ingestion.

Thanks,
 Sunny

On Wed, Apr 20, 2016 at 9:41 AM, Yogesh BG <yo...@gmail.com> wrote:

> Hi
>
>
>
> I have a one scenario ass below, I want to know whether its supported
> currently. If not is there any work around by using existing kafka
> features.
>
>
>
> I have kafka producer, currently he doesn’t have connection to the broker.
> I want to send the messages to kafka broker when the connection is
> available.
>
> Meanwhile I should be able to delete the messages from producer buffer
> after some size / some days of interval the connection is not available.
>
> --
> Yogesh..BG
> Senior Software engineer
> Sling Media Pvt. Ltd.
> PSS Plaza, #6,
> Wind Tunnel Road.
> Murghesh Palya,
> Banglore - 560 017
> Contact no: 7760922118
>

Re: Kafka support needed

Posted by to...@borked.ca.
‎Rsyslog (8.15+) now supports producing to Kafka, and doesn't require java (that can be a bonus).   Rsyslog can use a disk buffer, then when it can connect to Kafka, it will start streaming logs until the connection drops. That's a pretty simple config, and there are lots of examples online.

T

Sent from my BlackBerry 10 smartphone
  Original Message  
From: Yogesh BG
Sent: Wednesday, April 20, 2016 12:11 AM
To: users@kafka.apache.org
Reply To: users@kafka.apache.org
Subject: Kafka support needed

Hi



I have a one scenario ass below, I want to know whether its supported
currently. If not is there any work around by using existing kafka features.



I have kafka producer, currently he doesn’t have connection to the broker.
I want to send the messages to kafka broker when the connection is
available.

Meanwhile I should be able to delete the messages from producer buffer
after some size / some days of interval the connection is not available.

-- 
Yogesh..BG
Senior Software engineer
Sling Media Pvt. Ltd.
PSS Plaza, #6,
Wind Tunnel Road.
Murghesh Palya,
Banglore - 560 017
Contact no: 7760922118