You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@camel.apache.org by Nathan Jones <na...@ncjones.com> on 2014/12/30 12:41:05 UTC

Rate limit producer to match consumer

We are trying to improve the responsiveness of some bulk message 
processes such that a large batch does not flood a queue and prevent 
subsequent smaller batches getting through in a timely fashion. For 
example, a job to import millions of records from CSV may take an hour 
but a smaller job to import just one thousand records should be able to 
begin processing immediately in parallel instead of being sent to the 
back of a very long queue.

This seems to me like the sort of thing Camel should be good at but I 
have so far not been about to see how this could be achieved. The idea 
we have in mind is to have a queue with limited size that will block 
when it is full so that the rate of queuing from a large batch would be 
limited to consumer capacity. Subsequent batches would have equal 
opportunity to get the next message on to the queue.

At first I thought the Camel maxInflightExchanges property could be used 
for this but I don't think it has this affect. Is there a way a Camel 
route can inspect the size of the target queue to decide whether to 
suspend or resume?

Perhaps a message broker can help solve this with either blocking queues 
or virtual aggregate queues but I haven't found these in RabbitMQ or 
ActiveMQ.

Does anyone have any advice on a way solve this problem with Camel or 
otherwise?

  - Nathan

Re: Rate limit producer to match consumer

Posted by Taariq Levack <ta...@gmail.com>.
I just fired up a sample app based on the file-size CBR I mentioned, hope I
didn't misunderstand or oversimplify so take a look if you haven't solved
it yet.
https://github.com/levackt/samples/tree/master/file-size-cbr

Also take a look at the route throttling example which shows off the
policies nicely.
http://camel.apache.org/route-throttling-example.html

Taariq



On Tue, Dec 30, 2014 at 8:32 PM, Taariq Levack <ta...@gmail.com> wrote:

> How about a content based router that checks the file size and forwards
> large messages to one endpoint and smaller ones to another?
> Both are throttled or limited appropriately, and forward to your existing
> endpoint where the work is done.
>
> That last endpoint should have more consumers than either of the others.
> You can get more creative with the general idea, like dynamically setting
> the maximumRequestsPerPeriod[1] higher for the small file endpoint when
> there are no large files to process.
> [1] http://camel.apache.org/throttler.html
>
> Taariq
>
> > On 30 Dec 2014, at 13:41, Nathan Jones <na...@ncjones.com> wrote:
> >
> > We are trying to improve the responsiveness of some bulk message
> processes such that a large batch does not flood a queue and prevent
> subsequent smaller batches getting through in a timely fashion. For
> example, a job to import millions of records from CSV may take an hour but
> a smaller job to import just one thousand records should be able to begin
> processing immediately in parallel instead of being sent to the back of a
> very long queue.
> >
> > This seems to me like the sort of thing Camel should be good at but I
> have so far not been about to see how this could be achieved. The idea we
> have in mind is to have a queue with limited size that will block when it
> is full so that the rate of queuing from a large batch would be limited to
> consumer capacity. Subsequent batches would have equal opportunity to get
> the next message on to the queue.
> >
> > At first I thought the Camel maxInflightExchanges property could be used
> for this but I don't think it has this affect. Is there a way a Camel route
> can inspect the size of the target queue to decide whether to suspend or
> resume?
> >
> > Perhaps a message broker can help solve this with either blocking queues
> or virtual aggregate queues but I haven't found these in RabbitMQ or
> ActiveMQ.
> >
> > Does anyone have any advice on a way solve this problem with Camel or
> otherwise?
> >
> > - Nathan
>

Re: Rate limit producer to match consumer

Posted by Taariq Levack <ta...@gmail.com>.
How about a content based router that checks the file size and forwards large messages to one endpoint and smaller ones to another?
Both are throttled or limited appropriately, and forward to your existing endpoint where the work is done.

That last endpoint should have more consumers than either of the others.
You can get more creative with the general idea, like dynamically setting the maximumRequestsPerPeriod[1] higher for the small file endpoint when there are no large files to process.
[1] http://camel.apache.org/throttler.html

Taariq

> On 30 Dec 2014, at 13:41, Nathan Jones <na...@ncjones.com> wrote:
> 
> We are trying to improve the responsiveness of some bulk message processes such that a large batch does not flood a queue and prevent subsequent smaller batches getting through in a timely fashion. For example, a job to import millions of records from CSV may take an hour but a smaller job to import just one thousand records should be able to begin processing immediately in parallel instead of being sent to the back of a very long queue.
> 
> This seems to me like the sort of thing Camel should be good at but I have so far not been about to see how this could be achieved. The idea we have in mind is to have a queue with limited size that will block when it is full so that the rate of queuing from a large batch would be limited to consumer capacity. Subsequent batches would have equal opportunity to get the next message on to the queue.
> 
> At first I thought the Camel maxInflightExchanges property could be used for this but I don't think it has this affect. Is there a way a Camel route can inspect the size of the target queue to decide whether to suspend or resume?
> 
> Perhaps a message broker can help solve this with either blocking queues or virtual aggregate queues but I haven't found these in RabbitMQ or ActiveMQ.
> 
> Does anyone have any advice on a way solve this problem with Camel or otherwise?
> 
> - Nathan