You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@camel.apache.org by Debraj Manna <su...@gmail.com> on 2016/10/01 14:43:10 UTC

Limit Concurrent Access

Hi

I have seen Throttler <http://camel.apache.org/throttler.html> in camel. Is
there anything available in camel that restricts the number of concurrent
accesses something like this as mentioned here
<https://github.com/google/guava/blob/master/guava/src/com/google/common/util/concurrent/RateLimiter.java#L41>
?

Also  the below seems to be a more generic query but thought of asking here
if anyone can provide some thoughts on this:-

I have observed that most of the RESP APIs does rate limiting on requests
rather than restricting the number of concurrent requests. Is there any
specific reason?

Re: Limit Concurrent Access

Posted by Brad Johnson <br...@mediadriver.com>.
@Debraj,

Until you can accurately characterize the request/response that users are
sending as input it's hard to give good advice on this.  If they are
sending a bunch of data and just expecting an acknowledgement back that
you've receive it and are processing it then there are fairly simple ways
of dealing with it.  On the other hand, if they are sending you data and
you have to make all the DB calls, REST calls, number crunching and then
send the results back it's a bit trickier but not impossible.

Are you sure that it is OK to just bounce your client's request?  That
means they'll have to have retry and/or error handling on their side.
That's usually undesirable.  But if you want a fail fast like that then you
should do it on the endpoint coming in by limiting the max requests/threads
as shown in the Jetty documents.  I'll assume for now that you are using
Jetty as that's probably the most common.

Jetty also has "continuations" that you may want to look into.
Essentially the continuation doesn't hang the input thread, permits you to
put the data on a queue for asynchronous process (SEDA or JMS persistent
depending on data type) and then wait for the response from your
asynchronous processing before sending the response back to the user.

You need to characterize the problem better.  Think of a request to Amazon
for an order that has a credit card to be charged, database look ups and
reservations for inventory, shipping information, etc.  You don't sit and
wait for Amazon to process all that.  The send a quick response back
thanking you for your order and later send an email telling you your order
has been successfully processed and later send you an email that your order
has been shipped.  I'm not saying that that exact thing is right for your
business case but the point is that they send a very fast response
essentially acknowledging they've received your order and are processing
it.

Brad

On Sun, Oct 2, 2016 at 11:32 PM, yogu13 <yo...@gmail.com> wrote:

> I think throttler can be used in your case. Unless you feel there couldnt
> be
> a issue when you use it.
>
> Other approach is by using  Route Policy
> <http://camel.apache.org/routepolicy.html>
>
> Regards,
> -Yogesh
>
>
>
> --
> View this message in context: http://camel.465427.n5.nabble.
> com/Limit-Concurrent-Access-tp5788278p5788304.html
> Sent from the Camel - Users mailing list archive at Nabble.com.
>

Re: Limit Concurrent Access

Posted by yogu13 <yo...@gmail.com>.
I think throttler can be used in your case. Unless you feel there couldnt be
a issue when you use it.

Other approach is by using  Route Policy
<http://camel.apache.org/routepolicy.html>  

Regards,
-Yogesh



--
View this message in context: http://camel.465427.n5.nabble.com/Limit-Concurrent-Access-tp5788278p5788304.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Limit Concurrent Access

Posted by Brad Johnson <br...@mediadriver.com>.
@Vitalii,

I'm not sure.  It sounds like he wants to receive a call and then process
it but I'm not positive.  That's why I was asking for a clearer
definition.  Here's one of the initial statements.

"In our use case we are doing lot of crunching, DB and external REST
service calls? There is a limit on external REST service calls we can
make. I can restrict the call to external services using a thread
pool. *But I was thinking if it is possible to limit when receiving the*


*request, so that we can fail fast rather than limit while making
theexternal call. If request is crossing the limit sending error to the**caller
is fine.*"

On Mon, Oct 3, 2016 at 10:32 PM, Vitalii Tymchyshyn <vi...@tym.im> wrote:

> I am not sure that Debraj was talking about incoming calls. And I was also
> looking for a way to limit number of concurrent exchanges being sent to
> given endpoint.
> In the Async scenario even thread pool can't help because one can make
> unlimited number of exchanges with one thread.
> And Throttler does not account for concurrent request amount, so it can't
> be used to limit concurrency level. I am actually thinking on extending
> Thtottler.
> To be specific, my usecase is batch processing where I need to make some
> web service calls with Netty. Currently without the limitation it can open
> up to few hundred concurrent sockets that unnesessary overloads the server.
> I'd like to set a limit of e.g. 20 concurrent calls with others waiting
> (similar to database connection pool).
> Netty4 component has limits to set, but it starts to fail when limit is
> reached instead of waiting. It would be very useful to have a generic
> module to help in such cases.
>
> Best regards, Vitalii Tymchyshyn
>
>
> Нд, 2 жовт. 2016 11:27 користувач Brad Johnson <
> brad.johnson@mediadriver.com>
> пише:
>
> > Ah, so you aren't really concerned about the incoming calls, per se, it's
> > the number of outgoing calls.  And to limit that you want to limit the
> > incoming calls?  Are the incoming calls sending data in that can be
> > processed asynchronously or are they returning chunks of data to the
> > caller?
> >
> > On Sat, Oct 1, 2016 at 2:45 PM, Debraj Manna <su...@gmail.com>
> > wrote:
> >
> > > Thanks Brad for replying.
> > >
> > > In our use case we are doing lot of crunching, DB and external REST
> > > service calls? There is a limit on external REST service calls we can
> > > make. I can restrict the call to external services using a thread
> > > pool. But I was thinking if it is possible to limit when receiving the
> > > request, so that we can fail fast rather than limit while making the
> > > external call. If request is crossing the limit sending error to the
> > > caller is fine.
> > >
> > >
> > >
> > > On 10/1/16, Brad Johnson <br...@mediadriver.com> wrote:
> > > > The first question I'd have is "are you sure you have a problem with
> > the
> > > > number of incoming requests?"  One of the biggest problems I find in
> > the
> > > > field is premature optimization. If you have a fairly good
> > > characterization
> > > > of the problem, the number of requests anticipated, the length of
> time
> > to
> > > > process the incoming request, etc. you can set up JMeter to stress
> test
> > > > your application.  That will let you change configuration options in
> > > Camel
> > > > and see if the response is more in line with what you are expecting.
> > > >
> > > > What exactly are you trying to accomplish by limiting concurrent
> > > requests?
> > > > What do you want to happen if there are too many requests? Are these
> > > > request/responses that you are getting and sending data back after
> some
> > > > length operations or are you mostly receiving data to be processed
> and
> > > then
> > > > sending an "OK" response back.  In the case of the latter you can put
> > the
> > > > incoming data on a SEDA queue and immediately return an "OK".  Is it
> > that
> > > > the incoming request is resulting in a lot of number crunching,
> > database
> > > > calls, or other operations that take too long and the number of
> > requests
> > > > are bogging things down before sending a response back to the user?
> > > >
> > > > Camel has a wide range of components that can provide RESTful APIs.
> > They
> > > > are all going to be a little different in their behavior.  For
> example,
> > > the
> > > > Netty component is going to use NIO under the covers to handle
> incoming
> > > > data.
> > > > http://camel.apache.org/rest-dsl.html
> > > >
> > > > If you use Jetty you can look at the min and max settings on the
> thread
> > > > pool. Jetty also has continuations which frees up the incoming
> request
> > > > threads and uses a callback mechanism to send the response back when
> it
> > > is
> > > > finished.
> > > > http://camel.apache.org/jetty.html
> > > >
> > > > But really, a bit more detail and code about the use case and what it
> > is
> > > > you're trying to do would be helpful.  Do you want the request to
> send
> > an
> > > > error to the client if there are too many incoming requests? Why is
> the
> > > > number of concurrent requests a concern?  Is the incoming data large
> > > chunks
> > > > of data that are gobbling up memory or is the processing expensive or
> > > ???.
> > > >
> > > > On Sat, Oct 1, 2016 at 9:43 AM, Debraj Manna <
> subharaj.manna@gmail.com
> > >
> > > > wrote:
> > > >
> > > >> Hi
> > > >>
> > > >> I have seen Throttler <http://camel.apache.org/throttler.html> in
> > > camel.
> > > >> Is
> > > >> there anything available in camel that restricts the number of
> > > concurrent
> > > >> accesses something like this as mentioned here
> > > >> <https://github.com/google/guava/blob/master/guava/src/
> > > >> com/google/common/util/concurrent/RateLimiter.java#L41>
> > > >> ?
> > > >>
> > > >> Also  the below seems to be a more generic query but thought of
> asking
> > > >> here
> > > >> if anyone can provide some thoughts on this:-
> > > >>
> > > >> I have observed that most of the RESP APIs does rate limiting on
> > > requests
> > > >> rather than restricting the number of concurrent requests. Is there
> > any
> > > >> specific reason?
> > > >>
> > > >
> > >
> >
>

Re: Limit Concurrent Access

Posted by yogu13 <yo...@gmail.com>.
Hello Vitalii,

I have in past extended throttler successfully to limit concurrent request,
that too in a clustered environment across nodes. The scenario used in was
to limit our clients based on certain number of concurrent requests they can
send. The number of concurrent requests was based on the licenses purchased
by client.


Regards,
-Yogesh



--
View this message in context: http://camel.465427.n5.nabble.com/Limit-Concurrent-Access-tp5788278p5788336.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Re: Limit Concurrent Access

Posted by Vitalii Tymchyshyn <vi...@tym.im>.
I am not sure that Debraj was talking about incoming calls. And I was also
looking for a way to limit number of concurrent exchanges being sent to
given endpoint.
In the Async scenario even thread pool can't help because one can make
unlimited number of exchanges with one thread.
And Throttler does not account for concurrent request amount, so it can't
be used to limit concurrency level. I am actually thinking on extending
Thtottler.
To be specific, my usecase is batch processing where I need to make some
web service calls with Netty. Currently without the limitation it can open
up to few hundred concurrent sockets that unnesessary overloads the server.
I'd like to set a limit of e.g. 20 concurrent calls with others waiting
(similar to database connection pool).
Netty4 component has limits to set, but it starts to fail when limit is
reached instead of waiting. It would be very useful to have a generic
module to help in such cases.

Best regards, Vitalii Tymchyshyn


Нд, 2 жовт. 2016 11:27 користувач Brad Johnson <br...@mediadriver.com>
пише:

> Ah, so you aren't really concerned about the incoming calls, per se, it's
> the number of outgoing calls.  And to limit that you want to limit the
> incoming calls?  Are the incoming calls sending data in that can be
> processed asynchronously or are they returning chunks of data to the
> caller?
>
> On Sat, Oct 1, 2016 at 2:45 PM, Debraj Manna <su...@gmail.com>
> wrote:
>
> > Thanks Brad for replying.
> >
> > In our use case we are doing lot of crunching, DB and external REST
> > service calls? There is a limit on external REST service calls we can
> > make. I can restrict the call to external services using a thread
> > pool. But I was thinking if it is possible to limit when receiving the
> > request, so that we can fail fast rather than limit while making the
> > external call. If request is crossing the limit sending error to the
> > caller is fine.
> >
> >
> >
> > On 10/1/16, Brad Johnson <br...@mediadriver.com> wrote:
> > > The first question I'd have is "are you sure you have a problem with
> the
> > > number of incoming requests?"  One of the biggest problems I find in
> the
> > > field is premature optimization. If you have a fairly good
> > characterization
> > > of the problem, the number of requests anticipated, the length of time
> to
> > > process the incoming request, etc. you can set up JMeter to stress test
> > > your application.  That will let you change configuration options in
> > Camel
> > > and see if the response is more in line with what you are expecting.
> > >
> > > What exactly are you trying to accomplish by limiting concurrent
> > requests?
> > > What do you want to happen if there are too many requests? Are these
> > > request/responses that you are getting and sending data back after some
> > > length operations or are you mostly receiving data to be processed and
> > then
> > > sending an "OK" response back.  In the case of the latter you can put
> the
> > > incoming data on a SEDA queue and immediately return an "OK".  Is it
> that
> > > the incoming request is resulting in a lot of number crunching,
> database
> > > calls, or other operations that take too long and the number of
> requests
> > > are bogging things down before sending a response back to the user?
> > >
> > > Camel has a wide range of components that can provide RESTful APIs.
> They
> > > are all going to be a little different in their behavior.  For example,
> > the
> > > Netty component is going to use NIO under the covers to handle incoming
> > > data.
> > > http://camel.apache.org/rest-dsl.html
> > >
> > > If you use Jetty you can look at the min and max settings on the thread
> > > pool. Jetty also has continuations which frees up the incoming request
> > > threads and uses a callback mechanism to send the response back when it
> > is
> > > finished.
> > > http://camel.apache.org/jetty.html
> > >
> > > But really, a bit more detail and code about the use case and what it
> is
> > > you're trying to do would be helpful.  Do you want the request to send
> an
> > > error to the client if there are too many incoming requests? Why is the
> > > number of concurrent requests a concern?  Is the incoming data large
> > chunks
> > > of data that are gobbling up memory or is the processing expensive or
> > ???.
> > >
> > > On Sat, Oct 1, 2016 at 9:43 AM, Debraj Manna <subharaj.manna@gmail.com
> >
> > > wrote:
> > >
> > >> Hi
> > >>
> > >> I have seen Throttler <http://camel.apache.org/throttler.html> in
> > camel.
> > >> Is
> > >> there anything available in camel that restricts the number of
> > concurrent
> > >> accesses something like this as mentioned here
> > >> <https://github.com/google/guava/blob/master/guava/src/
> > >> com/google/common/util/concurrent/RateLimiter.java#L41>
> > >> ?
> > >>
> > >> Also  the below seems to be a more generic query but thought of asking
> > >> here
> > >> if anyone can provide some thoughts on this:-
> > >>
> > >> I have observed that most of the RESP APIs does rate limiting on
> > requests
> > >> rather than restricting the number of concurrent requests. Is there
> any
> > >> specific reason?
> > >>
> > >
> >
>

Re: Limit Concurrent Access

Posted by Brad Johnson <br...@mediadriver.com>.
Ah, so you aren't really concerned about the incoming calls, per se, it's
the number of outgoing calls.  And to limit that you want to limit the
incoming calls?  Are the incoming calls sending data in that can be
processed asynchronously or are they returning chunks of data to the caller?

On Sat, Oct 1, 2016 at 2:45 PM, Debraj Manna <su...@gmail.com>
wrote:

> Thanks Brad for replying.
>
> In our use case we are doing lot of crunching, DB and external REST
> service calls? There is a limit on external REST service calls we can
> make. I can restrict the call to external services using a thread
> pool. But I was thinking if it is possible to limit when receiving the
> request, so that we can fail fast rather than limit while making the
> external call. If request is crossing the limit sending error to the
> caller is fine.
>
>
>
> On 10/1/16, Brad Johnson <br...@mediadriver.com> wrote:
> > The first question I'd have is "are you sure you have a problem with the
> > number of incoming requests?"  One of the biggest problems I find in the
> > field is premature optimization. If you have a fairly good
> characterization
> > of the problem, the number of requests anticipated, the length of time to
> > process the incoming request, etc. you can set up JMeter to stress test
> > your application.  That will let you change configuration options in
> Camel
> > and see if the response is more in line with what you are expecting.
> >
> > What exactly are you trying to accomplish by limiting concurrent
> requests?
> > What do you want to happen if there are too many requests? Are these
> > request/responses that you are getting and sending data back after some
> > length operations or are you mostly receiving data to be processed and
> then
> > sending an "OK" response back.  In the case of the latter you can put the
> > incoming data on a SEDA queue and immediately return an "OK".  Is it that
> > the incoming request is resulting in a lot of number crunching, database
> > calls, or other operations that take too long and the number of requests
> > are bogging things down before sending a response back to the user?
> >
> > Camel has a wide range of components that can provide RESTful APIs.  They
> > are all going to be a little different in their behavior.  For example,
> the
> > Netty component is going to use NIO under the covers to handle incoming
> > data.
> > http://camel.apache.org/rest-dsl.html
> >
> > If you use Jetty you can look at the min and max settings on the thread
> > pool. Jetty also has continuations which frees up the incoming request
> > threads and uses a callback mechanism to send the response back when it
> is
> > finished.
> > http://camel.apache.org/jetty.html
> >
> > But really, a bit more detail and code about the use case and what it is
> > you're trying to do would be helpful.  Do you want the request to send an
> > error to the client if there are too many incoming requests? Why is the
> > number of concurrent requests a concern?  Is the incoming data large
> chunks
> > of data that are gobbling up memory or is the processing expensive or
> ???.
> >
> > On Sat, Oct 1, 2016 at 9:43 AM, Debraj Manna <su...@gmail.com>
> > wrote:
> >
> >> Hi
> >>
> >> I have seen Throttler <http://camel.apache.org/throttler.html> in
> camel.
> >> Is
> >> there anything available in camel that restricts the number of
> concurrent
> >> accesses something like this as mentioned here
> >> <https://github.com/google/guava/blob/master/guava/src/
> >> com/google/common/util/concurrent/RateLimiter.java#L41>
> >> ?
> >>
> >> Also  the below seems to be a more generic query but thought of asking
> >> here
> >> if anyone can provide some thoughts on this:-
> >>
> >> I have observed that most of the RESP APIs does rate limiting on
> requests
> >> rather than restricting the number of concurrent requests. Is there any
> >> specific reason?
> >>
> >
>

Re: Limit Concurrent Access

Posted by Debraj Manna <su...@gmail.com>.
Thanks Brad for replying.

In our use case we are doing lot of crunching, DB and external REST
service calls? There is a limit on external REST service calls we can
make. I can restrict the call to external services using a thread
pool. But I was thinking if it is possible to limit when receiving the
request, so that we can fail fast rather than limit while making the
external call. If request is crossing the limit sending error to the
caller is fine.



On 10/1/16, Brad Johnson <br...@mediadriver.com> wrote:
> The first question I'd have is "are you sure you have a problem with the
> number of incoming requests?"  One of the biggest problems I find in the
> field is premature optimization. If you have a fairly good characterization
> of the problem, the number of requests anticipated, the length of time to
> process the incoming request, etc. you can set up JMeter to stress test
> your application.  That will let you change configuration options in Camel
> and see if the response is more in line with what you are expecting.
>
> What exactly are you trying to accomplish by limiting concurrent requests?
> What do you want to happen if there are too many requests? Are these
> request/responses that you are getting and sending data back after some
> length operations or are you mostly receiving data to be processed and then
> sending an "OK" response back.  In the case of the latter you can put the
> incoming data on a SEDA queue and immediately return an "OK".  Is it that
> the incoming request is resulting in a lot of number crunching, database
> calls, or other operations that take too long and the number of requests
> are bogging things down before sending a response back to the user?
>
> Camel has a wide range of components that can provide RESTful APIs.  They
> are all going to be a little different in their behavior.  For example, the
> Netty component is going to use NIO under the covers to handle incoming
> data.
> http://camel.apache.org/rest-dsl.html
>
> If you use Jetty you can look at the min and max settings on the thread
> pool. Jetty also has continuations which frees up the incoming request
> threads and uses a callback mechanism to send the response back when it is
> finished.
> http://camel.apache.org/jetty.html
>
> But really, a bit more detail and code about the use case and what it is
> you're trying to do would be helpful.  Do you want the request to send an
> error to the client if there are too many incoming requests? Why is the
> number of concurrent requests a concern?  Is the incoming data large chunks
> of data that are gobbling up memory or is the processing expensive or ???.
>
> On Sat, Oct 1, 2016 at 9:43 AM, Debraj Manna <su...@gmail.com>
> wrote:
>
>> Hi
>>
>> I have seen Throttler <http://camel.apache.org/throttler.html> in camel.
>> Is
>> there anything available in camel that restricts the number of concurrent
>> accesses something like this as mentioned here
>> <https://github.com/google/guava/blob/master/guava/src/
>> com/google/common/util/concurrent/RateLimiter.java#L41>
>> ?
>>
>> Also  the below seems to be a more generic query but thought of asking
>> here
>> if anyone can provide some thoughts on this:-
>>
>> I have observed that most of the RESP APIs does rate limiting on requests
>> rather than restricting the number of concurrent requests. Is there any
>> specific reason?
>>
>

Re: Limit Concurrent Access

Posted by Brad Johnson <br...@mediadriver.com>.
The first question I'd have is "are you sure you have a problem with the
number of incoming requests?"  One of the biggest problems I find in the
field is premature optimization. If you have a fairly good characterization
of the problem, the number of requests anticipated, the length of time to
process the incoming request, etc. you can set up JMeter to stress test
your application.  That will let you change configuration options in Camel
and see if the response is more in line with what you are expecting.

What exactly are you trying to accomplish by limiting concurrent requests?
What do you want to happen if there are too many requests? Are these
request/responses that you are getting and sending data back after some
length operations or are you mostly receiving data to be processed and then
sending an "OK" response back.  In the case of the latter you can put the
incoming data on a SEDA queue and immediately return an "OK".  Is it that
the incoming request is resulting in a lot of number crunching, database
calls, or other operations that take too long and the number of requests
are bogging things down before sending a response back to the user?

Camel has a wide range of components that can provide RESTful APIs.  They
are all going to be a little different in their behavior.  For example, the
Netty component is going to use NIO under the covers to handle incoming
data.
http://camel.apache.org/rest-dsl.html

If you use Jetty you can look at the min and max settings on the thread
pool. Jetty also has continuations which frees up the incoming request
threads and uses a callback mechanism to send the response back when it is
finished.
http://camel.apache.org/jetty.html

But really, a bit more detail and code about the use case and what it is
you're trying to do would be helpful.  Do you want the request to send an
error to the client if there are too many incoming requests? Why is the
number of concurrent requests a concern?  Is the incoming data large chunks
of data that are gobbling up memory or is the processing expensive or ???.

On Sat, Oct 1, 2016 at 9:43 AM, Debraj Manna <su...@gmail.com>
wrote:

> Hi
>
> I have seen Throttler <http://camel.apache.org/throttler.html> in camel.
> Is
> there anything available in camel that restricts the number of concurrent
> accesses something like this as mentioned here
> <https://github.com/google/guava/blob/master/guava/src/
> com/google/common/util/concurrent/RateLimiter.java#L41>
> ?
>
> Also  the below seems to be a more generic query but thought of asking here
> if anyone can provide some thoughts on this:-
>
> I have observed that most of the RESP APIs does rate limiting on requests
> rather than restricting the number of concurrent requests. Is there any
> specific reason?
>