You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@jmeter.apache.org by "vlsi (via GitHub)" <gi...@apache.org> on 2023/06/04 07:45:55 UTC

[GitHub] [jmeter] vlsi opened a new issue, #5966: Open Model Thread Group: support concurrency limiting

vlsi opened a new issue, #5966:
URL: https://github.com/apache/jmeter/issues/5966

   ### Use case
   
   Currently OMTG might create an unlimited amount of threads threads in case the application processes requests slower than JMeter spawns threads.
   JMeter might run out of memory since every `JMeterThread` uses a cloned test plan. We should report the error and fail gracefully rather than crash with OutOfMemory.
   
   Another use case is to test the system with a limited concurrency level.
   For instance, the system might have a hard limit on the request concurrency (e.g. 10 concurrent requests maximum), and the users might want to test various load levels.
   
   
   ### Possible solution
   
   a) Implement a fixed concurrency limit for OMTG
   b) Implement variable concurrency limit like the current `rate(...)` in the schedule string: `concurrency(10) rate(10/sec) random_arrivals(10 min)`
   
   Open questions
   
   - [ ] What should OMTG do in case the concurrency limit is reached? Should it discard a launch immediately? Should it wait for some time for the thread to become available?
   - [ ] How do we report that certain launches were delayed due to concurrency limit?
   
   
   ### Possible workarounds
   
   _No response_
   
   ### JMeter Version
   
   5.5
   
   ### Java Version
   
   _No response_
   
   ### OS Version
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] vipergreat commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "vipergreat (via GitHub)" <gi...@apache.org>.
vipergreat commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1631751992

   @vlsi The Runtime Controller is working for me in this case. Thank you very much!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] vlsi commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "vlsi (via GitHub)" <gi...@apache.org>.
vlsi commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1630518577

   >Actually we comparing with bzm - Free Form Arrival Thread Group but I think OMTG is easier to config.
   
   Glad to hear that.
   
   Have you tried wrapping the test with `Runtime Controller` so it stops the thread if it takes too long?
   https://jmeter.apache.org/usermanual/component_reference.html#Runtime_Controller
   
   It won't really "retry the steps in case of timeout reached", however, it would probably enable you to configure an overall timeout for several steps.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] vlsi commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "vlsi (via GitHub)" <gi...@apache.org>.
vlsi commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1630172337

   >When they wait too long they will kill the mobile app and re longin again.
   >We cannot limit them. It's not real.
   
   You are right limiting threads is not realistic for systems like `google.com`, however, if the limit is reasonably high, then it it might be enough to reach the target workload, and at the same time it might be low enough to keep JMeter from failing with `OutOfMemoryError`.
   
   ---
   
   >We cannot limit them. It's not real
   
   There are cases when thread limit is expected for a production system.
   Imagine there's an internal application with 100 users only. The new users won't automatically appear, and they will not open 20 browser tabs trying to complete their work with 20 tabs simultaneously.
   So if we set JMeter's limit to 100, and test shows "not enough threads to reach the desired workload", it would effectively mean we need to optimize the software or recruit more workers.
   
   ---
   
   "thread lifetime" sounds like an interesting idea.
   The current "thread lifetime" for OMTG is "single execution". In other words, if OMTG is configured for 100 requests, each of them uses its own `JMeterThread` with its own variables and cookies.
   
   However, it looks like "thread lifetime" alone is not enough.
   Imagine the following: `thread_lifetime(20 min) rate(1/min) random_arrivals(1 day)`.
   Suppose JMeter creates the first request, and it completes within seconds. Which thread should be used for the second request? Should JMeter reuse the existing one (it is still available for ~20 minutes) or should it create a new thread?
   It seems we would need `thread_reuse_percentage(42%)` so JMeter could know it should reuse one of the existing threads with 42% (which one?) and it should create a new one with 58%.
   
   So the scenario becomes `thread_lifetime(20 min) thread_reuse_percentage(42%) rate(1/min) random_arrivals(1 hour)`.
   Overall concurrency limit would fit nicely there as a precaution setting to prevent OOM: `thread_lifetime(20 min) thread_reuse_percentage(42%) max_concurrency(100) rate(1/min) random_arrivals(1 day)`.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] vipergreat commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "vipergreat (via GitHub)" <gi...@apache.org>.
vipergreat commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1630415634

   Hi Vlad
   
   I have some use case to share. I'm reproducing the production issue by using JMeter and I think the Open Model Thread Group is the answer. 
   Actually we comparing with bzm - Free Form Arrival Thread Group but I think OMTG is easier to config.
   We has APIs as follows
   
   1. /api/welcome
   2. /api/login <--- This API was pain
   3. /api/verify-input
   4. /api/submit-input <--- we will simulate response time from 1 sec to 30 sec
   5. /api/logout
   
   We have number of Customer arriving rate from the load balancer. It's around 10k/min (we count only first API - welcome).
   On that day the API no.4 is slow from 1 seconds to 30-40 seconds.
   We found API no.2 is pain but another API still fine.
   
   So my hypothesis is when Customers wait API no.4 too long they will kill the app and try login and input data again and again the arriving rate increase to 20-25k/min.
   So the thread lifetime is represent Customer can't wait and kill the app. It's may config the lifetime to only 1-2 minutes per thread.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Re: [I] Open Model Thread Group: support concurrency limiter [jmeter]

Posted by "mkaeufler (via GitHub)" <gi...@apache.org>.
mkaeufler commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1787138439

   I would also highly appreciate a thread limit.
   
   We're testing our production load with thousands of relatively heavy requests. If we reach a certain point where the system can't keep up the OMTG will span more threads to push the configured throughput which makes everything worse by putting even more load on an overloaded system.
   
   I've seen test runs where we had over 60000 threads beating on our already defeated system.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] vipergreat commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "vipergreat (via GitHub)" <gi...@apache.org>.
vipergreat commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1630020860

   Hi
   
   Can we has a Thread lifetime option like the Standard Jmeter Thread Group. 
   Because if we specific the concurrency limit it not realistic like production issue.
   Imagin when the production server is slow. Customer wait too long.
   When they wait too long they will kill the mobile app and re longin again.
   We cannot limit them. It's not real.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Re: [I] Open Model Thread Group: support concurrency limiter [jmeter]

Posted by "Akaoni (via GitHub)" <gi...@apache.org>.
Akaoni commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1847960491

   +1 for concurrency limiting.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] vipergreat commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "vipergreat (via GitHub)" <gi...@apache.org>.
vipergreat commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1630520891

   > > Actually we comparing with bzm - Free Form Arrival Thread Group but I think OMTG is easier to config.
   > 
   > Glad to hear that.
   > 
   > Have you tried wrapping the test with `Runtime Controller` so it stops the thread if it takes too long? https://jmeter.apache.org/usermanual/component_reference.html#Runtime_Controller
   > 
   > It won't really "retry the steps in case of timeout reached", however, it would probably enable you to configure an overall timeout for several steps.
   
   Interesting! I will try for this. Thanks!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] vipergreat commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "vipergreat (via GitHub)" <gi...@apache.org>.
vipergreat commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1630517615

   > Hi I don't know if it can help you, but in HTTP request, you can configure a timeout
   
   Hi
   The default timeout is 30 sec and this value is acceptable in production.
   But I want to solve the problem about somany thread stuck when API slow response and it's will cause the OutOfMemory error.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] ra0077 commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "ra0077 (via GitHub)" <gi...@apache.org>.
ra0077 commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1583263878

   Hi
   
   I think it's a good idea to have concurrency(10) variable in addition to the others.
   How do you avoid the crash with OutOfMemory if the concurrency is to high?
   
   About report,
   If the reporting is with backend listener, send a event (I don't knwow if it always possible)
   If the reporting is with console log, a message
   If the reporting is with html report, it will be more complex. But maybe add another file in addition of csv file to add metadata. And put these metadata in a section at the top the html report
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [jmeter] ra0077 commented on issue #5966: Open Model Thread Group: support concurrency limiter

Posted by "ra0077 (via GitHub)" <gi...@apache.org>.
ra0077 commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1630481430

   Hi
   I don't know if it can help you, but in HTTP request, you can configure a timeout


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Re: [I] Open Model Thread Group: support concurrency limiter [jmeter]

Posted by "JJena (via GitHub)" <gi...@apache.org>.
JJena commented on issue #5966:
URL: https://github.com/apache/jmeter/issues/5966#issuecomment-1764768339

   We always run into jmeter heap issues because it tries to spawn ~ 2000 threads with open model while constant throughput timer/regular thread group can achieve the same TPS with < 10 threads. If you could add a limit to max thread count, it would be such a relief.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@jmeter.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org