I have three requests at 10 users per second.
constantUsersPerSec(10) during(5 minutes)
With the three one second pauses, I would expect 10 requests per second. However, I still get 30 RPS and I wondered what was wrong with my implementation?
10 users are starting every second.
Those 10 users are doing one request during the first second, and then pausing.
During the second second, those 10 users are doing their second request, and then pausing.
In the mean time, the next 10 users are starting, and doing their first set of 10 requests.
So during the second second, you are seeing a total of 20 requests.
During the third second, the first set of 10 are doing their third request,
the second set of 10 are doing their second request,
and another set of 10 are starting up, and doing their first.
This repeats for all 300 seconds.
Total simulation duration is approximately 302 seconds.
See why your throughput is 30 requests per second, average?
That took a while for my head to get around. But it now sounds perfectly sensible.
I wonder what others are doing when a requirement is for 100 requests per second but you have 2 requests. Is the idiom just to divide 10 by 2 and run with a constant rate of 50 users per second?
One option is to throttle. Another is to reduce your constantUsersPerSecond. The third is to question your requirements…