Hi,
I do hope it is a stupid question but I couldn’t figure out how to do it easily.
I have a system that should sustain let’s say 1000 active users. My users are coming to the site, they browse 2-3 pages and then they leave.
I don’t want to think in terms of RPS, I want to think in users.
The only think I can do so far is experimenting with rampUsersPerSec until I find the rate that will provide the right amount of active users.
What I would like is:
setUp(scn.inject(constantUsersPerSec(100) during(30 minutes))).throttle(
reachActiveUsers(1000) in (15 minutes),
holdFor(15 minutes)
)
This seems like a classical scenario. It is possible to do the equivalent?
Thanks,
Henri
Hi, Henri!
Have you found the answer?
I have exactly same issue.
I can use Apache Bench like that:
$ ab -k -c 1000 -n 1000000 “http://10.20.30.40/id1/id2?q1=v1”
Requests per second: 5424.66 [#/sec] (mean)
Time per request: 184.343 [ms] (mean)
Time per request: 0.184 [ms] (mean, across all concurrent requests)
But Apache Bench is not “smart enough” to parameterize IDs in the requests.
So, I wanted to use Gatling, and I can’t find the way to measure RPS under user’s loads. Seems like it designed to only measure site performance under given RPS load.
I’m trying different ways of doing that.
I was thinking that my code below would achieve that goal (50 users sending 80 requests each), but it only sends 50 requests.
package commonCollection
import io.gatling.core.Predef._
import io.gatling.http.Predef._
class ScsPerformanceSimulation extends Simulation {
val scsUsers = 50
val scsReqPerUser = 80
val scs = exec(http("Test").get("/"))
val httpConf = http
.baseURL("http://example.com")
val scn = scenario("Performance Test")
.repeat(scsReqPerUser) {
exec(scs)
}
setUp(
scn.inject(atOnceUsers(scsUsers))
).protocols(httpConf)
}
I'm using Gatling 2.2.5
What I'm not doing right? How can I have X number of users sending Y number of requests, without any relation to the duration?
Ok, I figured…
I had to disable cache:
val httpConf = http
.baseURL("http://example.com")
.disableCaching
Now, I’m getting 4000 requests.