I have a single, long-running task, which I am trying to time with Gatling. I am currently doing this by:
- Kicking of the task through an http PUT request
- Using a Gatling “group” function to poll the task, until it returns a JSON response to indicate the task is finished.
The sample code below shows what I am trying to do:
exec(http(“Start job”).put(“putURL”).body(StringBody("{json.body}")) // 1) “exec” to kick off the task
.check(status is 202))
.group(“Want to get cumulative time of this group job in the Gatling report”){ // 2) poll until the response JSON indicates the task has been completed
asLongAs(session => session(“jobCompleted”).validate[String] != “SuccessValue”
){pause(5).exec(
http(“Get task status”).get(“taskURI”).check(status is 200)
.check(jsonPath("$.jobCompleted").saveAs(“jobCompleted”))
)}}
When I run this code, I have found that the “group” function only aggregates the cumulative time of each individual http(“Get task status”). For example, if this code runs for 15 seconds the output I see is:
5 seconds in: exec(http(“Get task status”)) = 100ms [failed request]
10 seconds in: exec(http(“Get task status”)) = 95ms [failed request]
15 seconds in: exec(http(“Get task status”)) = 105 ms [successful request]
In the Gatling report, the group function I have specified shows 1 request, with a cumulative time of 300ms; rather than 15 seconds, because it does not include any of the 5 second pauses.
I have read the docs, and understand that this is the intended behavior. http://gatling.io/docs/2.2.2/general/timings.html
So my question is: Is there an easy way in Gatling to get the “cumulative” time of a long-running task, like I am trying to do above?
From what I can tell, my best option is to parse the “simulation.log” file, in order to get the begin and end time of this function. Is there a better way to do this? If there is another function I can use entirely (other than “group”), I am open to doing so.
Thanks!
Bert