Is there a way to calculate start and end time in simulation?

Hi Everyone,

So in our app we use open tracing and I wanted to send a request(span) with total duration for a gatling group so that I can visualise the data in grafana based on this span tag.

Main thing that I am trying to achieve is send a request at last when my whole group requests are completed with total duration value gatling took from first request to last request. To do this, for now I am manually calculating start and end time in epoch format in milliseconds and subtracting and sending it with the last request.

But my calculated value is around 200-300 ms over the final value reported by Gatling in html report, so for example if I captured start and end time and the difference is 3000 ms and when I check the group duration it is 2800 ms only.

So I have 2 questions -

  1. Is there a way I can capture group’s total duration from session or other vairable from gatling directly (programmatically)?
  2. If gatling doesn’t provide this information during run time, then could anyone suggest a better way for me to manually capture start and end time and substract to find total duration correctly? (should be equal to gatling’s total duration and not 200 to 300 ms additional)

Your help is much appreciated, thank you in advance for your help.

Hello Raghvendra,

I understand the issue you’re running into here and it can be a bit tedious. With Gatling Enterprise we actually have a Grafana Datasource integration that uses our API and can automatically push all of your data to Grafana for a test run including runtime. If you’re interested in giving that a shot we can set you up with a trial.

Using OSS, however, it would be done manually as you’re currently doing it as far as I know.

All the best,
Pete

Thank you for your quick response @GatlingPete, I certainly understand that Gatling Enterprise would help me here but unfortunately we’ll have to stick to OSS.
And just to clarify, as you stated whatever manually I am doing is the way to go… just wanted help to clarify why is there a difference of 200ms and why is manually capturing start and end time resulting in additional 200-300 ms? Is there a technical concept in play which you could help me understand? May be what I am doing is not how gatling captures start and end time? Any explanation at this point to understand time difference would be really helpful :slight_smile:

I suppose you have a separate block to save start and end timestamps. It shouldn’t be different from min of start - max of end .

Also if capturing duration post run from simulation.log instead of live, work for you then try doing that.

In our case we want the response time summary in a particular format with status wise aggregation and complete transaction names, hence we use shell function to parse the generated simulation.log file.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.