Use Gatling for integration testing

Hi everybody,

I found nothing on this subject so I create a post ! Sorry if this a duplicate.

We started to use Gatling to test our Java project. We defined some simulations and we’d like to know if there is a way to use those simulations as integration tests.
There already are integration tests in our project, so we are able to launch our webapp in memory during JUnit tests.

What we want from Gatling is to run a simulation into a JUnit test and assert that it ran well.
Can you advise us of the way to process ? We’d to contribute to the project with this feature.

In every case Gatling is an awesome tools ! We enjoy to discover it, thanks a lot !

Hi Thomas,

Actually, the best way to run Gatling integration tests would be to use the upcoming Jenkins plugin.
Do you use Jenkins?

2012/10/2 Thomas Hilaire <>

Hello Stéphane,

thanks for your quick reply.

Yes we use Jenkins and use the upcoming plugin would be a good way to achieve integration tests.
But the best thing would to be able to run those tests before push it on Jenkins, with eclipse and maven in our case.
Do you think that is workable ? We can work to make it possible !

Actually, there’s been some discussions about this already:

The main problem is that you have to define “it ran well”. This is a user dependent definition, meaning that it could mean “no failed requests”, “average below a given value”, “95th percentile below a given value”…

This implies that:

  • We’d have to add an API for specifying success criteria
  • We’d have to add something for matching those criteria with the consolidated results generated in the reports
  • We’d have to make Gatling return the result (a boolean, a message?), which is currently not the case
    If think the best way would be to have a wrapper around Gatling.

As you can see, there’s a bit of work here… :wink:

BTW, do you know that you can already launch Gatling from your IDE (even though you don’t have the success criteria)? See the maven archetype.

2012/10/2 Thomas Hilaire <>

Today we make our scenarios manually with a http library to simulate the client.
We’d like to replace the client part by Gatling, without any performance aspect.

What we concretely want is to do during a JUnit test :

  1. Launch our webapp in memory, with our mocks injected into
  2. Run a Gatling simulation to emulate the client
  3. Verify mocks calls to assert the test

…and we already know how to do step 1 and 3 !

I meant by “it ran well” that the simulation runner lets us to know if a fatal error has been triggered, by a client aspect.
This step is optional since we’ll verify our mock, but it’d be the best !

So we’re only looking for a simple “Engine” which only know how to run a simulation, without any “success criteria”, “reports” or “result” aspects.
The com.excilys.ebi.gatling.core.runner.Runner looks tied to reports aspect, do you think that we can easily implement our own and discard every reports and implicit errors handling things ?


That’s actually quite easy (honestly)!

First, there’s an option for not generating the reports named -nr (no reports):

The Gatling entry point is the main method (the usual Java one that takes a String array parameter) of the companion.

In Java, you can access it from$.MODULE$

So, if you could write something like:

String[] parameters = new String[] {"-nr", “-sf”, “pathOfYourSimulations”, …};$.MODULE$.main(parameters);

Please have a look at the documentation above in order to find out the other options you might have to specify.
I recommend you have a lot at the maven archetype:

Hope it helps!
It would be very kind if you could send feedback once you’re done.



Great !
Are you interested if we develop something to let exceptions pull up to the caller ?
For example with a special exception like “FatalClientException” which will not be handled by Gatling.

I didn’t plan when I’ll do this task, but sure I’ll give feedback.

Thanks for your help,

Actually, when I said that Gatling doesn’t return anything to the caller, that was as in output parameter.
Exceptions that happen in the Runner, like compilation errors, do get sent up the stack.

Exceptions that do get trapped (and will always be) are the ones that happen along the simulation, like a Connection Refused or a Connection Timeout: from the stress test perspective, this is a perfectly acceptable situation.

What we might do someday is introduce a success criteria API and move the json file parsing from the Jenkins plugin into the core.



2012/10/2 Thomas Hilaire <>