My team is currently using JMeter quite heavily, but we are out growing it and I am evaluating Gatling. Our JMeter scripts are mainly functional tests of the APIs (one user, one loop) that are executed via a maven test target.
I was wondering if anyone here could provide some anecdotes for using Gatling in a functional testing manner. Has anyone wrapped their scenarios with something like JUnit so that suites of test scenarios can be run on the continuous integration builds?
These same scenarios ideally would later get converted to load tests by passing in the number of users via JAVA_OPTS and could be executed directly (not using JUnit).
There would be actually 2 ways of having integration tests with Gatling:
- Integrate Gatling with Jenkins (if you have one of course)
- Integrate Gatling with a test framework such as JUnit
Regarding 1: there’s already a Jenkins plugin. It’s functional, but we haven’t released it yet because we might first refactor things for reasons I’ll explain below.
You can find the code here:
Regarding 2: Gatling currently doesn’t return a result or throw an exception and doesn’t provide an API for setting acceptance criteria. An embryo of such an API has been introduced in the Jenkins plugin, but we’re considering moving it into Gatling core.
If you’re in a hurry, I’d advice you go with the Jenkins plugin if you can.
2012/10/19 Stephen Ash <firstname.lastname@example.org>
Thanks for the pointer to the Jenkins plugin. I’ll try using that and see how far I get.
Feel free to ask if you have any question.
2012/10/22 Stephen Ash <email@example.com>
Any progress on this?
Would be nice to run gatling inside junit or some Scala test framework and be able to assert against various success criteria.
What about these?
Could they integrate with the above, so when gatling fails those assertions then so does the unit/integration test?
The API Stéphane was talking about is the Assertions API and if you’re running Gatling with the Maven plugin, the build automatically fails in case of failed assertions.
You can therefore use Gatling for functional testing, just setup the Maven plugin in your build and some assertions in your simulation and you’re good to go
Ah, thanks for that.
No, currently running Gatling inside Eclipse via the Maven architype, i.e. Engine.
Any way of getting at those assertion errors from Engine or similar? I see that the last things it does is run Gatling.fromMap(Map[String,Any]) returning an Int.
Where are they actual failed assertions stored?
Then I could call Gatling inside on of my tests, throwing some assertion containing the Gatling failed assertions.
The integer returned by fromMap is the exit code, and there are currently 3 exit codes :
- 0 => everything went well : no incorrect command line arguments, assertions are OK
- 1 => Parsing of command line arguments failed
- 2 => There were some failed assertions
So, basically, you’re interested in simulations exiting with exit code 2.
Assertions are stored in the simulation, and are accessible using the assertions method (which returns a Seq[Assertion]).
However, with the current implementation of assertions checking, result is not stored anywhere : an Assertion only store the assertion itself and the message to display.
At the end of the simulation, Gatling checks the assertions against simulations.log’s data, prints the result on the standard output and only gets back a boolean telling if all the assertions are OK or not.
To sum up, you can’t currently individual results for each assertions, only a global result.