Fail build on failed check

Hi,

It’s possible to fail the maven build (I use the plugin gatling-maven-plugin 1.1.5) when during test exection a request failed ?
exemple : […].check(status.is(200)) and server respond 500 code

Like maven phase test when a Junit test failed

Thanks,
Cyril.

Hi Cyril,

I’ve not used the maven plugin yet, but there seems to be a ‘failOnError’ configuration option.

Maybe setting this to true (or false) can achieve what you want to do :slight_smile:

Cheers,
Romain

I’ve just had a look at the code and I’m pretty sure failOnError doesn’t fail when a request fails, but if the engine fails to start or crashes.
Anyway, performance tests are similar to integration tests, not unit tests, so build shouldn’t fail.
That’s exactly why the maven-failsafe-plugin was developed.

Cheers,

Steph

2012/5/10 Romain Sertelon <bluepyth@gmail.com>

Ok, thank Stéphane and Romain for your answers.
I understand the philosophy so I rewrite my question (it was a bad idea) :

How to detect automatically a stress test regression or a stress test fail with continous integration without check the gatling reports ?

Describe “fail”: failing checks, worse response times?

What we intend to develop is a database storage for simulation results, with the ability to compare results, but I don’t think will be able to work on this before end of 2012.

Steph

2012/5/10 Cyril <cyril.couturier@novapost.fr>

Yes, both failling checks, worse response times or any other element that as a human report reader, I could say :
“Oh no, this stress test detects a anomaly, my application fails to serve as it should. This criterium is important for the business and it is not respected maybe a new commit breaks something.”

exemples in context of stress : a max duration of simulation is over a threshold, a particular response code is incorrect, too much failed requests…
Criteria so important that if failed, the stress test can be considered unsuccessful.

The check API is very good an easy to use but (it think) It is a microscopic view. It is possible to use checks at simulation level ?

As you realized, the meaning of “my test has an anomaly” is very complex and variable, so we’d need a real API for people to define it.

Statistic are only computed by the charts API once the simulation is finished, so it has nothing to do with the simulation configuration.
Haven’t figured out were to place such an API yet.

2012/5/10 Cyril <cyril.couturier@novapost.fr>