soak test


I would like to run the soak test over the weekend. Is there any known limit on the size of the simulation.log. I am concerned the file will get too large and i may loose the results at the end. I am using 2.1.6 gatling version. Any suggestions on what I can do as a preparation to minimise the risks of that happening are welcome.


Actually i don’t have an answer for but i would like to seond this request. In particular is there a way to reduce the logging interval into the simulation.log because i fear that the file will become too large to parse.

The size of the file depends on the number of requests you’ll be logging, so it really depends on your use case.
You can easily extrapolate from your standard usage.

Then, yes, if the file is humongous, you’ll need a big heap.

Thank you. I was hoping that just like you define intervals in tools like vmstat or iostat you can do so with the simulation log so that it could remain manageable.


Most likely, the size of the simulation.log doesn’t matter until you try to generate results. Then, if you create a big enough heap, you should be okay. If that doesn’t work, you can try intelligently dividing up the simulation log and generating reports on subsets of it. Stéfane, is there anything one would have to be careful about when taking a subset of a simulation log for reporting purposes?

The load that i am generating results in a 493MB simulation log in 7 min. If i continue to generate the same load then that would roughly mean 4.2GB per hour. So a soak test of 10 hours will result in roughly 42 GB file. Spliting it will reuqire quite a bit of resources in itself but what heap size would be sufficiant to parse it?

Another request for enhancement that comes to mind is to write and parse a compressed simulation log file. I think this feature will come in handy as anyway i gzip each simulation.log anyway to archive it.


Gatling standard report module won’t be able to parse such a file, sorry.
FYI, we’re working on a commercial product on top of Gatling for which such amount of data will be piece of cake.