My problem is very simple to understand, but I can’t find the reason.
I have big issues with a website. To start simple, I’ve made a linear scenario (adding users per step) which does :
exec(login)
exec(Cart)
repeat (4000000) {
exec(addProduct) (ajax request in the cart)
pause(1)
exec(removeProduct) (ajax request in the cart)
pause(1)
In terms of requests per sec, I think it should have been VERY linear, and it does not.
From step1 to step9 it’s chaos. From step10 to step20 it’s linear and smooth like expected
If someone has explanations, I would be very glad
Thank you !!
My2cents: you’re running in a shared environment where someone else was eating all CPU or network resources until 2AM. Batched ops, database copies, etc.
Hi Stéphane.
I don’t know if there’s a misunderstanding. I’m only talking about requests generated by Gatling without any consideration about webservers targeted.
If you talk about the Gatling virtual machine, yes it’s on a shared environment, but :
I’ve just checked all my grafana counters of the VM : ‘steal’ CPU stays at zero, plenty of free RAM, no saturation anywhere.
If Gatling was slowed down at @11pm, not @2am, why would it be able to make more requests @11pm than @2am ?
Thanks again for your time.