Gatling errors

Hello,

I’m getting lot of such errors for my scenario/simulation. I need some help to understand each of these errors and the direction to take to find their root cause. Are they because of my system capability (will inject these load from bigger machine solve this?) OR the way my application is handling these requests?

Context:
Gatling version: 3.5.1
I’m triggering my test from my local machine to run on application (not on my local).
My system configuration is :

Screenshot 2021-03-11 at 12.59.42 PM.png

Let me know if you need any more information from my end.

It’s a network issue.
It could be either your target application or your network.

Screenshot 2021-03-11 at 12.59.42 PM.png

I’m sorry. I’m very new in this type of testing.

Could you help and explain little bit in details so that i can take that as reference and initiate conversation with my team to solve this? Thanks in advance for all your help.

Hi Stéphane,

Good morning.

Here is summary of the information I collated from your another answers:

  • j.i.IOException: Premature close

    • your SUT can’t process in time
    • your load injector machine is not beefy enough (CPU, bandwidth) - We are working on it to inject load from Heavy machine.
    • It most likely means your application can’t withstand such load and kill connections even though requests are flying.
  • j.n.c.ClosedChannelException

    • System forcefully kills connections just after they’ve been opened.
  • i.n.c.ConnectTimeoutException: connection timed out: api-{env}/34.102.184.150:443

    • Increasing timeout from gatling side to 8mins brings the error down a bit but still for huge load this error such persists

Our application is hosted on GCP. Could you help me understand where(which section in generic) can I find these errors in GCP? There is no info in backend/application logs as the request doesn’t reach application itself.

Any other debugging idea is highly appreciated. Looking forward to your response.

j.i.IOException: Premature close

Premature close means the remote peer closed the connection while the client was trying to write on it.This can happen legitimately on keep-alive timeout, as the network is not instantaneous.

  • when getting a premature close on a pooled keep-alive connection, modern versions of Gatling with automatically retry and open e new connection and you won’t see an error
  • when getting a premature close on a new freshly opened connection, you’ll see this failure => it’s indeed a failure of your SUT that crashes connections

j.n.c.ClosedChannelException

I don’t think you should still see those with modern version of Gatling.

i.n.c.ConnectTimeoutException: connection timed out: api-{env}/34.102.184.150:443

Your SUT is saturated or loses requests. Increasing client timeout won’t help, your SUT (app, network, load balancers…) can’t withstand this load or is buggy.

Thanks a lot Stephane. Your explanation helped to clear a few more things out.

I’ll be looking into the load-balancers first to see if I find something there.

Hi Ankit,

You can also configure your load generators to be more performant

https://wiki.eclipse.org/Jetty/Howto/High_Load#Load_Generation_for_Load_Testing

No, really, Premature close has nothing to do with load injector performance.

Then, we also have such tunings documented on our side: https://gatling.io/docs/current/general/operations/

Dear Stephane, I faced with an issue while doing a test on finding max performance. After some level of rps I encount errors such as “14:21:03.294 [DEBUG] i.g.h.e.r.DefaultStatsProcessor - Request ‘myReq’ failed for user 24409: j.n.c.ClosedChannelException”. As you wrote before it shouldn’t appaer in new versions of gatling, but I’m using the latest one (3.7.4 + openjdk:17-jdk-alpine). Could you please give a fresh advice on this situation?

вторник, 16 марта 2021 г. в 18:30:44 UTC+3, Stéphane Landelle:

Hi,

The only way for us to investigate this kind of issue is that you provide a provide a Short, Self Contained, Correct (Compilable), Example (see http://sscce.org/).

1 Like