Close connection on each iteration

Is gatling have an ability to close connection on each iteration (like in LoadRunner)?
We want to emulate is real behavior of our client.


Don’t try to think the LoadRunner way. Learn Gatling.

With Gatling, you don’t want to iterate and recycle users, it doesn’t make sense. You want to start a fresh one.
Each virtual user has its own connection pool that gets closed when user terminates.

Thanks Stephane!
I don’t trying to recycle users :slight_smile:
I need to achieve the following simple scenario:

  1. Open connection
  2. Sending 1 http request
  3. Closing connection
  4. Do this in iterations with many concurrent connections
  5. In total I need about 6000 requests per sec
    Actually I need that request rate will equal to create/close connections rate.
    What is a way to achieve this with Gatling?


Standard HTTP behavior:

HTTP/1.1 defines the “close” connection option for the sender to signal that the connection will be closed after completion of the response. For example,

       Connection: close

You are right! This header will explain to server that client may to close connection.
But this will NOT close connection!!!
The connection should be closed by my client explicitly…


in either the request or the response header fields indicates that the connection SHOULD NOT be considered `persistent’ (section 8.1) after the current request/response is complete…

Connection: close notifies the peer that the emitting side is going to close the connection.

Yes, of course. Peer (client) notifies the peer (server) that is going to close connection.
But actually it is not. As far as I verified, gatling doesn’t close connection, even this header is set.


Ah ah, that’s indeed a stupid bug:

Thanks for reporting!

Wow! You are the best!
When we can expect for the fix? We will very appreciate this, because we already got stuck on this issue about 2 weeks…


We plan on releasing Gatling 3.2.0 mid-July along with FrontLine 1.8.0.
We might release a 3.1.3 before that if we hit a blocker for our customers, but there’s no such such plan atm.
If you can’t wait, you’ll have to build from sources.