.shareConnections explained

Hi,
I run a scenario that consists of two simple consecutive REST calls, and takes (with one user/real-wordl) approx. 4 seconds (I use no pauses of pacing).

without .shareConnections I get

`
[WARN ] i.g.h.a.AsyncHandlerActor - Request ‘getService’ failed: java.util.concurrent.TimeoutException: Request timed out to myapp.site.com/vv.zz.cc.xxx:443 of 60000 ms

`

But with .shareConnections set I do not get the WARN.

What does .shareConnections mean (basically explained), and is it not realistic to use this setting when doing REST-API simple perf testing?

My load model:

`

setUp(scn.inject(rampUsersPerSec(1) to (20) during (2 minutes))).maxDuration(5 minutes).protocols(httpConf)

`

Also I do not get a “a trend” in my reqs/responses per second using this load model. Is that because the scenario is so short?

shareConnections means that you have one single global HTTP connection pool, not one per virtual user.
The former is more realistic if you have just a few real world users but still designed your Simulation with tons of virtual users. Think of server side programs calling APIs, such as booking, price comparators, etc.
The latter is more realistic if you have tons of real world users that each handle connections, typically browsers.

Those request timeouts typically mean application resource starvation, such as database connections.

Ok, I have an app (iPhone and Android), but the test I am running goes against REST api’s.
The top-hour is estimated to hav 1500 unique users in one hour.
Using .shareConnections should then be the preferred way to go here then?

No.

so skip the .shareConnections then?

Yep. You want to simulate a big bunch of phones. Different from a few servers of a price comparator app that call remote APIs with a bunch of concurrent connections.