Is there a means to expose a large data object to many clients without storing it on every session?
As an example, a web application caches 10MB of data in the user’s session cache to reduce network chatter.
This is not an issue to emulate - simply hit the same endpoint and save the deserialized object on the user’s session.
However, this creates a problem for simulating large user scale, as the duplicated data across many user sessions grows disproportionately to the load generated by the users, and gatling’s ability to simulate users becomes capped by the memory allocated to storing this 10MB/session data.
So, I’m curious if there is a means exposed to retrieve this data before launching a simulation, and expose it to each user via a function stored on the session. Otherwise, the memory footprint of each user’s session severely limits gatling’s ability to test scale.
I’m aware that removing the call to load this object from the service from the user simulations increases the variance between the test and reality, but I can test that separately (validate the response and discard it, so it’s trivial to test at high scale as well)