Gatling Version used: 2.0.0-M3a
I am using a feeder to run a scenario as follows
val csvData = csv(“input-test.csv”).queue
val scn = scenario(“Events Test”)
setUp(scn.inject(ramp(10 users) over (5 seconds),
constantRate(300 usersPerSec) during (1 minute)
If you need to see the full code, it is similar to the example here:
Now when the input-test.csv increases(number of lines), I get out of memory. I guess it is loading all of the lines in the csv file into memory.
Is there a way to chunk the input so that for each execution i can take the next set of input values from the file.
In this particular case for each execution I would need 10 lines for 10 users for rampup and 300 * 60sec = 18000 lines, hence total of 18010 in the input-test.csv for each execution.
I have specified to repeat it 10 times so I would need 18010 * 10 = 180100 lines in the input file.
Is there a way that it reads only 18010 lines for each run rather than 180100 lines all at-once which causes out of memory.
Objective is to load data from feeder incrementally rather than all at once to avoid memory issues. I want to use the realistic data in the input file but create constant load of 300- 500 usersPerSec.