Best practice to create test with big csv injection


I want to inject many documents in my database using a webservice. I’ve got a big csv, 1Go ( contains only one column of string), and when i try to import this file i’ve got OutOfMemory issue.

i’m using the default feeder : val feeder = csv(“doc15.csv”).queue

What is the best practice to run this test :

→ create my own feeder ? ( can we import line by line for exemple )

→ create multiple csv and load one by one ?

→ dont use Csv injection and use an other injection ?

Thanks in advance,


Gatling built-in implementation loads everything in memory so it doesn’t have to read from file system at runtime.
There’s a pending feature request for what you want:
Contribs welcome :slight_smile: