Feeder which transforms JSON lines into JSON arrays of configurable size

Hello,

I have a fairly big file composed of one JSON object per line. I am struggling to build a feeder which feeds N lines at a time, transformed into an JSON array of N objects as payload to the HTTP REST API I am trying to load test.

Could someone provide an example of how to do this read/transformation step - ideally without reading the entire file into memory first ?

Thank you in advance,

David

Hello,

I’m not sure to understand what you want to achieve.

A feeder is a mean to centralize data to pick for each user (in their session).

What I understand is that you want to create a body for a request based on some data.

I don’t get the link between your use case and a feeder.

My guess:

  • Have a file src/resources/myData.json containing an array of possible data.
  • Have a function that read N lines (random or not depending on your concern) from classpath:myData.json, formatting them as you need
  • Use this function in your gatling script to generate the body for your request

I don’t know if this can be useful in your use case.

Cheers!

Hi Sébastien,

Thank you for your reply. You are suggesting that I don’t use the feeder part of the gatling DSL but rather construct the post bodies using something like processRequestBody?

As for my use case: I just want to iterate over the file taking N lines at a time, wrap those lines in a JSON array and post this as payload to my API until the file is exhausted. I am looking to simulate a server-2-server scenario where I have one client with pooled connections, that’s about it.

Thanks,

David