Producer/consumer feeders between two executions

Hi. I have two executions. I have a custom feeder which provides data (which are “create data” input values) for first execution and I need to use those consumed values (as “update data”) into the second execution. Is there a simple way to do such a thing? Or i’ll have to use producer/consumer blocking queues through a custom feeder?

I do this by saving the data into a session variable. Of course, the first execution and second execution have to be within the same scenario (i.e. both will use the same session object).

What do you mean by first/second execution? Different step inside the same simulation run, or distinct simulations?
Do you need a given virtual user to use in the second execution the data it got in the first one, or does data get mixed (any user can pick any data)?

I guess your data is generated dynamically, but can’t you dump it on the disk? Is it to big to fit in memory?

I’m using gatling 2. Actually first and second executions run in the same scenario. They need to be run in a random fashion (which I guess is the default behavior of gatling?)

Any virtual user can pick any data element. It’s just that “create” should happen before “update” for all data elements. Which also means that only the data that has been processed in first execution can be used in the second execution.

The input data is generated using a large (too large to hold in memory) csv file. It seems like a producer consumer problem.

I'm using gatling 2. Actually first and second executions run in the same
scenario. They need to be run in a random fashion (which I guess is the
default behavior of gatling?)

Nope, that's queue.

Any virtual user can pick any data element. It's just that "create" should
happen before "update" for all data elements. Which also means that only
the data that has been processed in first execution can be used in the
second execution.

The input data is generated using a large (too large to hold in memory)
csv file. It seems like a producer consumer problem.

You'll need some queue of a sort. The amont of memory will depend on if you
can consume at the same time that you can produce.
Have you considered using Redis?

Got it. I would give it a try using a suitable in-memory collection before considering Redis.

If you offer and pop at the same time, a ConcurrentQueue might suffice. You won’t get a random pop though, which you could with Redis: http://redis.io/commands/spop