However, I would like each user to process all the data in a feed. So far, I’ve been unsuccessful.
Even something like (with a creation of a feeder with 3 elements at the lowest point I could find):
will eventually get me a “next on empty iterator” error for my second user, after the 3 data in my feed are consumed.
Is it correct to think that this sort of behavior is not supported?
I guess an alternative would be to use a circular feeder, but I don’t see this guaranteeing that each user will use all the elements in the feeder, since they would concurrently access the same feeder (I think).
We’re considering having non-shared feeder for M5 (feeders have an impact on clustering, so we might have to change things there).
What you want is quite easy to implement actually.
For example:
val data = csv(“ids.csv”).data
repeat(data.length, “i”) { // repeat data.length times and set “i” as the counter name
exec{ session =>
val i = session(“i”).as[Int] // get current counter value
session.setAll(data(i))
}
.exec…
}
This way, you’ll have all the users loop on data and execute in the same order (the one defined in the file).
Depending on what you exactly want, you could have different strategies.
For example, if you want the order to be random and data is not too big, you could do something like this:
val data = csv(“ids.csv”).data
val indices = (0 to data.lenth).toVector
.exec(session => session.set(“indices”, util.Random.shuffle(indices))) // store in each session a different shuffled sequence of indices
.foreach("${indices}", “i”) {
exec{ session =>
val i = session(“i”).as[Int] // get current counter value
session.setAll(data(i))
}
.exec…