Feeder iteration issue

Hello,

I’m trying to iterate over a large number of URLs, the SAME user calling a couple of them in a row.
The first call is right but every subsequent call from the same user retrieves the same URL as the first call.

Any hint on what I am doing wrong?

Below is the structure of my calls :

class test extends Simulation {

[…]

val scn = scenario(“Simple search”)

.during (1200 seconds) {

.feed(ssv(“LR.csv”).circular)
.exec(http(“LR 1”)
.get("${LR_URL}")
.headers(headers_All)
.check(status.is(200)))
.pause(200 milliseconds, 400 milliseconds)

.feed(ssv(“LR.csv”).circular)
.exec(http(“LR 2”)
.get("${LR_URL}")
.headers(headers_All)
.check(status.is(200)))
.pause(200 milliseconds, 400 milliseconds)

.feed(ssv(“LR.csv”).circular)
.exec(http(“LR 3”)
.get("${LR_URL}")
.headers(headers_All)
.check(status.is(200)))
.pause(200 milliseconds, 400 milliseconds)
}

setUp(scn.inject(atOnce(1 user), ramp(30 users) over (120 seconds))).protocols(httpProtocol)

Thanks

Olivier

As the documentation states, feeders are shared: https://github.com/excilys/gatling/wiki/Feeders#what

What happens here is that when you run several concurrent users, they compete for the records in the feeder.

With Gatling 2, you can pull several records at the same time: https://github.com/excilys/gatling/wiki/Gatling%202#http-misc

The feeders are shared across the users.
Why don’t you do that instead :

.feed(ssv(“LR.csv”).circular)
.exec(http(“LR 1”)
.get("${LR1_URL}")
.headers(headers_All)
.check(status.is(200)))
.pause(200 milliseconds, 400 milliseconds)

.exec(http(“LR 2”)
.get("${LR2_URL}")
.headers(headers_All)
.check(status.is(200)))
.pause(200 milliseconds, 400 milliseconds)

.exec(http(“LR 3”)
.get("${LR3_URL}")
.headers(headers_All)
.check(status.is(200)))
.pause(200 milliseconds, 400 milliseconds)

Then, your csv file will have 3 columns, LR1_URL, LR2_URL and LR3_URL.

cheers
Nicolas

Thanks for this hint, I used the Gatling 2 trick pulling n records at a time.
Nevertheless, I still don’t completely understand what happened.
I get it that multiple users compete for the records, but it still seems strange that the first record of each iteration for each user is always “a new one” but the three subsequent calls are always “already used ones” (I’m quite sure of that as my system uses caching, so he first calls are always like 300 ms, and second, third and fourth are always like 20 ms). If the competition is the explanation, that would mean that each user always manages to get a first URL that no one has gotten before, but unluckily his further calls are always among already used ones.

The three column advice is helpful too.
Thanks.

OK, get it, I didn’t proper read your simulation.

You’re creating 3 different feeders instances!
All of them read the same file, but still!

val feeder = ssv(“LR.csv”).circular
.feed(feeder)