Process several CSV lines at the same time with a feed

Hi,

is it possible to process several CSV lines at the same time with a feed?

e.g I need to create an xml file from a template to send it via HTTP POST. For each request my XML must contain data from 5 lines in my CSV data file.

Currently I do :

feed(csv(dataSource).circular)
.exec(
http(“myRequest”)
.post(URI)
//loading template from ssp file
.fileBody(“myRequest”, Map(
“name” → “${name}”,
“address” → “${address}”
)).asXML
.headers(commonHeaders)
.check(status.is(200))
)
}

When I do this I can process only 1 line of the CSV and my XML looks like :

$name $address

How can I do if I want something like this, reading CSV entries 5 by 5?

$name1 $address1 $name2 $address2 .. .. .. .. .. ..

Thanks

Loic

Hi Loïc,

Your problem is a bit more complex than just polling 5 by 5 as you want the attributes names not to clash.

Feeder is an alias for Iterator[Map[String, _]], so you can do something like this:

val feeder = tsv(“fileName”).queue // explicitly call queue to make the Array into an Iterator

val scn = scenario(“foo”)
.feed(feeder.map(suffixMapKey(1)))
.feed(feeder.map(suffixMapKey(2)))
.feed(feeder.map(suffixMapKey(3)))
.feed(feeder.map(suffixMapKey(4)))
.feed(feeder.map(suffixMapKey(5)))

However, it won’t guarantee that the 5 records being polled are sequential, as multiple concurrent users might poll at the same time.

I’ll try to think of a better solution.

Cheers,

Stéphane

Thanks I will try this!
Non sequential read is not a problem for me

Regards
Loïc

Sorry I answered too quickly, I’m not sure to understand what I have to put in the suffixMapKey to make this work.
In addition, I have to use circular stategy instead of queue, is it possible?

Thanks

Loic

Here’s a better solution (just copy this directly into your simulation):

type Record = Map[String, Any] // define an alias so there’s less boilerplate

def chunkAndMerge(records: List[Record], chunkSize: Int): List[Record] = {

// translate the record key with a suffix

def suffixMapKey(record: Record, suffix: Int): Record = record.map { case (key, value) => (key + suffix) → value }

// recursively split a list into chunks of a given size (exceeding records are discarded)
def chunk(records: List[Record], chunks: List[List[Record]]): List[List[Record]] = {
if (records.size < chunkSize) chunks
else chunk(records.drop(chunkSize), records.take(chunkSize) :: chunks)
}

// merge a list of records into a single one
def merge(records: List[Record]): Record = records
.zipWithIndex
.map { case (record, index) => suffixMapKey(record, index) }
.foldLeft(Map.empty[String, Any]) { (record, merged) =>
merged ++ record
}

chunk(records, Nil).reverse.map(merge)
}

val feeder = chunkAndMerge(csv(“foo”).toList, 5).toArray.circular // FIXME toArray shouldn’t be necessary, that’s Gatling’s fault, will fix

You then just have to use it as a regular feeder.
Is this clear for you?

This is working great!!

I just needed to define this :

type Record = Map[String, String]

and to change

.foldLeft(Map.empty[String, Any]) { (record, merged) =>
by
.foldLeft(Map.empty[String, String]) { (record, merged) =>

Thanks a lot for your help!

Ah, sorry, maybe that's something that has changed in 2.0.0_SNAPSHOT/master

You're welcome

I think this method could be useful for a lot of other people, what about addind possibility to parse CSV in multiline mode in the galting code?

Yep, just have to think of a clean API. :wink:

https://github.com/excilys/gatling/issues/904

Great :slight_smile:

Hi,

I need to read 1000 csv records and hit as many users as i wish and those data to be parsed and put in body of a http request.
Right now with my solution i am able to read only 1 random record.

Ay help is appreciated.

Hello,I tried to read user csv files to get useriDs 5 by 5,For HTTP GET:get("/user/1234;1235;1236;1237;1238/"). I used “chunkAndMerge(csv(“foo”).toList, 5).toArray.circular”,but it’s error. Info: “value toList is not a member of io.gatling.core.feeder.RecordSeqFeederBuilder[String]”.
eg:csv.file
email ,password,userid
****@eadtest.ea.com,123,1234

****@eadtest.ea.com,123,1235

****@eadtest.ea.com,123,1236

****@eadtest.ea.com,123,1237

****@eadtest.ea.com,123,1238

****@eadtest.ea.com,123,1239

****@eadtest.ea.com,123,12310

****@eadtest.ea.com,123,12311

and more

在 2013年1月11日星期五 UTC+8下午7:08:30,Stéphane Landelle写道:

This chunkAndMerge doesn’t come from Gatling and is some custom code on your side, can’t help.