Sorry I answered too quickly, I’m not sure to understand what I have to put in the suffixMapKey to make this work.
In addition, I have to use circular stategy instead of queue, is it possible?
def suffixMapKey(record: Record, suffix: Int): Record = record.map { case (key, value) => (key + suffix) → value }
// recursively split a list into chunks of a given size (exceeding records are discarded)
def chunk(records: List[Record], chunks: List[List[Record]]): List[List[Record]] = {
if (records.size < chunkSize) chunks
else chunk(records.drop(chunkSize), records.take(chunkSize) :: chunks)
}
// merge a list of records into a single one
def merge(records: List[Record]): Record = records
.zipWithIndex
.map { case (record, index) => suffixMapKey(record, index) }
.foldLeft(Map.empty[String, Any]) { (record, merged) =>
merged ++ record
}
chunk(records, Nil).reverse.map(merge)
}
val feeder = chunkAndMerge(csv(“foo”).toList, 5).toArray.circular // FIXME toArray shouldn’t be necessary, that’s Gatling’s fault, will fix
You then just have to use it as a regular feeder.
Is this clear for you?
I need to read 1000 csv records and hit as many users as i wish and those data to be parsed and put in body of a http request.
Right now with my solution i am able to read only 1 random record.
Hello,I tried to read user csv files to get useriDs 5 by 5,For HTTP GET:get("/user/1234;1235;1236;1237;1238/"). I used “chunkAndMerge(csv(“foo”).toList, 5).toArray.circular”,but it’s error. Info: “value toList is not a member of io.gatling.core.feeder.RecordSeqFeederBuilder[String]”.
eg:csv.file
email ,password,userid
****@eadtest.ea.com,123,1234