Github limitation of 100MB how to solve in gatling

I have a csv file of 400MB which i use as a feeder
When i compress it and store it in github the size is over 100MB which it is not allowing to use
I tried to use other compression methods to compress it to below 100 mb but it doesnt work.
Next i tried to split the files into 2 zip files of 50-50MB each and used batchablefeederbuilder but there is no provision to concatenate,append both these feeders so that i can load into the scenario
Next i tried using seq[Map} to load records into memory , it worked in my local to some extent but while running it through enterprise version it goes out of memory which i understand as it loads all the records into the memory which is too much

Is there any solution for this ?
I want to split the files and use batchablefeederbuilder to join these 2 feeders

def GATestFeeder(): BatchableFeederBuilder[String] = {
val agentPerfZipFiles: List[String] = List(
      "data/agents-perf-1.zip",
      "data/agents-perf-2.zip"
    )

    var records: BatchableFeederBuilder[String]=null
    for (fileName <- agentPerfZipFiles) {
      val csvData: BatchableFeederBuilder[String] = csv(fileName).batch.circular;
      if (csvData != null) {
        records=csvData
      }
    }
    return records
}

i want to use the else statement to append the records so that records variable contains both the feeder records . Is it possible or has anyone tried it ?

No, it’s not possible to merge feeders.

But they are plenty of other solutions, eg:

  • use Git LFS where the file size limit is 2G.
  • download the file from an external file server, eg AWS S3 or GitHub releases

Thanks for the reply . LFS was also not working last time we checked. S3 we have that option but we are having it at as a last resort due to cost issues and other factors too.

I tried this below code and its working . dont know if this is correct ?
I am using Feeder[Any] type and combining 2 batchable feeders and using that as a feeder in my scenario as below . It is working but i am still doubtful if it ll handle good load ? Can you pls help me know if i am doing this in a wrong way or should i use only batchable feeder for large sets of data and not feeder[any]?

def GAFeeder(): Feeder[Any] = {

    val records1: BatchableFeederBuilder[String] = csv("data/agents-perf-1.zip").unzip.batch.circular;
    val records2: BatchableFeederBuilder[String] = csv("data/agents-perf-2.zip").unzip.batch.circular;
    val records: Feeder[Any] = records1.batch()++records2.batch();

I tried this below code and its working . dont know if this is correct ?

As I above wrote:

No, it’s not possible to merge feeders.

LFS was also not working last time we checked.

No reason it wouldn’t work. But you do have some extra installations to perform on the Gatling Enterprise server, as explained in the GitHub documentation.

You also have the option of using GitHub Releases to host your file as I also wrote in my first message.