Testing large file downloads.

I am new to scala and gatling. I have gone over the examples and some introduction to scala. I am trying to determine of gatling could be used for testing involving large file downloads. The current tests we have hash files (~50 to 200 MiB) as they are downloaded. The file is not saved, but the hash is validated. This will not work if the http request in the gatling test buffers the entire response before the user can access it. We need to access the response in chunks using a stream.

Rather than spending a week learning scala and gatling, I wanted to see if this was possible. If not, I’ll try The Grinder The current load test we use is just hand coded C# that is not run under any tool. I want to move toward an open source tool.

Hi,

We hadn’t thought of this use case, so no, it’s not doable with the current API.
AsyncHttpClient AsyncHandler provides a callback where you can get a byte array of a chunk every time one arrives, so that’s probably something that would suit your needs.
If your not in a very hurry and can wait for a next version, we can give it a thought.

Cheers,

Stephane

2012/6/8 Aris Green <aris.green@gmail.com>

Hi there,

Just to let you know that I’ve pushed checksum checks support in master code, with md5 and sha1 built-ins:
https://github.com/excilys/gatling/issues/553

I need to perform further tests, but this should be out in 1.2.2 to be released by the end of the week.

Cheers,

Stephane

2012/6/8 Stéphane Landelle <slandelle@excilys.com>

Awesome! I was just thinking about how to implement this feature, and there it is. I plan to use it.

Stephen

Just to let you know: Gatling 1.2.2 is out, so you can use those new check now, just go grab it!

2012/6/13 Stephen Kuenzli <stephen.kuenzli@qualimente.com>

I appreciate all the feedback. Sorry I have not got back. I went ahead and started using The Grinder, but I release, that once I understand Scala better and how to use Gatling, Gatling may have more to offer. Essentially, what I needed to do was do download a lot of files, hash them as I download, and not save them to disk. I was check the hash values against know correct values to test for success. So the logic is this: download files from several processes and threads in chunks, updating a hash for each chunk, but not saving the chunk. This saves the whole file from being buffered into memory. I bet its possible to do in Scala, its just that I already know Python so Jython was not too hard. In Scala, I’d need to do the downloads and hash in a loop.

Many thanks,
Aris

I appreciate all the feedback. Sorry I have not got back. I went ahead and started using The Grinder, but I release, that once I understand Scala better and how to use Gatling, Gatling may have more to offer. Essentially, what I needed to do was do download a lot of files, hash them as I download, and not save them to disk. I was check the hash values against know correct values to test for success. So the logic is this: download files from several processes and threads in chunks, updating a hash for each chunk, but not saving the chunk. This saves the whole file from being buffered into memory. I bet its possible to do in Scala, its just that I already know Python so Jython was not too hard. In Scala, I’d need to do the downloads and hash in a loop.

Many thanks,
Aris

Hello Aris,

As I explained it, Gatling now supports checksum checks out of the box:
https://github.com/excilys/gatling/wiki/Checks#wiki-checksum

For example, you can write something like:

http(“Downloading filel”).get(“http://my-host/my-path”).check(md5.is(“0xA59E79AB53EEF2883D72B8F8398C9AC3”))

Ouf course the checksum is updated when a file part is received, not computed against the whole body.
Generally speaking, Gatling discards file parts unless you need it for something else, for example checking the whole body with a regex.

So in your case, as you only want to compute the checksum, file parts will be indeed discarded.

Cheers,

Stéphane

2012/6/27 Aris Green <aris.green@gmail.com>

Thanks, that is what I would have needed. I guess I did not see that earlier. I did see the note about not loading the whole response body in memory.

Hello Aris,

I hope you will have the time to try Gatling someday.
Anyway, this feature was worth asking for and other users are glad to have it, so thank you.

Regards,

Stéphane

2012/6/28 Aris Green <aris.green@gmail.com>

Hello Stephanne

I am using gatling to download large files upto 10gb but i noticed i am only able to download files upto 500mb and anything over like 1gb gives error. I am using sbt project and have allocated 12gb memory

val javaMemoryFlags = List("-Xmx12g")
javaOptions in GatlingKeys.Gatling := GatlingKeys.overrideDefaultJavaOptions(javaMemoryFlags:_*),


Below is the call. As the part of the test i need to download the file to disk but it is not able to download more than 500mb file.  Please can you suggest me . I am able to upload hugh files