I am wondering how can I get the overall page load time for our website using gatling 2? I have a pretty simple/basic script, where I am trying a simple GET for homepage. Our homepage has a lot of CSS, JS & dynamically generated content. I have done some baseline using WebPageTest, but the user limit there is 10 and we want to test it for much higher number. I have attached a screen shot of results and I can’t tell which number I should be looking at … these numbers do not match with WebPageTest. Please let me know.
If you wrap your request with a group, you’ll get overall HTTP time.
Gatling is not a browser, it only measures what happens on the wire, so it doesn’t run your javascript, apply your css styles, update your DOM, redraw and paint, etc…
The number you should be looking at is the 99th percentile responstime.
That number needs to stay below 1 second for the complete set of HTTP requests involved in a single user click, if you want the user to not experience your site as sluggish.
Unless those measurements are listing nanoseconds instead of milliseconds (I doubt it) those results indicate the site you’re testing has a bit of a problem.
Not only is the 99th percentile exceeding that 1 second limit in a variëty of cases, it’s doing so for static content such as .jpg files. Meaning that the total responsetime for the site loading those images will be higher. Potentially much higher.
That being said, I would apply a healthy dose of scepticism until you know that the load you’re putting on there is realistic. Got any graphs for that test?