Problem with Gatling stress tool when running for longer period of time using csv feeder

I am running the following Gatling scenario for more that 20 minutes

val httpConf = httpConfig.baseURL("some url")
val csvFeeder = csv("title.csv").circular
val scenario1 = scenario("some sample scenario").during(1200) //create
exact match scenario
        {
        feed(csvFeeder).
        exec(
        http("Search query").
        get("${csvValue}").
        check(status.is(200)))
        }
setUp(scenario1.users(100).ramp(5).protocolConfig(httpConf)

When I run the above scenario for more than 20 minutes Gatling tool was
stuck and not firing any requests.

I've just run a similar scenario with 1.5.0 and it went smoothly.
Did you get any exception in the console?

I observed that the heap size is full and I extracted the heap dump using
VisualVM. I found that the Map created for the csv feeder for each request
is staying in the memory and never garbage collected.

Nope, built-in csv feeders are not GC-ed and the whole file content is
loaded in memory for faster access. The only one that could be is the
default queue one, but how could the circular one you set up be?

I am using 1.4.6 version of Gatling. I set the Maximum heap size to 1GB.
Please suggest me some work around for csv feeder or correct me if my code
is wrong.

No, memory is not an issue there: the only thing loaded in memory is the
feeder, and it's loaded on start up.
The only thing that would explain your problem is messages being lost.
Could you upgrade to 1.5.0, please?

When I use random instead of circular there are no memory issues with 1.4.6 version. May be when I use circular it is filling the whole heap. Anyway My test case needs random than circular.

OMG, you’re absolutely right! We’ve done something very stupid actually.

Will fix ASAP

Fixed, thanks a lot for reporting!
https://github.com/excilys/gatling/issues/1139