How to fetch the data from gatling report and put that data in a csv?

Is there any way in Gatling open source to fetch the data from Gatling report and put that data into a CSV file?

I have tried to create a python script that is taking the index.html report generated by Gatling once the test execution is completed but the problem is data in the index.html is getting filled by some javascript code the data is not present in the html. I have used bs4 to scrap but it didn’t work.

Can anyone please help me out.

My code:

def read_zip_file(zip_file_path,zip_file_path2,html_file):
    with zipfile.ZipFile(zip_file_path, 'r') as zip_ref:
    with, 'r') as zip_ref2:
            inner_zip_data =
            inner_zip = zipfile.ZipFile(io.BytesIO(inner_zip_data))
            with as html_file:
                html_content ='utf-8')
                soup = BeautifulSoup(html_content, "html.parser")
                containers = soup.find_all(id="container_statistics_body")
                td_element = soup.select_one('td.value.ok.col-3')
                # with open("table.csv", "w", newline="") as csvfile:
                #     writer = csv.writer(csvfile)
                #     writer.writerow(columns)

zip_file_path = ""
zip_file_path2 = ""
html_file = 'aggregated-results/aggregated/index.html'


You can try to parse simulation.log but you must have in your mind this file is internal and can be changed without any information.
Better way is to send by graphite protocol and parse as you want: Gatling - Realtime Monitoring

is this is something we can do in open source?
What I want is once the test execution is completed all the data present on index.html should be extracted to a csv file.

I know you can, why do you ask? :slight_smile:

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.