Each test run generates a report that provides statistics about the test run.

Set the time frame to display in the report

The time frame for which report results are displayed is shown above the report.

You can change the time range using one of these options:

  • Drag, stretch, or shrink the time slider.
  • Change the start and end time counters.

All metrics affected by the time selection are displayed in green.

Back to top

Add notes to a report section

To add your comments to any section of the report, select at the bottom of the section to open the Add notes dialog box.

After you have added comments, you can share the report with the comments with other people. For details, see Share report results below.

Back to top

Share report results

Customize which report sections are included in your shared results by selecting or deselecting report sections from the navigation pane.

  • Email the report results to another user by selecting Email report from the (More options) menu.
  • Export the report results to a .docx file by selecting Export to Word from the (More options) menu.

  • Export the report results to a .pdf file by selecting Export to PDF from the (More options) menu.

    1. In the menu bar, click to open the Notification area.

    2. The notification entry for the export shows whether the export is still in progress or if it has completed.

    3. If it has completed, click the Download button in the notification entry to download the .pdf file.

      Note: The .pdf file is available for download for one week. After this time the export is marked as expired.

    4. Optionally, you can view more details about the export by clicking the arrow to the right of the notification entry.

    Note: You can export a maximum of three .pdf and .csv files simultaneously.

Back to top

Add a custom snapshot of a dashboard widget to the report

Save a snapshot of a widget for a specific filter, split and time period and display it in the report for the run.

Add a snapshot from the Dashboard

  1. Navigate to Results > Dashboard for a specific run.
  2. Zoom to a specific time frame in the test run.
  3. Configure what data is displayed in a widget.
  4. Click in the widget header to save the snapshot.

In this example, the LG alerts, first added to the dashboard, is then added as a custom report.

Preview the snapshot in the report

  1. Navigate to Results > Report for a specific run.
  2. In the navigation pane, select Custom Snapshots.


  • Add a maximum of 5 snapshots per report.
  • If your test has more than 1000 transactions, a large scale test, a snapshot cannot contain more than 200 data rows.
  • If your test has less than 1000 transactions, a snapshot cannot contain more than 600 data rows.

Configure a snapshot for the report

Action How to
Rename a custom snapshot
  1. Double-click the snapshot name.
  2. Edit the text.
Exclude a snapshot from the report Deselect the custom snapshot in the navigation pane.
Edit the custom snapshot description
  1. Double-click the custom snapshot description.
  2. Edit the text.
Remove a custom snapshot Click the X in top right hand corner of a custom snapshot in the report viewer.

Back to top

Generate and download logs

To generate and download error logs for your load tests do the following:

Action How to
Enable logs
  1. Navigate to the Load Test page.
  2. Select a test.
  3. Select the General tab.
  4. Select Enable logs.
Configure your script to generate logs
  1. In VuGen, select VuGen > Runtime Settings > Logs > Log Level and associated options.

    In TruClient, select > Run-time settings > Log >Log level and associated options.

  2. Set the Log level to standard.
Download the log
  1. Navigate to the Results page.
  2. Click the button on the run you want to analyze.
  3. Click the (More options menu) and select Download logs.
  1. Logs are only supported for load tests run on the cloud.
  2. The maximum size of a log file cannot exceed 110 mb.

Back to top

Stream script errors to Splunk

You can stream script errors to a Splunk Cloud system during a test run. To stream script errors to Splunk, do the following:

Action How to
Configure your Splunk account

In your Splunk account, configure the following for the HTTP Event Collector.

  1. Create a new token
    1. Enter a name for the HTTP Event Collector.
    2. Set the Source type to _json.

    When the new token is created, note the token value that is displayed (you will need this when configuring StormRunner Load).

  2. Enable the token

    Note the HTTP port number that is displayed (you will need this when configuring StormRunner Load).

For details on configuring the HTTP Event Collector, refer to the Splunk documenation.

Configure you Splunk account details in StormRunner Load

In StormRunner Load:

  1. Navigate to Menu bar > Your user name > Splunk account.
  2. In the dialog box that opens, configure:
    • Your Splunk Cloud URL that was sent to you by Splunk when you created your Splunk instance.
    • The HTTP Event Collector port number.
    • If your account is a managed Splunk Cloud account (rather than a self-service account), select the Managed Splunk check box.
    • The HTTP Event Collector token.

  3. Click Apply.
Enable script error streaming for a load test
  1. Navigate to the Load Test page.
  2. Select a test.
  3. Select the General tab.
  4. Under Enable logs, select the Stream script errors to Splunk check box.

Notes and Limitations

  • Streaming script errors to Splunk is enabled only for tests run in the cloud.
  • You can stream script errors to a Splunk Cloud account only.
  • Only one Splunk account can be configured for a StormRunner Load tenant.
  • During a load test run, only the first 500,000 script errors are sent to the Splunk account.

Back to top

Learn about report results sections

The following table shows the various sections available in load test reports.

Section Description Example
Report summary

The report summary provides an overview of the test run including duration, status, and important performance statistics such as average throughput.

Note: For custom widgets, the average value may be different due to a difference in granularity and the number of data points.


The scripts overview section lists all the scripts that were included in the test definition, and their configuration.

Mobile Devices The mobile devices overview section lists all the devices and device details that were included in the test definition.  
Scripts Distribution The Scripts Distribution section displays how the test Vusers were distributed geographically during the test run.

The Transactions section provides detailed statistics on each transaction in your test. The list of transactions can be sorted by any column.

If you selected General Settings > Group Transactions when you configured the load test, measurements for the transaction groups are also displayed.

Hits per second

The number of hits (HTTP requests) to the Web server per second.


The amount of data received from the server every second.

Total Passed Transactions

The number of transactions that passed during the load test run.


Total Failed Transactions

The number of transactions that failed during the load test run.

Transaction response time [Average] The average amount of time it took the defined percentile of transactions to complete.  
Errors Summary

A list of errors encountered by Vusers during the load test run.

SLA Results

The SLA Result section provides detailed statistics for each SLA in your test.

Additional Vusers

This section provides details for each time there was a change in load during the running of a test.

Custom Snapshots This section include all the custom snapshots added from the dashboard.  

This section provides an overview of statistics collected for each monitor, either server or application, during the test run.


Back to top

Notes and Limitations

Reports support up to 2000 transactions. If you have more than 2000 transactions, you may encounter performance issues in the Reports UI or in the Export to PDF or Word functionality.

Back to top