Each test run generates a report that provides statistics about the test run.
Set the time frame to display in the report
The time frame for which report results are displayed is shown above the report.
You can change the time range using one of these options:
- Drag, stretch, or shrink the time slider.
- Change the start and end time counters.
All metrics affected by the time selection are displayed in green.
Add notes to a report section
To add your comments to any section of the report, select at the bottom of the section to open the Add notes dialog box.
After you have added comments, you can share the report with the comments with other people. For details, see Share report results
Customize which report sections are included in your shared results by selecting or deselecting report sections from the navigation pane.
- Email the report results to another user by selecting Email report from the (More options) menu.
Export the report results to a .docx file by selecting Export to Word from the (More options) menu.
Export the report results to a .pdf file by selecting Export to PDF from the (More options) menu.
In the menu bar, click to open the Notification area.
The notification entry for the export shows whether the export is still in progress or if it has completed.
If it has completed, click the Download button in the notification entry to download the .pdf file.
Note: The .pdf file is available for download for one week. After this time the export is marked as expired.
Optionally, you can view more details about the export by clicking the arrow to the right of the notification entry.
Note: You can export a maximum of three .pdf and .csv files simultaneously.
Add a custom snapshot of a dashboard widget to the report
Save a snapshot of a widget for a specific filter, split and time period and display it in the report for the run.
Add a snapshot from the Dashboard
- Navigate to Results > Dashboard for a specific run.
- Zoom to a specific time frame in the test run.
- Configure what data is displayed in a widget.
- Click in the widget header to save the snapshot.
In this example, the Errors widget is customized, added to the dashboard, and then added as a custom report.
- Navigate to Results > Report for a specific run.
- In the navigation pane, select Custom Snapshots.
- Add a maximum of 5 snapshots per report.
- If your test has more than 1000 transactions, a large scale test, a snapshot cannot contain more than 200 data rows.
- If your test has less than 1000 transactions, a snapshot cannot contain more than 600 data rows.
Configure a snapshot for the report
|Rename a custom snapshot||
|Exclude a snapshot from the report||Deselect the custom snapshot in the navigation pane.|
|Edit the custom snapshot description||
|Remove a custom snapshot||Click the X in top right hand corner of a custom snapshot in the report viewer.|
To generate and download error logs for your load tests do the following:
|Configure your script to generate logs||
|Download the log||
- Logs are only supported for load tests run on the cloud.
- The maximum size of a log file cannot exceed 110 mb.
You can stream script errors to a Splunk Cloud system during a test run. To stream script errors to Splunk, do the following:
|Configure your Splunk account||
In your Splunk account, configure the following for the HTTP Event Collector.
For details on configuring the HTTP Event Collector, refer to the Splunk documenation.
|Configure you Splunk account details in StormRunner Load||
In StormRunner Load:
|Enable script error streaming for a load test||
Notes and Limitations
- Streaming script errors to Splunk is enabled only for tests run in the cloud.
- You can stream script errors to a Splunk Cloud account only.
- Only one Splunk account can be configured for a StormRunner Load tenant.
- During a load test run, only the first 500,000 script errors are sent to the Splunk account.
You can configure StormRunner Load so that you can open a defect in ALM Octane directly from a StormRunner Load report.
Configure ALM Octane in StormRunner Load
To configure StormRunner Load to work with ALM Octane:
Navigate to Menu bar > Your user name > ALM Octane account.
In the dialog box that opens, enter the URL to the home page of your Octane account.
Create a defect from StormRunner Load
To open a defect in ALM Octane directly from a StormRunner Load report:
In the report, click the (More options menu) and select Open defect (ALM Octane).
- In the dialog box that opens, enter the following details for the defect:
Learn about report results sections
The following table shows the various sections available in load test reports.
The report summary provides an overview of the test run including duration, status, and important performance statistics such as average throughput.
Note: For custom widgets, the average value may be different due to a difference in granularity and the number of data points.
The scripts overview section lists all the scripts that were included in the test definition, and their configuration.
|Mobile Devices||The mobile devices overview section lists all the devices and device details that were included in the test definition.|
|Scripts Distribution||The Scripts Distribution section displays how the test Vusers were distributed geographically during the test run.|
The Transactions section provides detailed statistics on each transaction in your test. The list of transactions can be sorted by any column.
If you selected General Settings > Group Transactions when you configured the load test, measurements for the transaction groups are also displayed.
Transactions section columns
The following data is displayed for transactions:
|Hits per second||
The number of hits (HTTP requests) to the Web server per second.
The amount of data received from the server every second.
|Total Passed Transactions||
The number of transactions that passed during the load test run.
Total Failed Transactions
The number of transactions that failed during the load test run.
|Transaction response time [Average]||The average amount of time it took the defined percentile of transactions to complete.|
A list of errors encountered by Vusers during the load test run.
Errors Summary columns
The following data is displayed for errors:
The SLA Result section provides detailed statistics for each SLA in your test.
SLA Result columns
The following data is displayed for SLAs:
This section provides details for each time there was a change in load during the running of a test.
Additional Vusers columns
The following data is displayed for additional Vusers:
|Custom Snapshots||This section include all the custom snapshots added from the dashboard.|
This section provides an overview of statistics collected for each monitor, either server or application, during the test run.
The following data is displayed for monitors:
Notes and Limitations
Reports support up to 2000 transactions. If you have more than 2000 transactions, you may encounter performance issues in the Reports UI or in the Export to PDF or Word functionality.