Each test run generates a report that provides statistics about the test run. This topic describes LoadRunner Cloud's former report. For the new report interface, see Report.
Where do I find it?
Navigate to the Load Tests tab, select a test, and click to open the Runs page.
Navigate to the Results page.
For a selected test run, click the vertical ellipsis (more options) and select Report.
Switch off the New Dashboard & Report toggle .
Note: If the New Dashboard and Report toggle is switched on, the new report is displayed. For details on the new report, see Report.
Set the time frame to display in the report
The time frame for which report results are displayed is shown above the report.
You can change the time range using one of these options:
- Drag, stretch, or shrink the time slider.
- Change the start and end time counters.
All metrics affected by the time selection are displayed in green.
When viewing a report, you can add notes to the test run and can also add notes to individual report sections.
To add a note to the test run:
Click the Note icon under the list of the report sections on the left. In the dialog box that opens, enter your note and save it.
To add a note to a specific report section:
Click the Add notes icon at the bottom of the section. In the dialog box that opens, enter your note and save it.
After you have added notes to report sections, you can share the report, including the notes, with other people. For details, see Share report results
For details on exporting report results in different formats and for sending report results by email, see Run results actions.
Add a custom snapshot of a dashboard widget to the report
Save a snapshot of a widget for a specific filter, split and time period and display it in the report for the run.
Add a snapshot from the Dashboard
- Navigate to Results > Dashboard for a specific run.
- Zoom to a specific time frame in the test run.
- Configure what data is displayed in a widget.
- Click in the widget header to save the snapshot.
- Navigate to Results > Report for a specific run.
- In the navigation pane, select Custom Snapshots.
- Add a maximum of 5 snapshots per report.
- If your test has more than 1000 transactions, a large scale test, a snapshot cannot contain more than 200 data rows.
- If your test has less than 1000 transactions, a snapshot cannot contain more than 600 data rows.
Configure a snapshot for the report
|Rename a custom snapshot||
|Exclude a snapshot from the report||Deselect the custom snapshot in the navigation pane.|
|Edit the custom snapshot description||
|Remove a custom snapshot||Click the X in top right hand corner of a custom snapshot in the report viewer.|
The following table shows the various sections available in load test reports.
The report summary provides an overview of the test run including duration, status, and important performance statistics such as average throughput.
Note: For custom widgets, the average value may be different due to a difference in granularity and the number of data points.
The scripts overview section lists all the scripts that were included in the test definition, and their configuration.
|Mobile Devices||The mobile devices overview section lists all the devices and device details that were included in the test definition.|
|Scripts Distribution||The Scripts Distribution section displays how the test Vusers were distributed geographically during the test run.|
The Transactions section provides detailed statistics on each transaction in your test. The list of transactions can be sorted by any column.
If you selected General Settings > Group Transactions when you configured the load test, measurements for the transaction groups are also displayed.
Transactions section columns
The following data is displayed for transactions:
|Hits per second||
The number of hits (HTTP requests) to the Web server per second.
The amount of data received from the server every second.
|Total Passed Transactions||
The number of transactions that passed during the load test run.
Total Failed Transactions
The number of transactions that failed during the load test run.
|Transaction response time [Average]||The average amount of time it took the defined percentile of transactions to complete.|
A list of errors encountered by Vusers during the load test run.
Errors Summary columns
The following data is displayed for errors:
|On-premises Load Generator Scripts||
The table lists which scripts run on which on-premises load generators.
|On-premises Load Generator Script Errors||
The table lists the top three errors for each script that occurred on on-premises load generators during the test run.
Note: This section is only available for scripts run on on-premises load generators version 3.3 or later.
The SLA Result section provides detailed statistics for each SLA in your test.
SLA Result columns
The following data is displayed for SLAs:
This section provides details for each time there was a change in load during the running of a test.
Additional Vusers columns
The following data is displayed for additional Vusers:
|Custom Snapshots||This section include all the custom snapshots added from the dashboard.|
This section provides an overview of statistics collected for each monitor, either server or application, during the test run.
You can also include Windows on-premises load generators in the monitors displayed in this section. To enable this feature, open a support ticket.
The following data is displayed for monitors:
The table lists the pause and resume schedule actions that occurred during the test run. For information about pausing a test, see Pause scheduling.
Pause Scheduling columns
The following data is displayed for pause scheduling:
Notes and limitations
Reports support up to 5,000 transactions. If you have more than 5,000 transactions, you may encounter performance issues in the Reports UI or in the Export to PDF or Word functionality.