Manage test runs and results

This section describes how to view and manage test results and related actions for test runs.

Collate run data

When you run a test, by default all the run data is stored locally on each load generator. After the performance test runs, in order to generate analysis data, the results must be gathered into one location so that they can be processed for analysis. This process is known as data collation.

You can set LoadRunner Enterprise to collate the run data automatically, either during runtime, or as soon as the run is complete (see Configure collate options). Alternatively, you can collate the run data manually after the run (see Post Run Action). This way, you can save and close a test, and collate the data after at a later stage.

In addition to collating the results data, LoadRunner Enterprise can collate data from Network Virtualization and from the log files. After successfully collating the data, these files are deleted from the load generators from which they were gathered.

After LoadRunner Enterprise collates the results, you can analyze the results using different tools. For details, see View and analyze test results.

Back to top

View test run details

To view runs for a LoadRunner Enterprise test:

  1. In the top banner, click the module name or the dropdown arrow and select Test Management > Tests & Scripts / Runs.

  2. Select a performance test in the test management tree, and open the Runs tab.

    If the test is running, click the ID link to display the test in the View running tests in the Run Dashboard. If the test run has finished, click the ID link to open the test in the offline view.

    To filter, sort, or group runs displayed in the runs grid, see Personalize your display.

    The Runs grid displays the following information:

    UI Elements Description
    ID

    The result file ID. If the test has finished, click the ID link to display the results in the Results page. For details, see View finished test run results offline.

    Name

    The name of the test run. If the test is running, click the name link to display the test in the View running tests in the Run Dashboard.

    State

    Displays the current state of the selected test runs. If the test is running, click the state link to display the test in the View running tests in the Run Dashboard.

    SLA Status

    Passed. Indicates a passed SLA status.

    Failed. Indicates a failed SLA status.

    N/A. Indicates that there is no data about the SLA status.

    Not Completed. Indicates the collating results process is not finished.

    Test Name

    The name of the test. Click the Test Name link to display the test in the Performance Test Designer. For details, see Performance Test Designer.

    Start/End Date The start and end date and time of the test run.
    Last Modified Enables you to filter the information being displayed in the Test Runs grid by the last modified date. Click the drop-down arrow to apply a filter.
    Test ID

    The test run ID.

    Test Instance ID

    The ID of the test instance linked to the timeslot.

    Tester

    The name of the user running the test.

    Duration

    The time, in minutes, that the test took to run.

    Test Set Name The name of the test set.
    Test Set ID The test set ID.
    Timeslot

    The ID of the test run timeslot. Click the Timeslot link to open the test in the Timeslots page. For details, see View or edit a reserved timeslot.

    Controller

    The name of the Controller machine used in the test run.

    Load Generators The names of the load generator machines used in the test run.
    Vusers Involved The number of Vusers that were initialized at least once during the run.
    Vusers Max The maximum number of concurrently running Vusers during the run.
    Vusers Avg The average number of concurrently running Vusers during the run.
    Errors The total number of errors during the test run.
    TX Passed The total number of passed transactions during the test run.
    TX Failed The total number of failed transactions during the test run.
    Hits/Seconds The total number of hits per second during the test run.
    VUFDs Amount The amount of VUFDs involved in the action.
    Analysis Template Indicates whether Analysis Templates were used in the test run.
    Using VUFDs Indicates whether VUFDs were used in the test run.
    Comments Comments added to the test run.
    Jenkins Job Name Name of the Jenkins job if you are running tests as part of your Jenkins continuous integration process.
    Operating System Operating system of the machine that runs the job.
    Test Description A description of the test.
  3. To display the above details about a run, select a row in the runs grid, and click the Details tab.

    To add comments to a test run, click the Comments button , and enter text as required.

    Note: Comments cannot contain any of the following characters: | = !

  4. To display event logs for a selected test run, select a row in the runs grid, and click the Event Log tab. For details, see View test run event logs.

  5. To download (or upload) result files for a selected test run, select a row in the runs grid, and click the Results tab. For details, see Download and upload result files.

Back to top

Manage test results

You can manage results and related actions for test runs.

To access:

  • In the top banner, click the module name or the dropdown arrow and select Test Management > Runs. In the test management tree, select a performance test and click the Runs tab.

  • Select Test Management > Tests & Scripts. In the test management tree, select a performance test and click the Runs tab.

You can perform the following actions from the Runs toolbar or context menu:

Action How to
View results

For a test run in the Finished state, click the View Results button to see the results for the selected test run.

View HTML summary report

You can view an HTML version of the Analysis Summary report opens, which analyzes the data collected during the performance test run (available for test runs in Finished state only).

For a test run in the Finished state, click the HTML Report button .

Analyze results

For a test run in the Before Creating Analysis Data state, right-click the run, and select Analyze. LoadRunner Enterprise analyzes the results for the selected test run and deletes the temporary results from the load generators and the Controller.

Collate results

For a test run in the Before Collating Results state, right-click the run, and select Collate. LoadRunner Enterprise collates the results for the test run.

Note:

  • If a test is running on the Controller that performs the collating, LoadRunner Enterprise issues an alert. Click OK to proceed with collating the results, or Cancel to defer the action until later.

  • If the Collate process encounters an error, the Collate Errors dialog box displays details of the error. To analyze the partially collated results, select Analyze partially collated results. Note that analyzing partially collated results is an irreversible operation.

Recover results

For a test in the Run Failure or Failed Collating Results state, right-click the run, and select Recover results to recover and collate the results of a failed test run.

Note: Enables you to collate results only up to the point where the test failed.

Select runs from a specific test set

To display run results for a test set, click the Test Sets drop-down arrow, and select a test set.

Note: The test set menu is only available when you select the root folder ('Subject') in the Test Management tree.

Rename a run

To rename a run, right-click a run, select Rename, and enter a new name for the run.

Note: Not available when the test is in an active state (Initializing, Running, or Stopping).

Go to timeslot

To view timeslot details for the test run, right-click a run, and select Go to Timeslot.

Recalculate SLA

Right-click a run, and select Recalculate SLA . In the Calculate SLA dialog box, select an option for recalculating the SLA:

  • Calculate SLA. Recalculates the SLA according to the defined information.

  • Calculate SLA for whole run duration. Calculates the SLA over the whole test run.

  • Calculate SLA for part of run duration. Calculates the SLA over part of the test run. Enter the desired Start Time and End Time over which to calculate the SLA.

Note: This feature is available only if SLAs were defined during the performance test design phase, and to run instance with Finished state. For details about SLAs, see Define service level agreements (SLAs).

Add run to trend report

Adds the selected test run to a trend report.

  1. Right-click a run, and select Add to Trend Report.

  2. In the Add to Trend Report dialog box, select a trend report, and click Add. The test run is added to the trend report (or added to the publish request queue if no data processors are available).

Note: Available only for run instances with Finished state.

For details, see Trend reports.

Export run data to Excel

Click the Export to Excel button to export the run data to Excel.

Delete runs

Select the test runs you want to delete, and click the Delete button . The selected test runs together with their results are deleted from the project, the Admin Runs list, and all trend reports in which they appear.

Note:

  • Not available when the test is in an active state (Initializing, Running, or Stopping).

  • Runs are deleted from the project's repository after 7 days. For details, see Deleting entities from a project.

Display latest run data

Click the Refresh button to display the most up-to-date information in the runs grid.

Filter, sort, or group the run data

For details, see Personalize your display.

Back to top

View results in the dashboard

To view run results:

  1.  In the top banner, click the module name or the dropdown arrow and select Test Management > Tests & Scripts.

  2. Select a performance test in the test management tree, and open the Runs tab.

  3. Select a test run, and click the View Results button .

  4. Click the Dashboard tab. The test runs dashboard opens, and displays performance measurements and results for resources that were monitored in the test run in the offline results page. For user interface details, see View results offline.

Back to top

Download and upload result files

To download or upload result files for a selected test run:

  1.  In the top banner, click the module name or the dropdown arrow and select Test Management > Tests & Scripts.

  2. Select a performance test in the test management tree, and open the Runs tab.

  3. Select a test run, and click the View Results button .

  4. In the Results tab, choose the type of result data you want to view:

    Output Log

    Select the SqliteDb.db.zip file (previously output.mdb.zip in LRE 2023 and earlier releases). The database created by LoadRunner Enterprise. Stores all the output messages reported during the performance test run. This is useful for troubleshooting errors.

    Output Log (Vuser Execution)

    Select the VuserLog.zip file. Contains the Vuser run log. You can change the log settings in the Runtime Settings dialogue box in the Performance Test Designer. For details, see Configure runtime settings.

    Raw Results

    Select the RawResults_<#>.zip file. Contains result information in its raw format. Can be used to generate the full results using the Analysis tool. For details, see LoadRunner Analysis.

    Rich Report

    Select the HighLevelReport_<#>.xls file. Contains a high-level report of the run results in an Excel file.

    HTML Report

    Select the Reports.zip file. Contains an HTML version of the Analysis Summary report.

    Note: If you delete the Reports.zip file, the HTML Report link is not available in the results page. For details, see View results offline.

    Analyzed Results

    Select the Results_<#>.zip file. Contains the Analysis Summary report, which analyzes the data collected during the performance test run.

    User files

    Select the Userfile.<file type> file. Files attached by users to provide additional information about the test run.

  5. Select the results file you want to download, and click the Download button to download the file.

  6. To upload attachments that provide additional information about the currently selected results file, select the results file you want to upload, and click the Upload . All uploaded files are displayed as a USER FILE type.

Back to top

View test run event logs

To display event logs for a selected test run:

  1.  In the top banner, click the module name or the dropdown arrow and select Test Management > Tests & Scripts.

  2. Select a performance test in the test management tree, and open the Runs tab.

  3. Select a test run, and click the View Results button .

  4. Click the Event Log tab. The Event Log grid displays the following information:

    ID

    The event ID.

    Event Type

    An indication of the event's severity. From most to least severe: error, warning, or info.

    Create date

    The date and time the event was logged.

    Event

    The name of the event.

    Description

    A description of the event.

    Responsible

    The user, or automated system process responsible for the event.

  5. To export the event log data to Excel, click the Export to XLS button .

Back to top

View changes made to a test run

To display a list of changes made to the currently selected test run:

  1. In the top banner, click the module name or the dropdown arrow and select Test Management > Runs. The Runs grid opens.

    Note: You can also see the Audit tab from Test Management, by choosing a test in the test management tree, and clicking the Runs tab. For details, see View or edit tests and test runs.

  2. Click the ID link for the test run that you want to view.

  3. Select the Audit tab. The Audit log opens, displaying the list of operations performed on the test run.

    Field Name The field modified during the change.
    Action Type Specifies the type of changes made to the field (for example Update, Insert).
    Old Value The value before the New Value (two before the current value).

    New Value

    The previous field value (one before the current value).
    User Name Displays the login name of the user that made the change to the entity.
    Modified Date Displays the date and time of the operation in chronological order (or reversed chronological order).
  4. To filter the information being displayed in the grid, click the Filter button and select a filter. For more details, see Personalize your display.

  5. To export the Audit log data to Excel, click the Export to XLS button .

Note: Administrators can display a list of changes made to other entity types, such as projects, users, hosts, and roles from LoadRunner Enterprise Administration. For details, see Audit entities and user connections.

Back to top

Perform quick trending analysis

You can perform quick trending analysis on test runs from the Test Runs and Test Management page. You can view a summary of failed transactions, and then drill down and compare against previous runs to help identify trends and resolve issues.

  1. Prerequisites

    To view trend preview data for a test run, you must configure an external analysis server (InfluxDB) and assign the project to the server. For details, see Configure external InfluxDB analysis servers.

  2. Run a test.

    For details, see Run Test.

  3. Open Quick Trending.

    Note: The Quick Trending tab is only available if an external analysis database server has been defined for the project. For details, see Manage analysis servers.

    From the Test Runs page

    1. In the top banner, click the module name or the dropdown arrow and select Test Management > Runs. The Runs grid opens.

    2. Click the ID link for the test run that you want to view. The test runs dashboard opens.

    3. Click the Quick Trending tab.

    From the Test Management page
    1.  In the top banner, click the module name or the dropdown arrow and select Test Management > Tests & Scripts.

    2. Select a test in the test management tree, and click the Test Details tab. You can see a summary of failed transactions in the Failed Transactions pane.

      Note: The pane shows "No data to display" if an external analysis database server has not been defined for the project.

    3. Click the Run Details link, and then click the Quick Trending tab.

  4. The Quick Trending page opens, displaying the trending transactions grid and a graphical representation of the top trending transactions that failed.

    • Total TRs represents the total number of transactions in the run.

    • Failed represents the total number of failed transactions (a transaction can fail multiple times).

  5. Select the filters to sort the information being displayed in the grid:

    Transactions

    Click the Transactions dropdown, and select the number of failed transactions you want to display (Top 5, Top 10, or Top 15).

    By Attribute

    Click the By Attribute dropdown, and select the attribute by which to sort transactions in the grid (in descending order):

    • Failed. Sorts transactions by the number of times the transaction failed.

    • Success Rate. Sorts transactions by their success rate (percentage).

    • Failed Rate. Sorts transactions by their failure rate (percentage).

    • TR Total Duration. Sorts transactions by their total duration time, in seconds.

    In addition to the information above, the grid also displays the following information:

    aTRT The average transaction response time of each transaction, in seconds, during the test run.

    STD

    The standard deviation of each transaction during the test run.
    Count The number of times the transaction participates in the test run.
  6. Analyze failed transactions.

    Click a transaction in the Trending Transactions grid to drill down on that transaction. Details of the failed transaction are displayed. Use this pane to help identify and analyze trends related to the run.

    1. Click the Trending dropdown, and select the number of runs that immediately preceded this run to use as a comparison (Last 3, 5, or 10 runs).

    2. The Trending tab displays a graphical representation of the failed transaction runs. Click the By Attribute drop down, and select the attribute by which to filter the transaction against preceding runs.

    3. Use the Summary tab to compare data patterns and establish trends for the selected transaction runs.

Back to top

See also: