Manage test runs and results

This section describes how to view and manage test results and related actions for test runs.

Collate run data

When you run a test, by default all the run data is stored locally on each load generator. After the performance test runs, in order to generate analysis data, the results must be gathered into one location so that they can be processed for analysis. This process is known as data collation.

You can set LoadRunner Enterprise to collate the run data automatically, either during runtime, or as soon as the run is complete (see Collate tab). Alternatively, you can collate the run data manually after the run (see Post-Run Action). This way, you can save and close a test, and collate the data after at a later stage.

In addition to collating the results data, LoadRunner Enterprise can collate data from Network Virtualization, and from the log files. After successfully collating the data, these files are deleted from the load generators from which they were gathered.

After LoadRunner Enterprise collates the results, you can analyze the results using various tools. For details, see View and analyze test results.

Back to top

View test run details

To view runs for a LoadRunner Enterprise test:

  1. From the LoadRunner Enterprise navigation toolbar, click and select Test Management/Test Runs (under Testing).

  2. Select a performance test in the test management tree, and open the Runs tab.

    If the test is running, click the ID link to display the test in the View the Performance Test Run page (online screen). If the test run has finished, click the ID link to open the test in the offline view.

    To filter, sort, or group runs displayed in the runs grid, see Personalize your display.

    The Runs grid displays the following information:

    UI Elements

    Description

    ID

    The result file ID. If the test has finished, click the ID link to display the results in the Results page. For details, see View test results offline.

    Name

    The name of the test run. If the test is running, click the Name link to display the test in the View the Performance Test Run page (online screen).

    State

    Displays the current state of the selected test runs. If the test is running, click the State link to display the test in the View the Performance Test Run page (online screen).

    SLA Status

    Passed. Indicates a passed SLA status.

    Failed. Indicates a failed SLA status.

    N/A. Indicates that there is no data about the SLA status.

    Not Completed. Indicates the collating results process is not finished.

    Test Name

    The name of the test. Click the Test Name link to display the test in the Performance Test Designer. For details, see Performance Test Designer.

    Start/End Date The start and end date and time of the test run.
    Last Modified Enables you to filter the information being displayed in the Test Runs grid by the last modified date. Click the drop-down arrow to apply a filter.
    Test ID

    The test run ID.

    Test Instance ID

    The ID of the test instance linked to the timeslot.

    Tester

    The name of the user running the test.

    Duration

    The time, in minutes, that the test took to run.

    Test Set Name The name of the test set.
    Test Set ID The test set ID.
    Timeslot

    The ID of the test run timeslot. Click the Timeslot link to open the test in the Timeslots page. For details, see View or edit a reserved timeslot.

    Controller

    The name of the Controller machine used in the test run.

    Load Generators The names of the load generator machines used in the test run.
    Vusers Involved The number of Vusers that were initialized at least once during the run.
    Vusers Max The maximum number of concurrently running Vusers during the run.
    Vusers Avg The average number of concurrently running Vusers during the run.
    Errors The total number of errors during the test run.
    TX Passed The total number of passed transactions during the test run.
    TX Failed The total number of failed transactions during the test run.
    Hits/Seconds The total number of hits per second during the test run.
    Vuds Amount The amount of VUDs involved in the action.
    Analysis Template Indicates whether Analysis Templates were used in the test run.
    Using Vuds Indicates whether VUDs were used in the test run.
    Comments Comments added to the test run.
    Jenkins Job Name Name of the Jenkins job if you are running tests as part of your Jenkins continuous integration process.
    Operating System Operating system of the machine that runs the job.
    Test Description A description of the test.
  3. To display the above details about a run and to add comments, select a row in the runs grid, and click the Details tab in the right pane.

  4. To display event logs for a selected test run, select a row in the runs grid, and click the Event Log tab in the right pane. For details, see View test run event logs.

  5. To download (or upload) result files for a selected test run, select a row in the runs grid, and click the Results tab in the right pane. For details, see Download and upload result files.

Back to top

Manage test results

You can manage results and related actions for test runs.

To access:

  • From the LoadRunner Enterprise navigation toolbar, click and select Test Runs (under Testing). Select a performance test in the test management tree, and click the Runs tab.

  • Select Test Management (under Testing), choose a test in the test management tree, and click the Runs tab.

You can perform the following actions from th Runs toolbar or context menu:

Action How to
View results

For a test run in the Finished state, click View Results to see the results for the selected test run.

Analyze results

For a test run in the Before Creating Analysis Data state, right-click the run, and select Analyze. LoadRunner Enterprise analyzes the results for the selected test run and deletes the temporary results from the load generators and the Controller.

Collate results

For a test run in the Before Collating Results state, right-click the run, and select Collate. LoadRunner Enterprise collates the results for the test run.

Note:

  • If a test is running on the Controller that will perform the collating, LoadRunner Enterprise issues an alert. Click OK to proceed with collating the results, or Cancel to defer the action until later.

  • Should the Collate process encounter an error, the Collate Errors dialog box appears, displaying details of the error. To analyze the partially collated results, select Analyze partially collated results. Note that analyzing partially collated results is an irreversible operation.

Recover results

For a test in the Run Failure or Failed Collating Results state, right-click the run, and select Recover results to recover and collate the results of a failed test run.

You will also need to use this option when using elastic (dockerized) load generators and the Use External Storage option is used in the Orchestrator. For details, see Retain run results after a performance test ends.

Note: Enables you to collate results only up to the point where the test failed.

Select runs from a specific test set

To display run results for a particular test set, click the Test Sets drop-down arrow, and select a test set.

Rename a run

To rename a run, right-click a run, select Rename, and enter a new name for the run.

Note: Not available when the test is in an active state (Initializing, Running, or Stopping).

Go to timeslot

To view timeslot details for the test run, right-click a run, and select Go to Timeslot.

Recalculate SLA

Right-click a run, and select Recalculate SLA. In the Calculate SLA dialog box, select an option for recalculating the SLA:

  • Calculate SLA. Recalculates the SLA according to the defined information.

  • Calculate SLA for whole run duration. Calculates the SLA over the whole test run.

  • Calculate SLA for part of run duration. Calculates the SLA over part of the test run. Enter the desired Start Time and End Time over which to calculate the SLA.

Note: This feature is available only if SLAs were defined during the performance test design phase, and to run instance with Finished state. For details about SLAs, see Define service level agreements (SLAs).

Add run to trend report

Adds the selected test run to a trend report.

  1. Right-click a run, and select Add to Trend Report.

  2. In the Trending dialog box, select a trend report, and click Add. The test run is added to the trend report (or added to the publish request queue if no data processors are available).

Note: Available only for run instances with Finished state.

For details, see Create trend reports.

Export run data to Excel

Click Export to Excel to export the run data to Excel.

Delete runs

Select the test runs you want to delete, and click Delete.

Note: Not available when the test is in an active state (Initializing, Running, or Stopping).

Display latest run data

Click Refresh to display the most up-to-date information in the runs grid.

Filter, sort, or group the run data

For details, see Personalize your display.

Back to top

View results in the dashboard

To view run results:

  1. From the LoadRunner Enterprise navigation toolbar, click and select Test Management (under Testing).

  2. Select a performance test in the test management tree, and open the Runs tab.

  3. Select a test run, and click View Results.

  4. Click the Dashboard tab. The Test Runs Dashboard opens, and displays performance measurements and results for resources that were monitored in the test run in the offline results page. For user interface details, see View test results.

Back to top

Download and upload result files

To download or upload result files for a selected test run:

  1. From the LoadRunner Enterprise navigation toolbar, click and select Test Management (under Testing).

  2. Select a performance test in the test management tree, and open the Runs tab.

  3. Select a test run, and click View Results.

  4. Click the Results tab, and choose the type of result data you want to view:

    Type

    Description

    Output Log

    Select the output.mdb.zip file. The database created by LoadRunner Enterprise. Stores all the output messages reported during the performance test run. This is useful for troubleshooting errors.

    Output Log (Vuser Execution)

    Select the VuserLog.zip file. Contains the Vuser execution log. You can change the log settings in the Runtime Settings dialogue box in the Performance Test Designer. For details, see Configure runtime settings.

    Raw Results

    Select the RawResults_<#>.zip file. Contains result information in its raw format. Can be used to generate the full results using the Analysis tool. For details, see Use LoadRunner Analysis to analyze performance test results.

    Rich Report

    Select the HighLevelReport_<#>.xls file. Contains a high-level report of the run results in an Excel file.

    HTML Report

    Select the Reports.zip file. Contains an HTML version of the Analysis Summary report.

    Analyzed Results

    Select the Results_<#>.zip file. Contains the Analysis Summary report, which analyzes the data collected during the performance test run.

    User files

    Select the Userfile.<file type> file. Files attached by users to provide additional information about the test run.

  5. Select the results file you want to download, and click Download to download the file.

  6. To upload attachments that provide additional information about the currently selected results file, select the results file you want to upload, and click Upload. All uploaded files are displayed as a USER FILE type.

  7. You can delete files with type USER FILE. Select the user files you want to delete, and click Delete.

Back to top

View test run event logs

To display event logs for a selected test run:

  1. From the LoadRunner Enterprise navigation toolbar, click and select Test Management (under Testing).

  2. Select a performance test in the test management tree, and open the Runs tab.

  3. Select a test run, and click View Results.

  4. Click the Event Log tab. The Event Log grid displays the following information:

    UI Elements

    Description

    ID

    The event ID.

    Event Type

    An indication of the event's severity. From most to least severe: error, warning, or info.

    Create date

    The date and time the event was logged.

    Event

    The name of the event.

    Description

    A description of the event.

    Responsible

    The user, or automated system process responsible for the event.

  5. To export the event log data to Excel, click Export to XLS.

Back to top

View changes made to a test run

To display a list of changes made to the currently selected test run:

  1. From the LoadRunner Enterprise navigation toolbar, click and select Test Runs (under Testing). The Runs grid opens.

    Note: You can also see the Audit tab from Test Management, by choosing a test in the test management tree, and clicking the Runs tab. For details, see Create and manage tests.

  2. Click the ID link for the test run that you want to view.

  3. Select the Audit tab. The Audit log opens, displaying the list of operations performed on the test run.

    UI Element Description
    Field Name The field modified during the change.
    Action Type Specifies the type of changes made to the field (for example Update, Insert).
    Old Value The value before the New Value (two before the current value).

    New Value

    The previous field value (one before the current value).
    User Name Displays the login name of the user that made the change to the entity.
    Modified Date Displays the date and time of the operation in chronological order (or reversed chronological order).
  4. To filter the information being displayed in the grid, click the Filter button and select a filter. For more details, see Personalize your display.

  5. To export the Audit log data to Excel, click Export to XLS.

Note: Administrators can display a list of changes made to other entity types, such as projects, users, hosts, and roles from LoadRunner Enterprise Administration. For details, see Audit entities and user connections.

Back to top

Perform quick trending analysis on test runs

Supported in versions: LoadRunner Enterprise SP3 and later

You can perform quick trending analysis on test runs from the Test Runs and Test Management page. You can view a summary of failed transactions, and then drill down and compare against previous runs to help identify trends and resolve issues.

  1. Prerequisites

    To view trend preview data for a test run, you must configure an external analysis server (InfluxDB) and assign the project to the server. For details, see Configure external analysis servers.

  2. Run a test.

    For details, see Run Test.

  3. Open Quick Trending.

    From the Test Runs page

    1. From the LoadRunner Enterprise navigation toolbar, click and select Test Runs (under Testing). The Runs grid opens.

    2. Click the ID link for the test run that you want to view.

    3. Click the Quick Trending tab.

      Note: The tab is unavailable if an external analysis database server has not been defined for the project.

    From the Test Management page
    1. From the LoadRunner Enterprise navigation toolbar, click and select Test Management (under Testing).

    2. Select a test in the test management tree, and click the Test Details tab. You can see a summary of failed transactions in the Failed Transactions pane.

      Note: The pane shows "No data to display" if an external analysis database server has not been defined for the project.

    3. Click the Quick Trending link.

  4. The Quick Trending page opens, displaying the trending transactions grid and a graphical representation of the top trending transactions that failed.

    • Total TRs represents the total number of transactions in the run.

    • Failed represents the total number of failed transactions (a transaction can fail multiple times).

  5. Select the filters to sort the information being displayed in the grid.

    • Click the Transactions dropdown, and select the number of failed transactions you want to display (Top 5, Top 10, or Top 15).

    • Click the By Attribute dropdown, and select the attribute by which to sort transactions in the grid (in descending order):

      Failed

      Sorts transactions by the number of times the transaction failed.

      Success Rate

      Sorts transactions by their success rate (percentage).

      Failed Rate

      Sorts transactions by their failure rate (percentage)

      TR Total Duration Sorts transactions by their total duration time, in seconds.

    In addition to the information above, the grid also displays the following information:

    UI Element Description
    aTRT The average transaction response time of each transaction, in seconds, during the test run.

    STD

    The standard deviation of each transaction during the test run.
    Count The number of times the transaction participates in the test run.
  6. Analyze failed transactions.

    Click a transaction in the Trending Transactions grid to drill down on that transaction. Details of the failed transaction are displayed in the right pane. Use this pane to help identify and analyze trends related to the run.

    1. Click the Trending dropdown, and select the number of runs that immediately preceded this run to use as a comparison (Last 3, 5, or 10 runs).

    2. The Trending tab displays a graphical representation of the failed transaction runs. Click the By Attribute drop down, and select the attribute by which to filter the transaction against preceding runs.

    3. Use the Summary tab to compare data patterns and establish trends for the selected transaction runs.

Back to top

See also: