Runtime monitor graphs

This section describes the Runtime monitor graphs that are available during a test run.

Runtime monitor graphs overview

The Runtime monitor graphs provide information about the status of the Vusers participating in the performance test, as well as the number and types of errors that the Vusers generate. In addition, the Runtime monitor provides the User-Defined Data Points graph, which displays the real time values for user-defined points in a Vuser script.

Runtime monitor graphs are enabled by default, and they automatically begin monitoring Vusers at the start of a performance test.

Back to top

Running Vusers graph

The monitor's Running Vusers graph provides information about the status of the Vusers running in the current performance test on all load generator machines. The graph shows the number of running Vusers, while the information in the legend indicates the number of Vusers in each state.

The Name field of each Vuser displays the status of the Vuser. The following table describes each Vuser status:

Status

Description

Running

The total number of Vusers currently running on all load generators.

Ready

The number of Vusers that completed the initialization section of the script and are ready to run.

Finished

The number of Vusers that have finished running. This includes both Vusers that passed and failed.

Error

The number of Vusers whose execution generated an error.

Back to top

User-Defined Data Points graph

The User-Defined Data Points graph displays the real-time values of user-defined data points. You define a data point in your Vuser script by inserting an lr_user_data_point function at the appropriate place (user_data_point for GUI Vusers and lr.user_data_point for Java Vusers).

Action1()
{
    lr_think_time(1);
    lr_user_data_point ("data_point_1",1);
    lr_user_data_point ("data_point_2",2);
    return 0;
}

For Vuser protocols that support the graphical script representations such as Web and Oracle NCA, you insert a data point as a user-defined step. Data point information is gathered each time the script executes the function or step.

By default, LoadRunner Enterprise displays all of the data points in a single graph. The legend provides information about each data point. If desired, you can hide specific data points using the legend below the graphs.

You can also view data points offline, after the completion of the performance test. For details, see the LoadRunner Professional Help Center.

Back to top

Error Statistics graph

The monitor's Error Statistics graph provides details about the number of errors that accrue during each second of the test run. The errors are grouped by error source; for example, the location in the script or the load generator name.

Back to top

Operations graph

The Operations graph shows virtual service operations performance counters on all the SV servers that are used in the scenario. It only shows services that are used in the scenario.

The Operations on <server> graph shows the measurement (y-axis) as a function of the elapsed time in the performance test (x-axis).

Measurement

Description

Average Response Time Average response time of virtual service in milliseconds.
Hit Rate The number of requests per second of the virtual service operation.
Throughput Data sent and received by virtual service operation measured in megabytes.
Transactions Per Second The average value for all services deployed on the Service Virtualization Server.

Back to top

Services graph

The Services graph displays information about the virtual services used during a test run.

The Services on <server> graph shows the measurement (y-axis) as a function of the elapsed time in the performance test (x-axis).

Measurement

Description

Average Response Time Average Response Time on virtual service in milliseconds.
Data Simulation Accuracy Accuracy of Data Model emulation on virtual service displayed as a percentage. Accuracy is compared to the recorded behavior on the corresponding actual service, if available.
Hit Rate The number of requests per second of the virtual service.
Performance Simulation Accuracy Accuracy of performance model emulation on virtual service displayed as a percentage. Accuracy is compared to the recorded behavior on the corresponding actual service, if available.
Throughput Data sent and received on virtual service measured in megabytes per second.
Transactions Per Second

The average value for all services deployed on the Service Virtualization Server.

Back to top

Vusers with Errors graph

The Vusers with Errors graph provides details about the number of Vusers that generate errors during test execution. The errors are grouped by error source.

Back to top

See also: