Define a load test

A load test is a test designed to measure an application's behavior under normal and peak conditions. You add and configure scripts, monitors, and SLAs to a load test definition.

Get started

From the Load Tests page, you can select a project or create and edit a test.

Action How to

Select a project

Select a project from the dropdown in the masthead.

Create a test

To create a test, perform one of the following steps:

  • On the Load Tests page, click Create.
  • On the Home page, click Create a test.
Edit a test
  • Highlight a test in the grid.
  • Click Edit or click the test name link in the test grid.
Duplicate a test

Click Duplicate to copy a test definition to a new test.

Note: To use this feature, the test name should contain 120 characters or less.

Give your test a label

Organize your tests by assigning them labels. For details, see Assign labels.

View test settings

Use the left or right facing arrows in the top right to show or hide the Summary pane. This pane shows information about the last 5 runs, and lists test settings such as the number of Vusers per type, the duration, and the test ID.

Tip: You can resize the pane by dragging its border.

Back to top

Navigation bar

The Load Tests navigation bar provides the following options:

General. Configure basic, run configuration, log, and load generator settings. For details, see Define general test settings.

Scripts. Show a list of the scripts and lets you set a script schedule. For details, see Configure a schedule for your script and Manage scripts .
Monitors. Show a list of monitors. For details, see Add monitors to a load test.
Distribution. Choose the load generator machines for your Vusers. Optionally, assign scripts and the number of Vusers for on-premises load generators. For details, see Configure on-premises load generator locations.
Rendezvous. Set up a rendezvous for your Vusers. For details, see Configure rendezvous settings.
SLA. Show the SLA for the test. For details, see Configure SLAs.
Single user performance. Collect client side breakdown data. For details, see Generate single user performance data.
Streaming. Show a list of the streaming agents. For details, see Data streaming. (This option is only visible when data streaming was enabled for your tenant via a support ticket.)
Schedules. Set a schedule for the test run. For details, see Schedule a test run.
Runs. Opens the Runs pane listing the runs for the selected test along with run statistics such as run status, regressions, and failed transactions. For details, see Find out the status of my test run.
Trends. For details, see Trends.

Back to top

Configure a schedule for your script

In the Scripts page, you configure a schedule for each script.

There are three modes of script schedules: Simple, Manual, and Advanced. For all modes, you configure basic settings, which may differ depending on the run mode that you selected in the General settings.

To configure the Simple, Manual, or Advanced mode settings, click the arrow adjacent to a script to expand its details.

Back to top

Configure a goal for a load test

You can configure a load test to run in Goal Oriented mode. If you select this option, you configure a goal for the test and when the test runs, it continues until the goal is reached. For more details, see How LoadRunner Cloud works with goals below.

To configure a goal oriented test:

  1. In the load test's General page, select Run Mode > Goal Oriented. For details, see Run mode.

  2. In the load test's Scripts page, configure the % of Vusers and Location for each script. For details, see Configure a schedule for your script.

  3. In the load test's Scripts page, click the Goal settings button . In the Goal Settings dialog box that opens, configure the following:

    Setting Description
    Goal type

    Select the goal type:

    • Hits per second. The number of hits (HTTP requests) to the Web server per second.
    • Transactions per second. The number of transactions completed per second. Only passed transactions are counted.
    Transaction name If you selected Transactions per second as the goal type, select or input the transaction to use for the goal.
    Goal value Enter the number of hits per second or transactions per second to be reached, for the goal to be fulfilled.
    Vusers Enter the minimum and maximum number of Vusers to be used in the load test.
    Ramp up

    Select the ramp up method and time. This determines the amount of time in which the goal must be reached.

    • Ramp up automatically with a maximum duration of. Ramp up Vusers automatically to reach the goal as soon as possible. If the goal can’t be reached after the maximum duration, the ramp up will stop.

    • Reach target number of hits or transactions per second <time> after test is started. Ramp up Vusers to try to reach the goal in the specified duration.

    • Step up N hits or transactions per second every. Set the number of hits or transactions per second that are added and at what time interval to add them. Hits or transactions per second are added in these increments until the goal is reached.

    Action if target cannot be reached

    Select what to do if the target is not reached in the time frame set by the Ramp up setting.

    • Stop test run. Stop the test as soon as the time frame has ended.

    • Continue test run without reaching goal. Continue running the test for the duration time setting, even though the goal was not reached.

    Duration

    Set a duration time for the load test to continue running after the time frame set by the Ramp up setting has elapsed.

    Note: The total test time is the sum of the Ramp up time and the Duration.

The following do not support the Goal Oriented run mode:

  • Adding Vusers

  • Generating single user performance data

  • The Timeline tab when scheduling a load test

  • Network emulations for cloud locations

  • Changing the load in a running load test

  • TruClient - Native Mobile scripts

How LoadRunner Cloud works with goals

LoadRunner Cloud runs Vusers in batches to reach the configured goal value. Each batch has a duration of 2 minutes.

In the first batch, LoadRunner Cloud determines the number of Vusers for each script according to the configured percentages and the minimum and maximum number of Vusers in the goal settings.

After running each batch, LoadRunner Cloud evaluates whether the goal has been reached or not. If the goal has not been reached, LoadRunner Cloud makes a calculation to determine the number of additional Vusers to be added in the next batch.

If the goal has been reached, or there are no remaining Vusers to be added, the ramp up ends. Note that the value can be greater than the defined goal value. LoadRunner Cloud does not remove Vusers to reduce the load.

During run time, if the goal was reached, but subsequently dropped to below the configured value, LoadRunner Cloud will try to add Vusers (if there are any remaining to be added) to reach the goal again.

LoadRunner Cloud does not change think time and pacing configured for scripts.

Back to top

Configure cloud load generator locations

For each script in the test, you configure a location—Cloud or On Premise. This is the location of the load generators that run the script. For details, see Configure a schedule for your script.

Tip: To learn more about the machine types used for cloud load generators, see Cloud machine types.

To configure the settings for cloud locations:

For cloud locations, you configure the distribution of Vusers and the network emulation.

  1. In the Distribution pane , click the Cloud tab.
  2. Click Edit locations and select the locations to add to your test. For details, see Vuser distribution locations.

    Tip: Click Group by to group the load generators by location, vendor, vendor region, or geographical area.

  3. In the Cloud tab, click Edit network emulations and select up to five emulations for your test. For details, see Network emulations.
  4. For each location that you selected:

    1. Enter the percentage of Vusers you want to distribute to this location. The Vuser distribution must total 100 percent.

    2. Enter the percentage of those Vusers distributed to a location for which you want to assign a network emulation.

The distribution of locations and network emulations will be equally applied to each of the test scripts.

Example

In the following example, in the test's Scripts page, we deployed 15 Vusers for a TruClient script and 100 Vusers for a Web HTTP script.

We set the location to Bahrain for 50 percent of the Vusers, and Ireland for the other 50 percent.

When the test starts in the Bahrain region, it will launch 7 Vusers for the TruClient script and 50 Vusers for the Web HTTP script.

In Ireland, the test will launch the remaining Vusers: 8 Vusers for the TruClient script and 50 Vusers for the Web HTTP script.

Back to top

Cloud LG infrastructure capacity

By default. LoadRunner Cloud assigns a maximum number of Vusers for each cloud loud generator per protocol. For example, the default configuration for Vusers running Web HTTP scripts is a maximum of 2,500 Vusers per load generator. In this case, if you choose to run 10,000 Web HTTP Vusers in a single location, LoadRunner Cloud will provision four cloud load generators in that location.

In certain instances, you may want to change the default configuration:

  • High CPU or Memory utilization. Lowering the number of Vusers per cloud load generator may help reduce the CPU and memory utilization. For details, see LG alerts.

  • Environment limitations. Some environments do not allow you to load the system with more than a certain amount of Vusers originating from a single IP address. You can circumvent this issue by reducing the number of Vusers per cloud load generator. Another option is to enable the Multiple IPs feature. For details, see Enable multiple IPs from cloud and on-premises load generators.

To customize the configurations per protocol (admins):

  1. Make sure you have access to Project Management. Click your login name in the top banner and choose Project Management from the dropdown.

  2. Click on a project and go to the Load Generators page.

  3. Click on the Cloud tab. If no data is visible, you must enable this capability with a service request.

  4. Activate or deactivate a protocol for end users by toggling the switch in the Active column.

  5. Optionally, modify the maximum number of allowed Vusers for the active protocols.

  6. Click Apply. Note any pop ups or information messages indicating how the changes will affect your tests.

Set the Vuser limit per test

In the load test's General options, you can set a limit to the number of Vusers that can run on a cloud load generator, per protocol, for each test.

This capability is disabled by default. To enable it, your admin must submit a service request. Once this capability is enabled, your admin can activate or deactivate protocols for each project, and set the lower limit of Vusers per protocol. For details, see Cloud LG infrastructure capacity.

Once this feature has been enabled, follow these steps to configure the Vuser limits for your test:

  1. Set the Enable Cloud LG Vusers configuration option to Manual in the General settings. For details, see Load generators settings.

  2. Click the Edit button adjacent to the Manual option. The Vusers per cloud generator dialog box opens.

  3. Specify the maximum number of Vusers that you want to be able to run in the Vusers per cloud load generator dialog box.

  4. Click Apply. Note any pop up windows and informational messages describing how your changes will affect Vuser multipliers and costs

  5. In the Load Test > Scripts pane, click the License button to learn about your license consumption. If the multiplier was changed due to your changes, the button name will indicate 'Multiplied'.

    Potential implications of modifying the default values

    The following potential implications may apply when modifying the maximum number of Vusers:

    License consumption. License consumption may increase depending on the actual configuration. Pay attention to the pop up windows and information messages issued when you apply your changes.

    Dedicated IPs. When using dedicated IPs for cloud load generators, reducing the maximum number of Vusers per cloud load generator will increase the number of dedicated IPs required for your future tests.

Back to top

Configure on-premises load generator locations

This section describes how to choose on-premises load generators for your test. For details about script permissions, see Configure a schedule for your script.

To configure the settings for on-premises locations:

  1. in the Distribution pane , click the On Premise tab.
  2. Click + Add from Assets to select the load generators to use for your test, or click * Create to add a new load generator. Sort by any column by clicking on the header name. For details, see Load generator assets.
  3. If you selected Enable manual Vuser distribution for on-premises load generators as described in Define general test settings, you can assign scripts and Vusers per load generator. When this setting is enabled, a dropdown box presents two options—Simple and Advanced.

    Simple assignment

    With the Simple assignment method, you indicate which scripts in a load test will run on which on-premises load generators.

    In the Distribution > On Premise page, the Assigned Scripts pane to the right of the load generator list shows the scripts assigned to the selected load generator.

    Click + Select scripts to assign or unassign scripts to the selected load generator. To assign a new script, select it and click OK. To unassign a script, deselect it and click OK.

    Advanced assignment

    With the Advanced assignment method, you can also set the number of Vusers per script to run on specific on-premises load generators.

    Before setting the number of Vusers, make sure to assign each script to a load generator as described in Simple assignment.

    After you have assigned the scripts, set the number of Vusers either manually or automatically. For details, see Assign Vusers to on-premises load generators.

    Note:  

    • You can only assign scripts of the same type to a load generator. If a script is already assigned to a selected load generator, only scripts of the same type are enabled for assigning. Scripts of other types are disabled.
    • Only scripts that are configured with a Location of On-Premise (in the Scripts page) are displayed in the list.
    • If you change the location of a script that is already assigned to a load generator, the script is automatically unassigned from the load generator.

Back to top

Assign Vusers to on-premises load generators

LoadRunner Cloud lets you assign the number of the Vusers per load generator, manually or automatically.

Manually enter the number of Vusers

To manually enter the number of Vusers:

  1. Navigate to the load test's Distribution page and open the On Premise tab.

  2. Select the relevant load generator.

  3. Manually enter the number of the Vusers in the box adjacent to the script name.

Note: Make sure that the number of Vusers that you assign to your script matches the total number of Vusers in your load test. For example, if the script has 500 Vusers, you must manually assign all 500. In addition, the number of Vusers defined in your test must not exceed the maximum number of Vusers defined in your Vuser license.

Automatically set the number of Vusers

To assign an equal amount of Vusers to your on-premises load generators:

  1. Navigate to the load test's Distribution page and open the On Premise tab.

  2. Click Distribute evenly. LoadRunner Cloud automatically distributes Vusers evenly over all assigned on-premises load generators, according to the script assignment configured in the test.

The following guidelines apply:

  • The even distribution feature only affects Vusers for scripts assigned to the test's on-premises load generators. For example, where both LG-1 and LG-2 are allowed to run 500 Vusers, the distribution will be as follows:

    Script name # of Vusers LG assignment # of Vusers running after clicking Distribute evenly
    Script-A 100 LG-1 100 on LG-1
    Script-B 200 LG-2 200 on LG-2
    Script-C 300 LG-1, LG-2 150 on both LG-1 and LG-2
  • If a load generator is limited to a number of Vusers less then its even portion, the maximum number of Vusers will be assigned. For example, if LG-1 is allowed to run 30 Vusers and LG-2 500 Vusers, the distribution will be as follows:.

    Script name # of Vusers LG assignment # of Vusers running after clicking Distribute evenly
    Script-A 100 LG-1, LG-2 30 on LG-1 and 70 on LG-2

View the number of assigned Vusers

To see the number of Vusers assigned per script, click + Select scripts and refer to the Vusers column.

Back to top

Configure rendezvous settings

When performing load testing, you need to emulate heavy user load on your system. To help accomplish this, you can instruct Vusers to perform a task at exactly the same moment using a rendezvous point. When a Vuser arrives at the rendezvous point, it waits until the configured percentage of Vusers participating in the rendezvous arrive. When the designated number of Vusers arrive, they are released.

You can configure the way LoadRunner Cloud handles rendezvous points included in scripts.

If scripts containing rendezvous points are included in a load test, click the Rendezvous tab and configure the following for each relevant rendezvous point:

  • Enable or disable the rendezvous point. If you disable it, it is ignored by LoadRunner Cloud when the script runs. Other configuration options described below are not valid for disabled rendezvous points.

  • Set the percentage of currently running Vusers that must reach the rendezvous point before they can continue with the script.

  • Set the timeout (in seconds) between Vusers for reaching the rendezvous point. This means that if a Vuser does not reach the rendezvous point within the configured timeout (from when the previous Vuser reached the rendezvous point), all the Vusers that have already reached the rendezvous point are released to continue with the script.

Note:  

  • When you select a rendezvous point in the list, all the scripts in which that rendezvous point is included are displayed on the right (under the configuration settings). If the script is disabled, its name is grayed out.

  • If no rendezvous points are displayed for a script that does contain such points, go to Assets > Scripts and reload the script.

Back to top

Configure SLAs

Service level agreements (SLAs) are specific goals that you define for your load test run. After a test run, LoadRunner Cloud compares these goals against performance related data that was gathered and stored during the course of the run, and determines whether the SLA passed or failed. Both the Dashboard and Report show the run's compliance with the SLA.

On the SLA page, you set a percentile, the percentage of transactions expected to complete successfully, by default 90%, and the following goals :

Goal Description
Percentile TRT (sec)

The expected transaction response time that it takes the specified percent of transactions to complete (by default 3 seconds). If the percent of completed transaction exceeds the SLA's value, the SLA will be considered "broken".

For example, assume you have a transaction named "Login" in your script, where the 90th percentile TRT (seconds) value is set to 2.50 seconds. If more than 10% of successful transactions that ended during a 5 second window have a response greater than 2.50 seconds, the SLA will be considered "broken". It will show a Failed status. For details about how the percentiles are calculated, see Percentiles.

Failed TRX (%) The percent of allowed failed transactions, by default 10%. If the percent of failed transaction exceeds the SLA's value, the SLA will be considered "broken" and will show a Fail status.

You can set a separate SLA for each of your runs. You can also select multiple SLAs at one time and use bulk actions to apply one or more actions to the set. For details, see Configure SLAs.

You configure one of more SLAs (Service Level Agreements) for each load test:

To configure the SLA:

  1. In the Load Tests tab, choose a test and open the SLA pane .
  2. Set percentile settings:

    1. Select the percentage of transactions expected to complete successfully. The default value is 90%.

    2. Set the Percentile TRT (sec) (transaction response time) value. This is the expected time it takes the specified percent (from the previous step) of transactions to complete.

      For example, if you set your percentile to 90 and the Percentile TRT (sec) value to 2.50 seconds, then if more than 10% of successful transactions that ended during a 5 second interval have a response time higher than 2.50 seconds, an SLA warning is recorded (using the average over time algorithm). For details, see Percentiles.

      The default Percentile TRT (sec) value is 3.00 seconds.

      Tip: If you have run your test more than once, LoadRunner Cloud will display a suggested percentile TRT value. Click the button to use the suggested value.

    3. Check Stop to stop the test if the SLA is broken.
    4. Clear the Enable option adjacent to the Percentile TRT (sec) column if you do not want the percentile TRT (seconds) value to be used during the test run.

      Note: For tenants created from January 2021 (version 2021.01), SLAs will be disabled by default. You can enable them manually or use the Bulk actions dropdown to enable SLAs for multiple scripts. To change this behavior and have all SLAs enabled by default, open a support ticket.

  3. Set failed transaction settings:
    1. Set the Failed TRX (%) value (default 10%). If the number of transaction failures exceeds this value, the SLA is assigned a status of Failed.
    2. Set the Enable option if you want the Failed TRX (%) value to be used during the test run.
  4. Optionally, select the Set test status to Failed if one or more Vusers fail option to allow your SLAs to be dependent on Vuser success. If even one Vuser fails, it will give the test a Failed status. By default, this option is disabled. To enable it, submit a support ticket requesting this option.

Tip: Select multiple SLAs at one time and use Bulk actions to apply one or more settings to the selection.

Back to top

Generate single user performance data

When enabled, this feature lets you create client side breakdown data to analyze user experience for a specific business process.

Note: You cannot generate single user performance data for:

  • Load tests configured to the Iterations run mode.
  • Scripts configured to run on on-premises load generators.

Select one or more of the following report types:

Report type How to
NV Insights

Generate a comprehensive report based on the selected script that provides information about how your application performs for a specific business process.

Generate an NV Insights report:

  1. Select the Single user performance pane.
  2. Under the Client Side Breakdown tab, select the NV Insights check box.
  3. Click Select Script and select a script for the report.

Note:

  • It can take several minutes for the NV Insights report to be generated when viewing it for the first time in a load test.

  • The NV Insights report does not support scripts in which WinInet is enabled.

For details, see The NV Insights report.

TRT Breakdown

Generate TRT breakdown data:

  1. Select the Single user performance pane.
  2. Under the Client Side Breakdown tab, select the Transaction Response Time Breakdown check box.
  3. Click Select Scripts and select up to five scripts.
  4. To display the results, from the Dashboard > , select the Breakdown widget.

For details, see Transaction response time breakdown data.

WebPage

Generate a WebPage report:

  1. Select the Single user performance pane.
  2. Under the WebPage Test Report tab, enter a URL of your application. You can add up to 3 URLs.
  3. Select a network speed.

    Network speed Description

    Cable

    5/1 Mbps 28ms RTT
    DSL 1.5 Mbps/384 Kbps 50 ms RTT
    FIOS 20/5 Mbps 4 ms RTT
    56K Dial-Up 49/30 Kbps 120 ms RTT

    Mobile 3G

    1.6 Mbps/786Kbps 300ms RTT
    Mobile 3G- Fast 1.6 Mbps/786Kbps 150ms RTT
    Native Connection No traffic shaping

For details, see WebPage Test report data.

Back to top

See also: