Define a load test

A load test is a test designed to measure an application's behavior under normal and peak conditions. You add and configure scripts, monitors, and SLAs to a load test definition.

Get started

From your Home page, you can select a project or create and edit a test.

Action How to

Select a project

On the Home page, select a project from the menu bar.

Create a test

To create a test, perform one of the following steps:

  • On the Home page, click Create a test.
  • On the Load Tests page, click Create.
Edit a test
  • Highlight a test in the grid.
  • Click Edit or click the test name link in the test grid.
Duplicate a test

Click Duplicate to copy a test definition to a new test.

Note: To use this feature, the test name should contain 120 characters or less.

Give your test a label

Organize your tests by assigning them labels. For details, see Assign labels.

Back to top

Navigation bar

The Load Tests navigation bar provides the following options:

General. Configure basic, run configuration, log, and load generator settings. For details, see Define general test settings.

Scripts. Show a list of the scripts and lets you set a script schedule. For details, see Configure a schedule for your script and Manage scripts.
Monitors. Show a list of monitors. For details, see Add monitors to a load test.
Distribution. Choose the load generator machines for your Vusers. Optionally, assign scripts and the number Vusers for on-premises load generators. For details, see Configure load generator locations.
Rendezvous. Set up a rendezvous for your Vusers. For details, see Configure rendezvous settings.
SLA. Show the SLA for the test. For details, see Configure SLAs.
Single user performance. Collect client side breakdown data. For details, see Generate single user performance data.
Streaming. Show a list of the streaming agents. For details, see Data streaming. (This option is only visible when data streaming was enabled for your tenant via a support ticket.)
Schedules. Set a schedule for the test run. For details, see Schedule a test run.
Runs. Opens the Runs pane listing the runs for the selected test along with run statistics such as run status, regressions, and failed transactions. For details, see Find out the status of my test run.
Trends. For details, see Trends.

Back to top

Configure a schedule for your script

In the Scripts page, you configure a schedule for each script.

There are three modes of script schedules: Simple, Manual, and Advanced. For all modes, you configure basic settings, which may differ depending on the run mode that you selected in the General settings.

To configure the Simple, Manual, or Advanced mode settings, click the arrow adjacent to a script to expand its details.

Back to top

Configure a goal for a load test

You can configure a load test to run in Goal Oriented mode. If you select this option, you configure a goal for the test and when the test runs, it continues until the goal is reached. For more details, see How LoadRunner Cloud works with goals below.

To configure a goal oriented test:

  1. In the load test's General page, select Run Mode > Goal Oriented. For details, see Run mode.

  2. In the load test's Scripts page, configure the % of Vusers and Location for each script. For details, see Configure a schedule for your script.

  3. In the load test's Scripts page, click the Goal settings button . In the Goal Settings dialog box that opens, configure the following:

    Setting Description
    Goal type

    Select the goal type:

    • Hits per second. The number of hits (HTTP requests) to the Web server per second.
    • Transactions per second. The number of transactions completed per second. Only passed transactions are counted.
    Transaction name If you selected Transactions per second as the goal type, select or input the transaction to use for the goal.
    Goal value Enter the number of hits per second or transactions per second to be reached, for the goal to be fulfilled.
    Vusers Enter the minimum and maximum number of Vusers to be used in the load test.
    Ramp up

    Select the ramp up method and time. This determines the amount of time in which the goal must be reached.

    • Ramp up automatically with a maximum duration of. Ramp up Vusers automatically to reach the goal as soon as possible. If the goal can’t be reached after the maximum duration, the ramp up will stop.

    • Reach target number of hits or transactions per second <time> after test is started. Ramp up Vusers to try to reach the goal in the specified duration.

    • Step up N hits or transactions per second every. Set the number of hits or transactions per second that are added and at what time interval to add them. Hits or transactions per second are added in these increments until the goal is reached.

    Action if target cannot be reached

    Select what to do if the target is not reached in the time frame set by the Ramp up setting.

    • Stop test run. Stop the test as soon as the time frame has ended.

    • Continue test run without reaching goal. Continue running the test for the duration time setting, even though the goal was not reached.

    Duration

    Set a duration time for the load test to continue running after the time frame set by the Ramp up setting has elapsed.

    Note: The total test time is the sum of the Ramp up time and the Duration.

The following do not support the Goal Oriented run mode:

  • Adding Vusers

  • Generating single user performance data

  • The Timeline tab when scheduling a load test

  • Network emulations for cloud locations

  • Changing the load in a running load test

  • TruClient - Native Mobile scripts

How LoadRunner Cloud works with goals

LoadRunner Cloud runs Vusers in batches to reach the configured goal value. Each batch has a duration of 2 minutes.

In the first batch, LoadRunner Cloud determines the number of Vusers for each script according to the configured percentages and the minimum and maximum number of Vusers in the goal settings.

After running each batch, LoadRunner Cloud evaluates whether the goal has been reached or not. If the goal has not been reached, LoadRunner Cloud makes a calculation to determine the number of additional Vusers to be added in the next batch.

If the goal has been reached, or there are no remaining Vusers to be added, the ramp up ends. Note that the value can be greater than the defined goal value. LoadRunner Cloud does not remove Vusers to reduce the load.

During run time, if the goal was reached, but subsequently dropped to below the configured value, LoadRunner Cloud will try to add Vusers (if there are any remaining to be added) to reach the goal again.

LoadRunner Cloud does not change think time and pacing configured for scripts.

Back to top

Configure load generator locations

For each script in the test, you configure a location—Cloud or On Premise. This is the location of the load generators that run the script. For details, see Configure a schedule for your script.

You configure the settings for each location type in the Distribution pane:

  1. In the Load Tests tab, open the Distribution pane .
  2. Click the tab for the location type you want to configure—Cloud or On Premise.

Back to top

Configure rendezvous settings

When performing load testing, you need to emulate heavy user load on your system. To help accomplish this, you can instruct Vusers to perform a task at exactly the same moment using a rendezvous point. When a Vuser arrives at the rendezvous point, it waits until the configured percentage of Vusers participating in the rendezvous arrive. When the designated number of Vusers arrive, they are released.

You can configure the way LoadRunner Cloud handles rendezvous points included in scripts.

If scripts containing rendezvous points are included in a load test, click the Rendezvous tab and configure the following for each relevant rendezvous point:

  • Enable or disable the rendezvous point. If you disable it, it is ignored by LoadRunner Cloud when the script runs. Other configuration options described below are not valid for disabled rendezvous points.

  • Set the percentage of currently running Vusers that must reach the rendezvous point before they can continue with the script.

  • Set the timeout (in seconds) between Vusers for reaching the rendezvous point. This means that if a Vuser does not reach the rendezvous point within the configured timeout (from when the previous Vuser reached the rendezvous point), all the Vusers that have already reached the rendezvous point are released to continue with the script.

Note:  

  • When you select a rendezvous point in the list, all the scripts in which that rendezvous point is included are displayed on the right (under the configuration settings). If the script is disabled, its name is grayed out.

  • If no rendezvous points are displayed for a script that does contain such points, go to Assets > Scripts and reload the script.

Back to top

Configure SLAs

Service level agreements (SLAs) are specific goals that you define for your load test run. After a test run, LoadRunner Cloud compares these goals against performance related data that was gathered and stored during the course of the run, and determines whether the SLA passed or failed. Both the Dashboard and Report show the run's compliance with the SLA.

On the SLA page, you set a percentile, the percentage of transactions expected to complete successfully, by default 90%, and the following goals :

Goal Description
Percentile TRT (sec)

The expected transaction response time that it takes the specified percent of transactions to complete (by default 3 seconds). If the percent of completed transaction exceeds the SLA's value, the SLA will be considered "broken".

For example, assume you have a transaction named "Login" in your script, where the 90th percentile TRT (seconds) value is set to 2.50 seconds. If more than 10% of successful transactions that ended during a 5 second window have a response greater than 2.50 seconds, the SLA will be considered "broken". It will show a Failed status. For details about how the percentiles are calculated, see Percentiles.

Failed TRX (%) The percent of allowed failed transactions, by default 10%. If the percent of failed transaction exceeds the SLA's value, the SLA will be considered "broken" and will show a Fail status.

You can set a separate SLA for each of your runs. You can also select multiple SLAs at one time and use bulk actions to apply one or more actions to the set. For details, see Configure SLAs.

You configure one of more SLAs (Service Level Agreements) for each load test:

To configure the SLA:

  1. In the Load Tests tab, choose a test and open the SLA pane .
  2. Set percentile settings:

    1. Select the percentage of transactions expected to complete successfully. The default value is 90%.

    2. Set the Percentile TRT (sec) (transaction response time) value. This is the expected time it takes the specified percent (from the previous step) of transactions to complete.

      For example, if you set your percentile to 90 and the Percentile TRT (sec) value to 2.50 seconds, then if more than 10% of successful transactions that ended during a 5 second interval have a response time higher than 2.50 seconds, an SLA warning is recorded (using the average over time algorithm). For details, see Percentiles.

      The default Percentile TRT (sec) value is 3.00 seconds.

      Tip: If you have run your test more than once, LoadRunner Cloud will display a suggested percentile TRT value. Click the button to use the suggested value.

    3. Check Stop to stop the test if the SLA is broken.
    4. Clear the Enable option adjacent to the Percentile TRT (sec) column if you do not want the percentile TRT (seconds) value to be used during the test run.

      Note: For tenants created from January 2021 (version 2021.01), SLAs will be disabled by default. You can enable them manually or use the Bulk actions dropdown to enable SLAs for multiple scripts. To change this behavior and have all SLAs enabled by default, open a support ticket.

  3. Set failed transaction settings:
    1. Set the Failed TRX (%) value (default 10%). If the number of transaction failures exceeds this value, the SLA is assigned a status of Failed.
    2. Set the Enable option if you want the Failed TRX (%) value to be used during the test run.

Tip: Select multiple SLAs at one time and use Bulk actions to apply one or more actions to the set.

Back to top

Generate single user performance data

When enabled, this feature lets you create client side breakdown data to analyze user experience for a specific business process.

Note: You cannot generate single user performance data for:

  • Load tests configured for the Iterations run mode.
  • Scripts configured to run on on-premises load generators.

Select one or more of the following report types:

Report type How to
NV Insights

Generate a comprehensive report based on the selected script that provides information about how your application performs for a specific business process.

Generate an NV Insights report:

  1. Select the Single user performance pane.
  2. Under the Client Side Breakdown tab, select the NV Insights check box.
  3. Click Select Script and select a script for the report.

Note:

  • It can take several minutes for the NV Insights report to be generated when viewing it for the first time in a load test.

  • The NV Insights report does not support scripts in which WinInet is enabled.

For details, see The NV Insights report.

TRT Breakdown

Generate TRT breakdown data:

  1. Select the Single user performance pane.
  2. Under the Client Side Breakdown tab, select the Transaction Response Time Breakdown check box.
  3. Click Select Scripts and select up to five scripts.
  4. To display the results, from the Dashboard > , select the Breakdown widget.

For details, see Transaction response time breakdown data.

WebPage

Generate a WebPage report:

  1. Select the Single user performance WebPageTest Report pane.
  2. Under the WebPage Test Report tab, enter a URL of your application. You can add up to 3 URLs.
  3. Select a network speed.

    Network speed Description

    Cable

    5/1 Mbps 28ms RTT
    DSL 1.5 Mbps/384 Kbps 50 ms RTT
    FIOS 20/5 Mbps 4 ms RTT
    56K Dial-Up 49/30 Kbps 120 ms RTT

    Mobile 3G

    1.6 Mbps/786Kbps 300ms RTT
    Mobile 3G- Fast 1.6 Mbps/786Kbps 150ms RTT
    Native Connection No traffic shaping

For details, see WebPage Test report data.

Back to top

Schedule a test run

Schedule a test to run on a specific date and time.

Caution: The most up-to-date test settings will be used when the test is launched.

To schedule a test:

  1. From Load Tests page, open the Schedules pane.
  2. Click the Add schedules button.

    Note: You can configure up to 50 schedules for a load test.

  3. From the date picker, select a date to run the test.
  4. From the time picker, specify a time to run the test.
  5. To apply the schedule, click the toggle to On.

    The Schedule On button appears in the cart section indicating that a schedule is set for this load test.

Tip: In the Loads Test page, a test with a set schedule has the icon displayed next to its status.

On the lower part of the page, underneath the list of set schedules, the following two tabs show additional schedule information:

Runs tab

The Runs tab displays a list of previously scheduled runs and their launch status.

  • Launched. The test was launched successfully. Click the run ID to view results.

  • Failed to launch. The test was not launched successfully.

    Troubleshooting failed launches:

    • The number of Vusers scheduled in your test exceeds the amount remaining in your license.

    • A load generator or monitor that is defined in your test is down.

Timeline tab

Note:  

  • This tab is only displayed for tests whose license mode is set as VU.
  • This tab is not displayed for load tests configured for the Iterations run mode.

The Timeline tab displays the timeline for the selected schedule, as well as for any other schedule in the project that overlaps with the selected schedule.

If the current project uses global licenses (that is, it does not have dedicated licenses assigned to it), then schedules from other global license projects that overlap with the selected schedule are also displayed.

The timeline is colored according to its status, as follows:

  • OK (green). There are no conflicts and this schedule will run as planned.

  • Conflicting (orange). This schedule impacts another schedule and will cause it to breach your license count, but this schedule will still run as planned.

  • Blocked (red). This schedule will breach your license count and may not run.

Back to top

Run preview

The Run Preview window lets you see the following details before running your test:

  • how load generators are allocated (both cloud and on-premises)

  • how scripts are distributed across load generators

  • the number of Vusers allocated to each load generator per script

Knowing this information will help you better understand your load test runs and run them more efficiently. For example, if Vusers were distributed unevenly, you will be able to understand why CPU alerts occurred on specific load generator machines.

In addition, the run preview lets you verify the changes that you may have made to a test's settings. For example, if you adjusted the distribution locations or if you selected a different on-premises load generator, the run preview lets you see its impact on the load distribution.

The grid shows the load generator vendors, their locations, the scripts that will be used in the load test, and number of Vusers running on the load generator.

To show the run preview:

  1. On the Load Tests page, click on a test.
  2. In the right corner of the masthead, click Run Preview. The Run Preview window opens showing your Vuser distribution. If LoadRunner Cloud detects a potential error, you will be alerted when you open the window.

  3. Optionally, expand the Group by dropdown to select another way to display the information—by Vendor, Location, Load generator, or Script.

Back to top

Run the test

Once you configure all of the settings, you are ready to run the load test. Make sure you have set up Monitors and Data streaming if they are relevant for your test run.

Before you run the test, click Run preview to review the Vuser distribution, and check the Run Configuration Summary toolbar in the masthead:

Note:

  • Before running your load test, if you defined any server monitors, make sure that they are up and accessible so that LoadRunner Cloud can monitor the test and display results in the dashboard.
  • An on-premises load generator can only be used by one running test at a time.
  • The number of Vusers defined in your test must not exceed the maximum number of Vusers defined in your Vuser license.

To run the test:

  1. Click Run Test. The Dashboard opens showing the default metrics for your test run in real time.

    During the test run, the dashboard's toolbar shows summary information such as the elapsed time, number of running Vusers, and so forth.

  2. To pause the scheduling during the test run, click the Pause scheduling button on the toolbar. To resume, click . For details, see Pause scheduling during a test run.
  3. To add Vusers during a test run, click the Change Load button . For details, see Change the Vuser load dynamically.
  4. To end a test run before its completion, click Stop Test.

Back to top

See also: