Test Express Designer

Use the Test Express Designer to simplify the process of creating, designing, and running performance tests.

Test Express Designer overview

The Test Express Designer guides you through each step, including selecting scripts for a test, scheduling how many Vusers to run in each script and for how long to run them, and selecting a post-run action.

During test run, you can view the performance test initialization steps performed by Performance Center and the status for each step. If each step is completed successfully, the performance test continues running and the Performance Test Run page opens.

To access

From the Performance Center navigation toolbar, click and select Create New Test Express > Test.

Important Information You can refine and adjust your performance test using the Performance Test Designer window. For details, see Performance Test Designer window.
Relevant tasks

Back to top

Step 1 - Scripts Pane

This pane enables you to select VuGen scripts to run in the performance test.

User interface elements are described below:

UI Elements

Description

Add selected scripts to groups. Adds the selected VuGen script to the Step 2 - Design pane.

Tip: You can also add a script by dragging it from the tree to the Groups table in the Design pane.

Refreshes the resources tree.
<resources tree> Displays the available VuGen scripts.

Back to top

Step 2 - Design

This pane enables you to define a performance test. Defining a performance test includes setting a test run duration, and selecting load generators on which to run the Vuser groups.

 User interface elements are described below:

UI Elements

Description

Test Name

The name of the test.

Test Folder

The test plan folder name.

Groups

Name. The name of the Vuser group.

Script. The name of the Vuser script.

Vusers. The number of Vusers assigned to the group.

Load Generators. The load generator on which the group is running. Click the down arrow to select a load generator.

Command Line. Opens the Script Command Line dialog box, enabling you to type the name and value of the parameter you want to send using the format, <Parameter_Name> <value>. For information about the command line parsing functions, or for details about including arguments on a command line, see the LoadRunner Online Function Reference, provided with Virtual User Generator.

Start/End Vusers

You can start all Vusers simultaneously or start a specific number of Vusers gradually.

  • To start all Vusers simultaneously: Move the slider all the way to right.
  • To start a specific number of Vusers gradually: Move the slider to the appropriate predefined settings. The Scheduler will run X Vusers and then wait the predefined time before running another X Vusers.
Duration

The duration of the test run in hours and minutes.

Note: This indicates the test run duration once the gradual initialization and stopping of all Vusers is done.

Scheduler Preview Displays a preview graph by Vuser groups. For details, click the Scheduler Preview tooltip icon .

Back to top

Step 3 - Run

This pane enables you to run the performance test and determine the action to be taken when the test run is complete.

User interface elements are described below:

UI Elements

Description

Post-Run Actions

The action to be taken when the test run is complete.

If the system administrator has set the post-run action, the selected action will be set as the only option for all tests across the project, and you will not be able to change the setting.

If the system administrator has not set the post-run action, select one of the following the actions:

  • Do not collate results. Frees the machines immediately after the performance test ends. When the run has finished, the run results are left on the load generators. You can analyze the results at a later stage from the Results.
  • Collate results. When the run has finished, the run results are collected from all the load generators. This is recommended because collation of results takes only a few minutes, and can prevent loss of or inaccessibility to results in case any of your load generators becomes unavailable.
  • Collate and analyze results. When the run has finished, the run results are collected and analyzed. Data analysis requires some time, depending on the size of the results file. If there is no timeslot available to include the data analysis, select the Collate option instead, and run late Analysis when a data processor becomes available. You run late Analysis from the Results tab. For user interface details, see Manage results.

Note: This feature controls the action performed immediately after the run ends. It does not prevent collating or analyzing results from the Test Runs page at a later time.

Test Validation

Displays the validation results of your performance test

  • Level. The type of message: Error or Warning.
  • Details. Describes the error or warning.
Save and Run

Saves and runs the performance test.

The Process Details page opens, displaying the performance test initialization steps performed by Performance Center and the status for each step. For user interface details, see Process Details Page.

If each step is completed successfully, the performance test starts running and the Performance Test Run page opens. For user interface details, see Performance test run page (online screen).

Save

Saves the performance test.

Back to top

Process Details Page

This page displays the performance test initialization steps performed by Performance Center and the status for each step.

User interface elements are described below (unlabeled elements are shown in angle brackets):

UI Elements

Description

<progress chart> Displays test initialization steps progress in percentage.
Step Name

Displays the following steps:

  • RegisterRun. Performance Center initializes the run process.
  • ValidateScripts. Performance Center checks that the selected scripts' Run Logic runtime settings are in sync with the scripts' state.
  • GetReservationData. Performance Center checks the required resource information from the selected timeslot.
  • CheckDiskSpace. Performance Center checks that there is enough disk space on the Controller.
  • LaunchController. Performance Center initializes the Controller so that other testing entities, such as load generators and scripts can connect to it. If there is a problem launching the Controller, Performance Center automatically attempts to find an alternative Controller. This attempt appears as an additional initialization step. If no alternative Controller is available, the step fails.
  • ConnectToLGs. Performance Center checks that the required load generators are valid and connects them to the Controller. If this step fails, Performance Center attempts to find alternative load generators automatically. If this step still fails, check the test definitions and select different load generators for the test.
  • DownloadScripts. Performance Center downloads the required Vuser scripts.
  • StartControllerServices. Performance Center initializes the Controller's configuration settings in preparation to run the performance test.
  • MapVirtualHosts. Performance Center maps virtual hosts to actual hosts.
  • LoadLTOMToController. Performance Center creates the performance test and adds the Vuser scripts to the Controller.
  • StartRun. Performance Center starts the performance test run.
Description A detailed description of the current status of the step.
Status

Displays whether the step passed or failed.

System Messages

Displays error messages generated when a step fails. These error messages can also be viewed from the Event Log. For details about the event log, see View test run event logs.

Back to top

See also: