Design a performance test

This section provides best practices and instructions for designing a performance test.

Performance test design best practices

Before you start with designing a performance test, it is recommended to:

Action Description
Create scripts

Create scripts for your test in the relevant application (VuGen scripts, JMeter scripts, Unified Functional Testing tests, Service Test tests), and upload them to Performance Center. For details, see the relevant User Guide.

Design topologies

Design topologies of your application under test. For details, see Design topologies.

Create monitor profiles

Configure the monitor machines or monitor-over-firewall agents that you want to use to monitor the test run, and define monitor profiles for these monitors. For details, see Configure monitor profiles.

Enable diagnostics

To enable diagnostic modules to collect diagnostics data from the test run, you must set up the relevant diagnostics components (server/mediators). For details, see Manage diagnostics servers.

Create virtualized services

To use simulated services during your test run instead of loading actual services, create your virtualized services in Service Virtualization Designer. For details on creating projects that contain virtualized services, see the Service Virtualization Help Center.

Back to top

Design a performance test

This task describes how to design a performance test.

  1. Prerequisites

    • Make sure the relevant scripts/tests have been uploaded/saved to Performance Center. You can use VuGen scripts and JMeter scripts for performance testing, as well as UFT GUI tests, and UFT API tests.

      Script/Test Details
      VuGen Upload VuGen scripts
      JMeter Set up the JMeter test
      UFT GUI tests Unified Functional Testing User Guide available from the UFT Help Center
      UFT API tests Unified Functional Testing User Guide available from the UFT Help Center
    • Under Resources, select Hosts and make sure that there is at least one Controller, one load generator, and one data processor in the host pool of your project. If not, contact your administrator to add them.

    • For optional pre-design best practices, see Performance test design best practices.

  2. Create a new performance test

    1. From the Performance Center navigation toolbar, click and select Test Management (under Testing).

    2. In the test management tree, select the Subject root folder, click New Folder . Type a folder name and click OK.

    3. Select the folder from the tree.

    4. Click New Test . Fill in the fields in the Create New Performance Test dialog box. For details, see Create a test.

  3. Design a workload for the test

    Designing a workload involves creating Vuser groups, distributing Vusers among the Vuser groups, assigning hosts to the Vuser groups, and defining a run schedule for the test. For task details, see Define a performance test workload.

    Note: Non-English national characters are not supported in group names.

  4. Integrate virtualized services - optional

    Configure and integrate virtualized services into the performance test. For task details, see Add virtualized services to performance tests.

    You can start adding projects that contain virtualization services to your performance test from the Performance Test Designer at any point in the design process, but we recommend adding projects after you have added relevant scripts to the test.

  5. Select a topology for the test - optional

    Note: Before you can select a topology for a test, you must design the topology. To design topologies, see How to design topologies.

    In the Performance Center Designer's Topology tab, click Select Topology and select a topology for the test. For user interface details, see Performance Test Designer > Topology.

  6. Select or create monitor profiles to monitor the test - optional

    In the Performance Center Designer's Monitors tab, click Add Monitor Profile or Add Monitor OFW. The respective tree opens on the right.

    • To select existing profiles or monitor-over-firewall agents, select the monitors to add to the test, and click Add selected monitors .

      Note: Before you can select monitors for a test, you must configure monitor machines and create monitor profiles. For details, see Create and configure monitor profiles.

      Similarly, you must define monitor-over-firewall agents in the system before you can select them to monitor a test.

    • To create a new monitor profile or monitor-over-firewall agent, click the Create Monitor or Create Monitor OFW button.

      For a monitor profile: In the Create Monitor Profile dialog box, enter a name and description, and select a folder for storing the profile. Click OK, and create a new monitor profile as described in Create and configure monitor profiles.

      For a monitor-over-firewall agent: In the Create Monitor Over Firewall dialog box, enter a name, the machine key, and select a folder for storing the profile and the MI Listener with which the monitor is to connect. Click OK.

      After you have created new monitor profiles or monitor-over-firewall agents, select the monitors to add to the test, and click Add selected monitors/monitors over firewall .

    For user interface details, see Performance Test Designer > Monitors.

  7. Enable and configure J2EE/.NET diagnostics - optional

    Enable and configure diagnostic modules to collect J2EE/.NET diagnostics data from the test run. For details, see Diagnostics.

  8. Define service level agreements for the test - optional

    Define service level agreements to measure performance metrics against performance goals. For details, see Define service level agreements.

  9. Results

    When you save the test, it goes through a validation process. The test is valid only if it contains no errors. The result of the validation is stated at the bottom of the Test Designer window.

    If there are errors, click the link to open the Test Validation Results dialog box and view the details of the validation results. For user interface details, see Test Validation Results dialog box.

    Note: If you make changes to the test, and the test is linked to a timeslot, the timeslot is updated with these changes automatically.

  10. Add the test to a test set

    1. From the Performance Center navigation toolbar, click and select Test Management (under Testing).

    2. Click the Assigned Test Set drop-down arrow, and assign the test to an existing test set, or click Edit, and create a new test set in the Manage Test Set dialog box. For details, see Create, edit, or assign a test set.

Back to top