How to Design a Performance Test

This task describes how to design a performance test.

  1. Prerequisites

    • Make sure the relevant scripts/tests have been uploaded/saved to Performance Center. You can use VuGen scripts and JMeter scripts for performance testing, as well as UFT GUI tests, and UFT API tests.

      Script/Test Details
      VuGen Upload VuGen Scripts
      JMeter Set up the JMeter test
      UFT GUI tests Unified Functional Testing User Guide available from the UFT Help Center
      UFT API tests Unified Functional Testing User Guide available from the UFT Help Center
    • Under Lab Resources, select Testing Hosts and make sure that there is at least one Controller, one load generator, and one data processor in the host pool of your project. If not, contact your administrator to add them.

    • For optional pre-design best practices, see Performance Test Design Best Practices.

  2. Create a new performance test

    1. On the My Performance Center navigation bar, select Test Management > Test Plan.

    2. In the Test Plan Tree, select the Subject root folder, click New Folder . Type a folder name and click OK.

    3. Select the folder from the tree.

    4. Click New Test . Fill in the fields in the Create New Performance Test dialog box. For details, see Test Plan Module.

    Tip: To simplify the process of creating, designing, and running performance tests, you can use Test Express Designer to guide you through each step. For details, see Test Express Designer.

  3. Design a workload for the test

    Designing a workload involves creating Vuser groups, distributing Vusers among the Vuser groups, assigning hosts to the Vuser groups, and defining a run schedule for the test. For task details, see How to Define a Performance Test Workload.

    Note: Non-English national characters are not supported in group names.

  4. Integrate virtualized services - optional

    Configure and integrate virtualized services into the performance test. For task details, see How to Add Virtualized Services to Performance Tests.

    You can start adding projects that contain virtualization services to your performance test from the Performance Test Designer at any point in the design process, but we recommend adding projects after you have added relevant scripts to the test.

  5. Select a topology for the test - optional

    Note: Before you can select a topology for a test, you must design the topology. To design topologies, see How to Design Topologies.

    In the Performance Center Designer's Topology tab, click Select Topology and select a topology for the test. For user interface details, see Performance Test Designer > Topology.

  6. Select or create monitor profiles to monitor the test - optional

    In the Performance Center Designer's Monitors tab, click Add Monitor Profile or Add Monitor OFW. The respective tree opens on the right.

    • To select existing profiles or monitor-over-firewall agents, select the monitors to add to the test, and click Add selected monitors .

      Note: Before you can select monitors for a test, you must configure monitor machines and create monitor profiles. For details, see Create and configure monitor profiles.

      Similarly, you must define monitor-over-firewall agents in the system before you can select them to monitor a test.

    • To create a new monitor profile or monitor-over-firewall agent, click the Create Monitor or Create Monitor OFW button.

      For a monitor profile: In the Create Monitor Profile dialog box, enter a name and description, and select a folder for storing the profile. Click OK, and create a new monitor profile as described in Create and configure monitor profiles.

      For a monitor-over-firewall agent: In the Create Monitor Over Firewall dialog box, enter a name, the machine key, and select a folder for storing the profile and the MI Listener with which the monitor is to connect. Click OK.

      After you have created new monitor profiles or monitor-over-firewall agents, select the monitors to add to the test, and click Add selected monitors/monitors over firewall .

    For user interface details, see Performance Test Designer > Monitors.

  7. Enable and configure J2EE/.NET diagnostics - optional

    Enable and configure diagnostic modules to collect J2EE/.NET diagnostics data from the test run. For details, see Working with Diagnostics.

  8. Define service level agreements for the test - optional

    Define service level agreements to measure performance metrics against performance goals. For details, see How to Define Service Level Agreements.

  9. Results

    When you save the test, it goes through a validation process. The test is valid only if it contains no errors. The result of the validation is stated at the bottom of the Test Designer window.

    Click the link to open the Test Validation Results dialog box and view the details of the validation results. For user interface details, see Test Validation Results Dialog Box.

    Note: If you make changes to the test, and the test is linked to a timeslot, the timeslot is updated with these changes automatically.

  10. Add the test to a test set

    1. On the My Performance Center navigation bar, select Test Management > Test Lab.

    2. Click the button, and specify a name for the Test Set folder. Click OK.

    3. Select the folder you created above, and click the New Test Set button. The Create New Performance Test Set dialog box opens.

    4. Enter the test set name. Click OK. The test name is added to the tree. Select the test set and click OK.

    5. Assign the performance test to the test set.

    6. Click the Assign Test to Test Set button. The Assign Test to Test Set dialog box opens.

    7. Select your test and click OK.

Back to top

See also: