Run tests on OpenText Enterprise Performance Engineering

You can run DevWeb performance tests from your IDE on OpenText Enterprise Performance Engineering. This enables you to scale up the number of Vusers.

Preparing for testing

You must have a valid license to execute a load test on OpenText Enterprise Performance Engineering, and to obtain information on available hosts.

Refer to the following information sources to help you prepare for testing:

  • To get information within your IDE on usage, available flags, and examples, run a command in the IDE using the following argument:

    ScalUP ent -help

Back to top

Run performance tests from your IDE

In your IDE, you define and run the task to execute the performance test.

For an example of running the task from Visual Studio Code, see Scale your test from VS Code

Note: The ScalUP tool uses the system environment variable for http_proxy, https_proxy, or no_proxy, if defined.

To run tests with any IDE:

  1. Create your DevWeb script, and test it in OpenText Performance Engineering for Developers.
  2. Configure the required settings in the ent_uploader.yml file. For details, see Configure the YAML file.
  3. Open the script in your IDE.
  4. Set up a task for the ScalUP tool to run the test.

    When setting up the task in your IDE, configure the arguments and flags for the command line using the following format.

    Windows %DEVWEB_PATH%\ScalUP.exe ent <flags> run <DevWeb script folder path>
    Linux or macOS $DEVWEB_PATH/ScalUP ent <flags> run <DevWeb script folder path>

    The following flags are available:

    -host= The OpenText Enterprise Performance Engineering host name with http/https protocol.
    -tenantId= The tenant ID of the user (if required for the environment).
    -domain= The domain name of the user.
    -project= The project name.
    -authenticationMode= The authentication type used to connect to the machine.
    -userName= (For basic authentication) The user name to use to log into the machine.
    -password= (For basic authentication) The password to use to log into the machine.
    -clientId= (For SSO authentication) The client ID required to log into the machine.
    -clientSecret= (For SSO authentication) The client secret required to log into the machine.
    -keyLocation= The full path to a key file used to decrypt encrypted strings in the configuration file.
    -testPlan= The script is uploaded to a test plan folder with this name.
    -testSet= The performance test is created under a test set with this name.
    -testSetFolder= The test set is created under a test set folder with this name.
    -report= The relative or root path to the location for the results report. If a path is not specified, then the report is not generated.

    All flags apart from report can be configured in the ent_uploader.yml file. If you add them in the command line, the flag definitions override the YAML file definitions.

    For example, you might add to the command line:

    -report=<report location> -user=KimW

    Note: We recommend that you do not add sensitive information like passwords in plain text. For details, see Mask and encrypt data.

  5. Run the task in the IDE.

    The DevWeb script folder is uploaded to the test plan folder configured in the ent_uploader.yml file (parameter testPlan). If a test plan name is not configured, the script is uploaded by default to AutoTestPlan.

    A performance test is created for the test in the same test set plan folder, using the label <Script Name>_PerformanceTest.

  6. The performance test is executed, using the configuration settings in the relevant uploader configuration file.

    When the test is run, the test instance is assigned to the test set and test set folder that are configured in the ent_uploader.yml file. If these parameters are not defined, then AutoTestSet and AutoTestSetFolder are used by default.

    The test run is given an ID number, for example, Test Id 184.

    The ID can be used together with the test name to identify the run in OpenText Enterprise Performance Engineering.

    You can watch the run progress from the OpenText Enterprise Performance Engineering UI.

  7. When the test is complete, OpenText Performance Engineering for Developers displays the summary report. If email notification is configured, the results are emailed.

    You do not need to wait for the task to complete. You can close it locally in your IDE, or the tool exits after reaching the timeout. In both these cases, the test continues to run on OpenText Enterprise Performance Engineering. You can later return to the test in your IDE to see if the run is complete and to get the results. For details, see Get test run results.

Back to top

Get test run results

You can run a task in your IDE to check the results of the performance test on OpenText Enterprise Performance Engineering. This option enables you to get the test results after OpenText Performance Engineering for Developers exited (before test execution was complete and results were available).

Tip:  

  • If you want the ScalUP tool to continue to run until the test is finished, configure the value -1 in waitForRunTimeout in the YAML file. For details, see Configure the YAML file.

  • In Visual Studio Code, you can use the Get test run results from OpenText Enterprise Performance Engineering task to get the results.

To get test run results:

  1. Configure the arguments for the command line to get the test run results. The following options are available (on Windows OS):

    Purpose Command
    Results for the last run for the specified test. %DEVWEB_PATH%\ScalUP.exe ent <flags> getResults <script folder path>

    Results for the specified run ID.

    The applied login information is taken from the ent_uploader.yml file located in the same folder as ScalUP.exe.

    %DEVWEB_PATH%\ScalUP.exe ent <flags> getResults <test run ID in OpenText Enterprise Performance Engineering>

    Results for the specified run ID.

    The ent_uploader.yml file located in the specified script directory overrides the one located with ScalUP.exe.

    %DEVWEB_PATH%\ScalUP.exe ent <flags> getResults <test run ID in OpenText Enterprise Performance Engineering> <script folder path>

    Note: On Linux or macOS, use $DEVWEB_PATH/ScalUP instead of %DEVWEB_PATH%\ScalUP.exe.

  2. Run the task. The progress pane indicates if the load test is still running, or shows the summary report if it has finished.
  3. If the test is still running, you can return later and run the getResults task again.

Back to top

Configure the YAML file

The settings in the primary ent_uploader.yml configuration file (located in the same directory as the ScalUP executable) apply for all scripts that you execute on OpenText Enterprise Performance Engineering.

For local configuration, you can copy the ent_uploader.yml file to a specific script directory, and customize the file settings for use with that specific script. During the script run, the settings in the local ent_uploader.yml file override the settings in the original (primary) file.

The ent_uploader.yml file contains customizable default values for the script run, with explanations for each parameter. These settings include:

environment

These are required values for communication with the OpenText Enterprise Performance Engineering machine. These settings can be defined in the ent_uploader.yml file, or in the command line flags. (For details, see Run performance tests from your IDE.)

  • The authenticationMode key enables you to define the authentication type used to connect to the machine: basic, or accessKey for SSO authentication.

  • If basic is defined, enter the user name and password in the basic section.

    The Password key can be deleted from the file, and instead entered during runtime.

  • If accessKey is defined, enter the client ID and client secret in the accessKey section.

Note: We recommend that you do not add sensitive information like passwords and client secret in plain text. For details, see Mask and encrypt data.

workspace

Define the folder structure to use for the performance test. Do this by defining names for the following parameters: testPlan, testSetFolder, and testSet.

If the names are not defined, then the default name, AutoTest, is used at runtime, for example, AutoTestPlan.

hosts/controller Define the load generators and Controller machine to use for the test. For details, see Define hosts and Controller for tests.
trend report

Use to add the test run data to a trend report in OpenText Enterprise Performance Engineering. The name you define indicates whether to add to an existing or new trend report:

  • Define a unique name to create a new trend report.

  • Define the name of an existing trend report, so that the test run data is added to that report.

If required, you can define start and end points within the test run, so that the trend data is collected from between these points.

To ensure the test run data is added to the trend report, you must do one of the following:

  • Keep the task open in ScalUP until the test completes (set waitForRunTimeout to -1).

  • Run the getResults task after the test completes. See Get test run results.

encryption For encrypted data, define the key location. For details, see Mask and encrypt data.
Scenario and general settings These include parameters for the number of Vusers to run, ramp up time, duration, and logging.
waitForRunTimeout

Define the number of seconds for OpenText Performance Engineering for Developers to wait while checking the status of the test run. After the time elapses, the program exits, but the test continues to run on OpenText Enterprise Performance Engineering.

The default is 10 seconds. You can use -1 to wait continuously until the run is complete.

You can later check status again. For details, see Get test run results.

Script exclude This is a list of regular expressions defining files to exclude from the script when it is uploaded to OpenText Enterprise Performance Engineering.

Back to top

Define hosts and Controller for tests

You can configure the ent_uploader.yml file to instruct OpenText Enterprise Performance Engineering on which load generator hosts (on-premises or cloud-based) to use for your performance tests. You can also define a specific Controller machine to use for the test.

If you choose not to define hosts and/or a Controller machine in ent_uploader.yml, the performance test is run using the following rules in OpenText Enterprise Performance Engineering:

  • For a new test, the Controller machine and one load generator are automatically selected to run the test.

  • For a test that already exists in OpenText Enterprise Performance Engineering, nothing is changed in the existing load generator configuration for the test.

To define load generators and Controller for tests:

  1. To obtain a table listing the load generator hosts and host attributes available to use when defining the test run, execute the relevant command from the DEVWEB_PATH:

    List on-prem hosts ScalUP.exe ent info hosts
    List cloud-based host templates ScalUP.exe ent info cloudTemplates
    List host attributes ScalUP.exe ent info attributes

    Note: If one of the listed hosts is not available at the time of the test run (because it has been scheduled for another test), then the test will not run on that load generator.

  2. Open the ent_uploader.yml file (primary file, or local version for the script).
  3. In the hosts section, uncomment and update each type area as relevant, to define the load generators to run the test. You can define more than one type, in any combination:

    Parameter Description
    type: specific

    Add specific on-premises hosts.

    Define the name of the host in the options: name property, by copying it from the on-prem hosts table. Add additional hosts by repeating the type: specific parameters.

    For example:

    type: automatch

    Instruct OpenText Enterprise Performance Engineering to automatically select on-premises hosts for the run.

    • options: amount. Define the number of hosts to select. For example, amount:10.

      You can define both an automatch amount, and specifically named hosts or cloud hosts, up to the limit of available hosts.

    • options: attributes. You have the option to include items from the attributes list, so that the automatch selects hosts that are assigned those attributes. For example, Host memory:High.

    type: cloud

    Add host templates, used by OpenText Enterprise Performance Engineering to provision your cloud-based load generators.

    Define the name of the template in the options: name property, by copying valid templates from the cloud templates table. Add additional templates by repeating the type: cloud parameters.

    Note: Use of cloud-based hosts is supported for OpenText Performance Engineering for Developers and OpenText Enterprise Performance Engineering.

  4. If you want to define a specific Controller to run the test, uncomment the controller section and specify the machine name.

  5. Save the file.

    When you run the performance test, any hosts/Controller defined for the test within OpenText Enterprise Performance Engineering are overwritten by your definitions in ent_uploader.yml.

Back to top

Scale tests FAQ

Review the frequently asked questions before running your performance test.

Is a performance test run from the integration the same as a test run from the OpenText Enterprise Performance Engineering UI?

Yes—when a performance test is executed for the integration using the ScalUP tool, it is run like any other OpenText Enterprise Performance Engineering test.

How is my performance test created in OpenText Enterprise Performance Engineering the first time I run the test?

When you execute your test, the script is uploaded to the folder defined in ent_uploader.yml; or, if the folder is not defined, to the AutoTestPlan folder. It also creates a performance test using the label <Script Name>_PerformanceTest.

What happens if I make changes to the test in OpenText Enterprise Performance Engineering?

You can make configuration changes to the test in the OpenText Enterprise Performance Engineering UI. When you execute the test again from OpenText Performance Engineering for Developers, some of your configuration items might be overwritten.

How can I change the script schedule settings?

You can change schedule settings, for example for number of Vusers or duration, in the scenario section of the ent_uploader.yml configuration file. When you execute the test, these settings overwrite the settings in the OpenText Enterprise Performance Engineering UI.

How can I add additional groups to my performance test?

Whenever you execute the test from OpenText Performance Engineering for Developers, only the uploaded group is used for the test. You cannot run multiple groups.

How can I configure a specific load generator for my performance test?

You can configure load generators in the hosts section of the ent_uploader.yml file. For details, see Define hosts and Controller for tests.

Back to top

See also: