Jenkins plug-in
This section describes the Jenkins plug-in, and how it enables you to use continuous performance testing in production. It enables you to run performance tests and to view OpenText Enterprise Performance Engineering Trend reports.
Jenkins continuous integration overview
As more software companies utilize continuous integration practices, you may also need to integrate performance tests into your testing process. This integration helps developers insure that new builds did not introduce regressions.
The Application Automation Tools plug-in for the Jenkins continuous integration server provides a mechanism for running performance tests as part of a build script. This plug-in allows you to trigger a performance test as a build step and present the results in the Jenkin's user interface.
You can integrate performance tests which have service level agreements (SLAs). This allows you to quickly determine whether the test passed or failed and if performance was affected.
Set up a Jenkins server and install the plug-in
This task describes how to install a Jenkins server and the CI plug-in.
-
Install the Jenkins server.
For the supported server versions, see the Integrations section of the System Requirements guide.
-
Install the OpenText Application Automation Tools plug-in.
For details on downloading and installing the plug-in, see the Application Automation Tools wiki page.
Note: The Jenkins plug-in requires an administrator account.
Set up a Jenkins job to run tests
This task describes how to set up a Jenkins job to run performance tests.
-
Go to the Jenkins Server home page.
-
Click the New Job link or select an existing job.
-
Enter a Job name (for a new job).
-
Select Build a free-style software project and click OK.
-
In the Project Configuration section scroll down to the Build section.
-
Perform the following according to the plug-in you are using:
Plug-in Description Application Automation Tools In the Add build step list, select Execute tests using OpenText Enterprise Performance Engineering.
OpenText Enterprise Performance Engineering integration with Git In the Add build step list, select Run Performance Test Using OpenText Enterprise Performance Engineering.
-
(Optional) Enter a description of the build step.
-
Enter the hostname or IP address of an OpenText Enterprise Performance Engineering server.
Example: If the server URL is
http://MY_SERVER/loadtest
, enterMY_SERVER
. -
If you are running the Jenkins job over a secured OpenText Enterprise Performance Engineering server, select the Use HTTPS protocol option.
-
Enter the server credentials, project, and domain.
Depending on the authentication type required by your server, credentials can be a user name and password, or an API key for SSO or LDAP authentication.
To use SSO or LDAP authentication, select Use Token For Authentication and enter the Client ID and API key secret obtained from your site administrator. For details, see Set up API access.
-
Perform the following according to the plug-in you are using.
Plug-in Description Application Automation Tools Enter the Test ID. You can get the ID from OpenText Enterprise Performance Engineering > Test Management > Test Lab > Performance Test Set view. If the column is not visible, you can select it by clicking the Select Columns button. OpenText Enterprise Performance Engineering integration with Git In the Run Test section, you can choose:
-
Run an existing test. Provide a Test ID. You can get the ID from OpenText Enterprise Performance Engineering > Test Management. Select your test and find the ID in the General Details: ID <number>.
-
Create a new test. In the Test To Create field, provide text in YAML syntax representing a performance test or a path to a YAML file relative to the workspace. The parameters used in each case are different. See Create a test from YAML input.
-
-
Select an option for adding the Test Instance ID:
-
Automatically select existing or create new if none exists: The existing test instance is located or a new test instance is created.
-
Manual selection: Enter the Test Instance ID from the Assign and Select Test Set dialog box. For details, see To edit a test set:.
-
-
Indicate whether to use a local proxy.
-
Choose a Post Run Action: Collate Results, Collate and Analyze, or Do Not Collate.
-
Select a trend report option.
Option Description Do Not Trend No trend report is created. Use trend report associated with the test If Auto Trending is selected in the Load Test, select this option to automatically publish trend results.
Add run to trend report with ID If you select this option, enter the trend report ID. -
Enter a duration for the Ad-Hoc timeslot. The minimum time is 30 minutes.
-
Choose whether to use VUFDs licenses.
-
Choose whether to consider the SLA status for determining the build-step status. If you do not enable the Set step status according to SLA option, the build-step is labeled as Passed provided that no failures occurred.
Run the job
Run or schedule the job as you would with any standard Jenkins job.
Configure trending report charts on Jenkins
This task describes how to configure trending report charts on Jenkins.
-
Install the plot plug-in from the Jenkins plug-ins site.
-
Open your job configuration, and add a new post build action: Plot build data.
-
Click Add Plot.
-
Configure the following settings and leave the other settings blank or unselected. For setting details, see the Data Series files tables.
Name Description Plot group Enter a meaningful name for the group. For example, Performance Trending. Plot title Enter a name that is related to the measurement. For example, Average Transaction Response Time. Number of builds to include Enter the number of builds to include. We recommend no more than 10 builds. Plot y-axis label Enter the appropriate measurement unit. For example, seconds. Plot style Select the plot style. Line or Line 3D are the most suitable for trending. Data series file Enter one of the measurement types listed in the following table.
Tip: We recommend using the bolded measurements for trending because they compare like-for-like measurements to provide an indication of the application’s performance and workload.
Load data from csv file Select this check box. Include all columns Make sure this is selected. Display original csv above plot Select this check box. -
Repeat steps 1-4 to add more plots.
-
Save the configuration.
Note: If you get a file does not exist error (“<file_name.csv> doesn’t match anything”), you can ignore this because the file is created during the job run.
Data Series files
TRT (Transaction Response Time)
Data Series File | Comment | Unit |
---|---|---|
pct_minimum_trt.csv | Minimum transaction response time | Seconds |
pct_maximum_trt.csv | Maximum transaction response time | Seconds |
pct_average_trt.csv | Average transaction response time | Seconds |
pct_median_trt.csv | Median transaction response time | Seconds |
pct_percentile_90_trt.csv 90th | 90th percentile of transaction response time | Seconds |
pct_stddeviation_trt.csv | Standard deviation transaction response time | Seconds |
pct_count1_trt.csv | Number of occurrences of the transaction | Count |
TPS (Transaction per seconds)
Data Series File | Comment | Unit |
---|---|---|
pct_minimum_tps.csv | Minimum transaction per second | Count |
pct_maximum_tps.csv | Maximum transactions per second | Count |
pct_average_tps.csv | Average transactions per second | Count |
pct_median_tps.csv | Median transactions per second | Count |
pct_sum1_tps.csv | Total amount of transactions per second for a specific transaction | Count |
TRS (Transaction Summary)
Data Series File | Comment | Unit |
---|---|---|
pct_count1_trs.csv | Total amount of occurrences for a specific transaction | Count |
UDP (User defined data point)
Data Series File | Comment | Unit |
---|---|---|
pct_minimum_udp.csv | Minimum value for a user defined data point | Unit |
pct_maximum_udp.csv | Maximum value for a user defined data point | Unit |
pct_average_udp.csv | Average value for a user defined data point | Unit |
pct_median_udp.csv | Median value for a user defined data point | Unit |
pct_stddeviation_udp.csv | Standard deviation value for user defined data point | Unit |
pct_count1_udp.csv | Number of occurrences for a specific user defined data point | Count |
pct_sum1_udp.csv | Sum of the values reported in a specific user defined data point | Unit |
VU (Running Vusers)
Data Series File | Comment | Unit |
---|---|---|
pct_maximum_vu.csv | Maximum number of running Vusers in the scenario | Count |
pct_average_vu.csv | Average number of running Vusers in the scenario | Count |
pct_maximum_vu.csv | Maximum number of running Vusers in the scenario | Count |
WEB
Data Series File | Comment | Unit |
---|---|---|
pct_minimum_web.csv | Minimum value of web statistics (such as # of connections, throughput, and hits per second) | Unit |
pct_maximum_web.csv | Maximum value of web statistics | Unit |
pct_average_web.csv | Average value of web statistics | Unit |
pct_median_web.csv | Median value of web statistics | Unit |
pct_sum1_web.csv | Total value of web statistics | Unit |
Configure the integration with OpenText Software Delivery Management
You can bring performance test run results into OpenText Software Delivery Management using OpenText Software Delivery Management pipelines, and include them in the overall analysis of your product.
For details on configuring the integration, see Automated testing flow in the OpenText Software Delivery Management Help Center.
See also: