Jenkins plug-in
This section describes the Jenkins plug-in, and how it enables you to use continuous performance testing in production. It enables you to run performance tests using LoadRunner Enterprise and to view LoadRunner Enterprise Trend reports.
Jenkins continuous integration overview
As more software companies utilize continuous integration practices, you may also need to integrate performance tests into your testing process. This integration helps developers insure that new builds did not introduce regressions.
The Application Automation Tools plug-in for the Jenkins continuous integration server provides a mechanism for running performance tests as part of a build script. This plug-in allows you to trigger a LoadRunner Enterprise test as a build step and present the results in the Jenkin's user interface.
You can integrate performance tests which have service level agreements (SLAs). This allows you to quickly determine whether the test passed or failed and if performance was affected.
Set up a Jenkins server and install the plug-in
This task describes how to install a Jenkins server and the CI plug-in.
-
Install the Jenkins server.
For the supported server versions, see the Integrations section of the System Requirements guide.
-
Install the OpenText Application Automation Tools plug-in.
For details on downloading and installing the plug-in, see the Application Automation Tools wiki page.
Note: The Jenkins plug-in requires an administrator account.
Set up a Jenkins job to run tests in LoadRunner Enterprise
This task describes how to set up a Jenkins job to run performance tests.
-
Go to the Jenkins Server home page.
-
Click the New Job link or select an existing job.
-
Enter a Job name (for a new job).
-
Select Build a free-style software project and click OK.
-
In the Project Configuration section scroll down to the Build section.
-
Perform the following according to the plug-in you are using:
Application Automation Tools Expand the Add build step drop-down, and select Execute tests using LoadRunner Enterprise.
LoadRunner Enterprise integration with Git Expand the Add build step drop-down and select Run Performance Test Using LoadRunner Enterprise.
-
(Optional) Enter a description of the build step.
-
Enter the hostname or IP address of a LoadRunner Enterprise server.
Example: If the LoadRunner Enterprise server URL is
http://MY_SERVER/loadtest
, enterMY_SERVER
. -
If you are running the Jenkins job over a secured LoadRunner Enterprise server, select the Use HTTPS protocol option.
-
Enter the server credentials, project, and domain.
Depending on the authentication type required by your LoadRunner Enterprise server, credentials can be a user name and password, or an API key for SSO or LDAP authentication.
To use SSO or LDAP authentication, select Use Token For Authentication and enter the Client ID and API key secret obtained from your LoadRunner Enterprise site administrator. For details, see Set up API access.
-
Perform the following according to the plug-in you are using:
Application Automation Tools Enter the Test ID. You can get the ID from LoadRunner Enterprise > Test Management > Test Lab > Performance Test Set view. If the column is not visible, you can select it by clicking the Select Columns button. LoadRunner Enterprise integration with Git In the Run Test section, you can either select to:
-
Run an existing test. If you run an existing test, you can provide a Test ID. You can get the ID from LoadRunner Enterprise > Test Management. Select your test and find the ID in the General Details: ID <number>.
-
Create a new test. If you create a new test, you can provide in the Test To Create field a text in YAML syntax representing a LoadRunner Enterprise test, or provide a path to a YAML file relative to the workspace. The parameters to be used in each case are different. See Create a LoadRunner Enterprise test from YAML input.
-
-
Select an option for adding the Test Instance ID:
Automatically select existing or create new if none exists If you select this option, the existing test instance is located or a new test instance is created.
Manual selection Enter the Test Instance ID from the Assign and Select Test Set dialog box. For details, see To edit a test set:.
-
Choose whether to use a Local Proxy.
-
Choose a Post Run Action (Collate Results, Collate and Analyze, or Do Not Collate).
-
Select a trend report option:
Do Not Trend No trend report is created. Use trend report associated with the test If Auto Trending is selected in the Load Test, select this option to automatically publish trend results.
Add run to trend report with ID If you select this option, enter the trend report ID. -
Enter a duration for the Ad-Hoc timeslot. The minimum time is 30 minutes.
-
Choose whether to use VUFDs licenses.
-
Choose whether to consider the SLA status for determining the build-step status. If you do not enable the Set step status according to SLA option, the build-step is labeled as Passed provided that no failures occurred.
Run the job
Run or schedule the job as you would with any standard Jenkins job.
Configure trending report charts on Jenkins
This task describes how to configure trending report charts on Jenkins.
-
Install the plot plug-in from the Jenkins plug-ins site.
-
Open your job configuration, and add a new post build action: Plot build data.
-
Click Add Plot.
-
Configure the following settings and leave the other settings blank or unselected (see the Data Series files table below for setting details):
Name Description Plot group Enter a meaningful name for the group. For example, Performance Trending. Plot title Enter a name that is related to the measurement. For example, Average Transaction Response Time. Number of builds to include Enter the number of builds to include. We recommend no more than 10 builds. Plot y-axis label Enter the appropriate measurement unit. For example, Seconds. See the recommended unit for each measurement in the table below. Plot style Select the plot style. Line or Line 3D are the most suitable for trending. Data series file Enter one of the measurement types listed in the table below.
Tip: We recommend using the bolded measurements for trending because they compare like-for-like measurements to provide an indication of the application’s performance and workload.
Load data from csv file Select this check box. Include all columns Make sure this is selected. Display original csv above plot Select this check box. -
Repeat steps 1-4 to add more plots.
-
Save the configuration.
Type | Data Series File | Comment | Unit |
---|---|---|---|
TRT (Transaction Response Time) | pct_minimum_trt.csv | Minimum transaction response time | Seconds |
pct_maximum_trt.csv | Maximum transaction response time | Seconds | |
pct_average_trt.csv | Average transaction response time | Seconds | |
pct_median_trt.csv | Median transaction response time | Seconds | |
pct_percentile_90_trt.csv 90th | 90th percentile of transaction response time | Seconds | |
pct_stddeviation_trt.csv | Standard deviation transaction response time | Seconds | |
pct_count1_trt.csv | Number of occurrences of the transaction | Count | |
TPS (Transaction per seconds) | pct_minimum_tps.csv | Minimum transaction per second | Count |
pct_maximum_tps.csv | Maximum transactions per second | Count | |
pct_average_tps.csv | Average transactions per second | Count | |
pct_median_tps.csv | Median transactions per second | Count | |
pct_sum1_tps.csv | Total amount of transactions per second for a specific transaction | Count | |
TRS (Transaction Summary) | pct_count1_trs.csv | Total amount of occurrences for a specific transaction | Count |
UDP (User defined data point) | pct_minimum_udp.csv | Minimum value for a user defined data point | Unit |
pct_maximum_udp.csv | Maximum value for a user defined data point | Unit | |
pct_average_udp.csv | Average value for a user defined data point | Unit | |
pct_median_udp.csv | Median value for a user defined data point | Unit | |
pct_stddeviation_udp.csv | Standard deviation value for user defined data point | Unit | |
pct_count1_udp.csv | Number of occurrences for a specific user defined data point | Count | |
pct_sum1_udp.csv | Sum of the values reported in a specific user defined data point | Unit | |
VU (Running Vusers) | pct_maximum_vu.csv | Maximum number of running Vusers in the scenario | Count |
pct_average_vu.csv | Average number of running Vusers in the scenario | Count | |
WEB | pct_minimum_web.csv | Minimum value of web statistics (such as # of connections, throughput, and hits per second) | Unit |
pct_maximum_web.csv | Maximum value of web statistics | Unit | |
pct_average_web.csv | Average value of web statistics | Unit | |
pct_median_web.csv | Median value of web statistics | Unit | |
pct_sum1_web.csv | Total value of web statistics | Unit |
Note: If you get a file does not exist error (“<file_name.csv> doesn’t match anything”), you can ignore this because the file is created during the job run.
Configure the integration with ALM Octane
You can bring LoadRunner Enterprise performance test run results into ALM Octane using ALM Octane pipelines, and include them in the overall analysis of your product.
For details on configuring the integration, see Automated testing flow in the ALM Octane Help Center.
See also: