LoadRunner Enterprise and Jenkins
This section describes the Jenkins plugin, and how it enables you to use continuous performance testing in production. It enables you to run performance tests using LoadRunner Enterprise and to view LoadRunner Enterprise Trend reports.
Continuous integration with Jenkins overview
As more software companies utilize continuous integration practices, you may also need to integrate performance tests into your testing process. This integration helps developers insure that new builds did not introduce regressions.
The Micro Focus Application Automation Tools plugin for the Jenkins continuous integration server provides a mechanism for executing performance tests as part of a build script. This plugin allows you to trigger a LoadRunner Enterprise test as a build step and present the results in the Jenkin's user interface.
You can integrate performance tests which have service level agreements (SLAs). This allows you to quickly determine whether the test passed or failed and if performance was affected.
Set up a Jenkins server and install the plugin
This task describes how to install a Jenkins server and the CI plugin.
-
Install the Jenkins server.
For the supported server versions, see the Integration with non-Micro Focus products section of the System Requirements.
-
Install the Micro Focus Application Automation Tools plugin.
For details on downloading and installing this plugin, see the Application Automation Tools wiki page.
Note: The Jenkins plugin requires an administrator account.
Set up a Jenkins job to run tests in LoadRunner Enterprise
This task describes how to set up a Jenkins job to run performance tests.
-
Go to the Jenkins Server home page.
-
Click the New Job link or select an existing job.
-
Enter a Job name (for a new job).
-
Select Build a free-style software project and click OK.
-
In the Project Configuration section scroll down to the Build section.
-
Perform the following according to the plugin you are using:
Micro Focus Application Automation Tools Expand the Add build step drop-down, and select Execute tests using LoadRunner Enterprise.
Micro Focus LoadRunner Enterprise integration with Git Expand the Add build step drop-down and select Run Performance Test Using LoadRunner Enterprise.
-
(Optional) Enter a description of the build step.
-
Enter the hostname or IP address of a LoadRunner Enterprise server.
Example: If the LoadRunner Enterprise server URL is
http://MY_SERVER/loadtest
, enterMY_SERVER
. -
If you are running the Jenkins job over a secured LoadRunner Enterprise server, select the Use HTTPS protocol option.
-
Enter the server credentials, project, and domain.
-
Perform the following according to the plugin you are using:
Micro Focus Application Automation Tools Enter the Test ID. You can get the ID from LoadRunner Enterprise > Test Management > Test Lab > Performance Test Set view. If the column is not visible, you can select it by clicking the Select Columns button. Micro Focus LoadRunner Enterprise integration with Git In the Run Test section, you can either select to:
-
Run an existing test. If you run an existing test, you can provide a Test ID. You can get the ID from LoadRunner Enterprise > Test Management. Select your test and find the ID in the General Details: ID <number>.
-
Create a new test. If you create a new test, you can provide in the Test To Create field a text in YAML syntax representing a LoadRunner Enterprise test, or provide a path to a YAML file relative to the workspace. The parameters to be used in each case are different. See Create a LoadRunner Enterprise test from YAML input.
-
-
Select an option for adding the Test Instance ID:
Automatically select existing or create new if none exists If you select this option, LoadRunner Enterprise creates a test instance or locates the existing test instance. Manual selection Enter the Test Instance ID from the Assign and Select Test Set dialog box, available from LoadRunner Enterprise > Test Management > Performance Test Summary view > Assign To:. For details, see General Details Pane.
-
Choose whether to use a Local Proxy.
-
Choose a Post Run Action (Collate Results, Collate and Analyze, or Do Not Collate).
-
Select a trend report option:
Do Not Trend No trend report is created. Use trend report associated with the test If Auto Trending is selected in the Load Test, select this option to automatically publish trend results.
Add run to trend report with ID If you select this option, enter the trend report ID. -
Enter a duration for the Ad-Hoc timeslot. The minimum time is 30 minutes.
-
Choose whether to use VUDs licenses.
-
Choose whether to consider the SLA status for determining the build-step status. If you do not enable the Set step status according to SLA option, the build-step will be labeled as Passed as long as no failures occurred.
Run the job
Run or schedule the job as you would with any standard Jenkins job.
Configure trending report charts on Jenkins
This task describes how to configure trending report charts on Jenkins.
-
Install the plot plugin (it is available from Jenkins Plugins, https://plugins.jenkins.io/).
-
Open your job configuration, and add a new post build action: Plot build data.
-
Click Add Plot.
-
Configure the following settings and leave the other settings blank or unselected (see the Data Series files table below for setting details):
Name Description Plot group Enter a meaningful name for the group. For example, Performance Trending. Plot title Enter a name that is related to the measurement. For example, Average Transaction Response Time. Number of builds to include Enter the number of builds to include. We recommend no more than 10 builds. Plot y-axis label Enter the appropriate measurement unit. For example, Seconds. See the recommended unit for each measurement in the table below. Plot style Select the plot style. Line or Line 3D are the most suitable for trending. Data series file Enter one of the measurement types listed in the table below.
Tip: We recommend using the bolded measurements for trending because they compare like-for-like measurements to provide an indication of the application’s performance and workload.
Load data from csv file Select this check box. Include all columns Make sure this is selected. Display original csv above plot Select this check box. -
Repeat steps 1-4 to add more plots.
-
Save the configuration.
Type | Data Series File | Comment | Unit |
---|---|---|---|
TRT (Transaction Response Time) | pct_minimum_trt.csv | Minimum transaction response time | Seconds |
pct_maximum_trt.csv | Maximum transaction response time | Seconds | |
pct_average_trt.csv | Average transaction response time | Seconds | |
pct_median_trt.csv | Median transaction response time | Seconds | |
pct_percentile_90_trt.csv 90th | 90th percentile of transaction response time | Seconds | |
pct_stddeviation_trt.csv | Standard deviation transaction response time | Seconds | |
pct_count1_trt.csv | Number of occurrences of the transaction | Count | |
TPS (Transaction per seconds) | pct_minimum_tps.csv | Minimum transaction per second | Count |
pct_maximum_tps.csv | Maximum transactions per second | Count | |
pct_average_tps.csv | Average transactions per second | Count | |
pct_median_tps.csv | Median transactions per second | Count | |
pct_sum1_tps.csv | Total amount of transaction per second for a given transaction | Count | |
TRS (Transaction Summary) | pct_count1_trs.csv | Total amount of occurrences for a given transaction | Count |
UDP (User defined data point) | pct_minimum_udp.csv | Minimum value for a user defined data point | Unit |
pct_maximum_udp.csv | Maximum value for a user defined data point | Unit | |
pct_average_udp.csv | Average value for a user defined data point | Unit | |
pct_median_udp.csv | Median value for a user defined data point | Unit | |
pct_stddeviation_udp.csv | Standard deviation value for user defined data point | Unit | |
pct_count1_udp.csv | Number of occurrences for a given user defined data point | Count | |
pct_sum1_udp.csv | Sum of the values reported in a given user defined data point | Unit | |
VU (Running Vusers) | pct_maximum_vu.csv | Maximum number of running Vusers in the scenario | Count |
pct_average_vu.csv | Average number of running Vusers in the scenario | Count | |
WEB | pct_minimum_web.csv | Minimum value of web statistics (# of connections, throughput, hits per second, etc.) | Unit |
pct_maximum_web.csv | Maximum value of web statistics | Unit | |
pct_average_web.csv | Average value of web statistics | Unit | |
pct_median_web.csv | Median value of web statistics | Unit | |
pct_sum1_web.csv | Total value of web statistics | Unit |
Note: If you get a file does not exist error (“<file_name.csv> doesn’t match anything”), you can ignore this because the file will be created during the job execution.
Configure the LoadRunner Enterprise-Octane integration
You can bring performance test run results into ALM Octane using ALM Octane pipelines, and include them in the overall analysis of your product.
For details on configuring the integration, see Automated testing flow in the Octane Help Center.
See also: