Performance Center and Jenkins
This section describes the Jenkins plugin, and how it enables you to use continuous performance testing in production. It enables you to run performance tests using Performance Center and to view Performance Center Trend reports.
As more software companies utilize continuous integration practices, you may also need to integrate performance tests into your testing process. This integration helps developers insure that new builds did not introduce regressions.
The Micro Focus Application Automation Tools plugin for the Jenkins continuous integration server provides a mechanism for executing performance tests as part of a build script. This plugin allows you to trigger a Performance Center test as a build step and present the results in the Jenkin's user interface.
You can integrate performance tests which have service level agreements (SLAs). This allows you to quickly determine whether the test passed or failed and if performance was affected.
Install the Jenkins server.
For the supported server versions, see the Integration with non-Micro Focus products section of the System Requirements.
Install the Micro Focus Application Automation Tools plugin.
For details on downloading and installing this plugin, see the Application Automation Tools wiki page.
Note: The Jenkins plugin requires an administrator account.
Go to the Jenkins Server home page.
Click the New Job link or select an existing job.
Enter a Job name (for a new job).
Select Build a free-style software project and click OK.
In the Project Configuration section scroll down to the Build section.
Micro Focus Application Automation Tools
Expand the Add build step drop-down, and select Execute tests using Performance Center.
Micro Focus Performance Center integration with Git
Expand the Add build step drop-down and select Run Performance Test Using Performance Center.
(Optional) Enter a description of the build step.
Enter the hostname or IP address of a Performance Center server.
Example: If the Performance Center server URL is
If you are running the Jenkins job over a secured Performance Center server, select the Use HTTPS protocol option.
Enter the server credentials, project, and domain.
Micro Focus Application Automation Tools Enter the Test ID. You can get the ID from My Performance Center > Test Management > Test Lab > Performance Test Set view. If the column is not visible, you can select it by clicking the Select Columns button. Micro Focus Performance Center integration with Git
In the Run Test section, you can either select to:
Run an existing test. If you run an existing test, you can provide a Test ID. You can get the ID from Performance Center > Test Management > Test Lab > Performance Test Set view. If the column is not visible, you can select it by clicking the Select Columns button.
Create a new test. If you create a new test, you can provide in the Test To Create field a text in YAML syntax representing a Performance Center test, or provide a path to a YAML file relative to the workspace. The parameters to be used in each case are different. See Create a Performance Center test from YAML input.
Automatically select existing or create new if none exists (Performance Center 12.55 or later) If you select this option, Performance Center creates a test instance or locates the existing test instance. Manual selection
Enter the Test Instance ID (available from Performance Center > Test Management > Test Lab > Performance Test Set view).
Choose whether to use a Local Proxy.
Choose a Post Run Action (Collate Results, Collate and Analyze, or Do Not Collate).
Select a trend report option:
Do Not Trend No trend report is created. Use trend report associated with the test (Performance Center 12.55 or later)
If Auto Trending is selected in the Load Test, select this option to automatically publish trend results.
Add run to trend report with ID If you select this option, enter the trend report ID (supported for version 12.53 or later).
Enter a duration for the Ad-Hoc timeslot. The minimum time is 30 minutes.
Choose whether to use VUDs licenses.
Choose whether to consider the SLA status for determining the build-step status. If you do not enable the Set step status according to SLA option, the build-step will be labeled as Passed as long as no failures occurred.
Run the job
Run or schedule the job as you would with any standard Jenkins job.
To view Performance Center Trend reports:
Install the plot plugin from https://wiki.jenkins.io/display/JENKINS/Plot+Plugin.
Open your job configuration, and add a new post build action: Plot build data.
Click Add Plot.
Configure the following settings and leave the other settings blank or unselected (see the Data Series files table below for setting details):
Name Description Plot group Enter a meaningful name for the group. For example, Performance Trending. Plot title Enter a name that is related to the measurement. For example, Average Transaction Response Time. Number of builds to include Enter the number of builds to include. We recommend no more than 10 builds. Plot y-axis label Enter the appropriate measurement unit. For example, Seconds. See the recommended unit for each measurement in the table below. Plot style Select the plot style. Line or Line 3D are the most suitable for trending. Data series file
Enter one of the measurement types listed in the table below.
Tip: We recommend using the bolded measurements for trending because they compare like-for-like measurements to provide an indication of the application’s performance and workload.
Load data from csv file Select this check box. Include all columns Make sure this is selected. Display original csv above plot Select this check box.
Repeat steps 1-4 to add more plots.
Save the configuration.
|Type||Data Series File||Comment||Unit|
|TRT (Transaction Response Time)||pct_minimum_trt.csv||Minimum transaction response time||Seconds|
|pct_maximum_trt.csv||Maximum transaction response time||Seconds|
|pct_average_trt.csv||Average transaction response time||Seconds|
|pct_median_trt.csv||Median transaction response time||Seconds|
|pct_percentile_90_trt.csv 90th||90th percentile of transaction response time||Seconds|
|pct_stddeviation_trt.csv||Standard deviation transaction response time||Seconds|
|pct_count1_trt.csv||Number of occurrences of the transaction||Count|
|TPS (Transaction per seconds)||pct_minimum_tps.csv||Minimum transaction per second||Count|
|pct_maximum_tps.csv||Maximum transactions per second||Count|
|pct_average_tps.csv||Average transactions per second||Count|
|pct_median_tps.csv||Median transactions per second||Count|
|pct_sum1_tps.csv||Total amount of transaction per second for a given transaction||Count|
|TRS (Transaction Summary)||pct_count1_trs.csv||Total amount of occurrences for a given transaction||Count|
|UDP (User defined data point)||pct_minimum_udp.csv||Minimum value for a user defined data point||Unit|
|pct_maximum_udp.csv||Maximum value for a user defined data point||Unit|
|pct_average_udp.csv||Average value for a user defined data point||Unit|
|pct_median_udp.csv||Median value for a user defined data point||Unit|
|pct_stddeviation_udp.csv||Standard deviation value for user defined data point||Unit|
|pct_count1_udp.csv||Number of occurrences for a given user defined data point||Count|
|pct_sum1_udp.csv||Sum of the values reported in a given user defined data point||Unit|
|VU (Running Vusers)||pct_maximum_vu.csv||Maximum number of running Vusers in the scenario||Count|
|pct_average_vu.csv||Average number of running Vusers in the scenario||Count|
|WEB||pct_minimum_web.csv||Minimum value of web statistics (# of connections, throughput, hits per second, etc.)||Unit|
|pct_maximum_web.csv||Maximum value of web statistics||Unit|
|pct_average_web.csv||Average value of web statistics||Unit|
|pct_median_web.csv||Median value of web statistics||Unit|
|pct_sum1_web.csv||Total value of web statistics||Unit|
Note: If you get a file does not exist error (“<file_name.csv> doesn’t match anything”), you can ignore this because the file will be created during the job execution.
You can bring performance test run results into ALM Octane using ALM Octane pipelines, and include them in the overall analysis of your product.
For details on configuring the integration, see Automated testing flow in the Octane Help Center.