Define service level agreements (SLAs)

This topic explains how to define service level agreements (SLAs). SLAs measure performance test goals over time intervals during a test run, or over a whole performance test run.

SLA overview

Service level agreements (SLAs) are specific goals that you define for your performance test. During a test run, LoadRunner Enterprise measures performance and collects data. This data is compared against thresholds defined in the SLAs.

After a test run, LoadRunner Analysis compares these goals against performance related data that was gathered and stored during the course of the run, and determines whether the SLA passed or failed.

Determine the SLA status

Depending on the measurements that you are evaluating for your goal, LoadRunner Enterprise determines the SLA status in one of the following ways:

SLA status determined at time intervals over a timeline

Analysis displays SLA statuses at set time intervals over a timeline within the test run.

At each time interval in the timeline, for example, every 10 seconds, Analysis checks to see if the measurement's performance deviated from the threshold defined in the SLA.

Measurements that can be evaluated in this way:

  • Average Transaction Response Time

  • Errors per Second

SLA status determined over the whole run

Analysis displays a single SLA status for the whole test run.

Measurements that can be evaluated in this way:

  • Transaction Response Time - Percentile

  • Total Hits per run

  • Average Hits (hits/second) per run

  • Total Throughput (bytes) per run

  • Average Throughput (bytes/second) per run

APDEX status determined over the whole run

Analysis displays an APDEX (Application Performance Index) score for the whole test run. APDEX is an open standard to measure satisfaction with the response time. It is based on a ratio value of the number of satisfied, tolerating, and frustrated requests, against the total requests made, for each transaction. Each satisfied request counts as one request, while each tolerating request counts as half a satisfied request. The end score is a value between 0 and 1. The higher the value, the better the performance of the measured metric. Measurements that can be evaluated in this way:

  • Transaction Response Time (APDEX)

Note:  

  • You can define and edit SLAs in LoadRunner Enterprise or in LoadRunner Analysis.

  • For details about viewing post-run SLA statuses in LoadRunner Enterprise, see View SLA reports.

  • For details about viewing post-run SLA statuses in Analysis reports, see the LoadRunner Professional Help Center.

Back to top

Define SLAs

This section explains the workflow for defining service level agreements.

  1. Prerequisites

    Create a performance test. For details, see Design a test.

    Note: To define Average Transaction Response Time or Transaction Response Time Percentile SLAs, your performance test must include a script that contains at least one transaction.

  2. Open the SLA tab and select a measurement

    1. In the top banner, click the module name or the dropdown arrow and select Test Management > Tests & Scripts.

    2. Select a performance test in the test management tree and click Edit Test.

    3. In the Performance Test Designer window, select the SLA tab.

    4. Select a measurement for the SLA.

      Transaction Response Time

      • Transaction Response Time - Percentile. Measures the percentage of transactions whose transaction response time falls below a specific threshold.

      • Transaction Response Time - Average. Measures whether the average transaction response time of the transactions over a specified time interval exceeds the defined threshold.

      For details, see Define a Transaction Response Time - optional.

      Errors per Second

      Measures whether the errors per second over a specified time interval exceed the defined threshold. For details, see Define Errors per Second - optional.

      Measurements per Run

      • Total Hits. Measures whether the total number of hits over the whole test run reach the defined threshold.

      • Average Hits per Second. Measures whether the average hits per seconds over the whole test run reach the defined threshold..

      • Total Throughput. Measures whether the total throughput over the whole test run reaches the defined threshold.

      • Average Throughput. Measures whether the average throughput over the whole test run reaches the defined threshold.

      For details, see Define Measurements per Run - optional.

  3. Define a Transaction Response Time - optional

    1. Click Transaction Response Time.

    2. Select a measurement type:

      • Percentile. Measures the percentage of transactions whose transaction response time falls below a specific threshold. The SLA is measured over the whole run.

        If you select this option, enter the percentage of transactions to measure against the configured threshold.

      • Average. Measures whether the average transaction response time of the transactions over a specified time interval exceeds the defined threshold.

        If you select this option, select a load criterion to consider when evaluating the goal and define appropriate load value ranges for the load criterion. For details, see Select load criteria and define ranges.

    3. Select the transactions to include in your goal.

      Click Select Transactions to display the transactions in the scripts participating in the test. To add transactions critical to your test to the list of transactions in the main SLA list, select the transactions and click OK. The selected transactions are added to the list with their default threshold values.

      Note: If you delete a transaction from this list by deselecting it, and later add the same transaction, it is added with its default threshold value.

    4. Set thresholds for the measurements.

      You can also set threshold values for satisfactory response time and tolerable response time, for each transaction. To do this, select Enable APDEX, and define different thresholds for each transaction in the Satisfied, Tolerating, and Frustrated thresholds field in the thresholds table.

      Note: If the resulting value over the whole run exceeds the defined threshold values, the SLA produces a Failed status.

  4. Define Errors per Second - optional

    1. Click Errors per Second.

    2. Select a load criterion to consider when evaluating the goal and define appropriate load value ranges for the load criterion. For details, see Select load criteria and define ranges.

    3. Set thresholds for the measurements.

      Note: If the resulting value over the specified time interval exceeds the threshold values, the SLA produces a Failed status for that time interval.

  5. Define Measurements per Run - optional

    1. Click Measurements per Run.

    2. Set thresholds for each measurement type you want to include (Total Hits, Average Hits per Second, Total Throughput, Average Throughput).

      Note: If the resulting value over the whole run is lower than the threshold value, the SLA produces a Failed status.

  6. Results

    During post test run analysis, LoadRunner Analysis compares the data collected from the test run against the settings defined in the SLAs, and determines SLA statuses and APDEX scores which are included in the default Summary Report and the SLA Report.

    To learn more, see the Analysis section of the LoadRunner Professional Help Center.

Back to top

Select load criteria and define ranges

You can select a load criterion for your goal and define appropriate load value ranges.

Example: You can define the SLA to show the behavior of errors per second when there are less than 5 running Vusers, when there are between 5 and 10 running Vusers, and when there are 10 or more running Vusers.

  1. Select the load criterion to consider when evaluating the goal.

    None

    Does not consider any load criterion.

    Running Vusers

    Considers the impact of the running Vusers.

    Throughput

    Considers the impact of throughput.

    Hits per Second

    Considers the impact of the hits per second.

    Transactions per second

    Considers the impact of the transactions per second.

    Note: Available for Average Transaction Response Time and Errors per Second only.

    Transactions per second (passed)

    Considers the impact of the transactions per second that passed the evaluation.

    Note: Available for Average Transaction Response Time and Errors per Second only.

  2. Enter load values to consider when evaluating the goal.

    Value ranges must be consecutive, spanning all values from zero to infinity.

    Less than

    The lower range is always from 0 up to, but not including, the value entered here.

    Example: If you enter 5, the lower range is between 0 and 5, but does not include 5.

    Between

    The in-between ranges include the lower value of the range, but not the upper value. You can set up to three in-between ranges.

    Example: If you enter 5 and 10, the range is from 5 and up to, but not including, 10.

    Greater than

    The upper range is from the value entered here, and higher.

    Example: If you enter 10, the upper range is from 10 and up.

Back to top

View SLA reports

The SLA report displays the post-run SLA statuses of the SLA goals defined for the performance test.

Note: The SLA report is available only if SLAs were defined prior to running the performance test.

  1.  In the top banner, click the module name or the dropdown arrow and select Test Management > Tests & Scripts.

  2. Select a performance test in the test management tree, and click the Runs tab.

  3. Right-click a test run and select SLA Report.

  4. Choose the type of file  to which you want to export the selected section of the SLA report (Excel, PDF, CSV, Word).

  5. The details of the performance test run to which the SLA report data relates are displayed at the top of the report and the results for each SLA are displayed in separate grids:

    • Indicates a failed SLA status.

    • Indicates a passed SLA status.

    • Indicates that there is no data about the SLA status.

Back to top

See also: