Analysis (beta)

StormRunner Load's analysis capabilities provide a dashboard and an executive report that enable you to:

  • view and analyze test results for multiple runs of a load test
  • compare multiple metrics
  • view summary data in pre-defined tables

Where do I find it?

Do one of the following:

Analysis

  1. Navigate to the Analysis page.
  2. Click a load test.

Load Tests

  1. Navigate to the Load Tests page.
  2. Select a test and navigate to the Runs tab.
  3. For a selected test run, click (more options) and select Executive report.

Note: When you are in the Analysis page and have selected a load test, you can toggle between the dashboard and the executive report by clicking the relevant tab at the top of the page.

Back to top

Analysis dashboard

A dashboard view is contained in a tab. You can create multiple tabs for a load test, with each tab showing a different dashboard. For example, in tab 1 you can display a dashboard with data for runs one and two of a load test and in tab 2, you can display a different dashboard with data for runs three and four of the load test. You can create up to 20 different dashboards, which are persistent for the current user.

To create and work with tabs, do one of the following:

Action Method
Create a tab
  • Right-click an existing tab and select New tab.

  • Click .

The new tab that is created contains metrics for the latest run of the selected load test. If there is an active run of the load test (that is, it is currently running) then that is considered the latest run.

Rename a tab Right-click a tab and select Rename.
Duplicate a tab Right-click a tab and select Duplicate. The name of the new, duplicated tab is the same as the original tab name with the addition of -1. For example, if the original tab name is MyTab, the duplicated tab name is MyTab-1.
Close a tab
  • Right-click a tab and select Close.
  • Click the X next to the tab name.

Note:

  • A closed tab cannot be re-opened.

  • There must be at least one tab open.

A dashboard view comprises two panes. The left pane shows what data you are seeing and provides various configuration options. The right pane shows the actual data in a graph.

Configuration pane

The configuration pane displays a hierarchical tree of the metrics available for the selected load test run, as well as the values of those metrics you select to view in the graph. An additional column show the percentage of SLA breaks for a metric, when applicable.

You can configure load test runs and metrics.

Load test runs

By default, the following run is displayed in the dashboard:

  • The first time you access Analysis for a load test, the latest run of the selected load test is displayed. If there is an active run of the load test (that is, it is currently running) then that is considered the latest run.
  • When you subsequently access Analysis for the same load test, the last view displayed is reloaded.

You can change the run, or add additional runs. To add or remove a run from the dashboard, click and in the dialog box that opens, select the check boxes for the required runs. For each selected run a Value column is displayed in the tree, in which the values for the selected metrics, for a selected time point in the graph, are displayed.

Metrics

Metrics that are selected for the dashboard are denoted by . By default, the following metrics are selected:

  • Running Vusers

  • Hits per second

  • Throughput

  • Errors

Metrics that breach a configured SLA are further denoted as follows:

  • — an SLA was breached, but the breach did not cause the load test run to fail.

  • — an SLA was breached and the breach caused the load test run to fail.

The percentage of SLA breaks for the metric is displayed in the Break column.

Use the following actions to configure the metrics you want to include in the dashboard:

Action Description
Select or deselect a metric for viewing

To select a metric, click the empty cell to the left of the metric name. The icon denotes that a metric is selected.

To deselect a metric, click the icon next to the metric name.

Select or deselect sub-metrics by type

Right-click a metric name to display a menu showing the number of sub-metrics by type. For example, click a script name to show the number of related transactions, locations, emulations, and so forth. Select a metric type and then click Show or Hide to show or hide all the metrics of that type under the parent metric.

Note: When hiding sub-metrics, you can also select All to hide all the metrics under the parent metric, regardless of their type.

Search for a metric Enter a string in the search box and click the magnifying glass icon. The metrics whose name contains the string are displayed in the tree.
Open or close a metric Click the arrow next to a metric name to open or close the tree for that specific metric.
Open all selected metrics Click to open the tree for all selected metrics.
Deselect all metrics Click to deselect all metrics.
Change the source run for which metrics are displayed

The exact metrics associated with a load test can vary from run to run. For example, if the load test configuration is changed between runs. If you have selected multiple runs in a dashboard, the metrics that are displayed are the metrics associated with the most recent run. To change the source run:

Click next to the run number at the top of a Value column and select Select Metrics Source. In the dialog box that opens, select the run you want to use as the source run.

Resize value columns

Click next to the run number at the top of a Value column and select:

  • Autosize This Column (for one column)
  • Autosize All Columns (for all columns)
Highlight metrics

To highlight a metric, click on it. To highlight multiple metrics, use Ctrl+click or Shift+click.

The data lines for the selected metrics are also highlighted in the data graph.

Note: The number of selected metrics is displayed at the bottom of the configuration pane.

Data graph

The graph of the selected metrics is a line graph, with a line for each metric. The lines are color coded and you can see the color key in the Value column in the Configuration pane.

SLA breaches are shown as a red shaded area and the SLA threshold is denoted by a horizontal, dashed line.

The x-axis of the graph shows the load test run's time line and the y-axis shows the metric values. If you have selected multiple runs in the dashboard, by default the time line is:

  • If the selected runs do not include an active run, the time line is that of the longest run.

  • If the selected runs include an active run, the time line is that of the active run.

Use the following actions to customize the graph view:

Action Description
View a specific time

Drag the time indicator to a specific time in the run. The metric values displayed in the Value column in the Configuration pane are updated accordingly. The specific time selected is displayed in a time counter at the top left of the graph.

Zoom to a specific frame

The time frame for which results are displayed is shown at the top of the graph. You can change the time range using one of these options:

  • Drag, stretch, or shrink the time slider.
  • Place the cursor in the graph area and drag it left or right to select a time range.
Select a preset time frame

Note: This option is available for active runs only.

From the drop-down list to the right of the time slider, select a preset time frame to view. The time frame is the last selected number of minutes of the run.

Available options are 5, 15, 30, and 60 minutes. The default time frame is five minutes.

The graph is refreshed automatically every five seconds.

If you use the time slider to select a time frame, Manual appears in the preset box and the graph is no longer automatically refreshed.

Display a tooltip Hover over a data point to display a tooltip that shows the metric name, the value, and the ID run.
Enable or disable tooltips You can enable or disable tooltips by toggling the (Show point details) icon.
Select the graph scale

To change the scale of the graph, select one of the following options from the drop-down menu at the top right of the graph:

  • No scale. No scale is used for the Y axis and shows actual values.
  • Scale. Uses a linear scale for the Y axis and shows actual values.
  • Log scale. Uses a logarithmic scale for the Y axis. This can be helpful when the data covers a large range of values, as it reduces the range to a more manageable size.
Highlight lines

To highlight a line in the graph, click on it. To highlight multiple lines, use Ctrl+click.

The metrics for highlighted lines are also highlighted in the metrics tree in the Configuration pane.

Stop a test run

Note: This option is available for active runs only.

Click Stop Test to stop an active test run.

Back to top

Executive report

The Executive Report provides a two page executive summary of a load test's results for a quick, high-level overview.

The report is for a single run of a load test. By default, the last run of the selected load test is included in the report. To change the run included in the report, click and in the dialog box that opens, select the required run.

The report comprises the following four sections:

Summary

The Summary table contains general information about the load test run and can include the following fields:

Name Description
Date & Time The date and time the test ended.
Status

The status of the test.

  • Aborted. You stopped the test during initialization.
  • Stopped. You stopped the test and no SLAs were broken.
  • Failed. The test has ended and one or more SLAs were broken.
  • Halted. StormRunner Load stopped the test due to a security limitation.
  • Passed. The test has ended without any failed transactions.
  • System error. StormRunner Load was unable to run the test due to a server error.
Duration The duration of the test, including ramp up and tear down.
Vusers

The number of Vusers that ran the test.

Note: There is a row for each user type that ran the test.

Average Throughput The average amount of data received from the server every second.
Total Throughput The total amount of data received during the test.
Averge Hits Per Sec The average number of hits (HTTP requests) to the Web server per second.
Total Hits Per Sec The total number of hits (HTTP requests) to the Web server during the test.
Total Transactions Passed The total number of transactions that completed successfully.
Total Transactions Failed The total number of transactions that did not successfully complete.
Total Errors The total number of script errors.
Total Passed SLA The total number of SLAs that completed successfully.
Total Failed SLA The total number of SLAs that did not successfully complete.

Top 10 Transactions

This table shows data for the top ten transactions by frequency and duration and includes the following data:

Name Description
Transaction The transaction name
Script The script in which the transaction is included
%Breakers The percentage of transactions that broke an SLA threshold at any given time in the load test run.
SLA Status

The status of the SLA.

  • N/A. The transaction was not monitored.
  • Pass. The SLA was not broken.
  • Fail. The SLA was broken.
Avg. The average time it took for a transaction to complete.

Min.

The minimum time it took for a transaction to complete.
Max. The maximum time it took for a transaction to complete.
Std. The standard deviation for transaction response time.
Passed The number of transactions that ended with status Passed.
Failed The number of transactions that ended with status Failed.
90%th The average of the 90th percentile of transaction response time.
SLA Threshold The SLA defined in the Load Test > SLA pane for the transaction.
90% Trend

Compared to the last run, the percentage increase or decrease in the maximum transaction time, for the 90th percentile, of all transactions.

Example:

Run 1: 90th percentile is 5 seconds, 90th percentile trend is 0.

Run 2: 90 percentile is 6 seconds, 90th percentile trend is -0.20. (20% increase).

Hits and Throughput per second

This graph shows the following data throughout the test duration:

  • Running Vusers
  • Hits per second
  • Throughput

The average and maximum values for these metrics are displayed in a table under the graph.

Total Passed and Failed Transactions per second

This graph shows the following data throughout the test duration:

  • Running Vusers
  • Passed transactions per second
  • Failed transactions per second

The average and maximum values for these metrics are displayed in a table under the graph.

Back to top

See also: