Analysis and reporting

ALM Octane contains the information to analyze development quality. Track quality in the Backlog, Quality, Pipelines modules, or the Dashboard.

Tip: Also, with OData support, you can generate sophisticated reports that directly access ALM Octane entities to help with your analysis. For details, see OData support for extended reporting.

Analyze release quality

To analyze release quality, ensure you associate tests and defects with features, user stories, requirements, and defects, using:

  • The Backlog Coverage field in a test

  • The Covered requirement field in a test
  • The Feature field in a defect

Associating tests and defects enables ALM Octane to provide better quality analysis.

To analyze release quality, use any of the following:

Backlog module

Use different grid columns including:

  • Defect count

  • Risky commit count

  • Test Coverage

    The Test Coverage widget summarizes the last unique native run statuses for each test. If you run a test with the same configuration as a previous test run, ALM Octane updates the new result.

    Hover over the widget to open the detailed run results or click View runs link to view a filtered list of the last runs.

Backlog module, Overview tab and Dashboard module

View test run results by adding different widgets, including:

  • Feature quality status

  • Open defects by feature

  • Defect daily injection

To see a specific epic or feature, select the node in the backlog tree. In the Dashboard, add a filter to display items relevant to your release.

Backlog module, Features tab

ALM Octane displays a risk icon for features associated with risky commits. You may want to increase testing on these features or postpone their release.

For more details, see Identify risky commits and features at risk.

Note: Charts and graphs based on test runs display only the following summarized run statuses: Passed, Failed, Planned, Skipped, and Blocked. Each individual, native test run status falls into one of these categories. This is for clarity of analysis.

Back to top

Transition from release to product quality

When you first create tests, you usually create tests of a specific feature assigned to a release.

For example, you have a feature to improve the shopping cart area of the application . In the Backlog release tree, there is a feature node for the shopping cart. The feature contains user stories. Each user story has acceptance tests.

Example:  

Feature: Shopping Cart
   User Story: Adding to cart
      Test: Adding one item to cart
      Test: Adding multiple items to cart
      Test: Adding from wish list to cart

To track product quality, use the same tests. Group them in the context of an application module, such as product navigation:

Example:  

Application module: Navigation
   Sub-application module: Adding to cart
      Test: Adding one item to cart
      Test: Adding multiple items to cart
      Test: Adding from wish list to cart

You can also use the tests for other functional areas:

Example:  

Application module: Navigation
   Sub-application module: Adding to cart
      Test: Adding one item to cart
      Test: Adding multiple items to cart
      Test: Adding from wish list to cart
Application module: Multiple selections
      Test: Adding multiple items to cart
      Test: Deleting multiple items from cart

In doing this, you change from testing release quality to testing product quality.

Back to top

Analyze product quality

To analyze product quality, ensure that you associate features, user stories, defects, and tests with application modules.

ALM Octane uses the test results and defect analysis to paint a picture of application health across releases. Using these measures, you gain a global picture of your application's quality at a specified point.

To analyze product quality, use any of the following:

Quality module tabs

Use different grid columns including:

  • Defect count

  • Risky commit count

  • Last runs

    The Last runs widget summarizes the last unique native run statuses for each test. If you run a test with the same configuration as a previous test run, ALM Octane updates the new result.

    Hover over the widget to open the detailed run results or click View runs link to view a filtered list of the last runs.

Quality module, Overview tab and Dashboard module

View test run results by adding different widgets, including:

  • Open defects by application module

  • Run status by application module

  • Quality by application module

To see application module, select the node in the application module tree. In the Dashboard, add a filter to display items relevant to your application modules.

Back to top

Analyze build quality using pipeline data

ALM Octane collects information about pipeline runs, build failures, automated test runs, and SCM commits.

Link this information with your backlog and application modules to provide a comprehensive picture of the connection between your build quality, and your release and product progress and quality.

Track and analyze pipelines in Dashboard widgets and in the Pipelines module.

DevOps and Analytics dashboard widgets

In the Dashboard, select widgets from the DevOps and Analytics section or create custom graphs based on pipeline information.

Create a custom widget, and in Scope > Item type, select Pipeline runs, Test runs, Test run history (automated), or Commits.

If you add tags to your pipeline runs, you can use a graph to view pipeline runs, divided according to your tags.

Example:  

  1. Add tags to your pipeline runs that describe different reasons for pipeline run failure. For example,

    • Database problem

    • Broken builds

    • Out of memory

    • Merge issues

  2. Use the Pipeline runs by failure type widget to see how many of your pipeline runs are failing for each reason you listed.

Automated test run history widgets

A custom graph based on the item type Test runs displays only the tests' last runs.

To create a graph that includes automated test runs from the past, select the Item type: Test run history (automated).

  • The graph displays the number of automated test runs per category. The Y Axis is set to Count and you configure in the category in the X Axis. For example, you can see the number of runs per pipeline, environment, or release.

  • To view the list of test runs, click a column or category in the graph.

Filter the graph by Pipeline run to see the automated test runs from specific pipeline runs.

Caution: If you filter based on Latest pipeline run, you can filter by Pipeline as well, but do not filter by Pipeline Run.

Analyzing builds in the Pipelines module

In the Pipelines module, you can find several layers of information.

  • The Live Summary tab provides a high level live view of each pipeline's history, its current status, and its progress.

  • In the Pipelines tab, you can see all the pipelines that are being tracked, and filter to see the ones that interest you.

    You can see summary information about each pipeline's last completed run, and more detailed information about the selected pipeline.

  • You can open an individual pipeline or pipeline run to learn more about its status, its run history, related code changes, affected application modules, and more.

    You can also find analytic information about failed tests and tools to help you analyze failures.

For details, see Run pipelines.

Back to top

Performing quality analysis

You can check the quality of the application development.

In the Dashboard module, add widgets to assess quality:

Feature Quality Status

This enables you to see quality by comparing the total number of test runs and the status of each.

Open defects by severity

This enables you to see the problems based on open defects.

Features by risky commits

This enables you to recommend further regression testing before finishing the release.

Quality by application module

This graph enables you to see the quality of each area of the application.

You can configure the Criteria to use to determine the level of quality. For example, you can use the number of defects, percentage of failed tests, or percentage of risky commits.

Back to top

See also: