Dashboard templates

Several built-in dashboard templates are available with pre-configured widgets. Each template contains the widgets that are most relevant for specific roles, such as a project or QA manager.

Use a dashboard template

You can add widgets from a template to a new dashboard or an existing dashboard.

To use a dashboard template:

  1. Select a new or existing dashboard to which you want to add widgets from a template:

    New dashboard In the Dashboard module, on the toolbar, click + to open a new dashboard tab and select + Create new dashboard.
    Existing dashboard Open the existing dashboard and click + Add widget in the dashboard toolbar. In the Widget Gallery, click Dashboard Templates .
  2. Click a template to view a preview of the template.

    The template preview is read-only. The widgets include demo data, not your actual workspace data.

  3. Hover over each widget title to learn about the graphs, their configuration, and content. Some graphs have additional icons as described in the table below. Note that the icons are not active in the demo template.

    Icon Description
    Insights

    Indicates that the graph has insights that allow you to drill down to the graph's results.

    Choose columns Select which columns to display in this widget.
    Filter applied

    Indicates that a filter has been applied to this widget. Hover over the icon to view the filter.

  4. After you determine the template that fits your needs, select the widgets you want to include in the dashboard. You can select or deselect a widget by clicking the checkbox in its upper left corner.

    Tip: By default, all widgets in the template are selected. To clear the selection, click Clear selection in the lower left corner.

  5. Click Use template. The widgets you selected are added to the dashboard, and you can configure and filter each of the widgets. For details, see Configure widget settings.

Back to top

Value Stream Management template

The Value Stream Management template is most suitable for PMs and team leads. It includes widgets with a recommended set of KPIs that reflect the current value stream management status.

The widgets include measurements and insights that help you assess whether the development speed, quality, and flow are improving. This helps you evaluate how a value stream is performing.

Back to top

Agile Quality template

The Agile Quality template is most suitable for QA managers who need to understand where the bottlenecks are in the testing process and to identify the problematic areas. The Agile Quality dashboard gives you the ability to:

  • View quality metrics to track quality release criteria.
  • Determine investment areas and testing coverage to optimize your testing strategy.
  • Identify risk areas, allowing you to optimize and focus testing activities in high risk areas.
  • Track automation stability and progress, and improve automation suites.
  • Continuously improve your development cycle by identifying bottlenecks in QA processes and tracking quality trends across releases.

The Agile Quality template contains the OpenText best practices and recommended set of KPIs for managing lifecycle management in Agile & DevOps. The widgets include tracking and insights needed by various personas, providing the visibility needed to manage, learn, and optimize the application delivery process.

The following KPIs are included in this dashboard template:

KPI

Description

Daily Defects Inflow/Outflow

How many defects were fixed or opened on a given day?

Quality by application module

Which application modules have quality issues? Which areas do not fit my release quality criteria?

Active defects across releases trend

How many defects with high or critical severities existed per release?

Risky commits per application module

How many code changes were made per application module? How many of these code changes are considered risky commits?

CI automation test per application module

How many CI automated tests were executed in the selected timeline for each application module?

Manual test execution per application module

How many manual tests were executed in the selected timeline for each application module?

Story points per application module

What was the progress of the planned story points for each application module?

Test strategy coverage per application module

How many tests were used to cover each application module? What is the percentage of automation for each module?

Open defects per application module

How many defects are open for each application module?

Main CI test failure trends

What is the trend of failed test runs in the CI pipeline? How many of them failed due to instability? Are we handling test failures immediately or do we have unhandled tests that are continuously failing?

Aging failed tests per application module

Which tests in my automation suite are failing for extended periods of time, per application module?

Problematic tests

Which are the problematic tests in my automation suite? Which type of problem is the most common?

Manual to automation trend

What was the trend of manual tests that were converted to automated tests?

Trend of manual test runs in the release timeline

What is the trend of manual test execution during the release timeline?

Automated tests with long durations

How many tests had a long duration, exceeding 5 minutes?

Tip: You can change the default of 5 minutes by adjusting the Duration filter applied to the widget (enter a value in milliseconds).

Planned content progress

What was the progress of planned features per milestone, sprint, or release?

Features without regression test coverage

How many features are already in the In Testing or Done phases but do not have covering tests?

Defect resolution time by phase across releases

How much time did defects spend in each phase, across releases? Where were my bottlenecks?

Coverage per environment

What is the latest status of test runs per environment?

Escaped defects across releases

What is the trend of escaping defects across releases? Are we improving quality?

Back to top

Release Management template

The Release Management template is most suitable for project managers who need to understand how things are progressing and if the projected dates are correct.

This dashboard template contains the following widgets:

Widget Description

Release cumulative flow per all phases

Where are the bottlenecks in the development process? Are we handling too many tasks?

Release forecast

How much work was done? When are we forecasted to complete all the planned work?

Feature status in order of rank

How are my team’s features progressing? Are we working according to priority?

Features WIP per team

How many features are in-progress per team? Is the work in progress balanced and according to the team capacity?

Features with open vulnerabilities

Which features have open vulnerabilities? A high number of open vulnerabilities indicates a major risk for the release.

Velocity tracking

How many user stories are we expected to complete? Are we working according to the expected velocity?

Throughput through the release timeline

What is the cumulative number of story points completed by each team throughout the release timeline?

Feature cycle time by phase

How much time did features spend in each phase? Where are my bottlenecks?

Tip: This graph has insights, indicated by the Insights icon . For details, see Insight cards.

Defect resolution time by phase

How much time did defects spend in each phase? Where are my bottlenecks?

Tip: This graph has insights, indicated by the Insights icon . For details, see Insight cards.

Defect daily injection

At what rate are defects being added or updated? Use this alongside the Release burn down graph to help you determine how much work should be planned for resolving defects vs. implementing user stories.

Automation effort per team

What is the effort of automation per team? Use quality stories to reflect the automation effort and measure the story points of automation per team.

Active defects per team

How many High & Critical open defects are there for each team? Is it balanced and in accordance with the team’s capacity?

Back to top

Cross Release Trends & Analytics template

The Cross Release Trends & Analytics template is most suitable for managers who need to understand trends in quality, speed, and efficiency in order to evaluate development costs. This template also includes several widgets with DevOps Research and Assessment (DORA) metrics.

This dashboard template contains the following widgets:

Widget Description
Feature cycle time across releases How does my average cycle time compare across releases? Am I seeing a positive or a negative trend?
Feature cycle time by phase across releases How much time did features spend in each phase, across releases? Where were my bottlenecks?
Defect resolution time across releases How does my average cycle time compare across releases? Am I seeing a positive or a negative trend?
Defect resolution time by phase across releases How much time did defects spend in each phase, across releases? Where were my bottlenecks?
CI Automation failure trend over time Track the trend of automation failure cause. What is the instability trend? What is the frequency of high number of failures which are regressions? Do we resolve test failures immediately? Do we have high number of continuously failing tests over time?
Automation ROI What is the ROI of automation? Does a high investment in automation result in improved key indicators such as speed, cost, and efficiency?
Escaped defects across releases What is the trend of escaping defects across releases? Are we improving quality?
Defects trend cross releases Track the trend of High & Critical defects. Does the trend show improvement? Do we manage to reduce defect backlog and improve quality?
Done story points per team cross releases Track the throughput of the teams across releases. Is the velocity improved over time? Do we manage to deliver more content in a release timeline?
Investment on content vs quality across releases What is the percentage of done backlog items per release, per type? How much is invested on new content vs resolving defect and automation effort?
Done story points per investment area cross releases The done story points per investment area across releases.
User stories moving to Done trend Track that the number of user stories reaching Done phase is streamlined over time.
Commit trends Are there streamline of commits over time? Track that there are not so many peaks in the number of commits as it may impact system stability.
Defects Opened trend How many defects are opened per day? Track that there are not so many peaks in the number of defects as it might impact system stability. Any correlation with the number of commits done?
Aging defects per application module (>100 days)

How many defects are open for more than 100 days without changing phase? Which application module has a high number of aging defects?

Tip: You can change the default value of 100 days by adjusting the Days in phase filter applied to the widget.

DORA widgets

The Cross Release Trends & Analytics template includes some of the common DevOps Research and Assessment (DORA) metrics in several dashboard widgets. These metrics are relevant for release processes and release process items. Adjust the time range and change the resolution to days or weeks, depending on your methodology.

Widget Description
DORA: Deployment Frequency

How often do we release successfully to production? How many processes are completed per month in this deployment process?

Default: Last year

DORA: Lead Time for Changes in days

How long does it take for a code commit to reach the user in production? How long does it take to get from the In Progress phase to Done within Backlog items?

Default: BLIs in the last year

DORA: Mean Time to Recovery in days

How long does it take to recover from a service interruption in production? To see meaningful results, configure the filter to only show defects that represent service incidents.

Default: Last 3 months

DORA: Change Failure Rate

How often do changes lead to a failure in production that require a hotfix? To see meaningful results, set the first data set to represent the hotfixes in production and the second data set to represent the total number of regular deployments.

Default: Last year

Back to top

See also: