Dashboard templates
ALM Octane provides several built-in dashboard templates with pre-configured widgets. Each template contains the widgets that are most relevant for specific roles, such as a project or QA manager.
Use a dashboard template
You can select a dashboard template from the toolbar, by clicking Dashboard Templates .
The widgets displayed in the templates use demo data, and not the data in your repository. To use a dashboard template with your repository, you must add the template to your own dashboard.
For details, see Dashboard.
Agile Quality dashboard
The Agile Quality dashboard is most suitable for QA managers who need to understand where the bottlenecks are in the testing process and to identify the problematic areas. The Agile Quality dashboard gives you the ability to:
- View quality metrics to track quality release criteria.
- Determine investment areas and testing coverage to optimize your testing strategy.
- Identify risk areas, allowing you to optimize and focus testing activities in high risk areas.
- Track automation stability and progress, and improve automation suites.
- Continuously improve your development cycle by identifying bottlenecks in QA processes and tracking quality trends across releases.
The Agile Quality dashboard contains the OpenText best practices and recommended set of KPIs for managing lifecycle management in Agile & DevOps. The widgets include tracking and insights needed by various personas, providing the visibility needed to manage, learn, and optimize the application delivery process.
The following KPIs are recommended:
KPI |
Description |
---|---|
Daily Defects Inflow/Outflow |
How many defects were fixed or opened on a given day? |
Quality by application module |
Which application modules have quality issues? Which areas do not fit my release quality criteria? |
Active defects across releases trend |
How many defects with high or critical severities existed per release? |
Risky commits per application module |
How many code changes were made per application module? How many of these code changes are considered risky commits? |
CI automation test per application module |
How many CI automated tests were executed in the selected timeline for each application module? |
Manual test execution per application module |
How many manual tests were executed in the selected timeline for each application module? |
Story points per application module |
What was the progress of the planned story points for each application module? |
Test strategy coverage per application module |
How many tests were used to cover each application module? What is the percentage of automation for each module? |
Open defects per application module |
How many defects are open for each application module? |
Main CI test failure trends |
What is the trend of failed test runs in the CI pipeline? How many of them failed due to instability? Are we handling test failures immediately or do we have unhandled tests that are continuously failing? |
Aging failed tests per application module |
Track your automation suite to identify tests failing for extended periods of time, per application module. |
Problematic tests |
Which are the problematic tests in my automation suite? Which type of problem is the most common? |
Manual to automation trend |
What was the trend of manual tests that were converted to automated tests? |
Trend of manual test runs in the release timeline |
What is the trend of manual test execution during the release timeline? |
Defect resolution time by phase across releases |
How much time did defects spend in each phase, across releases? Where were my bottlenecks? |
Planned content progress |
What was the progress of planned features per milestone, sprint, or release? |
Features without regression test coverage |
How many features which are already in the In Testing or Done phases do not have covering tests? |
Automated tests with long durations |
How many tests had a long duration, exceeding 5 minutes? |
Coverage per environment |
What is the latest status of test runs per environment? |
Escaped defects across releases |
What is the trend of escaping defects across releases? Are we improving quality? |
Release Management dashboard
The Release Management dashboard is most suitable for project managers who need to understand how things are progressing and if the projected dates are correct.
The dashboard contains the following widgets:
Widget | Description |
---|---|
Release cumulative flow per all phases |
Where are the bottlenecks in the development process? Are we handling too many tasks? |
Release forecast |
How much work was done? When are we forecasted to complete all the planned work? |
Feature status in order of rank |
How are my team’s features progressing? Are we working according to priority? |
Features WIP per team |
How many features are in-progress per team? is the work in progress balanced and according to the team capacity? |
Features with open vulnerabilities |
Which features have open vulnerabilities? A high number of open vulnerabilities indicates a major risk for the release. |
Velocity tracking |
How many user stories are we expected to complete? Are we working according to the expected velocity? |
Throughput through the release timeline |
What is the cumulative number of story points completed by each team throughout the release timeline? |
Feature cycle time by phase |
How much time did features spend in each phase? Where are my bottlenecks? Tip: This graph has insights, indicated by the Insights icon . For details, see Dashboard templates. |
Defect resolution time by phase |
How much time did defects spend in each phase? Where are my bottlenecks? Tip: This graph has insights, indicated by the Insights icon . For details, see Dashboard templates. |
Defect daily injection |
At what rate are defects being added or updated? Use this alongside the Release burn down graph to help you determine how much work should be planned for resolving defects vs. implementing user stories. |
Automation effort per team |
What is the effort of automation per team? Use quality stories to reflect the automation effort and measure the story points of automation per team. |
Active defects per team |
How many High & Critical open defects are there for each team? Is it balanced and in accordance with the team’s capacity? |
Cross Release Trends & Analytics dashboard
The Cross Release Trends & Analytics dashboard is most suitable for managers who need to understand trends and evaluate the development costs.
The dashboard contains the following widgets:
Widget | Description |
---|---|
Feature cycle time across releases | How does my average cycle time compare across releases? Am I seeing a positive or a negative trend? |
Feature cycle time by phase across releases | How much time did features spend in each phase, across releases? Where were my bottlenecks? |
Defect resolution time across releases | How does my average cycle time compare across releases? Am I seeing a positive or a negative trend? |
Defect resolution time by phase across releases | How much time did defects spend in each phase, across releases? Where were my bottlenecks? |
CI Automation failure trend over time | Track the trend of automation failure cause. What is the instability trend? What is the frequency of high number of failures which are regressions? Do we resolve test failures immediately? Do we have high number of continuously failing tests over time? |
Automation ROI | What is the ROI of automation? Does a high investment in automation result in improved key indicators such as speed, cost, and efficiency? |
Escaped defects across releases | What is the trend of escaping defects cross releases? Are we improving quality? |
Defects trend cross releases | Track the trend of high & critical defects. Does the trend show improvement? Do we manage to reduce defect backlog and improve quality? |
Done story points per team cross releases | Track the throughput of the teams cross releases. Is the velocity improved over time? Do we manage to deliver more content in a release timeline? |
Investment on content vs quality across releases | What is the percentage of done backlog items per release, per type? How much is invested on new content vs resolving defect and automation effort? |
Done story points per investment area cross releases | The done story points per investment area cross releases. |
User stories moving to Done trend | Track that the number of user stories reaching Done phase is streamlined over time. |
Commits trends | Are there streamline of commits over time? Track that there are not so many peaks in the number of commits as it may impact system stability. |
Defects Opened trend | How many defects are opened per day? Track that there are not so many peaks in the number of defects as it might impact system stability. Any correlation with the number of commits done? |
Aging defects per application module (>100 days) | How many defects are open for too many days without changing phase? Which application module has a high number of aging defects? |
See also: