Entity graph types

This section describes the graph types available for each entity type.

Requirement graphs

You can generate the following graphs for requirements:

Graph

Description

Requirements Coverage Trend Graph

Shows how many requirements are currently in an ALM project, according to their test coverage status.

Progress Graph

Shows how many requirements accumulated in an ALM project at specific points during a period of time.

Specify the time interval displayed along the x-axis, and the requirement information by which ALM groups the data. Specify whether you want to view the number of requirements or the change in the number of requirements.

Summary Graph

Shows how many requirements are currently in an ALM project.

Specify the type of data displayed along the x-axis, and the requirement information by which ALM groups the data.

Trend Graph

Shows the history of changes to specific requirement fields in an ALM project, for each time interval displayed.

Specify the field for which you want to view the number of changes, and the time period for which you want to view data.

Each status change is only recorded once for the purpose of this graph. For example, if a field was changed from Not Completed to Passed to Not Completed, the Not Completed status change will only be recorded once in this graph.

Back to top

Test graphs

You can generate the following graphs for tests:

Graph

Description

Progress Graph

Shows how many tests accumulated in an ALM project at specific points during a period of time.

Specify the time interval displayed along the x-axis and the test plan information by which ALM groups the data. Specify whether you want to view the number of tests or the change in the number of tests.

Summary Graph

Shows how many tests are currently in an ALM project.

Specify the type of data displayed along the x-axis, and the test plan information by which ALM groups the data.

Test Execution Status Trend Graph

Shows the history of execution status changes of tests in a project, at each point during a period of time.

Specify Test fields for which you want to view the number of changes, and the time period for which you want to view data.

Trend Graph

Shows the history of changes to specific Test Plan fields in an ALM project, for each time interval displayed.

Specify the field for which you want to view the number of changes, and the time period for which you want to view data.

Each status change is only recorded once for the purpose of this graph. For example, if a field was changed from Ready to Repair to Ready, the Ready status change will only be recorded once in this graph.

Back to top

Test instance graphs

You can generate the following graphs for test instances:

Graph

Description

Progress Graph

Shows how many tests accumulated in test sets at specific points during a period of time.

Specify the time interval displayed along the x-axis, and the test information by which ALM groups the data. Specify whether you want to view the number of tests or the change in the number of tests.

If you create the graph in the Test Lab module, you can choose whether to include only the current test set or all test sets.

Summary Graph

Shows how many tests in an ALM project belong to test sets.

Specify the type of data displayed along the x-axis, and the test plan and test in test set information by which ALM groups the data.

If you create the graph in the Test Lab module, you can choose whether or not to include only the current test set or all test sets.

Test Execution Treemap

Shows the test execution status of each test set under the filtered test set folder. A test execution treemap enables you to visualize the test sets differences from the perspectives of the failed ratio and total of test instances.

In a treemap, each rectangle represents a test set:

  • Number of rectangles. To configure which test sets are included in a treemap, you use the Test Set: Test Set Folder filter to select the test set folder under which all the test sets are included. See Configure test execution treemap graphs.

  • Rectangle size. Indicates the number of test instances in a test set.

  • Rectangle color. Indicates the failed ratio of test instances in a test set, that is the percentage of failed test instances to all executed test instances.

    • Red. By default, test sets with failed ratios higher than the default tolerant value (0%) or your custom tolerant value are displayed in red. In this failed ratio range, test sets with lower failed ratios are displayed in lighter red.

    • Green. By default, test sets with failed ratios equal to or lower than the default tolerant value or your custom tolerant value are displayed in green. In this failed ratio range, test sets with higher failed ratios are displayed in lighter green.

You can configure the treemap to define which types of statuses are grouped into failed and executed. See Configure test execution treemap graphs.

You can set your custom tolerant value and rectangle colors for the treemap. See Set treemap appearance.

Planned vs. Actual - Last Test Run

Composed of a Plan line, a bar for each period, and an Actual line.

  • Plan line. Shows the ideal test execution progress, assuming the same execution rate in each period. It indicates the accumulated number of test instances that are planned to be executed by each period.

  • Bars. The bars show the number of test instances that are executed during each period.

    If a test instance is run multiple times, only the last run counts and is reflected in the bar for the last period.

    For example, you ran 5 instances on Day 1 and reran 2 of them on Day 2. If you check the graph on Day 1, you see a bar with 5 test instances for Day 1. If you check the graph on Day 2, you see a bar with 3 test instances for Day 1 and a bar with 2 test instances for Day 2.

  • Actual line. Accumulates the number of test instances that have been completed by each period.

We recommend you use this graph if you are interested in actual progress only.

Note: This graph is named Test Execution - Planned vs. Actual in versions earlier than ALM 16.0.1.

Planned vs. Actual - Test Run History

Composed of a Plan line, a bar for each day, and an Actual line.

  • Plan line. Shows the ideal test execution progress, assuming the same execution rate per day. It indicates the accumulated number of test instances that are planned to be executed per day.

  • Bars. The bars show the number of test instances that are executed per day.

    If a test instance is run for multiple times, every run counts and is reflected in the bar of the corresponding day.

    For example, you ran 5 test instances on Day 1 and reran 2 of them on Day 2. If you check the graph on Day 1, you see a bar with 5 test instances for Day 1. If you check the graph on Day 2, you see a bar with 5 test instances for Day 1 and a bar with 2 test instances for Day 2.

  • Actual line. Accumulates the number of test instances that have been completed per day.

We recommend you use this graph if you are interested in both actual progress and actual daily workload.

Available in versions: 16.0.1 and later

Trend Graph

Shows the history of changes to specific Test Instances fields in a project, at each point during a period of time.

You specify the field for which you want to view the number of changes, and the time period for which you want to view data.

Back to top

Run graphs

You can generate the following graphs for test runs:

Graph

Description

Summary Graph

Shows the status of runs in the project, grouped by Tester.

Back to top

Test configuration graphs

You can generate the following graphs for test configurations:

Graph

Description

Progress Graph

Shows how many test configurations accumulated in a project at each point during a period of time. The number of test configurations is displayed according to the criteria that you specify.

You can specify the time interval displayed along the x-axis, and the test configuration information by which data is grouped. You can also specify whether you want to view the history of the selected data field, and whether you want to view the number of test configurations or the change in the number of test configurations.

Summary Graph

Shows how many test configurations are currently in a project. The number of test configurations is displayed according to the criteria that you specify.

You can specify the type of data displayed along the x-axis, and the test configuration information by which data is grouped.

Trend Graph

Shows the history of changes to specific Test Configuration fields in a project, at each point during a period of time.

You specify the field for which you want to view the number of changes, and the time period for which you want to view data.

Back to top

Defect graphs

You can generate the following graphs for defects:

Graph

Description

Age Graph

Shows the lifetime of defects in an ALM project. The lifetime of a defect begins when it is reported, and ends when it is closed.

Specify the defect information by which ALM groups the data, and the data displayed along the y-axis. Specify the time interval that you want to use to divide the data.

The age of a Closed defect is the difference between the date on which it was reported and the date on which it was closed. After a defect is closed, its age remains static.

Cycle Time by Phase Graph

Shows how long defects remained in each phase.

Specify the field that defines phases, the end phases, and the data to display on the y-axis.

Anomalies Graph

Shows how many defects remained in a specific phase for a specified duration. For example, how many defects remained in the Open status for more than 5 working days.

Specify the field that defines phases, the specific phase, the duration, and working days.

Progress Graph

Shows the accumulation of defects in an ALM project, or the estimated/actual amount of time taken to fix these defects, at specific points during a period of time.

Specify the time interval displayed along the x-axis, the defect information by which ALM groups the data, and the data displayed along the y-axis. Specify whether you want to view the number of defects or the change in the number of defects.

Summary Graph

Shows a summary of the number of defects in an ALM project, or the estimated/actual amount of time taken to fix these defects.

Specify the type of data displayed along the x-axis, the type of data displayed along the y-axis, and the defect information by which ALM groups the data.

Trend Graph

Shows the history of changes to specific defect fields in an ALM project, for each time interval displayed.

Specify the field for which you want to view the number of changes, and the time period for which you want to view data. Each priority change is only recorded once for the purpose of this graph. For example, if a field was changed from Urgent to Very High to Urgent, the Urgent priority change is recorded only once in this graph.

Back to top

Component graphs

You can generate the following graphs for business components:

Graph

Description

Summary Graph

Shows how many components are currently in the ALM project. The number of components is displayed according to the criteria that you specify.

Specify the type of data displayed along the x-axis, and the test plan information by which ALM groups the data. By default, the graph appears as a bar chart.

Progress Graph

Shows how many components accumulated in a project at each point during a period of time. The number of components is displayed according to the criteria that you specify.

Specify the time interval displayed along the x-axis, and the Business Components information by which data is grouped.

You can also specify whether you want to view the history of the selected data field, and whether you want to view the number of components or the change in the number of components.

Trend Graph

Shows the history of changes to specific component fields in a ALM project, at each point during a period of time.

Specify the field for which you want to view the number of changes, and the time period for which you want to view data. The graph can be viewed as a bar chart only.

Back to top

See also: