Data drive a test in ALM

Relevant for: GUI tests,API testing, and business process tests

This task provides a general overview of the steps involved in data driving a test with data stored in ALM. After you are familiar with the steps, you can perform many of them in the order you choose. Some steps may not be necessary in all cases.

Prerequisites

  1. Connect to ALM.

  2. Make sure that your test is saved in your ALM project.

  3. For GUI testing: Make sure you have a test that uses data table parameters from the Global sheet.

Back to top

Import data into a test (API testing only)

  1. In the Data pane, click the New Data Source button and select Excel.

  2. In the New/Change Excel Data Source Dialog Box, select the .xls or .xlsx file containing the data and select the Allow other tools to override the data option.

  3. Click OK to import the data source into your test.

Back to top

Data drive test steps (API testing only)

For details, see Assign data to API test/component steps.

Back to top

Export test iteration parameter data to Excel

For details, see Export component parameters to an Excel.

Back to top

Create a data resource file

  1. In ALM, in the Test Resources module, expand the Resources tree and select the required node.

  2. Select Resources > New Resource to add a resource under that node.

  3. In the New Resource dialog box:

    • In the Type list, select Data table.

    • In the Name box, enter a name for the data resource, for example, the name of the Microsoft Excel (.xls or .xlsx) file you plan to use.

    • Fill in the remaining fields (optional) and click OK to close the dialog box.

  4. In the Resource Viewer tab, click Upload File. Then browse to and upload the relevant .xls or .xlsx file.

    Tip: You can convert an internal data table from an open test to an uploadable data resource file by right-clicking the Data pane, selecting File > Export, saving the data table to the file system as an .xls or .xlsx file, and then uploading it as described above.

Back to top

Specify a default data table for new test configurations

  1. In the Parameters tab of the Test Plan module, select the data table resource you want to use as the default for all test configurations.

    For a GUI test, if you do not specify a data table resource, the data specified in the Resources pane of the Test Settings dialog box (File > Settings) is used instead.

    Note: In this tab, only the Parameter Name column is relevant for GUI tests.

  2. Click the Map Parameters button . In the Map Parameters dialog box, map the data table parameters (column headings) to the test parameters by entering the matching data table parameter names in the Resource Parameter Name column, as shown in the example below.

    • For an API test, type the Resource Parameter Name in the following format: <sheet name>.<column name>
    • For GUI tests, select the parameter from the list.

    All new configurations use the default mappings unless you specify otherwise in the Data tab of the Configurations tab.

Back to top

Define test configurations

Define test configurations for various run sessions. For each configuration, you specify whether to use the default resource file that you specified in the previous step, or whether to use a different data resource file.

  1. In the ALM Test Plan module, browse to and select the test to associate with your data table resource.

  2. With the test selected, click the Test Configurations tab. A default configuration is displayed in the grid. This configuration was created when your test was added to the ALM project.

  3. In the bottom pane of the Configurations tab, click the Data tab.

  4. In the Data tab, select the Override test data resource checkbox to select a different data resource file from the Test Resources module, or leave the checkbox blank to use the default resource file you selected in the Parameters tab in the previous step.

  5. In the Data Resource box, browse to and select the relevant data resource file to associate with this configuration. (Relevant only if you selected the Override test data resource checkbox)

  6. Click the Data Resource Settings button, and in the Filter Settings dialog box:

    • Map the data table parameters from your test to the column headers in the data table file (Relevant only if you selected a different data resource file in the previous step)

      For an API test, type the Resource Parameter Name in the following format: <sheet name>.<column name>

      For GUI tests, select the parameter from the list.

    • Apply filter conditions (text strings), as needed. You can apply one filter condition to each parameter.

    • Specify the rows on which to run iterations. For example, if you run a configuration named Gold, and users of this type are listed in rows 2-114, then specify these rows only.

    Note: If you apply filter conditions and specify rows, AND logic is used, meaning that the parameter value must equal the filter text value and the parameter value must be located in one of the specified rows.

Back to top

Link configurations to requirements

If you want to make sure that your requirements are fully covered, link them to configurations. This enables you to select configurations to run based on requirement coverage when you plan your run session.

  1. In the Test Plan module, click the Req Coverage tab.

  2. Click the Select Req button. The Requirement Tree tab is displayed in the right pane.

  3. From the Requirement Tree tab, select a requirement to add to the Req Coverage grid. When you add the requirement, the Add Advanced Coverage dialog box opens.

  4. Select the test configurations that cover this requirement.

Back to top

Run test configuration

  1. In OpenText Functional Testing, make sure that in the Tools > Options > GUITesting tab > Test Runs node, the Allow other products to run tests and components is selected.

  2. In the ALM Test Lab module, select or create a test set.

  3. In the right pane, select the Execution Grid tab.

  4. Click the Select Tests button to display the Test Plan Tree and Requirements Tree tabs in the right pane.

  5. Do one of the following to select the configurations to run:

    • From the Test Plan Tree tab, select a test to add to the Execution Grid. When you add the test, all of its configurations are added to the Execution Grid. (The test itself is not added to the Execution Grid because ALM runs configurations, not tests.)

    • Below the Test Plan Tree tab, expand the Test Configurations pane, and add the specific configurations that you want to run to the Execution Grid.

    • Below the Requirements Tree tab, expand the coverage pane, and select a test to add to the Execution Grid. When you add the test, all of its configurations are added to the Execution Grid. (The test itself is not added to the Execution Grid because ALM runs configurations, not tests.)

  6. Click the Run button to run the selected configurations.

  7. After the run session, click the Launch Report button in the Last Run Report tab to view the results.

Back to top