AI-based testing (tech preview)

Supported for web and mobile testing

This topic explains how to use Artificial Intelligence (AI) Features in your UFT Developer tests to identify objects the way a person would. This enables you to run the same test on different platforms and versions, regardless of the objects' implementation.

AI-based testing overview

Using AI, UFT Developer identifies objects visually, as humans do. Object descriptions can include the control type, associated text, and, if there are multiple identical objects, an ordinal position.

Some advantages of AI-based object identification are: 

  • Tests are technology agnostic, identifying objects visually, regardless of the technology details used behind the scenes.

  • Tests are easier to maintain. An object that changes location, framework, or even shape, is still identified, as long as the object remains visually similar or its purpose remains clear.

Using AI in UFT Developer is computationally intensive. We recommend using a powerful computer to benefit from UFT Developer's AI Features and achieve optimal performance. For recommended system requirements, see Support Matrix.

AI-based testing is supported in Java and JavaScript tests. For details, see the Java and JavaScript SDK references. Describe AI test objects and then use them in your tests. You can use AI-based test objects and technology, property-based, test objects in the same test.

Run the AI Object Inspection interface to inspect your application, or mockup images of an application, and identify objects. Create the steps you want to test, and automatically generate code to add to your test.

You can open AI Object Inspection from the Windows Start menu or from the UFT Developer menu in your IDE (Eclipse or IntelliJ IDEA).

UFT Developer 2023 or earlier: AI Object Inspection is called AI Inspection.

Back to top

Prerequisites

To use UFT Developer's AI capabilities, the following requirements must be met: 

  • UFT Developer is installed on a computer with a relevant OS and Windows feature installed. For details, see AI-based testing.

  • AI Features are enabled in the UFT Developer runtime engine settings, in the AI tab. This setting is enabled by default.

Prepare your test project and create a web or mobile test.

To start identifying AI objects to use in your test, continue with one of the following tasks:

Back to top

Inspect your application for objects

Use AI Object Inspection to identify objects in your application that you can use in your test steps.

This section describes the prerequisites for inspection, the inspection process, and how you can provide feedback on the identification, helping to shape the future of AI-based testing.

Prerequisites

Testing an application on a mobile device
  1. Make sure the Mobile Add-in is installed and loaded.

    To test web applications on your mobile device, make sure the Web Add-in is installed and loaded.

  2. Connect to Digital Lab (UFT Mobile) and open your mobile device and the application you want to test. For details, see Create mobile tests.

Testing a desktop web application
  1. Make sure the Web Add-in is installed and loaded.

  2. Make sure that your browser is set up with the OpenText UFT Agent extension. For details, see Set up web browsers and controls.

    UFT Developer 2023 and earlier: The extension is named Micro Focus UFT Agent.

  3. Open the application.

    For best identification results:

    • Maximize the browser window and set the zoom level to 100%.
    • Set your Windows display settings to display all text and apps at 100%.
    • Make sure no banners are displayed in the browser aside from your web page.

Supported browsers: Chrome, Edge, Firefox, Internet Explorer.

Headless browsers are not supported.

Identify all AI objects displayed in your application

  1. Open the AI Object Inspection window to inspect your application and detect all AI objects in it.

    To open the window, do one of the following: 

    • From the Windows Start menu, select AI Inspection.

    • In your IDE Click the AI Inspection toolbar button.

      UFT Developer 2021: The AI Inspection option is only available when the UFT Developer runtime engine is running.

  2. Click Select Application.
  3. Click a web application or the remote access window displaying your mobile application.

    Tip:  

    Press ESC to return to the AI Object Inspection window without selecting an application.

    Press CTRL to use the mouse for other operations on the screen before selecting an application.

    The Live Application tab displays the current screen of the application, highlighting all of the detected objects.

  4. Decide what type of objects you want to see: 

    Visual Element: The objects that UFT Developer detected visually.

    Text: Areas of text in the application.

    You can choose to view one or the other, or both.

  5. After identifying the AI objects, you can Add AI-based steps to your tests.

Help design the future of AI-based testing in UFT Developer

Click How is the detection? Help us to improve to open the Feedback Tool and send OpenText feedback about the object detection. For details, see the AI-based testing Feedback Tool topic in the UFT One Help Center.

Back to top

Inspect application mockups for objects

Use AI Object Mockup Inspection to inspect application mockups and identify objects to use in your test. This enables you to design and prepare your test even before your application is fully developed.

This section describes the prerequisites for inspection, the inspection process, and how you can provide feedback on the identification, helping to shape the future of AI-based testing.

UFT Developer 2023 or earlier: AI Object Mockup Inspection is called AI Mockup Inspection.

Prerequisites

Instead of opening a web or mobile application, make sure you have a local folder that contains your mockup images in .jpg, .jpeg, or .png format.

Open the AI Object Inspection window to the Mockup Images tab

You can access the AI Object Inspection Mockup Images tab from within your IDE or from the Windows Start menu.

In your IDE

(UFT Developer 2021 R1 and later)

Click the AI Mockup Inspection toolbar button or select the option in the UFT Developer menu.

In your IDE

(UFT Developer 2021) 

  1. Click the AI Inspection toolbar button to open the AI Inspection window.

  2. Click Select Application and click on a web application (in a browser with the Micro Focus UFT Agent extension enabled).

    The Live Application tab displays the current screen of the application, highlighting all of the detected objects.

  3. Switch to the Mockup Images tab.

From the Windows Start menu
  1. Open AI Inspection.

  2. Click Select Application and click on a web application (in a browser with the OpenText UFT Agent extension enabled).

    UFT Developer 2023 and earlier: The extension is named Micro Focus UFT Agent.

    The Live Application tab displays the current screen of the application, highlighting all of the detected objects.

  3. Switch to the Mockup Images tab.

Inspect application mockups

After opening the AI Object Inspection window to the Mockup Images tab, perform the following steps:

  1. Select Web or Mobile as the inspection context and click Browse Folder to select a folder containing images.

    The AI Object Inspection window inspects the image that appears first in the file name order and highlights all identified visual elements.

  2. Decide whether to show Visual Element, Text, or both, to see either the objects that UFT Developer detected visually, or areas of text in the application, or both.

    Tip: If necessary, you can select Web or Mobile to change the AI identification context type.

  3. Click the down arrow near the folder icon to perform one of the following: 

    • Select a different folder

    • Synchronize the current folder in case its content changed

    • Open the current folder in Windows Explorer

  4. To show the image gallery and view all images in your folder, click the down arrow at the top of the pane. In the gallery, you can:

    • Navigate between images in the folder.

    • Search for a specific image.

    • Display images in a Grid View or Line View.

    • Sort images by file name or modification time.

  5. After identifying the AI objects, you can Add AI-based steps to your tests.

Help design the future of AI-based testing in UFT Developer

Click How is the detection? Help us to improve to open the Feedback Tool and send OpenText feedback about the object identification. For details, see AI-based testing Feedback Tool.

Back to top

Add AI-based steps to your tests

You can add AI-based steps to your test using code generated in the AI Object Inspection window or manually, using the Java or JavaScript SDKs.

To use the code generated by the AI Object Inspection window, first make sure the correct Clipboard code language is selected in the settings. Click the cogwheel at the top right of the AI Object Inspection window, and select Java or JavaScript.

To add AI-based steps in your tests:

Set the AI object context

UFT Developer identifies AIObjects within the context of a web Browser object or a mobile Device object. Therefore, AIObjects are always described as children on Browser or Device objects. Before describing an AIObject, make sure to describe its parent.

For example, before you can use this AIObject, you have to describe the parent browser object in your test. 

Copy code
browser.describe(AiObject.class, new AiObjectDescription.Builder()
.aiClass(com.hp.lft.sdk.ai.AiTypes.TWITTER)
.locator(new com.hp.lft.sdk.ai.Position(com.hp.lft.sdk.ai.Direction.FROM_RIGHT, 0)).build()).click();

Add steps from AI objects identified by AI Object Inspection

  1. Click an identified object in the AI Object Inspection window. This can be an object in the Mockup Images tab or the Live Application tab.

  2. In the dialog box that opens, modify the step action and edit the value field if relevant.

    UFT Developer 23.4: The Hover action is supported, but is not available in this dialog box. Add the step to your test and them manually change its action to Hover.

    The step generated for each object includes any information used to identify it uniquely, such as associated text or the object's ordinal position on the screen, if multiple identical objects are displayed. For details, see Associating text with objects and Identifying objects by relative location.

    To modify object identification details such as Text, Position, and Relation, click Edit, and make your changes in the Edit step pane on the right. You can define a position or a relation but not both.

    To add a position: 

    Click the + button for Position, then select a direction and a number.

    To add a relation (supported in UFT Developer 2022 and later): 

    Click the + button for Relation, then click an object in the application to use as an anchor. AI Object Inspection suggests a direction, which you can accept or modify.

    You can click the eye button to highlight the anchor in the application and make sure the correct object is selected.

    Note: Sometimes, an object can only be described uniquely by combining multiple properties, such as text, and position or relation.

    UFT Developer 2021 R1 and later: Edit the properties in your test after adding the step. In the Edit step pane, each property must describe the object uniquely. Otherwise, you cannot edit other properties in the pane or copy the step code.

    UFT Developer 2022 and later: You can continue to edit the properties in the Edit step pane, even if each property on its own does not uniquely describe the object. You can copy the step code only when the combination of properties provides a unique object description. If you move to another object before completing a unique object description, your edits are discarded, and the original description is used.

  3. Click Copy to copy the step, then paste the copied code into your test.

    Avoid adding steps with objects that were incorrectly identified. For example, if a button is identified as a text box, or a check mark is identified as a button, such objects may be identified inconsistently and fail in subsequent test runs.

Run the added step on your application

If you added the step from an object identified on a live application, you can click the Run on application button to run the step. This tests that your step is correct, and also advances the application to the next state, re-inspecting the application for the next step.

Note: For the step to run on the correct application, the application you are testing must be the most recent application that was in focus.

You can instruct UFT Developer to provide time for the application to load after running the step, before re-inspecting the application. Click the dot-menu icon next to the Run on application button and configure a delay.

Add verification steps to your test (optional)

You can add verification steps to check the existence of an object using code generated in the AI Object Inspection window or manually, using the Java or JavaScript SDKs.

To add verification steps from the AI Object Inspection window

  1. Click an identified object in the AI Object Inspection window. This can be an object in the Mockup Images tab or the Live Application tab.

  2. Select Verify as the action and select Exists or Does Not Exist to verify the existence of the object.

    Example verification step:

    Copy code
    com.hp.lft.report.Reporter.startReportingContext("Verify property: exists", com.hp.lft.report.ReportContextInfo.verificationMode());
    Verify.isTrue(browser.describe(AiObject.class, new AiObjectDescription(com.hp.lft.sdk.ai.AiTypes.BUTTON, "ADD TO CART")).exists(), "Verification", "Verify property: exists");
    com.hp.lft.report.Reporter.endReportingContext();
  1. Click Copy to copy the step, then paste the step into your test.

The checkpoint passes if the application is in the expected conditions. Otherwise, a step failure is reported in the run results.

Back to top

Inspect the next application page/screen

When you finish creating test steps for one page or screen in your application and you want to continue on another one, follow the steps below:

To inspect the next application page/screen

  1. Navigate to the desired location in the application.

  2. In the AI Object Inspection window, click Re-inspect to load the new application page or screen and reinspect it.

    • If you don't reload and try to run a step from the AI Object Inspection window, UFT Developer runs the step based on the previous page’s inspection, resulting in an error or performing the operation on the new page.

    • If multiple remote access windows or browser windows are open, the inspection session interacts with only one.

    • If you need to perform steps on the application to prepare it for inspection, you can use the Delayed re-inspect.

      Click the down arrow near Delayed re-inspect and set the required delay.

      Click Go to begin the countdown, open the application, and perform steps such as hover or menu clicks to bring the application to the state you want to inspect.

      When the delay timer expires, the application is re-inspected.

  3. Add steps from the new page or screen to your test.

Back to top

Run AI-based tests

After inspecting your application and creating test steps, run your AI-based test as you would run any other UFT Developer test. See Run UFT Developer tests.

You can run the same test on different operating systems and versions since it is not based on implementation details.

Back to top

Troubleshooting

AI text identification requires the Windows mediaserver.exe service to be running. Otherwise the following may occur:

  • AI Object Inspection cannot find objects By Text.
  • An error message indicates that an error occurred when calling the Media Server OCR service.

Solution: 

Open the Windows Services Manager and make sure the mediaserver.exe is running. Otherwise, start the service manually.

Back to top

See also: