AI-based testing (tech preview)

Supported for web and mobile testing

This topic explains how to use Artificial Intelligence (AI) Features in your UFT Developer tests to identify objects the way a person would. This enables you to run the same test on different platforms and versions, regardless of the objects' implementation.

AI-based testing overview

Using AI, UFT Developer identifies objects visually, as humans do. Object descriptions can include the control type, associated text, and an ordinal position, in case there are multiple identical objects.

Some advantages of AI-based object identification are: 

  • Tests are technology agnostic, identifying objects visually, regardless of the technology details used behind the scenes.

  • Tests are easier to maintain. An object that changes location, framework, or even shape, is still identified, as long as the object remains visually similar or its purpose remains clear.

Using AI in UFT Developer is computationally intensive. We recommend using a powerful computer to benefit from UFT Developer's AI Features and achieve optimal performance. For recommended system requirements, see Support Matrix.

AI-based testing is supported in Java and JavaScript tests. For details, see the Java and JavaScript SDK references. Describe AI test objects and then use them in your tests. You can use AI-based test objects and technology, property-based, test objects in the same test.

Run the AI Inspection interface to inspect your application, or mockup images of an application, and identify objects. Create the steps you want to test, and automatically generate code to add to your test.

You can open AI Inspection from the Start menu or from the UFT Developer menu in your IDE (Eclipse or IntelliJ IDEA).

Back to top

Prerequisites

To use UFT Developer's AI capabilities, the following requirements must be met: 

  • UFT Developer is installed on a computer with a relevant OS and Windows feature installed. For details, see AI-based testing.

  • AI Features are enabled in the UFT Developer runtime engine settings, in the AI tab. This setting is enabled by default.

Prepare your test project and create a web or mobile test.

To start identifying AI objects to use in your test, Inspect your application for objects or Inspect application mockups for objects.

Back to top

Inspect your application for objects

Use AI inspection to identify objects in your application that you can use in your test steps.

Prerequisites

If you are testing an application on a mobile device
  1. Make sure the Mobile Add-in is installed and loaded.

    To test web applications on your mobile device, make sure the Web Add-in is installed and loaded.

  2. Connect to UFT Mobile and open your mobile device and the application you want to test. For details, see Create mobile tests.

If you are testing a desktop web application
  1. Make sure the Web Add-in is installed and loaded.

  2. Make sure that your browser is set up with the Micro Focus UFT Agent extension. For details, see Set up web browsers and controls.

  3. Open the application.

    For best identification results:

    • Maximize the browser window and set the zoom level to 100%.
    • Set your Windows display settings to display all text and apps at 100%.
    • Make sure no banners are displayed in the browser aside from your web page.

Supported browsers: Chrome, Chromium-based Edge, Firefox, Internet Explorer.

Headless browsers are not supported.

Identify all AI objects displayed in your application

  1. Open the AI Inspection window to inspect your application and detect all AI objects in it.

    To open the window, do one of the following: 

    • Select Start > Micro Focus > AI Inspection.

    • In your IDE Click the AI Inspection toolbar button.

  2. Click Select Application.
  3. Click a web application or the remote access window displaying your mobile application.

    Tip:  

    Press ESC to return to the AI Inspection window without selecting an application.

    Press CTRL to use the mouse for other operations on the screen before selecting an application.

    The Live Application tab displays the current screen of the application, highlighting all of the detected objects.

  4. Decide whether to show Visual Element, Text, or both, to see either the objects that UFT Developer detected visually, or areas of text in the application, or both.
  5. After identifying the AI objects, you can Add AI-based steps to your tests.

Help design the future of AI-based testing in UFT Developer

Click How is the detection? Help us to improve to open the Feedback Tool and send Micro Focus feedback about the object detection. For details, see the AI-based testing Feedback Tool topic in the UFT One Help Center.

Back to top

Inspect application mockups for objects

Use AI Mockup Identification to inspect application mockups and identify objects to use in your test. This enables you to design and prepare your test even before your application is fully developed.

Prerequisites

Instead of opening a web or mobile application, make sure you have a local folder that contains your mockup images in .jpg, .jpeg, or .png format.

Inspect application mockups

  1. Open the AI Inspection window the Start menu or from your IDE:

    • Select Start > Micro Focus > AI Inspection.

    • In your IDE Click the AI Inspection toolbar button.

  2. Click Select Application and click on a web application (in a browser with the Micro Focus UFT Agent enabled).

    The Live Application tab displays the current screen of the application, highlighting all of the detected objects.

  3. Switch to the Mockup Images tab.

  4. Select Web or Mobile as the inspection context and click Browse Folder to select a folder containing images.

    The AI Inspection window inspects the image that appears first in the file name order and highlights all identified visual elements.

  5. Decide whether to show Visual Element, Text, or both, to see either the objects that UFT Developer detected visually, or areas of text in the application, or both.

    Tip: If necessary, you can select Web or Mobile to change the AI identification context type.

  6. To select a different folder, synchronize the current folder in case its content changed, or open the current folder, click the down arrow near the folder icon .

  7. To show the image gallery and view all images in your folder, click the down arrow at the top of the pane. In the gallery, you can:

    • Navigate between images in the folder.

    • Search for a specific image.

    • Display images in a Grid View or Line View.

    • Sort images by file name or modification time.

  8. After identifying the AI objects, you can Add AI-based steps to your tests.

Help design the future of AI-based testing in UFT Developer

Click How is the detection? Help us to improve to open the Feedback Tool and send Micro Focus feedback about the object identification. For details, see AI-based testing Feedback Tool.

Back to top

Add AI-based steps to your tests

You can add AI-based steps to your test using code generated in the AI Inspection window or manually, using the Java or JavaScript SDKs.

To use the code generated by the AI Inspection window, first make sure correct Clipboard code language is selected in the settings. Click the cogwheel at the top right of the AI Inspection window, and select Java or JavaScript.

Set the AI context

UFT Developer identifies AIObjects within the context of a web Browser object or a mobile Device object. Therefore, AIObjects are always described as children on Browser or Device objects. Before describing an AIObject, make sure to describe its parent.

For example, before you can use this AIObject, you have to describe the parent browser object in your test. 

Copy code
browser.describe(AiObject.class, new AiObjectDescription.Builder()
.aiClass("twitter")
.locator(new com.hp.lft.sdk.ai.Position(com.hp.lft.sdk.ai.Direction.FROM_RIGHT, 0)).build()).click();

Add steps from AI objects identified by AI Inspection

  1. Click an identified object in the AI Inspection window. This can be an object in the Mockup Images tab or the Live Application tab.

  2. In the tooltip that opens, modify the step action, edit the value field if relevant, and click Copy to copy the step.

    Alternatively, click Edit to modify additional step details including Text and Position in the right pane, then click Copy to copy the step.

    The step added for each object includes any information used to identify it uniquely, such as associated text or the object's ordinal position on the screen, if multiple identical objects are displayed. For details, see Associating text with objects and Identifying objects by ordinal position.

  3. Paste the copied code into your test.

    Avoid adding steps with objects that were incorrectly identified. For example, if a button is identified as a text box, or a check mark is identified as a button, such objects may be identified inconsistently and fail in subsequent test runs.

Run the added step on your application

If you added the step from an object identified on a live application, you can click the Run on application button to run the step. This tests that your step is correct, and also advances the application to the next state, re-inspecting the application for the next step.

You can instruct UFT Developer to provide time for the application to load after running the step, before re-inspecting the application. Click the dot-menu icon next to the Run on application button and configure a delay.

Add verification steps to your test (optional)

You can add verification steps to check the existence of an object using code generated in the AI Inspection window or manually, using the Java or JavaScript SDKs.

To add verification steps from the AI Inspection window

  1. Click an identified object in the AI Inspection window. This can be an object in the Mockup Images tab or the Live Application tab.

  2. Select Verify as the action and select Exists or Does Not Exist to verify the existence of the object.

    Example verification step:

    Copy code
    com.hp.lft.report.Reporter.startReportingContext("Verify property: exists", com.hp.lft.report.ReportContextInfo.verificationMode());
    Verify.isTrue(browser.describe(AiObject.class, new AiObjectDescription("button", "ADD TO CART")).exists(), "Verification", "Verify property: exists");
    com.hp.lft.report.Reporter.endReportingContext();
  1. Click Copy to copy the step, then paste the step into your test.

The checkpoint passes if the application is in the expected conditions. Otherwise, a step failure is reported in the run results.

Back to top

Inspect the next application page/screen

When you finish creating test steps for one page or screen in your application and you want to continue on another one, follow the steps below:

To inspect the next application page/screen

  1. Navigate to the desired location in the application.

  2. In the AI Inspection window, click Re-inspect to load the new application page or screen and reinspect it.

    • If you don't reload and try to run a step from the AI Inspection window, UFT Developer runs the step based on the previous page’s inspection, resulting in an error or performing the operation on the new page.

    • If multiple remote access windows or browser windows are open, the inspection session interacts with only one.

    • If you need to perform steps on the application to prepare it for inspection, you can use the Delayed re-inspect.

      Click the down arrow near Delayed re-inspect and set the required delay.

      Click Go to begin the countdown, open the application, and perform steps such as hover or menu clicks to bring the application to the state you want to inspect.

      When the delay timer expires, the application is re-inspected.

  3. Add steps from the new page or screen to your test.

Back to top

Run AI-based tests

After inspecting your application and creating test steps, run your AI-based test as you would run any other UFT Developer test. See Run UFT Developer tests.

You can run the same test on different operating systems and versions since it is not based on implementation details.

Back to top

Troubleshooting

AI text identification requires the Windows mediaserver.exe service to be running. Otherwise the following may occur:

  • AI inspection cannot find objects By Text.
  • An error message indicates that an error occurred when calling the Media Server OCR service.

Solution: 

Open the Windows Services Manager and make sure the mediaserver.exe is running. Otherwise, start the service manually.

Back to top

See also: