Improve AI-based test object identification

Supported for web and mobile testing

This topic discusses some of the elements that UFT Developer's Artificial Intelligence (AI) Features use to support unique identification of objects in your application.

Associating text with objects

The text associated with an object can help identify an object uniquely. For example:

browser.describe(AiObject.class, new AiObjectDescription(com.hp.lft.sdk.ai.AiTypes.BUTTON, "ADD TO CART")).Click 

Using the ADD TO CART text in this step to describe the object, ensures that we click the correct button.

When detecting objects in an application, if there are multiple labels around the field, UFT Developer uses the one that seems most logical for the object's identification.

However, if you decide to use a different label in your object description, UFT Developer still identifies the object.

Example: When detecting objects in an application, a button is associated with the text on the button, but a field is associated with its label, as opposed to its content.

If a field has multiple labels and UFT Developer chooses one for detection, UFT Developer will still identify this field correctly when running a test step that uses a different label to describe the field.

In some cases, UFT Developer AI combines multiple text strings that are close to each other into one text string to identify one object.

You can edit the combined string and leave just one to use for object identification. Make sure to remove a whole string and not part of it.

Example:  

For the text box below, AI Object Inspection combines Password and Forgot? into one string to identify the object and uses the string "Password Forgot?" to describe the object in the code it generates.

You can remove a whole string from your script and change the code to use just "Password" for the description without causing test failure.

Back to top

Text recognition in multiple languages

To enable UFT Developer's AI features to recognize text in languages other than English, select the relevant OCR languages in the AI tab of the UFT Developer runtime engine settings. For details, see AI Object Detection Settings.

You can also customize the languages temporarily within a test run by adding AI run settings steps to your test. For details, see the AIRunSettings Class in the Java and JavaScript SDK references.

For example:

Copy code
// Configure OCR settings to enable detection for English, German, French, Hebrew and Traditional Chinese
AiRunSettings.updateOCRSettings((new AiOCRSettings()).setLanguages(new String[]{"en", "de", "fr", "he", "zht"}))

Or, you can create a set with your language list and use it to assign the language setting:

Copy code
// Configure OCR settings to enable detection for English and Russian
Set<String> languages = new HashSet<>();
languages.add(AiOCRLanguages.RUSSIAN);
languages.add(AiOCRLanguages.ENGLISH);
AiRunSettings.updateOCRSettings(new AiOCRSettings().setLanguages(languages));

Back to top

Identifying objects by relative location

UFT Developer must be able to identify an object uniquely to run a step on the object. When multiple objects match your object description, you can add the object location to provide a unique identification. The location can be ordinal, relative to similar objects in the application, or proximal, relative to a different AI object, considered an anchor.

In your object's description, include the Locator details, such as position or relation. You cannot describe an object by using both position and relation.

To describe an object's ordinal location

Provide the following information:

  • The object's occurrence. For example, 0, 1, and 2, for first, second, and third.

  • The orientation, the direction in which to count occurrences: FromLeft, FromRight, FromTop, FromBottom.

For example, the following code snippet clicks the first twitter image from the right in your application.

Copy code
browser.describe(AiObject.class, new AiObjectDescription.Builder()
.aiClass(com.hp.lft.sdk.ai.AiTypes.TWITTER)
.locator(new com.hp.lft.sdk.ai.Position(com.hp.lft.sdk.ai.Direction.FROM_RIGHT, 0)).build()).click();

To describe an object's location in proximity to a different AI object

Supported in UFT Developer 2022 and later

Provide the following information:

  • The description of the anchor object.

    The anchor must be an AI object that belongs to the same context as the object you are describing.

    The anchor can also be described by its location.

  • The direction of the anchor object compared to the object you are describing: WithAnchorOnLeft, WithAnchorOnRight, WithAnchorAbove, WithAnchorBelow.

UFT Developer returns the AI object that matches the description and is closest and most aligned with the anchor, in the specified direction.

Note: When you identify an object by a relative location, the application size must remain unchanged in order to successfully run the test script.

For example, this test clicks on the download button to the right of the Linux text, below the Latest Edition text block:

Copy code
AiObject latestEdition = browser.describe(AiObject.class, new AiObjectDescription(com.hp.lft.sdk.ai.AiTypes.TEXT_BLOCK, "Latest Edition"));
AiObject linuxUnderLatestEdition = browser.describe(AiObject.class, new AiObjectDescription.Builder()
        .aiClass(com.hp.lft.sdk.ai.AiTypes.TEXT)
        .text("Linux")
        .locator(new com.hp.lft.sdk.ai.Relation(com.hp.lft.sdk.ai.RelationType.WITH_ANCHOR_ABOVE, latestEdition
                )).build());
AiObject downloadButton = browser.describe(AiObject.class, new AiObjectDescription.Builder()
        .aiClass(com.hp.lft.sdk.ai.AiTypes.BUTTON)
        .text("download")
        .locator(new com.hp.lft.sdk.ai.Relation(com.hp.lft.sdk.ai.RelationType.WITH_ANCHOR_ON_LEFT, linuxUnderLatestEdition
                )).build());
downloadButton.click();

For details, see the Locator Class in the Java and JavaScript SDK references.

Back to top

Describe a control using an image

Supported in UFT Developer 24.2 and later

If your application includes a control type that is not supported by AI object identification, you can provide an image of the control that UFT Developer can use to identify the control. Specify the class name that you want UFT Developer to use for this control type by registering it as a custom class.

Once you register a custom class, you can use it in AIUtil steps as a control type.

Copy code
puplic void test() throws GeneralLeanFtException, IOException {

Browser browser = BrowserFactory.launch(BrowserType.CHROME);
browser.navigate("https://www.advantageonlineshopping.com/#/");

AiUtil.registerCustomClass ("MyClass", "C:\Users\MyUser\Pictures\MyImage.PNG");
AiObject aiObject = browser.describe(AiObject.class new AiObjectDescription("MyClass")
AiObject.Click();

}

You can even use a registered custom class to describe an anchor object, which is then used to identify other objects in its proximity.

Note:  

  • You cannot use a registered class as an anchor to identify another registered class by proximity.

  • The image you use to describe the control must match the control exactly.

  • When running tests on a remote machine, the image used to describe the control must be located on the machine running the test.

For more details, see the AIUtil.RegisterCustomClass method in the Java and JavaScript SDK References.

Back to top

Automatic scrolling

When running a test, if the object is not displayed in the application but the web page or mobile app is scrollable, UFT Developer automatically scrolls further in search of the object. Once an object matching the description is identified, no further scrolling is performed. Identical objects displayed in subsequent application pages or screens will not be found.

By default, UFT Developer scrolls down twice. You can customize the direction of the scroll and the maximum number of scrolls to perform, or disable scrolling if necessary.

  • Globally customize the scrolling in the AI tab of the UFT Developer runtime engine settings. For details, see AI Object Detection Settings.

  • Customize the scrolling temporarily within a test run by adding AI run settings steps to your test. For details, see the AIRunSettings Class in the Java and JavaScript SDK references.

    For example:

    Copy code
    // Configure autoscroll settings to enable scrolling up
    AiRunSettings.updateAutoScrollSettings((new AiAutoScrollSettings()).enable(ScrollDirection.UP, 10));

Back to top

See also: