Test with Kafka activities
Relevant for: API testing
This topic describes how to add and configure Kafka activities in your API test.
Overview
Use UFT One's native API testing capabilities to test your Kafka server.
Kafka activities enable you to test the main API functionality of your Kafka server. You can use these activities in UFT One to verify Kafka's data exchange processes, including the processes of publishing messages to topics, consuming or deleting messages from topics.
Kafka activities are independent of each other. You can add and configure one or multiple activities to serve your testing purpose.
Prerequisite
Before you add a Kafka activity in UFT One, do the following:
-
You must have a Kafka server (Kafka broker) set up.
-
If your Kafka server is using SSL or Kerberos for authentication, configure the authentication settings for your test.
Authentication mode How to configure SSL authentication - Create an API test or open an existing API test.
- Right-click the Start node in the canvas and select Properties.
-
In the Test Settings tab of the Properties pane, under Kafka SSL, configure the following SSL settings:
UFT One version 2022 and later-
Enable SSL: Select Yes.
Set this field only when your CA certificate is issued by one of the most trusted certification authorities in the industry.
-
CA certificate path: Optional. The path to the CA certificate file.
-
Client public key certificate path: Optional. The path to the public key certificate file. This is mandatory if your server requires client authentication.
-
Client private key path: Optional. The path to the private key certificate file. This is mandatory if your server requires client authentication.
UFT One version 2021 R1 and earlierSet Enable SSL to Yes and specify the path of the CA certificate.
Note: Leave the following properties empty. They are not supported:
Client Keystore, Client Key Store Password, and Client Key Password
-
Kerberos authentication
(UFT One version 2023 and later)
- Create an API test or open an existing API test.
- Right-click the Start node in the canvas and select Properties.
-
In the Test Settings tab of the Properties pane, under Kafka SASL, configure the following SASL settings:
-
Enable SASL: Select Yes.
-
Mechanism: The mechanism used to perform authentication. Select the following option:
GSSAPI: A security mechanism that authenticates using Kerberos V5.
-
Service Name: The Kafka server name.
-
Note: UFT One supports testing Kafka servers that use kerberos authentication only on Windows.
Run tests of previous versions
When running tests created in UFT One versions earlier than 2022 on UFT One version 2022 and later:
Before running tests with SSL authentication configured, do the following:
-
Open the UFT.exe.config file in the <UFT One installation folder>\bin folder.
-
Add the following line to the <appSettings> section of the file:
<add key="ValidateWithSchema" value="false"/>
Publish a message to a Kafka topic
This activity allows you to publish a message to a partition on a Kafka topic.
In the process, UFT One acts as a producer to test if a message can be successfully sent to the correct partition of the correct topic on your Kafka server.
To publish a message to a Kafka topic
- Create an API test or open an existing API test, and expand Kafka activities under the Kafka node in Toolbox > Standard Activities.
- Drag a Publish Message to Kafka Topic activity from Toolbox pane to the Test Flow in the canvas.
-
In the Input/Checkpoints tab of the Properties pane, set the activity properties.
You can enter the name of an existing topic hosted on your Kafka server or a new name. If you enter a new name, after this activity is successfully executed, a new topic with only one partition (partition 0) is created on your Kafka server.
Note: If you leave the Partition parameter empty, the default value is used, that is, UFT One sends the specified message to partition 0. If you set this parameter to a negative value, the message will be sent to any available partition on your topic.
- If necessary, add checkpoints to validate this activity.
Receive messages from a Kafka topic
This activity enables you to fetch all messages unconsumed by other consumers in the same group from a specified topic.
In the process, UFT One acts as a consumer. You must specify a Consumer Group ID for UFT One to retrieve messages that are not consumed by other consumers in the consumer group.
To receive messages from a Kafka topic
- Create an API test or open an existing API test, and expand Kafka activities under the Kafka node in Toolbox > Standard Activities.
- Drag a Receive Messages from Kafka Topic activity from Toolbox pane to the Test Flow in the canvas.
-
In the Input/Checkpoints tab of the Properties pane, set the activity properties.
You can set the Group ID to any value according to your test scenario. Remember to change the Group ID if you want to receive the same messages from the topic again.
- If necessary, add checkpoints to validate this activity.
Receive a message from a specific offset/partition
This activity enables you to fetch a message from a specific partition at a defined offset.
In the process, UFT One acts as a consumer and always uses a new consumer group ID to retrieve a message from the specified offset on your Kafka server.
To receive messages from a specific offset/partition
-
Create an API test or open an existing API test, and expand Kafka activities under the Kafka node in Toolbox > Standard Activities.
-
Drag a Receive Message from Specific Partition/Offset activity from the Toolbox pane to the Test Flow in the canvas.
-
In the Input/Checkpoints tab of the Properties pane, set the activity properties.
Note: If you leave the Partition and Offset parameters empty, default values are used, that is, UFT One retrieves a message from partition 0 at offset 0.
- If necessary, add checkpoints to validate this activity.
Receive messages starting from a specific offset/partition
This activity enables you to fetch all messages starting from a defined offset in a specific partition.
In this process, UFT One acts as a consumer and always uses a new consumer group ID to retrieve all messages starting from the defined offset on your Kafka server.
To receive messages starting from a specific partition/offset
-
Create an API test or open an existing API test, and expand Kafka activities under the Kafka node in Toolbox > Standard Activities.
-
Drag a Receive Messages Starting from Specific Partition/Offset activity from the Toolbox pane to the Test Flow in the canvas.
-
In the Input/Checkpoints tab of the Properties pane, set the activity properties.
Note: If you leave the Partition and Offset parameters empty, default values are used, that is, UFT One retrieves all messages starting from partition 0 at offset 0.
- If necessary, add checkpoints to validate this activity.
Receive all messages from a Kafka topic
This activity enables you to fetch all messages from a Kafka topic.
In this process, UFT One acts as a consumer and always uses a new consumer group ID to retrieve all messages from the specified topic on your Kafka server.
To receive all messages from a topic
-
Create an API test or open an existing API test, and expand Kafka activities under the Kafka node in Toolbox > Standard Activities.
-
Drag a Receive All Messages from Topic activity from the Toolbox pane to the Test Flow in the canvas.
-
In the Input/Checkpoints tab of the Properties pane, set the activity properties.
- If necessary, add checkpoints to validate this activity.
Delete messages from an offset/partition
This activity enables you to delete all messages at offsets smaller than a specified offset in a partition. Once you successfully execute this activity, you cannot recover the deleted messages.
To delete messages from an offset/partition
-
Create an API test or open an existing API test, and expand Kafka activities under the Kafka node in Toolbox > Standard Activities.
-
Drag a Delete Messages from Partition/Offset activity from the Toolbox pane to the Test Flow in the canvas.
-
In the Input/Checkpoints tab of the Properties pane, set the activity properties.
Note: If you leave the Partition and Offset parameters empty, default values are used and no message is deleted as offset 0 is the smallest offset.
- If necessary, add checkpoints to validate this activity.