Performance Application Lifecycle (PAL) allows for complete end-to-end testing and DevOps feedback. Users can accurately compare performance test results with real production data benchmarks. Analysis of these results provides a framework to assist you in creating performance test scenarios to resemble realistic environments as closely as possible, which reduces test assumptions and risks.
PAL enables you to use user traffic and system monitoring data in production to design a performance test that closely resembles production behavior. You can import production data from Microsoft IIS W3C Extended Log Format (IIS W3C), Apache, and Micro Focus Real User Monitor (RUM).
PAL’s importance stems from the basic need to correctly plan testing initiatives. Without a clear understanding of what users are doing in production, and how the production system behaves, it is very difficult to:
- focus testing on the most widely used business cases and scenarios
- test the system under appropriate loads
- define testing targets (for example, Service Level Agreements)
The PAL flow includes the following main steps:
|Import||Import a data set from a production system. Different production monitoring systems provide different data which may influence which information is available to the user.|
After uploading your data set to Performance Center, it analyzes the data and creates a PAL scenario with business flows. You can translate each business flow into a script. If a flow has no significance, you can exclude it from your script. After you are done translating your business flows to scripts, you create your performance test and assign to it your scripts.
Run your performance test.
|Compare||Compare performance test results with your production data. If necessary, readjust and rerun your test.|