High-Level Steps for Server Cluster Configuration
Server clusters are configured using the
server.conf file. You can define all of the nodes in a cluster (cluster-specific and node-specific configuration) in one
server.conf file, regardless of whether the nodes are running on a single machine, or distributed on different machines. Using a single
server.conf file ensures that each node reflects the correct setting. It also enables scripts such as
kStatus.sh to gather information from all the nodes in the server cluster, and not just the nodes that reside on the machine from which you run the script.
You can use the same
server.conf file on different machines that host nodes that participate in the same cluster. If you do, keep in mind that you must change the machine-specific parameters settings in the file.
To configure a server cluster:
Stop the PPM Server. (See Start and stop the PPM Server on a single-server system.
If your cluster is to include nodes hosted on different machines, make sure that you have set up a shared folder. (See Create a shared folder for the server.conf file
If you are using an external Web server, do the following:
Stop the external Web server.
workers.propertiesfile to include information for the multiple cluster nodes. Each node requires an external Web port defined (using the
For information about how to configure the
workers.propertiesfile, see Configure the Workers Properties file.
Configure the server nodes on the file system.
Manually configure the server nodes in the
The next sections provide the steps you use to configure the following server cluster setups (Table 1. Server configuration parameters affected by clustering):
External Web server, single machine
External Web server, multiple machines
Hardware load balancer, multiple machines