High-Level Steps for Server Cluster Configuration

Server clusters are configured using the server.conf file. You can define all of the nodes in a cluster (cluster-specific and node-specific configuration) in one server.conf file, regardless of whether the nodes are running on a single machine, or distributed on different machines. Using a single server.conf file ensures that each node reflects the correct setting. It also enables scripts such as kStatus.sh to gather information from all the nodes in the server cluster, and not just the nodes that reside on the machine from which you run the script.

You can use the same server.conf file on different machines that host nodes that participate in the same cluster. If you do, keep in mind that you must change the machine-specific parameters settings in the file.

To configure a server cluster:

  1. Stop the PPM Server. (See Start and stop the PPM Server on a single-server system.

  2. If your cluster is to include nodes hosted on different machines, make sure that you have set up a shared folder. (See Create a shared folder for the server.conf file

  3. If you are using an external Web server, do the following:

    1. Stop the external Web server.

    2. Configure the workers.properties file to include information for the multiple cluster nodes. Each node requires an external Web port defined (using the EXTERNAL_WEB_PORT configuration parameter).

      For information about how to configure the workers.properties file, see Configure the Workers Properties file.

  4. Configure the server nodes on the file system.

  5. Manually configure the server nodes in the server.conf file.

The next sections provide the steps you use to configure the following server cluster setups (Table 1. Server configuration parameters affected by clustering):

  • External Web server, single machine

  • External Web server, multiple machines

  • Hardware load balancer, multiple machines