When all of the
.xmlfiles representing actual data seem to be ready to import, we recommend running the script on the TEST instance against a subset of the actual time sheet data to be imported later on the production instance. This subset should be large enough to evaluate the CPU resource demand of the script and to determine an optimal number of threads for the script to use.
Before proceeding, make sure that the database is correctly configured and sized to accommodate the time sheet data to be imported. Also see Configuring PPM Environments.
Note: If performance of the script does not meet the required throughput with higher values of the
-maxThreadcount argument, you can run multiple concurrent executions of the script, each against different sets of
.xml data. Using this approach, the CPU demand is a function of the combination of active threads among all of the script executions.
To test a subset of the
.xmlfiles representing actual data:
Save the subset of the
.xmlfiles in the desired directory on the TEST instance.
Run the script on the TEST instance without using test mode. See Running the Time Sheet Data Importer Script.
While the script is running, monitor the following:
Usage of system resources (CPU, memory, disk space)
Table space available in the database
As needed, rerun the script with various values of the
-maxThreadcountargument to identify the optimal value to use from a performance standpoint.
Starting at step 2, repeatedly run the script against corrected source
.xmlfiles until all errors in the files are corrected, that is, until the script moves all of the
.xmlfiles to the
When all the
.xmlfiles chosen for this procedure are in the
SUCCESSsubdirectory, testing on the TEST instance is complete.
Cancel the time sheets that were created for test purposes.