Sizing and performance

This topic describes how certain sizing configuration settings can affect Application Server performance.

About Java memory

The effective operation of a large Java application such as the BMC Server Automation Application Server depends critically on the availability of sufficient heap memory. To correctly size Java memory for the Application Server, consider the following information and recommendations.

Process space, Java heap, and native heap

A Java process comprises two distinct memory areas: the Java heap and the native heap.

  • The Java heap contains Java objects and accounts for most of the memory required by a running Application Server. The Java heap is managed by the Java garbage collector and is sometimes called the GC heap.
  • The native heap (sometimes called the C heap) contains thread stacks, file handles, and other objects not managed by the Java garbage collector.

Both heaps, together with the Java executable code, must fit within the footprint of a single process. Increasing the maximum size of the Java heap decreases the maximum possible size of the native heap that can fit within a certain process size.

If the maximum Java heap size is set too low, it is possible to run out of Java heap memory. If the maximum Java heap size is set too high, it is possible to run out of native heap memory. In addition, peak memory use for either the Java heap or the native heap depends on the precise workload being considered, and timing effects between concurrently operating threads. Therefore, the recommendations that follow are recommendations only, not guarantees or absolute limits.

32-bit processes

A process running under a 32-bit operating system is limited to 4 GB of virtual address space, from which the operating system must reserve a significant portion for itself. For example, 32-bit Microsoft Windows divides the entire address space in half, allowing an application process only 2 GB total private process space. For large Java applications like Application Servers, this process space limit imposes a ceiling on the number of threads that can be accommodated within a single Application Server.

64-bit processes

A process running under a 64-bit operating system has access to a much larger virtual address space than a process running under a 32-bit operating system. Compared to a 32-bit Java process performing equivalent work, a 64-bit Java process also requires a larger Java heap, typically 50 percent or more larger than the 32-bit Java process. For some background on 32-bit versus 64-bit memory footprints, see this research paper.

Recommended Java heap settings

This section describes recommended Java heap sizes for Application Servers running under different operating systems. These recommendations must be adjusted in light of observed conditions, especially out-of-memory errors.

For 32-bit processes, BMC recommends operating system-specific Java heap size values according to the following table. Due to memory constraints, job servers using 32-bit processes should be configured to use no more than 50 work item threads.

For 64-bit processes, BMC recommends that the Java heap size be increased at least to the value indicated in the table, if there is sufficient physical memory to support this setting.

The following table shows recommended maximum heap sizes for various operating systems:

Operating system

Max Java heap recommendation
for 32-bit processes

Max Java heap recommendation
for 64-bit processes

Microsoft Windows

1024 MB

6144 MB


1536 MB

6144 MB

Oracle Solaris

2048 MB

Not applicable

The following table shows the blasadmin information for setting the preceding values:



Description and recommendation



Specifies the maximum heap size for this Application Server

About thread pools

An Application Server maintains several thread pools, each dedicated to a specific purpose. Selecting appropriate sizes for each of the various thread pools is one of the most important configuration choices for an Application Server.

Each thread consumes resources, especially memory, even when idle. Java threads might also consume operating system resources such as thread handles. While it is running, a thread consumes even more memory. Regardless of additional performance considerations, available process size limits the number of threads available in an Application Server.

Threads within the same process share certain data structures, especially caches, which are not shared between threads in different processes. Increasing the number of threads within a single process, especially threads within a particular thread pool, has the following consequences:

  • Serendipity: As more threads contribute to the process-wide caches, any given item request from any thread is more likely to be fulfilled from the cache because another thread is more likely to have already placed the element in the cache. This phenomenon has a mildly positive effect on overall performance as the number of threads increases.
  • Contention: Because some operations on some data structures require exclusive access, as the number of threads increases, there is a greater likelihood of one thread having to wait for another thread's exclusive access to conclude. This effect degrades per-thread performance as the number of threads increases; that is, increasing the number of threads is subject to diminishing returns, sometimes sharply so.

As the number of threads in a process grows, the negative contention effects grow more rapidly than do the positive serendipity effects. Doubling the number of threads in a pool improves performance but does not double it. Each additional thread provides an increasingly smaller net benefit while consuming as much memory and other resources as any other thread.

About database connections

Connections between an Application Server and the database are managed in three types of connection pools, with each pool devoted to a different purpose — a pool for jobs, a pool for clients, and a pool for other general connections. Each connection pool allows the configuration of a minimum and maximum number of connections, although BMC recommends leaving the minimum value at zero for all connection pools.

Configuring a database pool's maximum size to be too high wastes resources, and for large installations, might risk exceeding the total capacity of the database server. You must also ensure that the database server has sufficient capacity to service all the connections from all the connection pools for all the Application Servers in the environment. BMC recommends working with the database administrator and database vendor to ensure that you have this capacity, particularly for very large installations.

Conversely, configuring a database pool with a maximum size that is too low can degrade performance because a thread requesting a database connection from an empty connection pool is blocked until a connection becomes available.


Settings for job-related database connections are actually divided across two job connection pools, with 75% of the value that you set applied to a primary pool and 25% to a secondary pool that is reserved for nested jobs. For example, if you set the maximum number of job-related connections to a value of 100, the primary pool has 75 connections and the secondary pool has 25 connections.

This means that you might run out of job-related connections sooner than expected. Although you set the maximum number of connections to a value of 100, you might receive an error that no more connections are available when all 75 connections from the primary pool are being used, even though not all 25 connections in the secondary pool are being used.

Was this page helpful? Yes No Submitting... Thank you