The embodiments of the present invention relate to adaptively preventing out-of-memory conditions within a program executing on a computer. Modern computers can execute several programs concurrently. In general, each program can execute within its own “process.” To facilitate execution of multiple programs, many computers utilize virtual memory. Virtual memory refers to a technique in which an executing program is “tricked” into thinking, or behaving as if, the program has access to contiguous addresses of working memory, e.g., random access memory (RAM). The actual memory used by the program, however, may be fragmented and exist within RAM, within disk storage, or within a combination of both RAM and disk storage.
Virtual memory is apportioned on a per-process basis. As such, the virtual memory allotted to a particular process is independent of the virtual memory allotted to any other process currently executing within the computer. As an example, some 32-bit operating systems typically provide two gigabytes of virtual memory to each process. Each individual operating system, however, may differ with respect to the specific implementation and management of virtual memory.
For a running program to continue to function properly, the process in which the program executes must have virtual memory available for use. As the process reserves virtual memory, less is available. When the process within which the program executes runs out of virtual memory, the program executing within that process will likely become unstable or fail altogether.