In the field of computer programming, compilers are well known computer programs used to translate other sets of program instructions written in one programming language (typically a “high” or human-readable computer language) to another (typically a “low” or machine-readable computer language. Generally, the process through which a compiler generates computer executable code consists of three main stages. The first stage, also known as the frontend stage, performs the initial analysis (e.g., lexical, syntactical, and semantics) of the programmed instructions and generates an intermediate representation (IR) of the source code for further processing.
The next or middle stage performs optimizations on the resulting intermediate representation, typically simplifies the flow and eliminates useless portions of the code, as well as discovery and propagation of constant values. Often, the middle stage will generate and output a second IR for the third and final stage. The third and final stage, also known as the backend stage generates the computer-readable assembly code, and performs further optimizations and actions in preparation of code execution.
Since modern computer programs often contain huge amounts of programmed instructions, optimization during compilation to reduce execution time has become a large compelling interest. One type of optimization is known as inter-procedural optimization and involves analyzing the entirety of a program's source code, as opposed to limiting the analysis and resultant optimization to certain target regions or program constructs. Since a greater quantity of information can be analyzed for comparison (compared to targeted optimization techniques), the optimization as a whole can be more effective. However, for many programs, certain portions of the program's source code may use data (values) that are not known or available during compile, and only become so at run-time (execution). As such, static inter-procedural optimization for these programs may be less effective.