There are several areas in which compiler optimization literature exists within the field of computer software programming language implementation systems. Well-known techniques for optimizing compilation use abstract interpretation based static analysis. The most important work in this field is the extensive work of Patrick and Radhia Cousot, and the derivative and related work of Samson Abramsky, Chris Hankin, Alan Mycroft, Flemming, and Nielson. This body of work is primarily foundational in nature, and does not emphasize the application of the techniques in a practical setting, e.g., an optimizing C compiler. Among applications of abstract interpretation, the most nearly related is the work on strictness analysis, which is a technique in which functions are analyzed to discover when they are strict in (dependent upon) their arguments in the sense that a non-terminating argument computation will mean non-termination of the function as a whole. This notion of strictness is closely related to the definition of Boolean operations on undefined values, in that it provides for defined results in the presence of undefined inputs. However, the setting of strictness analysis is quite different because strictness analysis pertains to functional programs whose values are higher order types in the lambda calculus (where non-termination is a routine feature of computations).
Another area in which compiler optimization literature exists is constant propagation. In addition to classic results, based on dataflow analysis, outlined in compiler texts like Aho and Ullman, and Muchnick, Wegman and Zadeck have developed a conditional constant propagation algorithm, and several refinements of their technique have been reported. Various algorithms based on value numbering are concerned with the analysis of values at compile time.