In the latter half of the twentieth century, there began a phenomenon known as the information revolution. While the information revolution is a historical development broader in scope than any one event or machine, no single device has come to represent the information revolution more than the digital electronic computer. The development of computer systems has surely been a revolution. Each year, computer systems grow faster, store more data, and provide more applications to their users.
A modern computer system typically comprises one or more central processing units (CPU) and supporting hardware necessary to store, retrieve and transfer information, such as communication buses and memory. It also includes hardware necessary to communicate with the outside world, such as input/output controllers or storage controllers, and devices attached thereto such as keyboards, monitors, tape drives, disk drives, communication lines coupled to a network, etc. The CPU or CPUs are the heart of the system. They execute the instructions which form a computer program and directs the operation of the other system components.
From the standpoint of the computer's hardware, most systems operate in fundamentally the same manner. Processors are capable of performing a limited set of very simple operations, such as arithmetic, logical comparisons, and movement of data from one location to another. But each operation is performed very quickly. Sophisticated software at multiple levels directs a computer to perform massive numbers of these simple operations, enabling the computer to perform complex tasks. What is perceived by the user as a new or improved capability of a computer system is made possible by performing essentially the same set of very simple operations, but using software with enhanced function, along with faster hardware.
Almost all modern general purpose computer systems support some form of sharing of information with other computer systems, as via the Internet or some other network, and almost all large systems support multi-tasking on behalf of multiple users, in which multiple processes are simultaneously active, and computer system resources, such as processor and memory resources, are allocated among the different processes. Often, the actual users are physically remote from the computer system itself, communicating with it across a network. In the case of the Internet, the actual users may communicate with the computer through multiple intermediate computer systems and routers, and be so remote that they are difficult to identify.
Making the capabilities of computer systems widely available provides great benefits, but there are also risks which must be addressed. In such an environment of multiple users, some of them remote, sharing resources on a computer system, and communicating with other computer systems which may similarly share resources, data security and integrity are a significant concern.
If the capabilities of a system are to be made widely available, it is impractical to vet all persons using the system's capabilities. It must be assumed that it will be possible for unscrupulous persons to use the system, and the system should therefore be designed so that those who use it can not compromise its data integrity or access unauthorized data. Widely available systems therefore have various protection mechanisms, whereby the operations a user can perform are limited, data is isolated, and users are protected from one another. However, it is generally necessary to allow some individual or relatively small group of individuals greater access to the system for purposes of performing maintenance operations, administering system access by others, and so forth. Special access mechanisms exist for this purpose. Thus, an entire hierarchy of access mechanisms may exist for accessing different capabilities of a system by different classes of users.
In theory, these various access mechanisms for different users and associated security and protection measures protect the system and its data. However, these mechanisms are enormously complex. It is difficult to design systems of such complexity which are foolproof. Human ingenuity being what it is, unscrupulous persons all too often find ways to defeat the protection mechanisms. Those skilled in the art of computer security and data integrity therefore seek new and improved mechanisms for system protection. As these new and improved mechanisms are developed, interlopers likewise seek ways to thwart the improved protection mechanisms. Thus, an arms race of sorts exists between those who seek to protect computer systems and those who seek to defeat that protection, requiring continuing improvements to computer system protection mechanisms. Often, it is the anticipation of the security exposure which requires the greatest skill on the part of those who protect computer systems; the fix may be relatively straightforward once the exposure is understood and appreciated. It can not be expected that any single new development will, once and for all, put an end to attempts to defeat computer system protection mechanisms, but any development which makes it more difficult for the interloper has potential value.
One form of protection mechanism is a mechanism which leaves evidence of a security or integrity breach. Such a mechanism does not directly prevent the interloper from accessing or altering unauthorized data, but leaves telltale evidence of tampering, so that remedial action may be taken and/or the interloper may be discovered and apprehended. Various types of such mechanisms have been known for centuries. For example, the wax-sealed envelope did not prevent anyone from opening and reading its contents, but provided visible evidence that the contents had been read. Analogous mechanisms exist on modern computer systems to provide evidence of unauthorized access.
In particular, it is sometimes desirable to detect whether some piece of data has been altered. The data could be, for example, an executable code module which performs some important low level function, although it could be something other than executable code. Various low level executable code modules are responsible for enforcing security and data integrity in a large system, but if these modules themselves can be altered, then of course the integrity of the system is compromised. Such an executable code module or other vital data can be protected by use of a digital signature. Typically, the digital signature is some data which is derived from the executable code module and/or its attributes according to a known function, and then encrypted using a private encryption key according to a public/private key algorithm. The known function used to derive the data to be encrypted into the key is generally referred to as a hashing function, indicating that a relatively large number of bits are reduced to a smaller number of bits for convenience of encryption, although bit reduction is not strictly necessary, and if the data being protected is sufficiently small, the protected data itself could be encrypted directly (i.e., the hashing function could be the identity function). Using a digital signature, it is possible to verify whether the protected data has been altered by (a) decrypting the digital signature with the public key, and (b) comparing the decrypted signature with the known hashing function of the protected data to be verified. If the two are not identical, then the protected data has been altered. Because the digital signature of the protected data is produced with a private key, it is theoretically extremely difficult or impossible for the interloper to forge the digital signature, even if he can forge the protected data. Thus, a digital signature mechanism provides a means for detecting alterations of protected data (while not necessarily preventing alteration of the protected data).
Although digital signatures provide one useful means for detecting alteration, the arms race continues, and it must be assumed that interlopers will attempt to circumvent this and other mechanisms for protecting system integrity. Continued secure and correct operation of computer systems requires constant vigilance, and in particular, requires that potential avenues of attack be anticipated in advance, and appropriate countermeasures taken. Anticipating such potential avenues of attack is a difficult and demanding art.