Traditionally, personal and corporate data security functions are implemented in the form of add-on software modules on top of a hardware architecture essentially identical to consumer-grade personal computers, which are designed for affordability. Sometimes security-specific add-on hardware modules are also implemented, e.g. for the purpose of authenticating the user more securely (e.g. smart cards, biometrics). But even in these cases the bulk of the security functionality is implemented traditionally as add-on software components that are sometimes integrated into the operating system but mostly reside in memory and are executed just like any other software application.
A significant problem with this traditional approach is that when the security functionality is implemented in software, it may be compromised in a number of different ways. In the normal course of operating a computer, the user occasionally adds or modifies some software components—this is the ability to add and replace software components that gives the general purpose computing architecture its flexibility and usability in a wide variety of tasks and assignment. It is this same ability to modify or add software modules that opens a window of opportunity for an attacker to compromise the security of the computer system.
When a new software component is introduced, there is a risk that it includes a functionality intended for effecting an attack, or that it includes a programming error that could be exploited externally to facilitate an attack. Also, because the security software is distributed and installed similarly to application software, it is also vulnerable to the same risks.
In a traditional general-purpose computer the entire random-access memory (RAM) is organized in a single large uniform bank that can be physically accessed by processor, or by all processors if the system contains a plurality of processors. The uniformity of memory access provides the most flexibility in the usage of RAM, which is one of the critical resources in the computer, and leads to the most optimal utilization of RAM by the operating system and application software. While effective for cost efficiency, the uniform RAM architecture also means that programs running concurrently can access each other's memory regions, or the memory occupied by the operating system or its components. As such, this feature of the uniform RAM architecture has been the most used vehicle of compromising a computer's security.
Modern computer systems also employ a mechanism called “virtual memory”, where a hardware component embedded in the processor called a memory management unit (“MMU”) performs a function of memory address translation. The addition of virtual memory allows the RAM to be partitioned into sections, each section dedicated to a certain software component or a group thereof. Virtual memory also prevents inadvertent access to the memory that belongs to a different software component or the operating system. The virtual memory mechanism has proven quite effective to prevent erroneous software behavior from impacting the stability of the system as a whole, but it was not intended to prevent malicious sabotage, and in every operating system there is a documented mechanism to circumvent the protections furnished by the MMU meant for diagnostic purposes. These mechanism are often exploited to compromise the security of the computer and the data contained therein.
In one conventional approach to achieve an elevated level of security, some portion of the security mechanism is implemented in a separate and dedicated hardware module, which is designed with additional tamper-resistant features and thereby adds a difficulty level to the potential intruder. Perhaps one of the earliest non-classified examples of hardware-enhanced computer security features was the IBM HSM (Hardware Security Module), which was a small stand-alone computer with its own memory and storage subsystem which was built into a rugged enclosure designed similarly to an office safe. The Personal Identification Numbers of bank cards were stored in the HSM such that even the bank employees did not have access to these codes in clear-text form. When a automated teller machine needed to verify the identify of a card holder, a cryptographic challenge-response sequence was initiated such that the PIN was never transmitted verbatim over the communication links, and the HSM performed the verification process securely.
The smart-card approach user-authentication mechanism of the global standard cellular phone system (based on GSM) has a similar mechanism, except that the hardware security module is miniaturized to the size of a finger nail, and each user is furnished with such a device. The SIM card construction makes it difficult to disassemble without damaging the embedded memory chip.
Another conventional approach is the Truster Platform Module that is built into some of the personal computers presently manufactured. The TPM is somewhat similar to a SIM card in that it is a small memory chip that has restricted access, and contains some security-related identification information and some encryption keys. The pivotal idea of the TPM is to prevent an attacker from modifying this identification information to falsely identify the computer or its user and thus circumvent the security mechanisms present elsewhere in the system. Its down side however is that the keys and numbers contained in the TPM are just one part of the protection, while the rest of the parts are implemented traditionally in the operating system and application software components. Thus the TPM does provide an additional layer of protection, making it impossible to modified some key security-related information token by an unauthorized user. However, the TPM leaves significant vulnerabilities in the other parts of the system software and its communications that can be exploited for a successful attack.
Accordingly, a need remains for improved approaches to computer system security.