Security problems arise during the use of software in computers whenever the host computer's architectural arrangement permits applications programs to be copied and/or altered. Pirates, whether they are "authorized" users or not, freely copy software for unauthorized sale and use. Software theft has become a multi-billion dollar illegal industry that is unstoppable by the prior art. Alteration of application programs by other computer programs also causes major computer security problems.
Rogue computer programs called "viruses" or "worms" alter software to produce unauthorized, undesirable, and often damaging effects. Such self-replicating secretly-operating programs are most often transferred from a rogue-contaminated computer into a new host computer by authorized operators who do not realize that these programs have entered by means of diskettes, modems or networks . . . and have attached themselves so as to lie hidden in unused areas of the host computer's data storage and active memory; integrated themselves into operating systems; and/or attached themselves to other host-stored applications programs. Once inside, a cleverly written rogue will pose a continuing threat from within tho host computer, and is capable of compromising the security of anything that passes through the infected computer to any other computer, since it is able to copy, alter, destroy, and/or scramble any information that is electrically accessible to any other program operating in the host computer. As a result, rogue programs have been used successfully to circumvent security programs for espionage, sabotage, and extortion.
Copying and alteration is enabled by the basic architectural arrangement of prior art computers, which permits all host-run programs to have equal and unrestricted access to all of the host computer's resources including: mass data storage devices, console I/O; inter-computer communications; computer peripherals; and any prior art security device attached to the host computer. Typically, a copy of an applications program stored on a mass data storage device is down-loaded into the RAM of the host computer. Once in RAM, that program copy is able to be altered and/or copied to any host resource, because the host resources are directly controlled by the command coding of the program which is operating in the host computer's memory regardless of whether the program in operation is a well behaved program, or an insidious rogue program.
Mass data storage devices are an especially vulnerable resource, since host-loaded programs are able to command any information to be copied into RAM, altered or eliminated . . . including copies of other applications programs. Computers are unable to determine the intent of a program. Yet, no means is provided to prevent rogue-infected applications programs from accessing information directly. As a result, any program operating in the host computer's memory is able to avoid information-protecting security software; run any other software while monitoring its operation; and alter, copy, or destroy any information, (program or data,) that is electrically accessible to programs having a different intent. Even the prior art security devices and their controlling host-run security software are subject to rogue attack, since they require the use of secure, dependable host-run programs to maintain security . . . programs that are able to be altered by other (possibly contaminated) host-run applications programs.
To prevent rogue activity, a special architecture is required, wherein the operating system in the host computer is electrically separated from potentially contaminating applications programs, which are run in an independent, isolated computer, so as to prevent direct access the host computer's resources. However, the prior art does not provide such an arrangement. As a result, only secure, dependable, well-behaved programs are able to be used in computers needing security. This precludes using any, even remotely suspect program. It hampers the ability to test and upgrade software, severely limiting the ability to maintain adequate security.
No provision is made in the prior art to run suspect programs in an isolated architecture. There are no provisions for up-loading a suspected program into the security device itself without compromising security. There are no provisions for physical distribution of applications software within a protected architecture and apparatus, nor does it permit the actual operation of applications programs within the distribution means so as to eliminate any need for down-loading software.
A rogue program hidden WITHIN A PRIOR ART SECURITY DEVICE that is able to down-load information into the host computer, which in turn, is able to become a part of host-run programming code is easily able to compromise the information contained within the host computer. Such a security device must be manufactured by a friendly source, and once wired in, it must remain a permanent part of the host computer. Prior security devices do not have provisions to protect from replacement with an unfriendly "secure" program. As a result, the host computer is not protected from the security device, and the security device is not protected from the host computer.
Physical information security, that is, the ability to physically remove from a computer all existing copies of sensitive information, and lock them up in a safe or keep them under guard, is rendered moot by the ability of host computers to make security-compromising copies of stored information . . . with, or without the operator's knowledge. Means is not provided for physical security so that the only-existing-copy of an applications program is able to be physically removed from a host computer and kept in a safe until needed, because such devices are able to leave a security-compromising copy behind.
Processors, memories, inter-resource communications means, and component interconnections are security-sensitive. If security-sensitive components are physically accessible, unauthorized equipment is able to be attached to circumvent security measures. Sensitive information in prior security devices is not protected by dedicated security-sensitive components, which are both electrically inaccessible, and housed in a single sealed removable cartridge; so that sensitive information is protected no mater what kind of a computer it is plugged into, or who plugs it in. Removal of a prior art security device from the host computer does not remove these security-sensitive components simultaneously with the applications program, and other secret information, so as to enable physical information security to be affective.
Software and other transportable information is not protected from environmental factors that easily damage or destroy transporting apparatus, and as a result the information inside.
Prior art security methods that use removable cartridges typically use connectors between a host computer's data and address bus and the information-containing cartridges. Such plug-in cartridges often spark or arc upon insertion or removal of the cartridge from its socket. Such methods become unacceptable in certain hazardous environments where explosive gases, or a high percentage of oxygen is present, where a single spark is able to ignite a fire or explosion. Such environments include the use of computers at fuel depots, and in industrial environments where computers are becoming common.
Even ordinary environments are hazardous to conventional computing equipment. Diskettes and disk drives used for software distribution contain delicate mechanical and electrical parts that fail in the presents of dirt or moisture. The result is that, the prior art does not permit such devices to be used in dirty, wet, chemical-filled, explosive, or other hazardous physical environments, while simultaneously maintaining software security. If the information-containing hardware is damaged or destroyed security has failed, because the secure information is rendered useless or inaccessible.
A related security problem arises during the use of computers which require security. Prior connectorless data communications methods, such as those that are used between some terminals and host computers, are subject to eavesdropping by near-by equipment when electro-magnetic means are used for the transfer of information.
Solutions to the above problems are not provided in the prior art as indicated in the following examples.
U.S. Pat. No. 4,652,990 of Pailen et al. discloses a user access control method, wherein a portable processor and ROM cartridge called a Key is provided with a means for connecting the Key memory to a Key Carrier Computer Bus, which is connected to a microprocessor within a security unit called a Key Carrier, which is connected between a host computer and a terminal to prevent access to the programs within the host computer by a person using that particular terminal. Authorized users insert their Key into the open bus structure of the Key Carrier. The host computer and the security unit then exchange information so that the program in the host computer is able to determine if authentication has been achieved. If so, the applications programs are then permitted to be run within the host computer.
Once an authorized user has been authenticated, he has access to applications programs which are down-loaded into the host computer. Since the Key's primary purpose is to determine authorization of those persons who are allowed to copy programs, no protection is provided to prevent any copying at all even by "authorized users". No provision is made to make copying of applications programs unnecessary by containing them in permanent ROMs within the Key cartridge. The authorized user is, as a result, able to make as many copies of the applications programs as he wishes . . . which enables him to become a pirate.
The Key system is lacking several features that prevent it from providing protection from rogue programs operating from within the host computer, and from copying by users, authorized or not. As is common in the prior art, the applications programs are located in the host computer, and the Key system is designed to simply prevent a user from accessing those programs. An electrical and architectural separation is not made between the security program running in the host computer and an isolated dedicated computer for the applications program, so as to protect the host computer's resources. The application program is not contained within the portable Key cartridge, which is lacking a RAM to permit an application program to actually run inside the cartridge rather than inside the host computer. Instead, the Key system relies on a special Key controller program that must operate within the host computer. This program complements the program running in the Key carrier. It is this host-run program that determines if authorization has been verified by the security apparatus, and permits access to the actual applications program down-loaded into the same host computer.
Damaging viruses are generally introduced inadvertently from a virus-contaminated applications program being run by an authorized user. Modern applications programs are quite complex, and even expert programmers have great difficulty in determining for sure that a given program is virus free, let alone the average software user. Since the Key system leaves the applications programs, including the security-controlling program inside the host computer, such programs are just as subject to viral attack from a contaminated program operating in the host computer, under the Key system, as with the rest of the prior art.
A virus, once operating within the host computer is able to attach itself to the Key controller program, record, duplicate or simulate any of the communications between the security device and the host computer, or simply permit access by an unauthorized person on a separate terminal. Such a rogue program is able to extract security information from other applications programs and permit their use, effectively bypassing the security system imposed. Once the virus program has gained program control, all of the host computer's resources are available to it, unprotected by the Key system.
Since the Key cartridge does not prevent applications programs from being copied by an "authorized user", the applications program is unable to be maintained as the only-existing-copy of said program. If the Key cartridge is locked up in a safe when the program is not being used, a thief is still able to break in and steal the host computer with the applications program inside; dismantle the computer; access the stored information directly; and disassemble the security program to determine how to circumvent it. The thief also is able to steal the diskettes or other storage devices that said program has been copied onto. As a result, the Key system does not provide for the physical security of applications programs.
Additionally, the Key system is designed to cut off communications between a host computer and a terminal. Many modern computers have discarded terminals all together in favor of an integrated video-keyboard-computer such as the IBM personal computer. The Key system requires a remote terminal in order to cut off user access, and is as a result, not applicable to many of today's common computers.
U.S. Pat. No. 4,521,853 of Guttag discloses a method for protecting information contained in a memory which is on the same silicon chip as a microprocessor. Peripheral devices are prevented from accessing the on-chip memory through the common bus arrangement connecting the CPU with off-chip memory. This apparatus is designed to function as the main processor of the computer. It is wired into, rather than being an addition to host computers of various types. A standard bus arrangement is used that is not isolated from a host computer to prevent the addition of security-defeating equipment. Host resources are not protected from a rogue program being operated by this processor, as it is connected directly by its bus system to the host resources in the conventional manner. Rogue programs are able to gain entry into the host computer because the applications programs are run inside that same host computer, rather than within a secure cartridge and architecture.
Like the Key system referred to above, this arrangement does not provide for the physical security of applications software, nor does it provide a convenient and secure method for the distribution of software in a secure cartridge. The device is not designed to be removed from the computer and locked up in a safe at night, nor is the software protected from destruction by hostile environments.
U.S. Pat. No. 4,328,542 of Anastas et al. uses wired-in multiple processors that are designed for the implementation and secure operation of particular parallel programming algorithms. The type of security provided is to prevent interference between multiple applications by multiple processors working on common data, even using common programs in common memory. This method had been designed to operate using well-behaved, coordinated programs written for parallel processing. This method does not provide security in the sense that a not-so-well-behaved rogue program is to be prevented from tampering with or copying information in RAM or on mass data storage devices. This example of the prior art does not address the problem of rogue-contaminated programs being down-loaded into the parallel architecture from non-secure mass storage devices. The method uses access authorization registers, an elaborate system of mating hardware, and a specialized software structure to verify the authorization of applications programs to be used in the computer itself rather than providing a secure architecture with a separate processor and dedicated memory to run applications programs. Host resources remain accessible to all programs, including rogue programs that are able to gain entry by means of contaminated authorized programs.