One byproduct of the "information age" is the huge amounts of data which are stored in various storage media and which are transmitted over various transmission media. In order to reduce the amount of storage media required, to reduce the time required to retrieve data and to reduce required transmission times and/or bandwidths, it has been a common practice for some years to use some form of compression on the raw or clear data before it is stored or transmitted. Depending on the nature of the data, the acceptable computation penalty and other factors, compression ratios in excess of two to one can be achieved for relatively simple systems, with far higher compression ratios being available for more sophisticated compression techniques, such as where two or more compression techniques are chained. For example, when text data is to be transmitted, a run-length encoding (RLE) technique may be utilized to eliminate, or reduce the transmission bandwidth for all of the white spaces around the actual text and the actual text may then be further compressed by using a compression algorithm such as Huffman encoding, Lemple-Ziv (LZ) encoding, one of the many variations on LZ encoding such as Lemple-Ziv-Walsh (LZW) or a combination of two or more such compression techniques. When the data is retrieved from memory, or at the receiving end of a transmission, the data may be decompressed for utilization.
Another problem with the huge quantity of data currently available, particularly where the computer systems storing/utilizing the data are networked, is that data may be and frequently is surreptitiously observed or obtained by unauthorized people or organizations. Where the data is stored or transmitted in compressed form, the information obtained by unauthorized accessing of memory or transmission media cannot be utilized in the form obtained; however, compression algorithms which are usually publicly available or specified in advance, do not therefore provide security for the data. Even if compression algorithms were not known, they are not secure since they work on redundancy and the basis used for cryptographic code breaking is the detection and analyzing of redundant information. Therefore, compression alone, regardless of the degree of sophistication, is not much of a challenge to decipher for experienced cryptanalysts.
Therefore, it is desirable that valuable or sensitive information which is to be stored or transmitted be stored or transmitted in encrypted form. However, both encryption and compression are time and computer cycle intensive. Therefore, the independent, sequential performance of compression and encryption as separate operations on clear data before storage or transmission, and the reversing of these processes to permit utilization of the data, places an added burden on the data processing system performing these functions which may significantly increase the response time of the system to service requests and/or require the use of more powerful and therefore more expensive processing equipment. It would therefore be desirable if encryption and compression could be integrated so as to be automatically performed together as a single concryption operation, the term "concryption" being sometimes used hereinafter to refer to the integrated performance of compression and encryption on data, with a performance penalty for the combined operation which is reduced so as to be more comparable to either technology being performed separately than to that involved in performing the two technologies as separate functions.