Data compression is an extremely useful tool for storing and transmitting large amounts of data. For example, the time required to transmit an image, such as a facsimile transmission of a document, is reduced drastically when compression is used to decrease the number of bits required to recreate the image.
In some compression systems, an input file or set of data is translated into a sequence of decisions under the direction of a decision model. Each decision has an associated likelihood, and based on this likelihood, an output code is generated and appended to the compressed file. To implement these encoding systems, compression systems have three parts: a decision model, a probability estimation method and a bitstream generator. The decision model receives the input data and translates the data into a set of decisions which the compression system uses to encode the data. The decision model is typically referred to as a context model. The probability estimation method is a procedure for developing the probability estimate for the likelihood of each decision. The bit-stream generator performs the final bit-stream encoding to generate the output code which is the compressed data set or compressed file. Compression can effectively occur in either or both the decision model and the bit generator.
A binary coder is one type of coding system that encodes data as a sequence of binary decisions.
Finite state machine coders are binary entropy coders that are well known in the art. A FSM-coder is a lossless multi-context binary entropy coder. Finite state machines are used for both bit generation (creating bitstreams given bits with known or assumed probabilities) and probability estimation (estimating probabilities based on past data from the same context). During encoding, the FSM-coder takes a series of bits with associated contexts and produces a coded bitstream that represents those bits with as little data as possible. During decoding, the FSM-coder takes the coded bitstream and the sequence of contexts and reproduces the original sequence of bits. An example of one is described in U.S. Pat. No. 5,272,478, entitled "Method and Apparatus for Entropy Encoding", issued Dec. 21, 1993. See also U.S. Pat. No. 5,475,388, entitled "Method and Apparatus for Using Finite State Machines to Perform Channel Modulation and Error Correction and Entropy Coding", issued Dec. 12, 1995.
Binary entropy coders can be used as the lossless coding portion of image compression systems. They allow the best possible compression by allowing symbols with probability greater than 50% to be coded and by permitting changing contexts (changing probability estimates) independently for each bit of the data to be compressed. Other binary entropy coders are the IBM Q-coder, the IBM/Mitsubishi QM-coder, and the ABS-coder, which is described in U.S. Pat. No. 5,381,145, entitled "Method and Apparatus for Parallel Encoding and Decoding of Data", issued Jan. 10, 1995, and U.S. Pat. No. 5,583,500, entitled "Method and Apparatus for Parallel Encoding and Decoding of Data", issued Jan. 10, 1996.
The FSM-coder is relatively fast and simple to implement in software. The FSM-coder is currently used in reversible wavelet-based image compression systems that the corporate assignee of the present invention has proposed for standardization.