Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
Generally, entropy coders may refer to technologies adapted to map a string of symbols with statistical redundancy into an efficient representation by removing the statistical redundancy. An illustrative entropy coder may map a binary substring of a “0000” to a substring of “0”, a binary substring of “0” to a substring of “10”, a binary substring of 1″ to a substring of “11”. For example, this entropy coder may map a binary string of 0000001000000001 to a compressed representation of “01010110011”. Thus, this entropy coder can remove statistical redundancy in binary strings containing substantially more zero values than one values.
Conventional entropy coders may have higher computational complexities. In one example, some conventional entropy coders, such as arithmetic coders, may utilize floating point operations. In another example, some other conventional entropy coders, such as Huffman coders, may utilize large mapping tables. These higher computational complexities may not be suitable for implementations that might benefit from data compression but also desire or demand lower costs or faster processing speeds afforded by lower computational complexities.