In single-ended parallel Input/Outputs (IOs) such as a Double Data Rate (DDR) transfer-mode bus, data bus inversion (DBI) is an increasingly popular coding scheme that reduces signaling power and simultaneous switching output (SSO) noise, thereby improving power and signal integrity. An example is an 8-bit data bus. In an 8-bit DBI-encoded data bus, the eight data bits are transmitted either as-is or inverted for each bit unit interval. At each bit unit interval, an extra coding overhead bit (also referred to as the DBI bit) accompanies the transmitted data bits to inform the receiver whether an intentional bus-wide inversion took place. For example, when the coding overhead bit is set to a logical one, it may indicate that the data bits of the bus are inverted. The receiver receives the data bits and the overhead bit, and if the overhead bit indicates that the data bits are inverted, the receiver may then revert the data bits to their proper values.
One example conventional DBI scheme, referred to as DBI-DC, is used where the receiver end is resistively terminated. In DBI-DC, the inversion criterion is to invert the bits of the bus when more than half of the bits of the parallel data pattern are binary ones (in a ground-terminated interface), thereby limiting the maximum number of transmitted ones to half of the data bus width. This reduces the static current and power delivered by the parallel transmitting drivers in a given bit time interval.
Another example conventional DBI scheme is referred to as DBI-AC. In a conventional DBI-AC scheme the receiver end is unterminated to avoid static power consumption, and the inversion criterion is to invert the bus when more than half of the parallel data pattern would undergo transitions (zero-to-one or one-to-zero) in progressing from the current bit time interval to the next. For example, if a unit interval would otherwise cause five data bit transitions, the DBI logic inverts the bus so as to cause only three data bit transitions.
This reduces the dynamic current and power consumed by the transmitting driver since data transitions consume power in an unterminated link. Moreover, a large number of bit transitions may cause a phenomenon referred to as “ground bounce” or “supply bounce,” where the voltage supplied to a gate may temporarily dip and cause jitter. Conventional DBI-AC attempts to reduce ground bounce and supply bounce by reducing the number of transitions in a bit time interval.
In conventional DBI-AC, where N is an even integer number of data bits on a parallel bus, the maximum number of transitions permitted by conventional DBI is N/2+1 (where the “+1” refers to a possible transition of the DBI bit itself). In an example where the bus width is N=8 plus one DBI bit, the maximum number of transitions in the transmitted bus is limited by DBI to 4+1=5. There is still a need in the art for improved DBI-AC techniques.