A content addressable memory (CAM) is a memory device that accelerates any application requiring fast searches of a database, list, or pattern, such as in database machines, image or voice recognition, or computer and communication networks. CAMs provide benefits over other memory search algorithms by simultaneously comparing the desired information (i.e., data being stored within a given memory location) against the entire list of pre-stored entries. As a result of their unique searching algorithm, CAM devices are frequently employed in network equipment, particularly routers and switches, computer systems and other devices that require rapid content searching.
In order to perform a memory search in the above-identified manner, CAMs are organized differently than other memory devices (e.g., random access memory (RAM), dynamic RAM (DRAM), etc.). For example, data is stored in a RAM in a particular location, called an address. During a memory access, the user supplies an address and reads into or gets back the data at the specified address.
In a CAM, however, data is stored in locations in a somewhat random fashion. The locations can be selected by an address bus, or the data can be written into the first empty memory location. Every location has a status bit that keeps track of whether the location is storing valid information in it or is empty and available for writing.
Once information is stored in a memory location, it is found by comparing every bit in memory with data placed in a match detection circuit. When the content stored in the CAM memory location does not match the data placed in the match detection circuit, the CAM device returns a no match indication. When the content stored in the CAM memory location matches the data placed in the match detection circuit, the CAM device returns a match indication. In addition, the CAM may return the identification of the address location in which the desired data is stored. Thus, with a CAM, the user supplies the data and gets back the address if there is a match found in memory.
Locally, CAMs perform an exclusive-NOR (XNOR) function, so that a match is indicated only if both the stored bit and the corresponding input bit are the same state. CAMs are designed so that any number of stored bits may be simultaneously detected for a match with the input bits in the match detection circuit. One way in which this is achieved is by coupling a plurality of storage devices and logic circuits to a common Matchline, as depicted in FIG. 1.
Turning to FIG. 1, a schematic diagram of a conventional match detection circuit 175 is depicted. A source terminal of a precharge transistor 100 is coupled to a positive voltage source (e.g., VDD). The gate of transistor 100 is configured to receive a Precharge_N signal. A drain terminal of transistor 100 is coupled to a Matchline 185 for precharging the Matchline 185 to a predetermined voltage level (e.g., VDD).
Transistors 115, 125, 130 and 135 make up a flip-flop memory storage cell for storing a true logic state Q of a stored bit and a complementary logic state Q′ of the stored bit. Sources of transistors 115 and 130 are coupled to VDD and sources of transistors 125 and 135 are coupled to ground, thereby enabling the writing of a logic HIGH (e.g., “1”) and a logic LOW (e.g., “0”) in the flip-flop depending upon the command received on the bit line DBIT. As is known in the art, the flip-flop is accessed when both the word select line (WS) and the column select (DBIT) are simultaneously activated.
As for the comparison portion of match detection circuit 175, the gate of transistor 105 is coupled to Q and the gate of transistor 140 is coupled to Q′. Respective drain terminals of transistors 105 and 140 are coupled to the Matchline 185 and respective sources of transistors 105 and 140 are coupled to respective drains of transistors 110 and 150. Respective sources of transistors 110 and 150 are coupled to ground.
During a comparison operation, the Matchline 185 is precharged to VDD. Then the logic state of input bit MBIT is compared with the logic state of the stored bit Q. If the logic state of MBIT matches the logic state of Q, at least one transistor of the series connected transistor pairs (i.e., 105 and 110 or 140 and 150) is inactive, and therefore, the Matchline remains at VDD signifying a matched bit is detected. In practice, many stored bits are simultaneously compared with many input bits and if all input bits match their associated stored bits, then the Matchline 185 remains at a logic HIGH level.
In practice, however, it is more likely than not that at least one bit of a string of input bits will not match its corresponding bit location of the stored bit. In such a case, both transistors of at least one pair of series connected transistors (i.e., 105 and 110 or 140 and 150) will be active and the Matchline 185 will be discharged from VDD to ground, thereby signifying that a mismatch was detected.
In the above-identified process, the searched data (i.e., the input bits) is simultaneously compared with every data word in the CAM in order to find a match between the stored data and the input data. Since the comparison operation is conducted simultaneously on the entire memory, and is typically repeated at a very high frequency, this operation consumes a significant amount of power.
Power dissipation, P, in complementary metal-oxide semiconductor (CMOS) circuits, such as that depicted in FIG. 1, is related to the magnitude of Matchline signal swing, V, the load capacitance C, and the frequency of operation F as P=C*F*V2. Since the magnitude of Matchline signal swing, V, for typical match detection circuits is from VDD to ground, the power dissipated by the circuit is exceedingly high. Therefore, it is desirable to find a way to reduce power dissipation of CAM match detection circuits while maintaining the same levels of accuracy.