Neural networks are a class of electronic circuits which emulate higher-order brain functions such as memory, learning and/or perception/recognition. Associative networks are one category of neural devices which sense an input event and output a pattern of signals identifying that event.
Associative networks generally take the form of a matrix comprising a set of horizontal lines which cross and contact a set of vertical lines. The horizontal lines simulate the function of axons in the cortex of the brain and are used as inputs. The vertical lines simulate the function of dendrites extending from neurons. Each vertical line terminates at a voltage summing device which acts to simulate the function of the neuron cell body. Examples of such associative networks are found in pending applications entitled; "Semiconductor Cell For Neural Network Employing A Four-Quadrant Multiplier", Ser. No. 283,553, filed 12/09/88; "EXCLUSIVE-OR Cell For Neural Network And The Like", Ser. No. 309,247, filed 02/10/89; and "Neural Network Employing Leveled Summing Scheme With Blocked Array", Ser. No. 357,411, filed 05/26/89, now U.S. Pat. Nos. 4,950,917; 4,904,881; and 5,040,134 respectively, all of which are assigned to the assignee of the present application.
Within an associative network, neural synapses are simulated by circuit cells which provide electrical connection between the horizontal and vertical lines of the network. Individual synapses provide a weighted electrical connection between an input and a voltage summing element, i.e., a neuron. These synapse cells may either be analog or digital in nature. Analog circuitry is often preferred over digital circuitry for neural networks because of its superior density and also because neural networks generally do not require very high precision.
For an analog implementation, the weighted sum of input signals is usually computed by summing analog currents or charge packets. Examples of circuit devices useful as synapse cells in neural network are described in the co-pending applications entitled, "Adaptive Synapse Cell Providing Both Excitatory And Inhibitory Connections In An Associative Network", Ser. No. 379,933, filed 07/13/89; and "Improved Synapse Cell Employing Dual Gate Transistor Structure", Ser. No. 419,685, filed 10/11/89 now U.S. Pat. Nos. 4,956,564 and 4,961,002 respectively.
One of the major drawbacks associated with analog neural networks is that they are subject to inaccuracies resulting from external variations; mainly temperature and power supply variations. The dependence of the network on power supply and temperature variations introduces small error or offset terms into the calculations being performed. A typical network cycles through a series of weight changes until the entire network converges to a certain pattern, which depends on the pattern of inputs applied. The presence of error terms interferes with this learning process, ultimately leading to slower convergence times. Therefore, what is needed is a means for reducing an analog neural network's sensitivity to power supply and temperature variations.
As will be seen, the present invention greatly increases the computation accuracy of an analog neural network by increasing the network's tolerance to temperature and power supply variations.